Create powerful AI workflows by connecting multiple MCP servers including Jira, HubSpot, Firecrawl Deep Research for enhanced automation capabilities in Klavis AI.
Jira is a project management and issue tracking tool developed by Atlassian
HubSpot is a developer and marketer of software products for inbound marketing, sales, and customer service
A personal research assistant that analyze sources across the web, based on Firecrawl
Follow these steps to connect LangChain to these MCP servers
Sign up for KlavisAI to access our MCP server management platform.
Add your desired MCP servers to LangChain and configure authentication settings.
Verify your connections work correctly and start using your enhanced AI capabilities.
import os
import asyncio
from klavis import Klavis
from klavis.types import McpServerName, ConnectionType
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
# Initialize clients
klavis_client = Klavis(api_key=os.getenv("KLAVIS_API_KEY"))
llm = ChatOpenAI(model="gpt-4o-mini", api_key=os.getenv("OPENAI_API_KEY"))
jira_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.JIRA,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
hubspot_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.HUBSPOT,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
firecrawl_deep_research_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.FIRECRAWL_DEEP_RESEARCH,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
mcp_client = MultiServerMCPClient({
"jira": {
"transport": "streamable_http",
"url": jira_mcp_instance.server_url
},
"hubspot": {
"transport": "streamable_http",
"url": hubspot_mcp_instance.server_url
},
"firecrawl deep research": {
"transport": "streamable_http",
"url": firecrawl_deep_research_mcp_instance.server_url
}
})
tools = asyncio.run(mcp_client.get_tools())
agent = create_react_agent(
model=llm,
tools=tools,
)
response = asyncio.run(agent.ainvoke({
"messages": [{"role": "user", "content": "Your query here"}]
}))
Everything you need to know about connecting to these MCP servers
Join developers who are already using KlavisAI to power their LangChain applications with these MCP servers.
Start For Free