Create powerful AI workflows by connecting multiple MCP servers including Confluence, Figma, Resend for enhanced automation capabilities in Klavis AI.
Confluence is a team workspace where knowledge and collaboration meet
Figma is a collaborative interface design tool for web and mobile applications.
Resend is a modern email API for sending and receiving emails programmatically
Follow these steps to connect LangChain to these MCP servers
Sign up for KlavisAI to access our MCP server management platform.
Add your desired MCP servers to LangChain and configure authentication settings.
Verify your connections work correctly and start using your enhanced AI capabilities.
import os
import asyncio
from klavis import Klavis
from klavis.types import McpServerName, ConnectionType
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
# Initialize clients
klavis_client = Klavis(api_key=os.getenv("KLAVIS_API_KEY"))
llm = ChatOpenAI(model="gpt-4o-mini", api_key=os.getenv("OPENAI_API_KEY"))
confluence_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.CONFLUENCE,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
figma_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.FIGMA,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
resend_mcp_instance = klavis_client.mcp_server.create_server_instance(
server_name=McpServerName.RESEND,
user_id="1234",
platform_name="Klavis",
connection_type=ConnectionType.STREAMABLE_HTTP,
)
mcp_client = MultiServerMCPClient({
"confluence": {
"transport": "streamable_http",
"url": confluence_mcp_instance.server_url
},
"figma": {
"transport": "streamable_http",
"url": figma_mcp_instance.server_url
},
"resend": {
"transport": "streamable_http",
"url": resend_mcp_instance.server_url
}
})
tools = asyncio.run(mcp_client.get_tools())
agent = create_react_agent(
model=llm,
tools=tools,
)
response = asyncio.run(agent.ainvoke({
"messages": [{"role": "user", "content": "Your query here"}]
}))
Everything you need to know about connecting to these MCP servers
Join developers who are already using KlavisAI to power their LangChain applications with these MCP servers.
Start For Free