Connectto Jira, Firecrawl Deep Research MCP Servers

Create powerful AI workflows by connecting multiple MCP servers including Jira, Firecrawl Deep Research for enhanced automation capabilities in Klavis AI.

Jira icon

Jira

featured

Jira is a project management and issue tracking tool developed by Atlassian

Available Tools:

  • jira_search
  • jira_get_issue
  • jira_search_fields
  • +11 more tools
Firecrawl Deep Research icon

Firecrawl Deep Research

featured

A personal research assistant that analyze sources across the web, based on Firecrawl

Available Tools:

  • firecrawl_deep_research

Quick Setup Guide

Follow these steps to connect LangChain to these MCP servers

1

Create Your Account

Sign up for KlavisAI to access our MCP server management platform.

2

Configure Connections

Add your desired MCP servers to LangChain and configure authentication settings.

3

Test & Deploy

Verify your connections work correctly and start using your enhanced AI capabilities.

LangChain + KlavisAI Integration Snippets

import os
import asyncio
from klavis import Klavis
from klavis.types import McpServerName, ConnectionType
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

# Initialize clients
klavis_client = Klavis(api_key=os.getenv("KLAVIS_API_KEY"))
llm = ChatOpenAI(model="gpt-4o-mini", api_key=os.getenv("OPENAI_API_KEY"))

jira_mcp_instance = klavis_client.mcp_server.create_server_instance(
    server_name=McpServerName.JIRA,
    user_id="1234",
    platform_name="Klavis",
    connection_type=ConnectionType.STREAMABLE_HTTP,
)

firecrawl_deep_research_mcp_instance = klavis_client.mcp_server.create_server_instance(
    server_name=McpServerName.FIRECRAWL_DEEP_RESEARCH,
    user_id="1234",
    platform_name="Klavis",
    connection_type=ConnectionType.STREAMABLE_HTTP,
)

mcp_client = MultiServerMCPClient({
    "jira": {
        "transport": "streamable_http",
        "url": jira_mcp_instance.server_url
    },
    "firecrawl deep research": {
        "transport": "streamable_http",
        "url": firecrawl_deep_research_mcp_instance.server_url
    }
})

tools = asyncio.run(mcp_client.get_tools())

agent = create_react_agent(
    model=llm,
    tools=tools,
)

response = asyncio.run(agent.ainvoke({
    "messages": [{"role": "user", "content": "Your query here"}]
}))

Frequently Asked Questions

Everything you need to know about connecting to these MCP servers

Ready to Get Started?

Join developers who are already using KlavisAI to power their LangChain applications with these MCP servers.

Start For Free