Ollama Deep Researcher MCP Server
Ollama Deep Researcher MCP servers enable AI models to perform advanced topic research using web search and LLM synthesis, powered by a local MCP server.
Overview
The Cameron Rohn enables AI models to perform advanced topic research using web search and LLM synthesis. It's part of the Model Context Protocol (MCP) system, providing a safe and standard way to connect AI with research capabilities.
Created by:
Developed by
Key Features
Web Search Integration
Utilize web search APIs (Tavily, Perplexity) to gather up-to-date information.
LLM Synthesis
Synthesize research findings using local Ollama LLMs for comprehensive answers.
Iterative Research Process
Iteratively improve summaries by identifying knowledge gaps and generating new search queries.
Markdown Summary with Sources
Provides a final markdown summary of research with all sources used.
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
research | Research a given topic | Research |
get_status | Get the status of ongoing research | Utility |
configure | Configure research parameters | Configuration |
Detailed Usage
research▶
Initiate a deep research process on a specified topic.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "research",
arguments: {
topic: "The impact of AI on climate change"
}
});
Returns a research ID to track progress.
get_status▶
Retrieve the current status and results of an ongoing research task.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "get_status",
arguments: {
researchId: "research-12345"
}
});
Returns the current status, progress, and final summary if completed.
configure▶
Configure parameters for the research process, such as max loops, LLM model, and search API.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "configure",
arguments: {
maxLoops: 5,
llmModel: "deepseek-r1:8b",
searchApi: "tavily"
}
});
Updates the research configuration for subsequent tasks.
Installation
{
"mcpServers": {
"ollama-deep-researcher": {
"command": "node",
"args": [
"path/to/mcp-server-ollama-deep-researcher/build/index.js"
],
"env": {
"TAVILY_API_KEY": "your_tavily_key",
"PERPLEXITY_API_KEY": "your_perplexity_key",
"LANGSMITH_API_KEY": "your-langsmith-key"
}
}
}
}
Environment Variables:
Ensure TAVILY_API_KEY, PERPLEXITY_API_KEY, and LANGSMITH_API_KEY are set in your environment or directly in the env object.
Common Use Cases
1. Automated Literature Review
Perform systematic literature reviews on specific topics, aggregating information from various web sources.
2. Knowledge Gap Analysis
Identify missing information or inconsistencies in existing knowledge bases by iteratively researching and synthesizing findings.
3. Research Synthesis for Reports
Generate comprehensive summaries and reports on complex subjects, complete with citations and source validation.
Sources
Related Articles
S3 MCP Server
S3 MCP servers enable AI models to interact with Amazon S3 object storage, providing capabilities for file operations, metadata management, and versioning in a secure and scalable environment.
Flutter MCP Server
Flutter MCP servers enable AI models to interact with Flutter projects, providing capabilities for code analysis, formatting, testing, and documentation retrieval.
Notion MCP Server
Notion MCP servers enable AI models to interact with Notion workspaces, providing capabilities for database operations, page management, content creation, and collaborative workspace automation.