Ollama Deep Researcher: AI Model for Web Search & LLM Synthesis
Ollama Deep Researcher MCP servers enable AI models to perform advanced topic research using web search and LLM synthesis, powered by a local MCP server.
Overview
The Cameron Rohn enables AI models to perform advanced topic research using web search and LLM synthesis. It's part of the Model Context Protocol (MCP) system, providing a safe and standard way to connect AI with research capabilities. For more advanced memory capabilities, consider integrating with a Knowledge Graph Memory or Supavec. You can also explore other AI/ML tools for enhanced functionality.
Created by:
Developed by
Key Features
Web Search Integration
Utilize web search APIs (Tavily, Perplexity) to gather up-to-date information.
LLM Synthesis
Synthesize research findings using local Ollama LLMs for comprehensive answers.
Iterative Research Process
Iteratively improve summaries by identifying knowledge gaps and generating new search queries.
Markdown Summary with Sources
Provides a final markdown summary of research with all sources used.
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
research | Research a given topic | Research |
get_status | Get the status of ongoing research | Utility |
configure | Configure research parameters | Configuration |
Detailed Usage
research▶
Initiate a deep research process on a specified topic.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "research",
arguments: {
topic: "The impact of AI on climate change"
}
});
Returns a research ID to track progress.
get_status▶
Retrieve the current status and results of an ongoing research task.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "get_status",
arguments: {
researchId: "research-12345"
}
});
Returns the current status, progress, and final summary if completed.
configure▶
Configure parameters for the research process, such as max loops, LLM model, and search API.
use_mcp_tool({
server_name: "ollama-deep-researcher",
tool_name: "configure",
arguments: {
maxLoops: 5,
llmModel: "deepseek-r1:8b",
searchApi: "tavily"
}
});
Updates the research configuration for subsequent tasks.
Installation
{
"mcpServers": {
"ollama-deep-researcher": {
"command": "node",
"args": [
"path/to/mcp-server-ollama-deep-researcher/build/index.js"
],
"env": {
"TAVILY_API_KEY": "your_tavily_key",
"PERPLEXITY_API_KEY": "your_perplexity_key",
"LANGSMITH_API_KEY": "your-langsmith-key"
}
}
}
}
Environment Variables:
Ensure TAVILY_API_KEY, PERPLEXITY_API_KEY, and LANGSMITH_API_KEY are set in your environment or directly in the env object.
Common Use Cases
1. Automated Literature Review
Perform systematic literature reviews on specific topics, aggregating information from various web sources.
2. Knowledge Gap Analysis
Identify missing information or inconsistencies in existing knowledge bases by iteratively researching and synthesizing findings.
3. Research Synthesis for Reports
Generate comprehensive summaries and reports on complex subjects, complete with citations and source validation.
Sources
Related Articles
Fetch MCP Server
Fetch MCP servers enable AI models to retrieve and process content from web pages, converting HTML to markdown for easier consumption.
Filesystem MCP Server
Filesystem MCP servers enable AI models to interact with local file systems, providing capabilities for file operations, directory management, and secure file access within specified boundaries.
MongoDB MCP Server
MongoDB MCP servers enable AI models to interact with MongoDB databases and MongoDB Atlas, providing capabilities for document operations, aggregation pipelines, cloud database management, and natural language queries.