PleasePrompto/notebooklm-mcp
MCP server for NotebookLM - Let your AI agents (Claude Code, Codex) research documentation directly with grounded, citation-backed answers from Gemini. Persistent auth, library management, cross-client sharing. Zero hallucinations, just your knowledge base.
Overview
PleasePrompto/notebooklm-mcp is a TypeScript MCP server licensed under MIT. MCP server for NotebookLM - Let your AI agents (Claude Code, Codex) research documentation directly with grounded, citation-backed answers from Gemini. Persistent auth, library management, cross-client sharing. Zero hallucinations, just your knowledge base.
Ranked #6883 out of 25632 indexed tools.
Has 1,375 GitHub stars.
Ecosystem
TypeScript MIT
Signal Breakdown
Stars 1,375
Freshness 2mo ago
Issue Health 23%
Contributors 2
Dependents 0
Forks 179
Description Detailed
License MIT
How to Improve
Freshness high impact
Issue Health high impact
Dependents medium impact
Matched Queries
From the README
<div align="center"> # NotebookLM MCP Server **Let your CLI agents (Claude, Cursor, Codex...) chat directly with NotebookLM for zero-hallucination answers based on your own notebooks** [Installation](#installation) • [Quick Start](#quick-start) • [Why NotebookLM](#why-notebooklm-not-local-rag) • [Examples](#real-world-example) • [Claude Code Skill](https://github.com/PleasePrompto/notebooklm-skill) • [Documentation](./docs/) </div> --- ## The Problem When you tell Claude Code or Cursor to "search through my local documentation", here's what happens: - **Massive token consumption**: Searching through documentation means reading multiple files repeatedly - **Inaccurate retrieval**: Searches for keywords, misses context and connections between docs - **Hallucinations**: When it can't find something, it invents plausible-sounding APIs - **Expensive & slow**: Each question requires re-reading multiple files ## The Solution Let your local agents chat directly with [**NotebookLM**](hRead full README on GitHub →
Are you the maintainer? Claim this listing