A LangGraph-based starter template for building collections of AI agents. This template provides a foundation for creating multi-agent systems with capabilities like LinkedIn lead collection, web search, and Airtable data integration.
- Collection of configurable AI agents using DeepAgents and LangGraph
- Agent entrypoints defined in
langgraph.jsonfor easy management - Example: LinkedIn Lead Collector agent
- Airtable integration for structured data storage
- Built-in web search tools
- VS Code debugging configuration included
- Docker-ready for containerized development & deployment
- Python 3.11+
- Docker (optional, for containerized deployment)
-
Clone the repository
git clone <YOUR_REPO_URL> langgraph-starter cd langgraph-starter
-
Install dependencies with uv (recommended)
uv venv source .venv/bin/activate uv sync -
Set up environment variables
cp .env.example .env # Edit .env with your API keys and configuration -
Start the development server
make dev
This project includes an MCP (Model Context Protocol) server for enhanced AI integrations.
Start the MCP server:
python mcp- LangGraph Studio β Official Studio for visual graph editing & monitoring
- Agent Chat UI β Clean chat interface for testing your agents
-
.vscode/launch.jsonis pre-configured for:- Attach to LangGraph β Debug a running server on port
5678 - Debug Script β Run and debug individual scripts
- Attach to LangGraph β Debug a running server on port
Steps:
-
Run:
make debug
-
In VS Code, press F5 and select Attach to LangGraph.
langgraph-starter/
ββ agents/
β ββ linkedin_leads.py # LinkedIn lead collection agent (example)
ββ tools/
β ββ __init__.py
β ββ airtable.py # Airtable integration tools
β ββ search.py # Web search tools
ββ .vscode/
β ββ launch.json # VS Code debug config
ββ docker-compose.yml # Docker services
ββ langgraph.json # LangGraph configuration
ββ Makefile # Dev commands
ββ pyproject.toml # Python project configuration
ββ .env.example # Example environment variables
make dev # Start development server
make debug # Start with debugging enabled (port 5678)
make build # Build and push Docker imagelanggraph dev # Start dev server
langgraph dev --debug-port 5678 # Start with debugging
langgraph build # Build the application- Modular Design β Each agent is a separate module under
agents/ - LangGraph Integration β Agents and entrypoints are declared in
langgraph.json - Tooling β Shared tools under
tools/(e.g., Airtable, search) can be reused - State & Orchestration β LangGraph coordinates agent state and message flow
- LinkedIn Lead Collector β Searches LinkedIn for potential leads (example logic)
- Note Taker β Normalizes and organizes collected data
- Airtable Integration β Persists structured results to Airtable
- Create a new file under
agents/(e.g.,my_agent.py) - Register it in
langgraph.json(entrypoints, graph configuration) - Wire up any shared tools you need from
tools/
Key files:
langgraph.jsonβ Graphs and entrypoints.envβ Environment variables and API keyspyproject.tomlβ Python dependencies and project metadata.vscode/launch.jsonβ VS Code debugging configuration
Tip: Ensure any secrets (API keys, tokens) are only in
.envand excluded from version control.
Build and run with Docker:
docker compose up --buildMount local files (e.g., /files) via docker-compose.yml volumes for read/write access.
-
Fork the repo
-
Create a feature branch:
git checkout -b feature/amazing-feature
-
Commit:
git commit -m "Add amazing feature" -
Push:
git push origin feature/amazing-feature
-
Open a Pull Request
This project is licensed under the terms specified in the LICENSE file.