GitHub

Installation

Get DRIP/KG-RAG up and running on your system. This guide covers system requirements, Docker setup, SDK installation, and configuration options.

System Requirements

Minimum Requirements

  • CPU: 4 cores, 2.4 GHz
  • RAM: 8 GB
  • Storage: 50 GB free space
  • OS: Linux, macOS, or Windows
  • Docker: 20.10+ and Docker Compose 2.0+

Recommended Requirements

  • CPU: 8+ cores, 3.0+ GHz
  • RAM: 16+ GB
  • Storage: 100+ GB SSD
  • GPU: NVIDIA GPU with CUDA support (optional, for faster processing)

Quick Start with Docker

The easiest way to get started is using Docker Compose, which sets up all required services automatically.

1. Clone the Repository

git clone https://github.com/functor-ai/drip-kg-rag.git
cd drip-kg-rag

2. Start the Services

# Start all services
docker-compose up -d
# Check service status
docker-compose ps

3. Verify Installation

# Check API health
curl http://localhost:8000/api/v1/health
# Expected response:
# {"status": "healthy", "system": "DRIP/KG-RAG", "version": "1.0.0"}

Manual Installation

For development or custom deployments, you can install components individually.

Core Services

Neo4j (Knowledge Graph Database)

# Using Docker
docker run -d \
--name neo4j \
-p 7474:7474 -p 7687:7687 \
-e NEO4J_AUTH=neo4j/password \
neo4j:5.15
# Or install locally
# Download from https://neo4j.com/download/

Qdrant (Vector Database)

# Using Docker
docker run -d \
--name qdrant \
-p 6333:6333 \
qdrant/qdrant:latest
# Or install locally
pip install qdrant-client

Redis (Caching)

# Using Docker
docker run -d \
--name redis \
-p 6379:6379 \
redis:7-alpine
# Or install locally
# macOS: brew install redis
# Ubuntu: sudo apt install redis-server

SDK Installation

Functor SDK (Python)

# Install from PyPI
pip install functor-sdk
# Or using uv (faster)
uv pip install functor-sdk
# Or from source
git clone https://github.com/functor-ai/functor-sdk.git
cd functor-sdk
pip install -e ".[dev]"

MCP SDK (Python)

# Install MCP SDK
pip install drip-mcp-client
# Or install from source
git clone https://github.com/functor-ai/drip-mcp.git
cd drip-mcp
pip install -e .

Node.js SDK

# Install via npm
npm install @functor-ai/drip-client
# Or via yarn
yarn add @functor-ai/drip-client

Configuration

Environment Variables

Create a .env file in your project root:

.env
# API Configuration
FUNCTOR_API_KEY=your-api-key-here
FUNCTOR_BASE_URL=http://localhost:8000
# Database Configuration
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=password
QDRANT_URL=http://localhost:6333
REDIS_URL=redis://localhost:6379
# Optional: LLM Configuration
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key

API Key Generation

Generate an API key for authentication:

# Generate API key
curl -X POST http://localhost:8000/api/v1/auth/generate-key \
-H "Content-Type: application/json" \
-d '{"name": "my-app", "expires_in": 365}'
# Response:
# {"api_key": "drip_1234567890abcdef", "expires_at": "2025-01-01T00:00:00Z"}

Development Setup

Local Development

For development, you can run the system locally:

# Clone the repository
git clone https://github.com/functor-ai/drip-kg-rag.git
cd drip-kg-rag
# Install Python dependencies
pip install -r requirements.txt
# Install development dependencies
pip install -r requirements-dev.txt
# Set up pre-commit hooks
pre-commit install
# Run the development server
python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000

Database Setup

# Initialize Neo4j database
python scripts/init_neo4j.py
# Create initial knowledge graphs
python scripts/create_default_kgs.py
# Verify setup
python scripts/verify_setup.py

Verification

Test Installation

Verify that everything is working correctly:

test_installation.py
from functor_sdk import FunctorClient
# Test basic connectivity
client = FunctorClient(api_key="your-api-key")
# Check system health
health = client.health.check()
print(f"System status: {health['status']}")
# Test knowledge graph listing
kgs = client.knowledge_graphs.list()
print(f"Available KGs: {len(kgs)}")
# Test a simple query
result = client.queries.execute("What is machine learning?")
print(f"Query result: {result.answer[:100]}...")
print("✅ Installation successful!")

Health Checks

# Check all services
curl http://localhost:8000/api/v1/health
# Check detailed health
curl http://localhost:8000/api/v1/health/detailed
# Check visualization service
curl http://localhost:8000/api/visualizations/health

Troubleshooting

Common Issues

Docker Issues

# Check Docker status
docker --version
docker-compose --version
# Restart services
docker-compose down
docker-compose up -d
# View logs
docker-compose logs -f

Port Conflicts

# Check port usage
netstat -tulpn | grep :8000
netstat -tulpn | grep :7474
netstat -tulpn | grep :6333
# Kill processes using ports
sudo kill -9 $(lsof -t -i:8000)

Database Connection Issues

# Test Neo4j connection
curl -u neo4j:password http://localhost:7474/db/data/
# Test Qdrant connection
curl http://localhost:6333/collections
# Test Redis connection
redis-cli ping

Performance Optimization

Docker Resource Limits

docker-compose.override.yml
version: '3.8'
services:
neo4j:
deploy:
resources:
limits:
memory: 4G
reservations:
memory: 2G
qdrant:
deploy:
resources:
limits:
memory: 2G
reservations:
memory: 1G

System Tuning

# Increase file descriptor limits
echo "* soft nofile 65536" >> /etc/security/limits.conf
echo "* hard nofile 65536" >> /etc/security/limits.conf
# Optimize kernel parameters
echo "vm.max_map_count=262144" >> /etc/sysctl.conf
sysctl -p

Next Steps

Now that you have DRIP/KG-RAG installed, let's get you started:

Support

If you encounter issues during installation: