Skip to main content

Installation

Prerequisites

  • Python 3.9+
  • Docker & Docker Compose (for the memory server)
  • OpenAI API key (for embeddings)

Step 1: Install the SDK

pip install aegis-memory
With framework integrations:
# For CrewAI
pip install "aegis-memory[crewai]"

# For LangChain
pip install "aegis-memory[langchain]"

# All integrations
pip install "aegis-memory[all]"

Step 2: Start the Memory Server

Step 3: Configure Environment

Create a .env file or set environment variables:
# Required
export AEGIS_API_KEY="dev-key"  # For local dev, any string works
export OPENAI_API_KEY="sk-..."  # For embeddings

# Optional
export AEGIS_BASE_URL="http://localhost:8000"  # Default

Step 4: Verify Installation

from aegis_memory import AegisClient

client = AegisClient(
    api_key="dev-key",
    base_url="http://localhost:8000"
)

# Add a test memory
result = client.add("Test memory - Aegis is working!")
print(f"Memory stored: {result.id}")

# Query it back
memories = client.query("test memory")
print(f"Retrieved: {memories[0].content}")
You should see your test memory returned!

What’s Running

After docker-compose up, you have:
ServicePortPurpose
Aegis API8000REST API for memory operations
PostgreSQL5432Storage with pgvector extension
(Optional) Redis6379Caching and rate limiting

Next Steps


Troubleshooting

Check if ports 8000 or 5432 are already in use:
lsof -i :8000
lsof -i :5432
Kill conflicting processes or change ports in docker-compose.yml.
Wait for PostgreSQL to be ready:
docker-compose logs postgres
Look for “database system is ready to accept connections”.
Ensure your API key is set and has credits:
echo $OPENAI_API_KEY
Test directly:
curl https://api.openai.com/v1/models \
  -H "Authorization: Bearer $OPENAI_API_KEY"