Fix: Add vector store config, fix volume mount, add Ollama support #3858
+786
−17
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Written with Cursor, works on my machine.
Pull Request: Fix Vector Store Configuration & Add Ollama Support
Summary
This PR fixes critical bugs preventing OpenMemory from working and adds comprehensive Ollama integration documentation.
Problems Fixed
1. Missing Vector Store Configuration (Critical)
Issue: Fresh installations fail with "Memory client is not available" error because vector store configuration is missing.
Root Cause:
config.pyline 72 returns"vector_store": None, which gets saved to the database and prevents the memory client from initializing.Files Changed:
openmemory/api/app/routers/config.py- Added proper Qdrant configurationopenmemory/api/config.json- Added vector_store section for consistencyopenmemory/api/default_config.json- Added vector_store section for consistency2. Incorrect Volume Mount
Issue: Docker Compose volume mount overwrites the entire application directory, causing "Could not import module main" errors.
Root Cause: Mounting
./api:/usr/src/openmemoryreplaces the Python code built into the container.Fix: Changed to only mount the data directory:
./api/data:/usr/src/openmemory/dataFiles Changed:
openmemory/docker-compose.yml- Fixed volume mount path3. Environment Variable Precedence
Issue: Database configuration overrides environment variables, preventing compose files from controlling Qdrant connection settings.
Root Cause: Database config always overwrote environment-based config, even when
QDRANT_HOSTandQDRANT_PORTwere set.Fix: Environment variables now take precedence over database configuration for vector store settings, allowing compose files to control connection settings.
Files Changed:
openmemory/api/app/utils/memory.py- Added environment variable precedence checkEnhancements
Ollama Integration Documentation
Added comprehensive guide for switching from OpenAI to Ollama for local AI processing.
Files Added:
openmemory/OLLAMA.md- Complete Ollama integration guideopenmemory/podman-compose.yml- Podman-compatible compose file with SELinux supportCovers:
Testing
Tested Scenarios
✅ Fresh install with fixed configuration
✅ Memory creation with OpenAI
✅ Switching to Ollama
✅ Memory creation with Ollama (embeddings working)
✅ Environment variable precedence
✅ Podman compatibility
✅ Service name resolution (mem0_store)
✅ Host gateway access (host.containers.internal)
✅ Data persistence across restarts
Test Commands
Impact
Before This PR
After This PR
Files Changed
Backward Compatibility
✅ Fully backward compatible
Additional Notes
Why Vector Store Config is Critical
Mem0 requires vector store configuration to initialize the memory client. Without it:
Memory()constructor failsWhy Volume Mount Matters
The current volume mount (
./api:/usr/src/openmemory) overwrites:This causes immediate startup failures. The fix mounts only the data directory, preserving the container's built code while maintaining data persistence.
Ollama Integration Benefits
Checklist
Known Issues
See
KNOWN_ISSUES.mdfor details on:Future Improvements (Not in This PR)