You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Configuration for using local LLMs and embedders instead of Mem0's cloud API (see Local Configuration section)
50
+
</ParamField>
51
+
48
52
<Tip>
49
53
At least one of `user_id`, `agent_id`, or `run_id` must be provided to
50
54
organize memories.
@@ -172,6 +176,42 @@ pipeline = Pipeline([
172
176
])
173
177
```
174
178
179
+
### Using Local Configuration
180
+
181
+
The `local_config` parameter allows you to use your own LLM and embedding providers instead of Mem0's cloud API. This is useful for self-hosted deployments or when you want more control over the memory processing.
182
+
183
+
```python
184
+
local_config = {
185
+
"llm": {
186
+
"provider": str, # LLM provider name (e.g., "anthropic", "openai")
187
+
"config": {
188
+
# Provider-specific configuration
189
+
"model": str, # Model name
190
+
"api_key": str, # API key for the provider
191
+
# Other provider-specific parameters
192
+
}
193
+
},
194
+
"embedder": {
195
+
"provider": str, # Embedding provider name (e.g., "openai")
196
+
"config": {
197
+
# Provider-specific configuration
198
+
"model": str, # Model name
199
+
# Other provider-specific parameters
200
+
}
201
+
}
202
+
}
203
+
204
+
# Initialize Mem0 memory service with local configuration
205
+
memory = Mem0MemoryService(
206
+
local_config=local_config, # Use local LLM for memory processing
207
+
user_id="user123", # Unique identifier for the user
208
+
)
209
+
```
210
+
211
+
<Warning>
212
+
When using `local_config` do not provide the `api_key` parameter.
0 commit comments