You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Structure of an Interface Options Object](#structure-of-an-interface-options-object)
28
35
-[Caching](#caching)
@@ -40,7 +47,9 @@ Welcome to the documentation for the LLM Interface package. This documentation p
40
47
-[MongoDB](#mongodb)
41
48
-[Memory Cache](#memory-cache)
42
49
-[Example Usage](#example-usage-4)
43
-
-[Models](#models)
50
+
-[Support](#support)
51
+
-[Model Aliases](#model-aliases)
52
+
-[Embeddings Model Aliases](#embedding-model-aliases)
44
53
-[Jailbreaking](#jailbreaking)
45
54
-[Glossary](#glossary)
46
55
-[Examples](#examples)
@@ -53,7 +62,7 @@ The LLMInterface npm module provides a unified interface for interacting with va
53
62
54
63
## API Keys
55
64
56
-
To interact with different LLM providers, you will need API keys. Refer to [API Keys](api-key.md) for detailed instructions on obtaining and configuring API keys for supported providers.
65
+
To interact with different LLM providers, you will need API keys. Refer to [API Keys](api-keys.md) for detailed instructions on obtaining and configuring API keys for supported providers.
57
66
58
67
## Usage
59
68
@@ -62,26 +71,39 @@ The [Usage](usage.md) section contains detailed documentation on how to use the
_processStream(stream) is defined in the [streaming mode example](/examples/misc/streaming-mode.js)._
341
361
362
+
_This is a legacy function and will be depreciated._
363
+
342
364
## Message Object
343
365
344
366
The message object is a critical component when interacting with the various LLM APIs through the LLMInterface npm module. It contains the data that will be sent to the LLM for processing and allows for complex conversations. Below is a detailed explanation of the structure of a valid message object."
@@ -350,11 +372,15 @@ A valid message object typically includes the following properties:
350
372
-`model`: A string specifying the model to use for the request (optional).
351
373
-`messages`: An array of message objects that form the conversation history.
352
374
353
-
Different LLMs may have their own message object rules. For example, both Anthropic and Gemini always expect the initial message to have the `user` role. Please be aware of this and structure your message objects accordingly. _LLMInterface will attempt to auto-correct invalid objects where possible._
375
+
Different LLMs may have their own message object rules. For example, both Anthropic and Gemini always expect the initial message to have the `user` role. Please be aware of this and structure your message objects accordingly.
376
+
377
+
_LLMInterface will attempt to auto-correct invalid objects where possible._
354
378
355
379
## Options Object
356
380
357
-
The options object is an optional component that lets you send LLM provider specific parameters. While parameter names are fairly consistent, they can vary slightly, so it is important to pay attention. However, `max_token` is a special value, and is automatically normalized.
381
+
The options object is an optional component that lets you send LLM provider specific parameters. While parameter names are fairly consistent, they can vary slightly, so it is important to pay attention.
382
+
383
+
However, `max_token` is a special value, and is automatically normalized and is set with a default value of `1024`.
0 commit comments