Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Custom Model Support and Configurable API URL #32

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

DDnim
Copy link

@DDnim DDnim commented Dec 2, 2024

Why

  • Enable users to use newer or custom OpenAI or compatibility models that are not pre-defined in the plugin
  • Allow users to configure custom API endpoints for compatibility with OpenAI-compatible APIs
  • Skip token counting for custom models since their context limits may vary

What

Added Features

  1. New "customize" option in model selection dropdown

    • When selected, shows an additional input field for custom model name
    • Users can input any valid model identifier (e.g., gpt-4-1106-vision-preview)
  2. Made OpenAI API URL configurable

    • Moved hardcoded URL to a configurable setting
    • Maintains the default OpenAI endpoint as fallback
    • Allows users to use alternative API endpoints

Technical Changes

  1. Added CUSTOMIZE option to CHAT_MODELS
  2. Added customModelName field to settings
  3. Modified token counting logic to skip for custom models
  4. Updated settings UI to dynamically show/hide custom model input
  5. Enhanced type safety and error handling
  6. Maintained backward compatibility with existing configurations

Testing

@DDnim
Copy link
Author

DDnim commented Dec 2, 2024

@rpggio
Hi friend.
I really like this plugin and hope to use more models available.
Please review it when you have time, THX.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants