Open
Conversation
3f00ec7 to
d9d650c
Compare
6ea3047 to
b07ac99
Compare
b50d280 to
6467d26
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes dottxt-ai#806
Fixes dottxt-ai#789
Closes dottxt-ai#910
Problem
For
outlines.models.transformers, instead of using logits processors which encapsulate automata management,SequenceGeneratordirectly manages the automata. This different implementation resulted in dottxt-ai#789's bug.Solution
Transformers.generateandTransformers.streamwhich use HFtransformerslogits_processorargument withoutlines.processors.OutlinesLogitsProcessorSequenceGeneratorAdapterfor transformers instead ofSequenceGeneratorTODO:
Transformers.generateandTransformers.streamSequenceGeneratorAdapterversion ofoutlines.models.transformersmainllamacppandvllmchanges, these will be in a separate PRBonus
Details
This new structure allows us to easily integrate multi-modal models by subclassing
models.Transformer. Additionally, we can makemodels.mambaaTransformermodel and just passmodel_class=MambaLMHeadModel.Multi-modal model example: