-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Phi-4 to Tiktoken encoding map #7337
Comments
I got the model to load using the following code, and generate text but I can't seem to workout out the stop sequence: using Microsoft.Extensions.AI;
using Microsoft.ML.GenAI.Core;
using Microsoft.ML.GenAI.Phi;
using Microsoft.ML.Tokenizers;
using static TorchSharp.torch;
using TorchSharp;
using System.Text.Json;
var weightFolder = @"C:\Users\maxim\source\repos\models\microsoft\phi-4\";
var device = "cuda";
if (device == "cuda")
{
InitializeDeviceType(DeviceType.CUDA);
}
var defaultType = ScalarType.Float16;
manual_seed(1);
set_default_dtype(defaultType);
var model = Phi3ForCasualLM.FromPretrained(weightFolder, "config.json", layersOnTargetDevice: -1, quantizeToInt4: true);
var tokenizerPath = Path.Combine(weightFolder, "config.json");
var fileConfig = File.ReadAllText(tokenizerPath);
var config = JsonSerializer.Deserialize<Phi3Config>(fileConfig)!;
var tokenizer = TiktokenTokenizer.CreateForModel("gpt-4");
var pipeline = new CausalLMPipeline<Tokenizer, Phi3ForCasualLM>(tokenizer, model, device);
var client = new Phi3CausalLMChatClient(pipeline);
var task = """
Can you tell me a funny joke?
""";
var chatMessage = new ChatMessage(ChatRole.User, task);
var options = new ChatOptions
{
StopSequences = ["<|endoftext|>"],
};
await foreach (var response in client.CompleteStreamingAsync([chatMessage], options))
{
Console.Write(response.Text);
}
Console.WriteLine();
Console.WriteLine("End!"); |
@luisquintanilla looking at https://huggingface.co/microsoft/phi-4/tree/main looks it is using |
Looking at the tokenizer_config.json it says it's using
I also ran the following:
Got the following confirmation:
|
Adding to this thread. Looks like there may have been a bug with the original Phi-4 tokenizer published which validated @MaxAkbar observations. |
Phi-4 uses Tiktoken tokenizer (100k vocab).
2412.08905v1
Consider adding it as an option to the encoding map so it's easier to create.
machinelearning/src/Microsoft.ML.Tokenizers/Model/TiktokenTokenizer.cs
Lines 1025 to 1035 in 01c4164
The text was updated successfully, but these errors were encountered: