Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

Maximum context length? #13

Open
Tylersuard opened this issue Mar 24, 2023 · 0 comments
Open

Maximum context length? #13

Tylersuard opened this issue Mar 24, 2023 · 0 comments

Comments

@Tylersuard
Copy link

Hello, in the paper a context length of 49k tokens was used, which is great, but can this model go any further? What is the maximum context length possible?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant