Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Syntax highlighting on long lines not working? #139

Open
werunom opened this issue Aug 16, 2017 · 9 comments
Open

Syntax highlighting on long lines not working? #139

werunom opened this issue Aug 16, 2017 · 9 comments
Labels

Comments

@werunom
Copy link

werunom commented Aug 16, 2017

According to this fix, the latest release (1.1.0) should have no problem highlighting the syntax post 100 tokens. But after updating to this release, looks like that issue still persists.

See this example from my personal writing -
no syntax

The same text after a line break gets the syntax -
syntax

Let me know if you need any further info. Thanks!

@Aerijo
Copy link
Collaborator

Aerijo commented Aug 16, 2017

Sorry about that, I think there is some confusion about what the problems are.

The first, which is the one fixed in that pull request, is that Atom will only look for highlighting stuff for a limited number of characters on a line before giving up. The solution (which is now merged) tells Atom to not give up, effectively forcing it to look at the entire line.

The second is that Atom wil give up after a limited number of matches. This can be seen by copying and pasting something as simple as \a\b\c\d\e\f\g\h\i\# until it no longer highlights stuff. This issue has not been fixed by this package yet, though I am adpating a solution from grammar-token-limit. Until then, use that package to increase this limit. I find 150 works for me (default is 100), but you can adjust it to suit your needs. Be careful though, as it can potentially impact performance.

Also, in future please add a 'minimum not working example' that one of us can copy and paste into our own versions of Atom. This would speed up error identification a great deal, as we can play around with it ourselves.

@werunom
Copy link
Author

werunom commented Aug 16, 2017

Thanks for that clarification! Will use the package you recommended (setting token-limit as 150, for instance, resolved the above example). Nice to hear that you are working on fixing this within the language.

Will remember to include the minimum source code for reproducibility. Thanks for your time...

@werunom werunom closed this as completed Aug 16, 2017
@Aerijo Aerijo reopened this Aug 16, 2017
@Aerijo
Copy link
Collaborator

Aerijo commented Aug 16, 2017

I think it's best to leave this open until a fix is actually made. Others can then also see this issue instead of making new ones.

@lzkelley
Copy link

Some more illustrations....

latex_atom_1

latex_atom_2

@Aerijo
Copy link
Collaborator

Aerijo commented Dec 18, 2017

It was determined that this issue does not fall within the scope of this package. The current accepted solution is to use the grammar-token-limit package.

Hopefully, someday, a setting will be added in Atom core to allow this package to set a default value.

@Aerijo
Copy link
Collaborator

Aerijo commented Jan 29, 2018

@yudai-nkt I found that the setting maxTokensPerLine can be added directly, much like limitLineLength. I think we should add this property, and set the value to something like 300. This would fix all but the most extreme cases, where the person probably should put in a line break anyway.

@yudai-nkt
Copy link
Collaborator

I don't have time to test right now, but can that option be added on users' end (i.e., ~/.atom/config.cson)? If so, I'd like each user to set the value he/she prefers because we can put a line break basically anywhere in a sentence/paragraph in LaTeX.

@Aerijo Aerijo changed the title Syntax highlighting post 100 tokens still not working? Syntax highlighting on long lines not working? Apr 9, 2018
@aseyffert
Copy link

aseyffert commented Aug 27, 2019

Would it not be a good idea to add a reference to grammar-token-limit in the README.md of this project?

Requiring that package as a fix for this (pretty common) issue is a perfectly good solution, it seems, but not documenting it seems like a bad idea.

EDIT: Case in point: I just ran into this issue on a new installation and had to search for my comment to find this issue to solve the problem. If I didn't have looming deadlines I might've put in a pull request adding such a mention, but alas...

@dazsmith
Copy link

dazsmith commented Jul 4, 2020

Thank you, @aseyffert, this is exactly what I needed.

FWIW there appears to have been some effort to circumvent this issue #175 using the new tree-sitter grammar algorithm https://github.com/yitzchak/tree-sitter-latex but that has not been updated in over a year.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants