-
Notifications
You must be signed in to change notification settings - Fork 61
Add streams docs #1091
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add streams docs #1091
Conversation
add permissions info
Co-authored-by: Giorgos Bamparopoulos <gbamparop@gmail.com>
solutions/observability/logs/streams/management/extract/dissect.md
Outdated
Show resolved
Hide resolved
solutions/observability/logs/streams/management/extract/grok.md
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! 🦖
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Took a quick read through. Some thoughts below
solutions/observability/logs/streams/management/extract/dissect.md
Outdated
Show resolved
Hide resolved
solutions/observability/logs/streams/management/extract/grok.md
Outdated
Show resolved
Hide resolved
solutions/observability/logs/streams/management/extract/grok.md
Outdated
Show resolved
Hide resolved
Requires an LLM Connector to be configured. | ||
Instead of writing the grok patterns by hand, you can use the **Generate Patterns** button to generate the patterns for you. | ||
|
||
% TODO Elastic LLM? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the plan for this line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure on this one. @LucaWintergerst are we saying we want to describe configuring the LLM in a future iteration?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, we'll have an out of the box LLM soon that we should talk about here in the future
This PR adds the first iteration of the Streams documentation.