Skip to content

Feat: Add Optional LLM Helper#20

Merged
vituri merged 11 commits intofeat/add-config-themingfrom
feat/add-optional-llm-helper
May 9, 2025
Merged

Feat: Add Optional LLM Helper#20
vituri merged 11 commits intofeat/add-config-themingfrom
feat/add-optional-llm-helper

Conversation

@DeepanshKhurana
Copy link
Collaborator

@DeepanshKhurana DeepanshKhurana commented Apr 23, 2025

Closes #19

Description

  • Add a config.yml block for LLMs. Only params present in all functions of the chat_* family of functions from ellmer are used for now. These are also more colloquial and common parameters.
  • Redo logic for log processing to ensure we have the correct data to send to the LLM.
  • The prompt generates HTML which can then be embedded into a modalDialog. There are other ways that are a little less brittle but I found this to work with exceptional consistency since the classes are specified in the system prompt.
  • This PR also enables box.linters properly + carries over some of those lint fixes.

Demo

Screen.Recording.2025-04-13.at.8.38.46.PM.mov

Definition of Done

  • The change is thoroughly documented.
  • The CI passes (R CMD check, linter, unit tests, spelling).
  • Any generated files have been updated (e.g. .Rd files with roxygen2::roxygenise())

@DeepanshKhurana DeepanshKhurana added the enhancement New feature or request label Apr 23, 2025
@DeepanshKhurana DeepanshKhurana self-assigned this Apr 23, 2025
@DeepanshKhurana DeepanshKhurana marked this pull request as ready for review April 23, 2025 07:55
@DeepanshKhurana DeepanshKhurana requested review from jakubnowicki and removed request for TymekDev April 23, 2025 07:58
@DeepanshKhurana DeepanshKhurana linked an issue May 6, 2025 that may be closed by this pull request
1 task
@DeepanshKhurana DeepanshKhurana requested a review from vituri May 6, 2025 07:57
Copy link

@vituri vituri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR is really nice!! Why read all the logs if we can make a robot read it and even give the solution? It will save the user several Google searches. As you know, we think this is so useful that we already want to embed in another app.

I wrote some remarks, but the majority does not require any action: they are just me thinking out loud or making questions to understand your reasoning. You can mark them as solved. If you make some crucial changes, ping me again to review.

PS: I see that you also fixed lots of linter errors and styling; I did not focus on these changes since the CI is green.

#' @return A character vector of valid LLM providers
get_valid_providers <- function(
) {
ellmer_functions <- ls(ellmer)[
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ls function has an argument called pattern, which can be used as follows:

ls(ellmer, pattern = "^chat_")

Do you think this could cut the grepl part here?

get_llm_function <- function(
llm_config = get_llm_config()
) {
ellmer[[glue("chat_{llm_config$provider}")]]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, now I get your observation after the "nolint" in the beginning.

The user passes a provider and you have to check if the provider exists among many "chat_***" functions; and finally, here you have to return this function from the ellmer list.

config[
get
],
ellmer, #nolint: we do use this package just in a separate notation [[ ]]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we be specific as to which linter is being ignored? Something like

  ellmer, # nolint: box_unused_att_pkg_linter

api_key = llm_config$api_key,
model = llm_config$model,
system_prompt = llm_config$system_prompt,
seed = 42,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting a seed will guarantee that the results are always the same, giving the same input?

system_prompt = llm_config$system_prompt,
seed = 42,
api_args = list(
temperature = 0
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Genuine question: why set the temperature to zero? Is this too restrictive? I never played with this parameter. What if the user find the answer unhelpful and want to try the prompt again and again until something useful appears?

)
})

shiny$observeEvent(llm_result(), {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just nitpicking: I often prefer observe({...}) |> bindEvent(...) instead of observeEvent because the former can be cached more easily. Looking into the docs, it seems to be just a matter of taste for now (maybe they will deprecate it in the future?)

bindEvent() was added in Shiny 1.6.0. When it is used with reactive() and observe(), it does the same thing as eventReactive() and observeEvent(). However, bindEvent() is more flexible: it can be combined with bindCache(), and it can also be used with render functions (like renderText() and renderPlot()).

processed_logs <- shiny$reactive({
logs_data() %>%
pmap_dfr(
~ {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another nitpick (sorry!!): using formula is regarded as "old style" according to purrr docs:

A formula, e.g. ~ .x + 1. You must use .x to refer to the first argument. Only recommended if you require backward compatibility with older versions of R.

If the required R version is > 4.1, we can even start using the new pipe |> (supposing, of course, that we don't use anything fancy from the old pipe operator).

Base automatically changed from refactor/simplify-state to feat/add-config-theming May 9, 2025 17:03
@vituri vituri merged commit 687c311 into feat/add-config-theming May 9, 2025
1 check passed
@vituri vituri deleted the feat/add-optional-llm-helper branch May 9, 2025 17:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Add (Optional) LLM Support using {ellmer}

2 participants