There are more and more services in which you can call LLMs with HTTP requests. Integrating those services into LangCheck in the form of EvalClient could expand the userbase of LangCheck.
I did a quick research of services we can potentially support:
LLM hosting services
APIs provided by model developers
(I personally feel items listed on top are more important, but it is very subjective and unreliable)
We can split the work to investigate the API spec & implement the EvalClient for each service!!
There are more and more services in which you can call LLMs with HTTP requests. Integrating those services into LangCheck in the form of
EvalClientcould expand the userbase of LangCheck.I did a quick research of services we can potentially support:
LLM hosting services
APIs provided by model developers
(I personally feel items listed on top are more important, but it is very subjective and unreliable)
We can split the work to investigate the API spec & implement the EvalClient for each service!!