Replies: 2 comments
-
🤖 Hello, Thank you for your interest in LlamaIndex. Based on the current version of LlamaIndex (v0.8.24.post1) and the provided context, it does not have built-in support for TAPAS or TaBERT for tabular data extraction and question answering. The available embeddings in this version are GoogleUnivSentEncoderEmbedding, LangchainEmbedding, OpenAIEmbedding, and LinearAdapterEmbeddingModel. However, it's important to note that the LlamaIndex is a data framework for your LLM application and it's designed to be flexible and extensible. If you have the necessary skills, you could potentially integrate TAPAS or TaBERT into LlamaIndex by creating a new embedding model that uses these technologies. Please refer to the llama_index/embeddings/init.py for more information on how to create a new embedding model. As for future plans to include TAPAS or TaBERT, I don't have that information at the moment. You might want to keep an eye on the repository for any updates or new releases that might include these features. I hope this helps! If you have any other questions, feel free to ask. Best regards, SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@iriye no plan for this currently -- but would love a PR if you had something in mind 🤔 I'm not sure what the flow would look like tbh |
Beta Was this translation helpful? Give feedback.
-
Question Validation
Question
i want to wander how to use llamaindex llm models as tapas? Please let know is there a support or plan to include TAPAS or TaBERT for tabular data extraction question answering purpose?
Beta Was this translation helpful? Give feedback.
All reactions