Questions tagged [langchain]

LangChain is an open-source framework for developing applications powered by language models. Use [py-langchain] for the python-specific package.

LangChain is an open-source framework for developing applications powered by language models. Use [py-langchain] for the python-specific package.

672 questions
3
votes
1 answer

LangChain, chromaDB Chroma.fromDocuments returns TypeError: Cannot read properties of undefined (reading 'data')

I am running a langChain process on a node local server. On my code : // Create docs with a loader const loader = new TextLoader("Documentation/hello.txt"); const docs = await loader.load(); // Create vector store and index the docs const…
qYUUU
  • 33
  • 1
  • 5
3
votes
1 answer

Langchain gpt-3.5-turbo models reads files - problem

I am making really simple (and for fun) LangChain project. A model can read PDF file and I can then ask him questions about specific PDF file. Everything works fine (this is working example) from PyPDF2 import PdfReader from…
devZ
  • 606
  • 1
  • 7
  • 23
3
votes
3 answers

How to combine two Chroma databases

I created two dbs like this (same embeddings) using langchain 0.0.143: db1 = Chroma.from_documents( documents=texts1, embedding=embeddings, persist_directory=persist_directory1, ) db1.persist() db21 = Chroma.from_documents( …
randomQs
  • 31
  • 2
3
votes
0 answers

How do you resolve this error when you pickle a LangChain VectorStoreIndexCreator object?

I want to pickle/save my VectorStore index in LangChain. Below is my code: import pickle from langchain.document_loaders import PyPDFLoader from langchain.indexes import VectorstoreIndexCreator # Load the PDF file pdf_path = "Los Angeles County, CA…
L12345
  • 31
  • 2
3
votes
2 answers

How does LlaMA index select nodes based on the query text?

When I query a simple vector index created using a LlaMA index, it returns a JSON object that has the response for the query and the source nodes (with the score) it used to generate an answer. How does it calculate which nodes to use? (I'm guessing…
shardgon
  • 33
  • 2
3
votes
2 answers

Is it possible to access the history of calls done by LangChain LLM object to external API?

When we create an Agent in LangChain we provide a Large Language Model object (LLM), so that the Agent can make calls to an API provided by OpenAI or any other provider. For example: llm = OpenAI(temperature=0) agent = initialize_agent( …
Roman
  • 124,451
  • 167
  • 349
  • 456
3
votes
5 answers

langchain: No module named 'langchain.document_loaders'

First and foremost I'm using the latest of Python (==3.11.2) and the most recent version of langchain (==0.0.128). Following the latest docs on DirectoryLoader, the following line should work: from langchain.document_loaders import…
Brandon
  • 308
  • 1
  • 2
  • 10
3
votes
1 answer

How to get around OpenAI completion token limit when trying to do text to SQL conversion?

I'm using langchain and OpenAI to implement a natural language to SQL query tool. It works okay for schemas with a small number of simple tables. However, when I try to use it for schemas that have many tables or fewer tables with many columns, the…
Dean H.
  • 31
  • 2
2
votes
1 answer

Is there a way I can handle context and general questions in Langchain QA Retrieval?

I want to make a chatbot, that should answer questions from the context, in my case, a vector database. It is doing that perfectly. But I also want it to answer questions, which are not in the vector database. But it is unable to do so. It only is…
Usman Afridi
  • 179
  • 1
  • 11
2
votes
1 answer

Langchain: different knowledge depending on language

I'm trying to train a chatbot with domain-specific knowledge (in particular real estate in Switzerland). I created a chatbot, which I feed some information based on a PDF and then I'm running a chatbot with memory function. It works pretty well, in…
nicoe
  • 23
  • 4
2
votes
1 answer

LangChain specific default response

using LangChain and OpenAI, how can I have the model return a specific default response? for instance, let's say I have these statement/responses Statement: Hi, I need to update my email address. Answer: Thank you for updating us. Please text it…
iambdot
  • 887
  • 2
  • 10
  • 28
2
votes
1 answer

AttributeError: module 'chromadb' has no attribute 'config'

so i recently started to work on chromabd and i am facing this error: "module 'chromadb' has no attribute 'config'" here is my code: from langchain.vectorstores import Chroma from sentence_transformers import SentenceTransformer model…
2
votes
1 answer

How to run multiprocess Chroma.from_documents() in Langchain

Can we somehow pass an option to run multiple threads/processes when we call Chroma.from_documents() in Langchain? I am trying to embed 980 documents (embedding model is mpnet on CUDA), and it take forever. Specs: Software: Ubuntu 20.4 (on Win11…
Paris Char
  • 477
  • 4
  • 17
2
votes
1 answer

how to make privateGPT retrieving info only from local documents?

I'm using privateGPT with the default GPT4All model (ggml-gpt4all-j-v1.3-groovy.bin) but also with the latest Falcon version. My problem is that I was expecting to get information only from the local documents and not from what the model "knows"…
2
votes
0 answers

How to stream the response from LangChain QAMapReduceChain

I am using Langchain to interact with a long piece of text. The text is split into chunks, which are passed in as the docs variable below. I have set up a chain that streams my response, however I am finding that handleLLMNewToken is called whilst…