0

I just created a langchain index in Python by passing in quite a few URLs. I used the UnstructuredURLLoader as my Document Loader.

Now, to use UnstructuredURLLoader, I had to install several packages on my environment including several very larges ones like unstructured and chromadb.

What I want to know is, when I am deploying my index to run queries on it, is there any way that I can kind of export the index to do so? Sort of like how machine learning models are deployed for inference?

There are two main reasons why I want to do this, one is to avoid installing all of the large packages mentioned previously (if possible) and secondly, to bypass the lag of reading in the documents for their URLs.

Minura Punchihewa
  • 1,498
  • 1
  • 12
  • 35

1 Answers1

0

Not sure about the entire context, but based on this question, you are able to save your vectorstore indexes with the following:

index = VectorstoreIndexCreator(vectorstore_kwargs={"persist_directory":"./custom_save_dir_path"}).from_loaders([loader])