0

I am trying to connect to the redis semantic cache using the below code

import redis
import langchain
from langchain.cache import RedisSemanticCache
from langchain.embeddings import OpenAIEmbeddings
from dotenv import load_dotenv

# Connect to the Redis cache
load_dotenv()
redis_host ='hostname'
redis_port = 6379
redis_password = 'passwordxyz0'
#cache = redis.Redis(host=redis_host,port=6379,password=redis_password,db=0,ssl=True)
deployment_name = 'model_deployment'
langchain.llm_cache = RedisSemanticCache(
    redis_url=f'redis:hostname:6379', embedding= OpenAIEmbeddings(deployment=deployment_name))
print(OpenAIEmbeddings(deployment=deployment_name))
print(langchain.llm_cache)

# Set a value in the cache

key = f'{deployment_name}:redis-test-key'
value = 'redis-test-value'
llm_string = "hey"
prompt = "world"
print(langchain.llm_cache.lookup(llm_string=llm_string,prompt=prompt))

while running this piece of code just to check the lookup function is printing, i got below reponse error:

 redis.exceptions.ResponseError: unknown command `MODULE`, with args beginning with: `LIST`,

earlier i was getting argument validation error for openai api key for which i checked and passed openai api key through loadenv.

any suggestions on how to get the semantic redis cache

codette
  • 83
  • 1
  • 7

0 Answers0