Questions tagged [huggingface-hub]

22 questions
3
votes
1 answer

Indefinite wait while using Langchain and HuggingFaceHub in python

from langchain import PromptTemplate, HuggingFaceHub, LLMChain import os os.environ['HUGGINGFACEHUB_API_TOKEN'] = 'token' # initialize HF LLM flan_t5 = HuggingFaceHub( repo_id="google/flan-t5-xl", model_kwargs={"temperature":…
3
votes
1 answer

Google's flan-t5 models are not loading on HuggingFaceHub through Langchain

I am trying to replicate the example code provided on Langchain website (link here) but I am getting the following error whether I run it on Google colab or locally: HfHubHTTPError: 504 Server Error: Gateway Time-out for url:…
r1sh4bh
  • 61
  • 5
3
votes
1 answer

Using a custom trained huggingface tokenizer

I’ve trained a custom tokenizer using a custom dataset using this code that’s on the documentation. Is there a method for me to add this tokenizer to the hub and to use it as the other tokenizers by calling the AutoTokenizer.from_pretrained()…
1
vote
2 answers

How to load a huggingface dataset from local path?

Take a simple example in this website, https://huggingface.co/datasets/Dahoas/rm-static: if I want to load this dataset online, I just directly use, from datasets import load_dataset dataset = load_dataset("Dahoas/rm-static") What if I want to…
4daJKong
  • 1,825
  • 9
  • 21
1
vote
1 answer

create_csv_agent with HuggingFace: could not parse LLM output

I am using Langchain and applying create_csv_agent on a small csv dataset to see how well can google/flan-t5-xxl query answers from tabular data. As of now, I am experiencing the problem of ' OutputParserException: Could not parse LLM output: `0`' >…
1
vote
0 answers

TypeError: HfApi.create_repo() got an unexpected keyword argument 'organization'

I follow tutorial https://huggingface.co/blog/how-to-train-sentence-transformers . Its Colab notebook is…
1
vote
1 answer

How to load Unbabel Comet model without nested wrapper initialization?

The Unbabel COMET is a scoring library for machine translation. By default, loading the model as per the README works: from comet import download_model, load_from_checkpoint model_path = download_model("Unbabel/wmt22-comet-da") model =…
0
votes
0 answers

how can I find the list of all the environment variables supported in a HuggingFace model?

How can I find the list of all the environment variables supported in a HuggingFace model? E.g. on https://huggingface.co/tiiuae/falcon-40b-instruct model, based on the comment provided in this post:…
0
votes
1 answer

How to use locally saved United MUP model in Unbabel-Comet model for Machine Translation Evaluation?

From https://huggingface.co/Unbabel/unite-mup, there's a model that comes from the UniTE: Unified Translation Evaluation paper. The usage was documented as such: from comet import download_model, load_from_checkpoint model_path =…
alvas
  • 115,346
  • 109
  • 446
  • 738
0
votes
0 answers

Hugging face HTTP request on data from parquet format when the only way to get it is from the website's data viewer, how to fix?

Due to Hugging Face datasets disappearing, I've had to get the data from their data viewer using the parquet option. But when I try to run it, there is some sort of HTTP error. I've tried downloading the data but I can't. What is the recommended…
Charlie Parker
  • 5,884
  • 57
  • 198
  • 323
0
votes
1 answer

How does one create a pytorch data loader with a custom hugging face data set without having errors?

Currently my custom data set gives None indices in the data loader, but NOT in the pure data set. When I wrap it in pytorch data loader it fails. Code is in colab but will put it here in case colab dies someday: pip install datasets pip install…
0
votes
1 answer

is there a way to search a huggingface Repository for a specific filename?

I'd like to search a huggingface repository for a specific filename, without having to clone it first as it is a rather large repo with thousands of files. I couldn't find a way to do it with the web interface, I installed the python package…
asiera
  • 492
  • 5
  • 12
0
votes
0 answers

Gradio is not auto downloading Hugging Face model weights - auto downloading on Google Colab but not on Windows

The below code automatically downloading necessary model files and weights on Google Colab but not on Windows Any ideas? How to fix? import gradio as gr from diffusers import DiffusionPipeline import torch import base64 from io import…
0
votes
0 answers

In Huggingface is it possible to add files to an existing dataset instead of overwrite it each time

I am creating a dataset from a generator and then I want to save it to the hub. However I actually want to append to the dataset in the hub and not overwrite it. Is this possible? My code is below. After running 10 times I would like each row…
Funzo
  • 1,190
  • 2
  • 14
  • 25
0
votes
0 answers

Hugging Face- Not able to implement certain code in pyspark

I'm able to implement the code in python, but getting this error while implementing in spark udf. PythonException: 'ImportError: cannot import name 'CommitOperationAdd' from 'huggingface_hub'…
1
2