-1

Hugging face provides transforms and models that allows AL/ML processing offline - https://huggingface.co/ We currently use Digital Ocean and I would like to unload our ML onto DO functions. I know AWS does this already with a few AWS services: https://aws.amazon.com/blogs/compute/hosting-hugging-face-models-on-aws-lambda/ I was wondering if we can run it on DO anyone done this as it would really scale and be super cheap to do this instead of spinning up a droplet? Any help would be appreciated.

This can already be done on AWS but looking to see if we can do it with DO functions?

RodgerThat
  • 19
  • 1
  • 4

1 Answers1

0

Using https://github.com/digitalocean/sample-functions-python-jokes as an example, you can copy that and then change the requirements.txt file to install in the datasets and transformers.

Then in the __main__.py file, you can change it to look something like:

from transformers import pipeline

def main(args):
    generator = pipeline('text-generation', model = 'gpt2')
    text = generator("Hello, I'm a language model", max_length = 30, num_return_sequences=3)

    return {
      'body': {
        'response_type': 'in_channel',
        'text': text[0]['generated_text']
      }
    }

Then change the project.yml to look like

packages:
  - name: generate
    actions:
      - name: generate
        runtime: 'python:default'

The directory you have will ultimately look like:

sample-transformers-generate/
    package/generate/generate
        __main__.py
        build.sh
        requirements.txt
    README.md
    project.yml

Then do on CLI:

doctl serverless deploy sample-transformers-generate --remote-build

Then to call the function:

doctl sls fn invoke generate
alvas
  • 115,346
  • 109
  • 446
  • 738
  • Hi Alvas thanks very much for getting back to me about this. I only have one main concern I just found out that DO only allows for 48 MB zips of the function and have tried it and it comes back saying that I have exceeded the allowed size? Anyone know a way around this? – RodgerThat Mar 13 '23 at 06:14