1

I create an npm registry in Google's Artifact Registry called npm-public. I have a project that, when published, has the following structure...

- server.mjs
- package.json

I publish the arifacts to the server and can install them locally. Next I try to make reading from the registry public like this...

gcloud artifacts repositories add-iam-policy-binding npm-public --location=LOCATION --member=allUsers --role=artifactregistry.reader

Everything seems correct so now I would like to access an artifact via the browser. I have tried https://us-central1-npm.pkg.dev/my-project/npm-public/@my-scope/test-app/server.mjs but I get a 404 error.

How do I access a file in the npm Artifact Registry using a simple GET request?

I would like the actual files and not the tar files. Similar to how unpkg works...

https://unpkg.com/react@16.7.0/umd/react.production.min.js

Jackie
  • 21,969
  • 32
  • 147
  • 289
  • [This](https://stackoverflow.com/questions/69713527/) question deals with the same issue, but for Maven repositories. Have you checked if any of the two answers work with npm? – ErnestoC Mar 30 '22 at 15:58

1 Answers1

1

After investigating more from my initial comment, you would not find this functionality using Artifact Registry by itself.

All npm packages are bundled as tarballs, and Artifact Registry will behave the same way:

A package is any of the following: a) A folder containing a program described by a package.json file. b) A gzipped tarball containing (a). c) A URL that resolves to (b)...

What UNPKG does (external to npm) is download these tarball files, decompress them, and serve the static files as a separate service (CDN).

One way I downloaded the tarball with a GET request is using curl with -L to take care of redirects (otherwise, there will be no response):

curl -L "https://<region>-npm.pkg.dev/<projectid>/<reponame>/<@scope>/<pkgname>/-/<@scope>/<pckname-version>.tgz" -o name.tgz

You can also pipe the tarball into tar xvz -C ./destination to download and extract it in one command and get only loose source files:

curl -L "<artifact-registry-repo-package>" | tar xvz -C ./destination

Even also, you can search the zipped tar you fetch from your repository, find specific files you need and only keep those. This is very close to plugging a URL to a browser and getting the file:

curl -L "<artifact-registry-repo-package>" | tar zxvf - </path/to/required/file.js>

To be able to have something like UNPKG where you have URLs that directly host your files, you would need to use different solutions together (to make your own CDN).

One way of achieving this is storing your artifacts automatically at build time in Cloud Storage (if you are using Cloud Build and Cloud Storage), or implement a Cloud Function to read your artifacts, decompress tarballs, and store files in Cloud Storage. From here, it's up to you to decide how to implement it.

As for off-the-shelf solutions, the closest I could find is this one, but I have not tested it with Artifact Registry.

ErnestoC
  • 2,660
  • 1
  • 6
  • 19
  • Can you speak a little more on this? Are you suggesting we upload to cloud storage as well after publish? – Jackie Mar 31 '22 at 13:51
  • Also is there another out of the box solution that could just handle the Caching server side? Avoiding the need for Cloud storage (So I don't need to have 2 source of truth artifact locations) and allowing the dependencies to be extracted and cached on the local server? Right now like Express static or something seems the best solution. – Jackie Mar 31 '22 at 13:59
  • I can get the tar file like this `curl -L -v "https://us-central1-npm.pkg.dev/pure-plaform/npm-public/@pure-pm/bucky-barnes/-/@pure-pm/bucky-barnes-0.0.1.tgz"`. Do you know of any CDNs that support custom NPM registries? – Jackie Mar 31 '22 at 15:57
  • @Jackie I edited my answer to reply to those questions and elaborate a bit more. Let me know if it was useful. – ErnestoC Mar 31 '22 at 17:31
  • I looked at jsdeliver but it doesn't look to support custom registries just certain ones. The Cloud Function approach may be the one I end up needing to go with because I really don't want to store it in 2 places (pay double storage costs) but I also want some caching so I don't use up my quota. I asked another question about specifically which CDNs work with Google Artifact reg – Jackie Mar 31 '22 at 18:14
  • @Jackie As GET requests are concerned in this question, I also found out that you can download the tar, fetch only specific files from it and not keep everything else with this single command: `curl -L "" | tar zxvf - `. This is **very** close to what you need using GET requests. You can keep adding paths to files to fetch all the specific files you need easily. I added this to my answer as well. – ErnestoC Apr 01 '22 at 18:34
  • 1
    I ended up doing a simple express server using tar-stream to allow me to manage the cache and avoid hitting the registry too much. It still seems weird that there isnt an open source out of the box solution for this requirement – Jackie Apr 03 '22 at 14:54