After investigating more from my initial comment, you would not find this functionality using Artifact Registry by itself.
All npm packages are bundled as tarballs, and Artifact Registry will behave the same way:
A package is any of the following: a) A folder containing a program described by a package.json file. b) A gzipped tarball containing (a). c) A URL that resolves to (b)...
What UNPKG does (external to npm) is download these tarball files, decompress them, and serve the static files as a separate service (CDN).
One way I downloaded the tarball with a GET request is using curl with -L
to take care of redirects (otherwise, there will be no response):
curl -L "https://<region>-npm.pkg.dev/<projectid>/<reponame>/<@scope>/<pkgname>/-/<@scope>/<pckname-version>.tgz" -o name.tgz
You can also pipe the tarball into tar xvz -C ./destination
to download and extract it in one command and get only loose source files:
curl -L "<artifact-registry-repo-package>" | tar xvz -C ./destination
Even also, you can search the zipped tar you fetch from your repository, find specific files you need and only keep those. This is very close to plugging a URL to a browser and getting the file:
curl -L "<artifact-registry-repo-package>" | tar zxvf - </path/to/required/file.js>
To be able to have something like UNPKG where you have URLs that directly host your files, you would need to use different solutions together (to make your own CDN).
One way of achieving this is storing your artifacts automatically at build time in Cloud Storage (if you are using Cloud Build and Cloud Storage), or implement a Cloud Function to read your artifacts, decompress tarballs, and store files in Cloud Storage. From here, it's up to you to decide how to implement it.
As for off-the-shelf solutions, the closest I could find is this one, but I have not tested it with Artifact Registry.