I'm trying to develop a data exploration environment for heavy processing of "Small Data" (10 - 30 GB). Reliability and stability are not concerns for these lightweight environments (that basically just contain Jupyter, Julia, Python, and R, plus some packages). Instead, I'd like to maximize performance, and the data sets I'm working with are small enough to fit into memory. Is there a way that I can boot a Linux image directly into RAM on Google Compute Engine, bypassing the SSD altogether?
Google provides instructions on how to create a RAM disk for storing data (https://cloud.google.com/compute/docs/disks/mount-ram-disks), but I would like all of the OS files to be on the RAM disk as well.