In short: What are the limits of both cloud projects and service accounts per project? How can they be increased? Is the architecture a good idea at all?
I am developing an IoT application with tens of thousands of planned devices in the field, having a few devices per customer and hundreds of customers. Every device will continuously (24/7) stream measurement data directly to BigQuery with one dataset (or table) per device at sample rates of at least 100Hz.
Of course, every device needs to be authenticated and authorized to gain tightly restricted access to its cloud dataset. As stated in the Auth Guide API keys are not very secure. Therefore, the most appropriate solutions seems to have one service account per customer with one account key per device (as suggested in this GCP article). However, the FAQs of Cloud IAM state that the number of service accounts is limited to 100 per project.
- This limit could be reached quickly. If so, how easily/costly is it to increase this limit towards thousands or tens of thousands of service accounts per project?
- In such a scenario also the number of needed projects could easily grow to hundreds or thousands. Would this be feasible?
- Is this overall concept practical or are there better approaches within GCP?