There are a few ways to manage resources across projects. Probably the most straightforward way to do this is to:
- Create a service account with appropriate permissions across your project(s).
- Setup an Airflow connection with the service account you have created.
- You can create workflows that use that connection and then specify the project when you create a Cloud Dataproc cluster.
Alternate ways you could do this that come to mind:
- Use something like the BashOperator or PythonOperator to execute Cloud SDK commands.
- Use an HTTP operator to ping the REST endpoints of the services you want to use
Having said that, the first approach using the operators is likely the easiest by far and would be the recommended way to do what you want.
With respect to Dataproc, when you create a job, it will only bind to clusters within a specific project. It's not possible to create jobs in one project against clusters in another. This is because things like logging, auditing, and other job-related semantics are messy when clusters live in another project.