2

I have a cloud composer dag which has scheduler property set to none and needs to triggered.

I have uploaded my dag code to the cloud compose gcs folder and tried to trigger the from my local using my local gcloud credentials as suggested in the doc in python(https://github.com/GoogleCloudPlatform/python-docs-samples/blob/HEAD/composer/rest/composer2/composer2_airflow_rest_api.py). It was working fine. I could trigger the dag from my local using the application-default credentials.

Now I want to trigger the same dag from my local using the java code. But the documentation was not helpful in creating a client and triggering the dag run. I have known that there is a package provided by google that is google-cloud-orchestration-airflow(https://cloud.google.com/java/docs/reference/google-cloud-orchestration-airflow/latest/overview) But it doesn't have a class through which I can trigger a dag run from our service java code. I want to create a java client for my dag and trigger the dag run manually.

Sasirekha MSVL
  • 113
  • 2
  • 12

1 Answers1

2

Instead of using the google-cloud-orchestration-airflow library I could duplicate whatever python code had done and trigger the dag run from my local by adding an Authorization header and setting it to the access token which was generated from the default credentails in my local.

Attaching the working code:

import com.google.auth.oauth2.GoogleCredentials;
import com.google.gson.JsonObject;
import org.apache.http.HttpEntity;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.ContentType;
import org.apache.http.entity.StringEntity;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.util.EntityUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.IOException;

public class Main {
    static Logger logger = LoggerFactory.getLogger(Main.class);
    public static void main(String[] args) throws IOException {
        //Build the empty request payload
        JsonObject jsonPayload = new JsonObject();
        jsonPayload.add("conf", new JsonObject());
        //set the headers
        GoogleCredentials credentials = GoogleCredentials.getApplicationDefault();
        credentials.refresh();
        String accessToken = credentials.getAccessToken().getTokenValue();
        HttpPost httpPost = new HttpPost(  "https://<env-url>.composer.googleusercontent.com/<dag-name>/dagRuns");
        httpPost.setHeader("Authorization", "Bearer " + accessToken);
        httpPost.setHeader("Content-Type", ContentType.APPLICATION_JSON.getMimeType());
        // Set the request body to trigger the DAG run.
        httpPost.setEntity(new StringEntity(jsonPayload.toString()));

        CloseableHttpClient httpClient = HttpClients.createDefault();
        try {
            CloseableHttpResponse httpResponse = httpClient.execute(httpPost);
            HttpEntity entity = httpResponse.getEntity();
            String responseContent = EntityUtils.toString(entity);
            logger.info("Received response: " + responseContent);
        } catch (Exception e) {
            logger.info("Failed to send request: " + e.getMessage());
        } finally {
            httpClient.close();
        }

    }
}

In the above way I could trigger a dag run from my local. I couldn't figure out a way to do via the google-cloud-orchestration-airflow package. But attaching an Authorization header did the work of authentication.

Sasirekha MSVL
  • 113
  • 2
  • 12