1

I know that in google cloud pub/sub, message will be lost after 7 days regardless of their acknowledgement state. Is there anyway we can send and store those messages in a file or csv or mq even after 7 days. My aim is whenever publisher publishes message this message should store in other place also.

Thanks, santosh

santosh.a
  • 503
  • 5
  • 20

1 Answers1

3

There is no automatic way to store the messages that are published into Google Cloud Pub/Sub, but you could set up a subscriber that would store the messages as they are published. You would create a separate subscription on your topic that would be used for making the backups. Then, you would write a subscriber that reads messages using this subscription and immediately persists them in the desired place and format. You could use Cloud Dataflow to solve this by connecting a PubSubIO on the input side with a TextIO on the output side.

Kamal Aboul-Hosn
  • 15,111
  • 1
  • 34
  • 46
  • Thank you for the reply. Also is there anyway we can save topics and subscribers which are inactive for more than 30days? – santosh.a Feb 08 '17 at 08:31
  • Topics will not be deleted after 30 days automatically, only subscriptions (the documentation needs to be updated to reflect this fact). The only way to prevent that is to perform an action like pull on the subscription or have a message successfully delivered via push. – Kamal Aboul-Hosn Feb 08 '17 at 16:15
  • Thank you for the reply. I'm just beginner in google pubsub and oauth tokens. could you tell me how can i generate access token? for example, for executing this command, curl -i -H "Accept: application/json" -H "Content-Type: application/json" -H "Content-Length: 0"-H "Authorization: Bearer " -X PUT "https://pubsub.googleapis.com/v1/projects/{project}/topics/{topicName}" – santosh.a Feb 10 '17 at 13:43
  • See the documentation about GCP authorization: https://cloud.google.com/docs/authentication – Kamal Aboul-Hosn Feb 10 '17 at 18:46