1

Is there a way to export logs out of IBM Cloud? Mainly activity logs which come from Activity Tracker. Also, does anyone know where these logs are stored? I can only view them inside Kibana but dont see any storage associated with it.

I tried

ibmcloud logging log-show but it errors out as: 
read: connection reset by peer
NoviceMe
  • 3,126
  • 11
  • 57
  • 117

1 Answers1

2

First of all, you must choose a premium plan to make API calls and export the logs from Activity Trackerenter image description here

Once you create the Activity Tracker service, you can use the CLI to export the logs by creating a session

ibmcloud at session help create                                                                                                           
NAME:
   bx at session create - Create a new session
USAGE:
   bx at session create [command options] [arguments...]

OPTIONS:
   --start value, -s value        start date, in format 2006-01-02 (default: "2018-11-09")
   --end value, -e value          end date, in format 2006-01-02 (default: "2018-11-22")
   --at-account-level, -a         include entire account (default: current space only)
   -T value, --search-time value  Specify search time with the hour of one day, the valid value is 0-23
   --json                         output in JSON format

Eg., ibmcloud at session create -s 2018-11-21 -e 2018-11-22 --json

Once the session is created, it returns a session id.

You can download the logs using this command

ibmcloud at download -o events.log <SESSION_ID>

For more details on downloading the events, refer this link

You can also make a REST API call to download the events

Update: Reading through the documentation in-depth, I found out where the data is stored.

The Activity Tracker service includes 2 data repositories where your event data is stored:

One repository where event data is available for analysis through Kibana. The standard or lite plan only stores data in this repository. Data is kept for 3 days.

One long-term storage repository that hosts the event data for the premium plan. Event data is stored until you either configure a retention policy or delete them manually. By default, events are kept indefinitely.

The storage is encrypted. Also, to configure the retention policy read this link

Community
  • 1
  • 1
Vidyasagar Machupalli
  • 2,737
  • 1
  • 19
  • 29
  • This is paid? Is there free service? Plus it is not continuous, I need a permanent connection to push logs outside? – NoviceMe Nov 23 '18 at 12:57
  • Yes, Premium is paid. There is a Lite(free) plan but with limitations. You can find more information about the plans and what they offer here- https://console.bluemix.net/docs/services/cloud-activity-tracker/activity_tracker_ov.html#activity_tracker_ov – Vidyasagar Machupalli Nov 23 '18 at 13:50
  • Is there a way to continuously stream out data from paid plan? – NoviceMe Nov 23 '18 at 17:36
  • Getting following error: x509: certificate signed by unknown authority, running above command. – NoviceMe Nov 26 '18 at 15:53
  • @NoviceMe Updated my answer with details on where the Activity Tracker event data is stored. – Vidyasagar Machupalli Nov 27 '18 at 08:25
  • Any command I run I get the 509 error. for example: ibmcloud at session create -s 2018-11-21 -e 2018-11-22 --json – NoviceMe Nov 27 '18 at 14:47
  • I am expecting that you are running this on an Activity Tracker service created under Premium plan? If so check run this command to see the status of your Activity tracker(at) `ibmcloud at status` – Vidyasagar Machupalli Nov 27 '18 at 15:03
  • Same thing, maybe proxies, will try later. Also, can I use this in a script somehow to continuously spit data out? – NoviceMe Nov 27 '18 at 15:25
  • Yes, you can write a script or a polling job to periodically pull the events data to store in a storage like Cloud Object Storage or JSON on Cloudant NoSQL DB. – Vidyasagar Machupalli Nov 28 '18 at 05:45
  • I was able to get the logs out. But the question still remains how can I poll continuously? As currently, it polls only during a particular date. I want continuous polling like every 5 minutes or so for new data coming in? – NoviceMe Nov 28 '18 at 20:01
  • This is what I use for polling - https://console.bluemix.net/docs/openwhisk/openwhisk_feeds.html#openwhisk_feeds. It comes with a corn job which you can configure. As data is persistent for 3 days you can do a pull every three days or do it every minute. – Vidyasagar Machupalli Nov 29 '18 at 00:04
  • @NoviceMe FYI...IBM Log Analysis with LogDNA is available today on IBM Cloud. This service simplifies log management in the Cloud and provides a setting for long-term archiving to IBM Cloud Object Storage - Read more here -https://www.ibm.com/blogs/bluemix/2018/11/increase-observability-ibm-log-analysis-with-logdna-is-available-today-on-ibm-cloud/ – Vidyasagar Machupalli Nov 30 '18 at 02:46
  • U were saying that you are running a cron job to pull data out of AT. Can you please share code or do you have it anywhere on git? – NoviceMe Nov 30 '18 at 03:20
  • @NoviceMe I tried with Serverless approach but as sometimes the file is too large, serverless is not a good option here. So, I created a Cloud Object Storage service and created a NodeJS app which when you run will generate events and create an object in COS bucket. You can find the code here - https://repl.it/@aficionado/ActivityTrackerCloudObjectStorage . Follow the comments in the code – Vidyasagar Machupalli Dec 01 '18 at 12:15
  • Isn't there a simpler way in IBM to send audit logs out in JSON? Like something that pushes out logs as they come rather than pulling? – NoviceMe Dec 02 '18 at 06:34
  • As mentioned above, you can use LogDNA(released recently) with log analysis service where there is an option to easily archive logs to Cloud Object Storage - https://console.bluemix.net/docs/services/Log-Analysis-with-LogDNA/archive.html#archiving – Vidyasagar Machupalli Dec 02 '18 at 06:50
  • This does not make sense at all. The code you are suggesting above is not exporting data continuously. I have to wait for a day to export data. If I use the code you send above and poll for data every 5 minutes, I will have all duplicate data?? – NoviceMe Dec 05 '18 at 00:00