One idea would be the following:
Push events from the Event Grid to the Event Hub, and use the stream analytics to store the data in the cosmos Db, as you already have that, and to the blob storage in format {date}/{time}. The time you can set to be hourly.
This way, you can store your logs partitioned per date, and per hour in the file(s).
Now you could have azure function triggered on the blob, that will read the file, and count objects in the array, and write the count in the SQL DB. This way your count will be more event-driven, and more or less you will always have the accurate information in the DB
The alternative is that you do not store to the SQL DB and just do the count in the PowerBI, as there is a connector for Blob store as well.
Another idea would be to play with the Azure stream analytics job and time-window functions. But the tricky part here becomes partitioning, and how to make it work with insert-or-update properly on SQL DB. For this, we would need to know some implementation specifics.