colab offers free TPUs. It's easy to see how many cores are given, but I was wondering if its possible to see how much memory per core?
Asked
Active
Viewed 5,814 times
1 Answers
6
As far as I know we don't have an Tensorflow op or similar for accessing memory info, though in XRT we do. In the meantime, would something like the following snippet work?
import os
from tensorflow.python.profiler import profiler_client
tpu_profile_service_address = os.environ['COLAB_TPU_ADDR'].replace('8470', '8466')
print(profiler_client.monitor(tpu_profile_service_address, 100, 2))
Output looks like:
Timestamp: 22:23:03
TPU type: TPU v2
Utilization of TPU Matrix Units (higher is better): 0.000%
TPUv2 has 8GB per-core and TPUv3 has 16GB HBM per-core (https://cloud.google.com/tpu).

jysohn
- 871
- 6
- 9
-
1How do you check the number of available TPU cores? – HappyFace Nov 15 '21 at 11:06
-
If you're using JAX, then you can use `jax.devices()` to get the number of TPU cores (or devices, more generally). – joe Feb 11 '22 at 10:02