3

I'm trying to use TPUs on Google cloud and I'm trying to figure out how to specify the right TPU to use. I'm trying to following the quickstart

https://cloud.google.com/tpu/docs/quickstart

But it doesn't say how to select a TPU, it only gives instructions to select a region.

$ ctpu up --zone=us-central1-b  \
--tf-version=2.1 \
--name=tpu-quickstart

I am wondering how to select a v2-32 . At first I figured I should just specify us-central1-a but I noticed regions can hold more than one TPU type here

https://cloud.google.com/tpu/docs/types-zones

For example, us-central1-a has both v2-128 and v2-32, so I'm not exactly sure that the region alone can specify the TPU type. I'm sorta of afraid of accidentally spinning up a paid TPU.

SantoshGupta7
  • 5,607
  • 14
  • 58
  • 116

2 Answers2

3

You can select the TPU type by using the tpu-size parameter, as per the documentation (also here).

For example:

ctpu up --zone=us-central1-a  \
--tf-version=2.1 \
--name=tpu-quickstart \
--tpu-size=v2-32

Remember that only v2-8 and v3-8 are available unless you have access to evaluation quota or have purchased a commitment.

robsiemb
  • 6,157
  • 7
  • 32
  • 46
  • I have been given some TPU credits for TFRC. Do you know if there something I need to do to use those credits specifically? – SantoshGupta7 Jul 18 '20 at 16:49
  • Sorry, I don't know anything about the details of [that program](https://www.tensorflow.org/tfrc). – robsiemb Jul 18 '20 at 17:30
0

You can also use gcloud command to create TPUs

gcloud compute tpus create tpu-quickstart \
      --zone=us-central1-a \
      --network=default \
      --accelerator-type=v2-32 \
      --version=2.1

Pradeep Bhadani
  • 4,435
  • 6
  • 29
  • 48