Questions tagged [google-cloud-tpu]

Google Cloud TPUs (Tensor Processing Units) accelerate machine learning workloads developed using TensorFlow. This tag is used for questions about using the Google Cloud TPU service. Topics can range from the service user experience, issues with the trainer program written with Tensorflow, project quota issues, security, authentication, etc.

Official website

188 questions
1
vote
1 answer

Huge size of TF records file to store on Google Cloud

I am trying to modify a tensorflow project so that it becomes compatible with TPU. For this, I started with the code explained on this site. Here COCO dataset is downloaded and first its features are extracted using InceptionV3 model. I wanted to…
1
vote
2 answers

How to load data for TPU inference in Colab, without using GCP?

For training models on the colab TPUs, the data needs to be on GCP buckets. However, for small amounts of data, I am wondering if it's possible to directly inference data directly from the local colab enviroment.
SantoshGupta7
  • 5,607
  • 14
  • 58
  • 116
1
vote
0 answers

Using cloud-tpu fails with google ai-platform train api

I've been successfully using ai-platform train api with tensor2tensor and cloud-tpu backend until several days back, but it seems like something has changed and I can't get it to work since last week. The differences I see in logs between…
1
vote
1 answer

Where is data cached when using a Cloud TPU?

I have a question regarding using using TPU. When I use .cache() with dataset, where is the data cached? Is it cached in RAM of the VM instance rented (e.g. n1-standard-2) or in the memory of TPU. In other words, if I have a ~30G dataset, do I need…
Tony Chen
  • 11
  • 1
1
vote
1 answer

Cannot create TPU nodes: RESOURCE_EXHAUSTED

I'm using a free trial account for TPU training my deep learning models with my billing account enabled and I still have more than $100 promotional credits in my account. 2 days ago my preemptible TPU was "preemtibled" in the middle of a training…
Sea Otter
  • 73
  • 1
  • 6
1
vote
1 answer

Does xgboost use the TPU for `gpu-hist` (if TPU is available)?

I am curious if xgboost will use a TPU in google colab if such is available? It certainly makes very good use of the GPU...
Igor Rivin
  • 4,632
  • 2
  • 23
  • 35
1
vote
1 answer

Google Cloud VM still using CPU instead of TPU to execture Python/Tensorflow script

I have setup a TPU machine on Google Cloud and I think I have done it properly because when I run ctpu status it returns RUNNING. However, I have a Python script that I am trying to run and I want it to use TPU. It is still using CPU though,…
Tendi
  • 1,343
  • 2
  • 10
  • 11
1
vote
1 answer

Google Colab: Unsupported data type for TPU: double, caused by output cond_8/Merge:0

I'm using Talos and Google colab TPU to run hyperparameter tuning of a Keras model. Note that I'm using Tensorflow 1.15.0 and Keras 2.2.4-tf. import os import tensorflow as tf import talos as ta from tensorflow.keras.models import Sequential from…
1
vote
2 answers

Colab TPU error InvalidArgumentError: Cannot assign a device for operation

in google colab when using TPU , i have the following error InvalidArgumentError: Cannot assign a device for operation Adam/iterations/IsInitialized/VarIsInitializedOp: {{node Adam/iterations/IsInitialized/VarIsInitializedOp}} was explicitly…
1
vote
1 answer

Error while using Tensorflow-Hub and Colab TPU

I am trying to use BERT for text classification using Tensorflow hub. The code runs fine on Colab GPU but when I converted it for Colab TPU it shows up the following 'uninitialized layer' error. Following is the Bert Layer- class…
1
vote
1 answer

AssertionError: batch_size must be divisible by the number of TPU cores in use (1 vs 8) when using the predict function

Some details for context: Working on Google Colab using TPU. Model is fitting successfully without any issues Running into issues while attempting to use the predict function Here is the code I'm using to train: tpu_model.fit(x, y, …
madsthaks
  • 2,091
  • 6
  • 25
  • 46
1
vote
0 answers

Using TPU on Google Colab Error_Cannot find any TPU cores in the system

I am trying to run BERT model on Google Colab using TPU.. What is TPU_NAME exactly in: !python run_classifier.py \ . . . --use_tpu=True \ --tpu_name=$TPU_NAME I tried to define it as: TPU_NAME = os.environ['COLAB_TPU_ADDR'] But, I faced the…
userInThisWorld
  • 1,361
  • 4
  • 18
  • 35
1
vote
1 answer

Google colab TPU and reading from disc while traning

I have 100k pics, and it doesn't fit into ram, so I need read it from disc while training. dataset = tf.data.Dataset.from_tensor_slices(in_pics) dataset = dataset.map(extract_fn) def extract_fn(x): x = tf.read_file(x) x =…
had
  • 327
  • 2
  • 12
1
vote
0 answers

How to manually use Google TPU with Tensorflow's Object Detection API?

I've successfully trained models using Tensorflow's Object Detection API running both locally on GPU (using model_main.py) and using Google's ML Engine (both GPU and TPU). However, I can't seem to be able to use model_tpu_main.py to train a model,…
1
vote
2 answers

Running cloud TPU profiler in Google Colab environment

I am running a Google Colab notebook and am trying to capture TPU profiling data for use in TensorBoard, however I can't get capture_tpu_profile to run in the background while running my TensorFlow code. So far I tried to run the capture process in…
Jann
  • 1,799
  • 3
  • 21
  • 38