I could not connect to datalab of all GCP project.
I tried to connect on several way such as Cloud Shell, local environment, and use another PC, but same issue happened.
Running command to connect succeed, but trying to connect on browser, it is not displayed datalab's home but displayed like below.
I entered VM of datalab and checked datalab docker container had been running correctly, the container seemed to be failed to start.
$ docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
bee455ac9f6b gcr.io/cloud-datalab/datalab@sha256:913de629df5d90e7a2f85d9b9ec2986adb0f59d8a9e204360d84e770f1147ec8 "/datalab/run.sh" 27 seconds ago Exited (127) 25 seconds ago k8s_datalab_datalab-server-ryutah_default_3335ede982c9f3d4bef6a7f8dcf3948a_0
63a79825ae9e gcr.io/google_containers/fluentd-gcp@sha256:93ae1e71d71c4209947fcc20da66cba7c788ed6fa0b3ebf2b7ca0a32eca39ed4 "/bin/sh -c '/usr/..." 2 minutes ago Up 2 minutes k8s_logger_datalab-server-ryutah_default_3335ede982c9f3d4bef6a7f8dcf3948a_9
26da06887f72 gcr.io/google_containers/pause-amd64:3.0 "/pause" 2 minutes ago Up 2 minutes 127.0.0.1:8080->8080/tcp k8s_POD_datalab-server-ryutah_default_3335ede982c9f3d4bef6a7f8dcf3948a_9
That failed docker container output below log.
$ docker logs bee455ac9f6b
Verifying that the /tmp directory is writable
The /tmp directory is writable
/ /
From https://github.com/googledatalab/notebooks
* branch master -> FETCH_HEAD
HEAD is now at 887cb95 Add a text classification sample with both DNN and LSTM for TensorFlow. (#130)
/
Already on 'master'
Your branch is up-to-date with 'origin/master'.
/root/startup.sh: line 1: !pip3: command not found
Is this the reason why I could not connect to datalab?
This VM has below docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
gcr.io/cloud-datalab/datalab latest 43e710d36020 6 weeks ago 2.82 GB
gcr.io/google_containers/pause-amd64 3.0 99e59f495ffa 17 months ago 747 kB
gcr.io/google_containers/fluentd-gcp 1.18 e0be8d052951 19 months ago 411 MB
Does anyone know how to solve this issue?
Please give me answer if someone knows about this issue.