1

I am trying to run Python script from Google Cloud Natural Language API Python Samples

https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/language/cloud-client/v1beta2/snippets.py

I have not made any modifications, so I expected it would just work. Specifically, I want to run entities analysis on a text file/document. and the relevant part of the code is below.

def entities_file(gcs_uri):
"""Detects entities in the file located in Google Cloud Storage."""
client = language_v1beta2.LanguageServiceClient()

# Instantiates a plain text document.
document = types.Document(
    gcs_content_uri=gcs_uri,
    type=enums.Document.Type.PLAIN_TEXT)

# Detects sentiment in the document. You can also analyze HTML with:
#   document.type == enums.Document.Type.HTML
entities = client.analyze_entities(document).entities

# entity types from enums.Entity.Type
entity_type = ('UNKNOWN', 'PERSON', 'LOCATION', 'ORGANIZATION',
               'EVENT', 'WORK_OF_ART', 'CONSUMER_GOOD', 'OTHER')

for entity in entities:
    print('=' * 20)
    print(u'{:<16}: {}'.format('name', entity.name))
    print(u'{:<16}: {}'.format('type', entity_type[entity.type]))
    print(u'{:<16}: {}'.format('metadata', entity.metadata))
    print(u'{:<16}: {}'.format('salience', entity.salience))
    print(u'{:<16}: {}'.format('wikipedia_url',
          entity.metadata.get('wikipedia_url', '-')))

I have put my text file (utf-8 encoding) on cloud storage at gs://neotokyo-cloud-bucket/TXT/TTS-01.txt

I am running the script in Google cloud shell. and when I run the file:

python snippets.py entities-file gs://neotokyo-cloud-bucket/TXT/TTS-01.txt

I get the following error, which appears to be protobuf related.

[libprotobuf ERROR google/protobuf/wire_format_lite.cc:629]. 
String field 'google.cloud.language.v1beta2.TextSpan.content' 
contains invalid UTF-8 data when parsing a protocol buffer. 
Use the 'bytes' type if you intend to send raw bytes.

 ERROR:root:Exception deserializing message!
 Traceback (most recent call last):
 File "/usr/local/lib/python2.7/dist-packages/grpc/_common.py", line 87, in _transform
return transformer(message)
 DecodeError: Error parsing message
 Traceback (most recent call last):
 File "snippets.py", line 336, in <module>
entities_file(args.gcs_uri)
 File "snippets.py", line 114, in entities_file
entities = client.analyze_entities(document).entities
 File "/usr/local/lib/python2.7/dist-     packages/google/cloud/language_v1beta2/gapic/language_service_client.py", line 226, in analyze_entities
return self._analyze_entities(request, retry=retry, timeout=timeout)
 File "/usr/local/lib/python2.7/dist-packages/google/api_core/gapic_v1/method.py", line 139, in __call__
return wrapped_func(*args, **kwargs)
 File "/usr/local/lib/python2.7/dist-packages/google/api_core/retry.py", line 260, in retry_wrapped_func
on_error=on_error,
 File "/usr/local/lib/python2.7/dist-packages/google/api_core/retry.py", line 177, in retry_target
return target()
 File "/usr/local/lib/python2.7/dist-packages/google/api_core/timeout.py", line 206, in func_with_timeout
return func(*args, **kwargs)
 File "/usr/local/lib/python2.7/dist-packages/google/api_core/grpc_helpers.py", line 56, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
 File "/usr/local/lib/python2.7/dist-packages/six.py", line 737, in raise_from
raise value
 google.api_core.exceptions.InternalServerError: 500 Exception deserializing response!

I do not know protobuf so, any help appreciated!

  • Do you get the same error if you run it in Python 3? – Luke Sneeringer May 29 '18 at 16:21
  • Hi, the problem is, quote "the Google Cloud SDK requires Python 2.7.9 or later and does not currently work on Python 3". So I have to assume it will not work. I have posted my question as an issue on github repo too. https://github.com/GoogleCloudPlatform/google-cloud-python/issues/5411 – complexitocous May 30 '18 at 17:40

2 Answers2

0

Where is your text file from?

Python's ParseFromString/SerializeToString are using bytes. Try to convert your text file to bytes before parsing

Jie Luo
  • 36
  • 1
  • I have tried this: ```entities = client.analyze_entities(document, encoding_type='bytes').entities``` but it says ```ValueError: unknown enum label``` Or are you refering to something else, on the protobuf code side? – complexitocous May 31 '18 at 14:41
  • Yes, I am talking about protobuf python's wire format (I am in protobuf python team). Looks like the problem is more specific with language/cloud-client/v1beta2/snippets.py – Jie Luo May 31 '18 at 19:07
0

It looks like your file starts with a byte order mark (utf-8-sig). Try converting your content to standard UTF8 before calling the client.

mcasbon
  • 1
  • 1
  • 1