1

I'm trying to have a embedded entity with a field that is larger than 1500bytes. Based on the documentation, it's recommended to exclude that field from Indexes, and it should allow 1MB.

I add that field to the exclude_from_indexes key of the embedded entity. When I try and save the parent Entity it tells me the field is bigger than 1500bytes. If I save the embedded entity independently, it works.

Is exclude_from_indexes ignored on embedded entity?

client = datastore.Client(dataset_id=projectID)
record_key = client.key('Record', my_id)
record_entity = datastore.Entity(record_key)

embedded_key = client.key('Data', another_id)
embedded_entity = datastore.Entity(key=embedded_key,exclude_from_indexes=('big_field',))
embedded_entity['field1']='1234'
embedded_entity['big_field']='large string bigger than 1500bytes'

record_entity['RandomFieldName']=embedded_entity

client.put(record_entity)
#Error: gcloud.exceptions.BadRequest: 400 The value of property "big_field" is longer than 1500 bytes.

client.put(embedded_entity)
#No Error
bossylobster
  • 9,993
  • 1
  • 42
  • 61
igama
  • 58
  • 1
  • 8
  • Embedded Entities are tricky! Ancestor keys might be a much better way to do this, but there are trade-offs. This might be a bug in the library or not supported by Datastore. – Sandeep Dinesh Oct 28 '15 at 17:00

1 Answers1

1

This should be a valid call. You can try it out yourself by using the API explorer for Datastore.

It's likely a bug inside gcloud-python which doesn't properly pass along the indexing information when you use an embedded entity.

I see you also filed a bug on the gcloud-python github, I would recommend following along there.

Patrick Costello
  • 3,616
  • 17
  • 22
  • Thanks, will continue the conversation via Github. – igama Oct 28 '15 at 18:23
  • 1
    https://github.com/googleapis/google-cloud-python/issues/5111#issuecomment-553331578 This is the answer - in case someone wants to jump straight to the github – Vipluv Jun 02 '22 at 09:13