2

First time poster, and I'm not really a developer, so perspective is always appreciated :)

Objective:

  • I am attempting to put (or patch) a json.dumps(mergedFile) into firebase as one payload without firebase auto creating indexes (0, 1, etc..) in front of each object

Problem statement:

  • I am submitting the following json object into the /testObject path:

    [{"test1":"226.69"},{"test2":"7.48"}]

  • In firebase the response is stored as:

    [ { "testObject": { 0: { "test1": "226.69" }, 1: { "test2": "7.48" } } } ]

Background:

  • The total # of items in the payload of the data I need to store is just over 5000
  • If I parse each object via a for loop the data is written as expected, however, it initiates a new request for each itteriation of the loop and has a large overhead impact compared to just dumping one large object in one request.

Here is my Code:

import json
import requests
import xml.etree.ElementTree as ET

def get_data():
    try:
        print 'hampsters are running...'

        # OFFLINE TESTING
        sourceFile = 'response.xml'
        tree = ET.parse(sourceFile)
        root = tree.getroot()

        for symbol in root.iter('symbol'):
            company = symbol.attrib['company']
            location = symbol.attrib['location']
            destinationData = {company: location}
            mergedFile.append(destinationData)
        print('downlaoding the info was a success! :)')
    except:
       print 'Attempt to download information did not complete successfully :('

def patch_data():
    try:
        print 'attempting to upload info to database...'
        data = json.dumps(mergedFile)
        print data
        try:
            req = requests.put(url, data=data, headers=headers)
            req.raise_for_status()
        except requests.exceptions.HTTPError as e:
            print e
            print req.json()
        print 'upload to database complete!'
    except:
       print 'Attempt to upload information did not complete successfully :('

if __name__ == "__main__":
    mergedFile = []

    auth = "*****"
    databaseURL = 'https://*****.firebaseio.com'
    headers = {"auth": auth, "print": "pretty"}
    # headers = {"auth": auth, "print": "pretty", "Accept": "text/event-stream"}
    requestPath = '/testObject.json?auth=' + auth
    url = databaseURL + requestPath

    get_data()
    patch_data()

I feel like its storing an array, but I'm leveraging data = json.dumps(mergedFile) before the put request. Do I have a mis-understanding of how json.dumps works? Based on the output before the request I feel it looks good. I'm also leveraging the requests python module... is this converting the data to an array?

Any insight anyone could provide would be greatly appreciated!

Regards,

James.

JamesM
  • 21
  • 2
  • As Frank van Puffelen mentions this is expected behavior based on the data that I am submitting. I was inadvertently creating a array instead of a dict. Adjusted two lines: mergedFile = [] to mergedFile = {} and then adjusted the updating of the dict from mergedFile.append(destinationData) to mergedFile.update(destinationData) – JamesM Jan 21 '17 at 18:39

1 Answers1

0

The Firebase Database stores arrays as regular key-value pairs, with the keys being numbers. So what you see is the expected behavior.

There are many reasons why Firebase recommends against storing arrays in the database. A few can be found in these links:

Community
  • 1
  • 1
Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • Thanks @Frank, I appreciate the links, I'll definitely review them. As you mentioned this was intend based on the data I was submitting... after converting it to a dict it is working as expected. – JamesM Jan 21 '17 at 18:41
  • Indeed! I do not have enough rep for it to show yet, but it did state that it was recorded. Thanks again Frank, appreciate it. – JamesM Jan 22 '17 at 22:21