First time poster, and I'm not really a developer, so perspective is always appreciated :)
Objective:
- I am attempting to put (or patch) a json.dumps(mergedFile) into firebase as one payload without firebase auto creating indexes (0, 1, etc..) in front of each object
Problem statement:
I am submitting the following json object into the /testObject path:
[{"test1":"226.69"},{"test2":"7.48"}]
In firebase the response is stored as:
[ { "testObject": { 0: { "test1": "226.69" }, 1: { "test2": "7.48" } } } ]
Background:
- The total # of items in the payload of the data I need to store is just over 5000
- If I parse each object via a for loop the data is written as expected, however, it initiates a new request for each itteriation of the loop and has a large overhead impact compared to just dumping one large object in one request.
Here is my Code:
import json
import requests
import xml.etree.ElementTree as ET
def get_data():
try:
print 'hampsters are running...'
# OFFLINE TESTING
sourceFile = 'response.xml'
tree = ET.parse(sourceFile)
root = tree.getroot()
for symbol in root.iter('symbol'):
company = symbol.attrib['company']
location = symbol.attrib['location']
destinationData = {company: location}
mergedFile.append(destinationData)
print('downlaoding the info was a success! :)')
except:
print 'Attempt to download information did not complete successfully :('
def patch_data():
try:
print 'attempting to upload info to database...'
data = json.dumps(mergedFile)
print data
try:
req = requests.put(url, data=data, headers=headers)
req.raise_for_status()
except requests.exceptions.HTTPError as e:
print e
print req.json()
print 'upload to database complete!'
except:
print 'Attempt to upload information did not complete successfully :('
if __name__ == "__main__":
mergedFile = []
auth = "*****"
databaseURL = 'https://*****.firebaseio.com'
headers = {"auth": auth, "print": "pretty"}
# headers = {"auth": auth, "print": "pretty", "Accept": "text/event-stream"}
requestPath = '/testObject.json?auth=' + auth
url = databaseURL + requestPath
get_data()
patch_data()
I feel like its storing an array, but I'm leveraging data = json.dumps(mergedFile) before the put request. Do I have a mis-understanding of how json.dumps works? Based on the output before the request I feel it looks good. I'm also leveraging the requests python module... is this converting the data to an array?
Any insight anyone could provide would be greatly appreciated!
Regards,
James.