So I have these two scripts:
redditScraper.py
# libraries
import urllib2
import json
# get remote string
url = 'http://www.reddit.com/new.json?sort=new'
response=urllib2.urlopen(url)
# interpret as json
data = json.load(response)
#print(data)
response.close()
print data['data']['children'][3]['data']['title']
print data['data']['children'][3]['data']['permalink']
print data['data']['children'][3]['data']['subreddit']
and minerTweets.py
#!/usr/bin/env python
import sys
from twython import Twython
CONSUMER_KEY = 'XXXXXXXXXXXXXXXX'
CONSUMER_SECRET = 'XXXXXXXXXXXXXXXX'
ACCESS_KEY = 'XXXXXXXXXXXXXXXX'
ACCESS_SECRET = 'XXXXXXXXXXXXXXXX'
api = Twython(CONSUMER_KEY,CONSUMER_SECRET,ACCESS_KEY,ACCESS_SECRET)
api.update_status(status=sys.argv[1])
This is a Raspberry Pi that will update a twitter account (it's for academic purposes). Being new to python I took each part of the script I'm trying to write one at a time. I have one script that successfully scraps the title, link, and subreddit of the reddit "new" page and prints it. And then I have another that successfully hits the Twython API to update a status taking the sys.argv currently for testing. What I want the finished script to do is take the printed data from the redditScraper.py and update a twitter account's status with my minerTweets.py script. I've looked all over the place and since I'm just learning python my knowledge for the best way to accomplish this is limited.
I appreciate any advice in advance. Thank you!