Questions tagged [scrape]

DO NOT USE THIS TAG. It is under an active cleanup: https://meta.stackoverflow.com/q/305314 Use [web-scraping] if your question is about scraping information from web resources (there is also [screen-scraping]) or use [pdf-scraping] if your question is about scraping information from pdf files. Use [data-extraction] if you need to extract data from other resources.

1204 questions
-1
votes
3 answers

scraping dynamic updates of temperature sensor data from a website

I wrote following python code: from bs4 import BeautifulSoup import urllib2 url= 'http://www.example.com' page = urllib2.urlopen(url) soup = BeautifulSoup(page.read(),"html.parser") freq=soup.find('div', attrs={'id':'frequenz'}) print freq The…
Chris Weber
  • 1
  • 1
  • 1
-1
votes
1 answer

Scrape "paginasamarillas.es" using Scrapy

Hi I using scrapy for scrape paginasamarillas.es but I don't get results these are my codes.Please can you help me with this? from scrapy.item import Item, Field class AyellItem(Item): name = Field() pass This is the spider from scrapy.selector…
-1
votes
1 answer

How to use PHP to echo entire word with occurance of character sequence?

I have a PHP scraper that scrapes URLs and echos out the material inside a given div. I want to modify that scraper to check the html on the page for the occurence of a string, and then echo out the entire word the string occurs in. My Current…
Tim
  • 63
  • 1
  • 1
  • 10
-1
votes
1 answer

WordPress Post Scraper visual basic

I need following. I need to scrape posts from mine WordPress blog and show them in VB application. But news must be fresh, when user clicks "Refresh" it gets new content from website. So, posts from first page and link to it. Any way to do this? Is…
-1
votes
1 answer

R Shiny error for data scrape web app "number of items to replace is not a multiple of replacement length"

I am attempting to make a shiny web app using a function that I found to scrape data off of a NOAA website. So I did not create the function, but I have used it and it works well. I want to create the app for others at work who are not…
Jeff Tilton
  • 1,256
  • 1
  • 14
  • 28
-1
votes
2 answers

autostart node.js script at a given time

I am using a node.js/express.js script to scrap data from a website. The data I need are generated on a daily basis, so I need my script to launch automatically everyday at a given hour. Is there a way to do that?
Mahmoud
  • 31
  • 3
-1
votes
1 answer

Did PHP file_get_contents just erase my database?

Yesterday, i was searching for techniques to push and pull information into an app that I'm playing with. In my research, I happened to come across the following PHP script:
cpardon
  • 487
  • 4
  • 24
-1
votes
2 answers

PHP: simple_html_dom - Find text inside a span with class

Given a HTML like this: (...)
23
(...) How to get the 23 as a php var using simple_html_dom given the wanted part is inside the span of class LevelNum ? Thanks
bockzior
  • 199
  • 1
  • 6
  • 20
-1
votes
4 answers

How to I make the result of this a variable?

right now its set up to write to a file, but I want it to output the value to a variable. not sure how. from BeautifulSoup import BeautifulSoup import sys, re, urllib2 import codecs woof1 = urllib2.urlopen('someurl').read() woof_1 =…
Pevo
  • 55
  • 2
  • 6
-1
votes
1 answer

Access content within cross domain iframe

Is there are ANY way to access IFRAME content across different domains ? I am a trying to load external pages into an iframe and scrape their info. But get security errors in Chrome.
David
  • 1,051
  • 5
  • 14
  • 28
-1
votes
1 answer

BeautifulSoup to scrape street address

I am using the code at the far bottom to get weblink, and the Masjid name. however I would like to also get denomination and street address. please help I am stuck. Currently I am getting the following Weblink:
-1
votes
1 answer

Looking for a regex to pull img src information from inside of inline Javascript using PHP

I'm using PHP to scrape a few websites. The image information is contained within a script.
something
Something else