Questions tagged [rselenium]

The goal of RSelenium is to make it easy to connect to a Selenium Server/ Remote Selenium Server from within R. RSelenium provides R bindings for the Selenium Webdriver API. Selenium is a project focused on automating web browsers. RSelenium allows you to carry out unit testing and regression testing on your webapps and webpages across a range of browser/OS combinations.

The goal of RSelenium is to make it easy to connect to a Selenium Server/ Remote Selenium Server from within R. RSelenium provides R bindings for the Selenium Webdriver API.

Selenium is a project focused on automating web browsers. See http://docs.seleniumhq.org/

RSelenium allows you to carry out unit testing and regression testing on your webapps and webpages across a range of browser/OS combinations. This allows us to integrate from within R testing and manipulation of popular projects such as shiny, sauceLabs.

Selenium Server is a standalone java program which allows you to run HTML test suites in a range of different browsers, plus extra options like reporting.

See vignette for more details- http://cran.r-project.org/web/packages/RSelenium/vignettes/RSelenium-basics.html

944 questions
0
votes
1 answer

Create a time out handler using for web scraping with Rselenium

I'm creating a scraper with Rselenium and phantomjs. Sometime my program querying a web-site take too long and never end. So I'm writing a time-out handler. library(RSelenium) library(R.utils) pJS <- phantom(pjs_cmd…
dax90
  • 1,088
  • 14
  • 29
0
votes
1 answer

Does readHTMLTable has a limitation on the number of characters in a cell?

I am using RSelenium to submit forms on the UN treaty collection website and save the results. Everything works fine, beside the fact that the names of the treaties are truncated in my final table. Is is because there is a limitation on the number…
desval
  • 2,345
  • 2
  • 16
  • 23
0
votes
0 answers

Using RSelenium findElement funciton in an if else statement

I have been trying to use RSelenium and cannot figure out how to make an 'if' 'else' statement when using the findElement function. Can anyone help out this basic problem below? This is just a test to understand a piece of my overall code.…
Weevils
  • 312
  • 3
  • 9
0
votes
0 answers

R - web scraping dynamic forms skipping missing data

I am using RSelenium to scrape data off of a [website][1] that has a dynamic form where the multiple dropdown menus change depending on what is chosen. I am trying to pull the variable 'Number & Area of Operational Holdings' for every district in…
Weevils
  • 312
  • 3
  • 9
0
votes
0 answers

RSelenium error with phantom

I am running the following code and am getting an error which I do not know how to resolve: require(RSelenium) startServer() pJS <- phantom() remDr <- remoteDriver(browserName = "phantom") remDr$open() [1] "Connecting to remote server" Error: …
A Gore
  • 1,870
  • 2
  • 15
  • 26
0
votes
2 answers

RSelenium in MAC

I am using R R 3.1.1 on OS X Yosemite(10.10.4). I have recently installed RSelenium and I constantly receive an unknown error. The code that I use is as follow: require(RSelenium) checkForServer() startServer() Sys.sleep(5) remDr <-…
Miros
  • 94
  • 1
  • 6
0
votes
0 answers

Does using a headless browser with rselenium significanly faster?

I'd like to know if using a headless browser, (such as phantomJS) with RSelenium significanly faster than running scripts with a regular browser (such as chrome)? Also driving it directly or using a selenium server ? Is there a short function to…
Oren Bochman
  • 1,173
  • 3
  • 16
  • 37
0
votes
1 answer

CasperJS equivalent to RSelenium for filling a form

I have an Rselenium script to fill in a form, but am trying to use CasperJS as I am finding Rselenium too slow. The following code will navigate the form as I expect. remote.driver$navigate("http://news.ceek.jp/search.cgi?kind=sports") search.form…
JSB
  • 351
  • 2
  • 24
0
votes
0 answers

Rselenium clicking a link 100 times

I have to read the comments on Linkedin, which shows only 10 comments at a time with a link "Show Previous Comment". I have 1000 comments so need to click the show previous comment 100 times to see all of them. I want to know how to use Rselenium to…
0
votes
2 answers

Web scraping password protected website using R

i would like to web scrape yammer data using R,but in order to do so first il have to login to this page,(which is authentication for an app that i created). https://www.yammer.com/dialog/authenticate?client_id=iVGCK1tOhbZGS7zC8dPjg I am able to get…
vinay
  • 57
  • 1
  • 12
0
votes
1 answer

R RSelenium and phantom in a loop

I have the following code I borrowed from a previous Stackoverflow discussion ( Extracting data from javascript with R). I'm bassically trying to webscrape some data for some pharmaceuticals. When I run the code for a single pharmaceutical code…
0
votes
1 answer

How to get from an HTML table the number of the rows having a disabled radio button with R?

I'm scraping a webpage with R. I use the "RSelenium" and "XML" packages. The following table has a radio button in some rows. I need to know which rows (for instance the first and the third row) have a radio button disabled so as to skip the row…
Gianluca78
  • 794
  • 1
  • 9
  • 24
-1
votes
0 answers

How to web scrape a javascripted website?

I want to extract the data from the links in the following page using R: https://uygulama.gtb.gov.tr/BTBBasvuru/Btbler However, I am not sure how I can access all the following pages and the links on the table as well. I have tried using the…
lokij
  • 1
  • 1
-1
votes
2 answers

RSelenium -> click checkbox files

Basically, I would like to automatically download multiple files at once from a webpage ->> http://alertario.rio.rj.gov.br/download/dados-pluviometricos/ I am currently following the tutorials from: https://www.youtube.com/watch?v=BK_JBk_l5uQ; also…
Pedro Lima
  • 15
  • 4
-1
votes
1 answer

Creating "for" loop when webscraping with Rselenium and saving as dataframe

I have been playing around with Rselenium and webscraping from a list of URLs. Naturally, I would want to combine the data from each URL I scrape into a dataframe. When I do that, the dataframe that is returned will have the data, along with…
1 2 3
62
63