5

We are trying to use Support Vector Machines to do predictions on our dataset but with just 70,000 rows and 7 features - we have tried an SVM on Google DataLabs but our data set is too big to calculate in any reasonable finite time on the DataLabs VM.

We would like to leverage an approach that scales statistical approaches across CPU cores like Revolution Analytics version of R on Azure Machine Learning Studio but our data is on Google BigQuery.

How do we connect an R script on Azure Machine Learning Studio to use our dataset on Google BigQuery?

Praxiteles
  • 5,802
  • 9
  • 47
  • 78
  • I managed to do the connection in R Console, I used library(bigrquery) to create the connection; however, I used the browser for authentication which could be a blocker in Azure ML Studio. As I am not an expert in BigRquery, have you managed to create the connection in R console using server-side implementation (or any automated process w/o the need for a browser)? if so, please share how and I think I can help with uploading that to azure ML for you – Mohamed Yamama Apr 27 '16 at 12:39

1 Answers1

1

You can pull the data from a "Execute Python script" module using a http request or google sdk for python (https://cloud.google.com/bigquery/exporting-data-from-bigquery). than add an "Execute R script" with your logic

marnun
  • 808
  • 8
  • 23