I am trying to load the SQL query result to a new table after some transformation. But even the simplest script is failing.
DECLARE @inquery nvarchar(max) = N'
SELECT TOP 2000000 * from SQL Table'
DECLARE @Rscript nvarchar(max) = N'
sqlConnString = "Driver={SQL Server};SERVER='+@@SERVERNAME+N';DATABASE='+DB_NAME()+N';Trusted_Connection=True;"
outTabName <- "OutputTable"
outTabDS <- RxSqlServerData(table = outTabName, connectionString = sqlConnString)
rxDataStep(inData = InputDataSet, outFile = outTabDS, maxRowsByCols = NULL, rowsPerRead = 500000)
'
EXEC sp_execute_external_script @language = N'R'
, @script = @Rscript
, @input_data_1 = @inquery
WITH result sets none;
When I run this with 1M rows, it runs but fails to write with 2M rows. THough RevoScaleR function process the data in chunks then why having more numbers of rows is a problem? Because same query ger results in SQL server. Max Memory percentage is also allocated to 50 of 32 GB RAM.