1

I am making an Shiny App in which input is xlsb format file and output is prediction based on the model developed. Code is working fine on 22kb xlsb file. But as the file size becomes 2000kb the R server takes total 7mins to just read. Converting xlsb file to csv makes 2000kb to 9000kb in size. And xlsb format recommended to handle large datasets. Is there any way to reduce the time from 7mins in reading the data file. This just a layout of what I am trying.

ui <- library(shiny)
library(shinydashboard)
ui <- dashboardPage(
 dashboardHeader(title = "Customized Input File"),
  dashboardSidebar(
   fileInput('file1', 'Choose CSV File',
          accept=c('text/csv', 'text/comma-separated-values,text /plain',  '.csv'))
    ),
 dashboardBody(
  tableOutput('contents')
   )
  )

server <- shinyServer(function(input, output, session) {

 output$contents <- renderTable({
  return(model_prediction())
  })

model_prediction <- reactive({
inFile <- input$file1
if (is.null(inFile))
  return(NULL)
data<- xl.read.file(inFile$datapath)
 ###Model for Prediction###
 return(values)
 )}

I am posting here for the first time, if I left some detail of the problem, then freely question me.

Ashita
  • 11
  • 1
  • Can you give examle of xlsb file? – Batanichek Jan 20 '16 at 12:46
  • @ Batanichek .xlsb is one kind of format of excel file like .xlsx, .csv etc. It's used to store large data because it has binary nature. You can read from here: http://www.excel-for-consultants.com/xlsb-format/ – Ashita Jan 20 '16 at 15:43
  • I means it will be better to show link on file(small one) for others to reproduce you app and test different ways to read it. – Batanichek Jan 20 '16 at 17:30
  • The data has dimension: 86 columns and approx 24k rows. All variables are factors. So it would be difficult to replicate the sample data. – Ashita Jan 21 '16 at 11:40

0 Answers0