I am making an Shiny App in which input is xlsb format file and output is prediction based on the model developed. Code is working fine on 22kb xlsb file. But as the file size becomes 2000kb the R server takes total 7mins to just read. Converting xlsb file to csv makes 2000kb to 9000kb in size. And xlsb format recommended to handle large datasets. Is there any way to reduce the time from 7mins in reading the data file. This just a layout of what I am trying.
ui <- library(shiny)
library(shinydashboard)
ui <- dashboardPage(
dashboardHeader(title = "Customized Input File"),
dashboardSidebar(
fileInput('file1', 'Choose CSV File',
accept=c('text/csv', 'text/comma-separated-values,text /plain', '.csv'))
),
dashboardBody(
tableOutput('contents')
)
)
server <- shinyServer(function(input, output, session) {
output$contents <- renderTable({
return(model_prediction())
})
model_prediction <- reactive({
inFile <- input$file1
if (is.null(inFile))
return(NULL)
data<- xl.read.file(inFile$datapath)
###Model for Prediction###
return(values)
)}
I am posting here for the first time, if I left some detail of the problem, then freely question me.