Questions tagged [large-data-volumes]

302 questions
1
vote
3 answers

Time Limit on a SQL Server

Is it possible to limit the amount of time SQL Server takes to query a database? Having the program returning all results that fit the query that it could find in the time frame?
1
vote
0 answers

Sending Large or Complex Objects from WCF server to clients

My application is near real time data buffering application. I,m using net. TCP bindings to send complex objects. buffering data through different call-back functions to the client. as much as i enable certain events the application slow down and…
1
vote
2 answers

Fastest way to fetch a subset (200M) from a very large table (600M) in SQL Server

we are facing the following problem and we are trying to come up with the best possible solution. We are using SQL Server 2008. We have a table that has more than 600 millions records, and has about 25 columns. One of the columns is an ID and is…
Nicolas
  • 403
  • 1
  • 6
  • 18
1
vote
1 answer

How to send large response with play scala

I am using play framework 2 with Scala. From the controller I have a action method from where I need to return an object containing 100000 rows with some other data. But during JSON serialization it gets an exception…
1
vote
1 answer

ORM usage with potentially billions of records

I was thinking of this the other day, apps like Twitter deal with millions of users. I was thinking how the functionality of 'following' would work, where the maximum amount of users in the database can follow the maximum amount of users in the…
Billworth Vandory
  • 5,003
  • 5
  • 29
  • 34
1
vote
1 answer

Parallel.ForEach throws exception when processing extremely large sets of data

My question centers on some Parallel.ForEach code that used to work without fail, and now that our database has grown to 5 times as large, it breaks almost regularly. Parallel.ForEach(…
1
vote
1 answer

zipping very many files in r

I have 7300 *.csv files in a temp directory. I want to zip these into a zip archive in R. I'm using the following code, which is taking FOREVER. Is there a way to do this faster, short of exiting R and using the WinZip program? fileListing …
Benjamin Levy
  • 333
  • 6
  • 19
1
vote
1 answer

Organising large datasets in Matlab

I have a problem I hope you can help me with. I have imported a large dataset (200000 x 5 cell) in Matlab that has the following structure: 'Year' 'Country' 'X' 'Y' 'Value' Columns 1 and 5 contain numeric values, while columns 2 to 4 contain…
1
vote
4 answers

Adding an autonumber to a SQLcolumn which has more than 15 million records

I need to add a autonumber column to an existing table which has about 15 million records in SQL 2005. Do you think how much time it'll take? What's the better way to do it?
blue
  • 833
  • 2
  • 12
  • 39
1
vote
0 answers

MySQL splitting a large table

I have a huge (100+ Gig of data, ~1 billion rows) table on which I need to perform SELECT queries that are very fast for recent data as well as queries for older data where the speed is unimportant. I'm thinking about creating a new table for…
pedmillon
  • 143
  • 3
  • 13
1
vote
2 answers

Read tens of thousands of Files and write to millions of files in Java

I am doing some unusual data manipulation. I have 36,000 input files. More then can be loaded into memory at once. I want to take the first byte of every file and put it in one output file, and then do this again for the second and so on. It does…
Audo Voice
  • 43
  • 5
1
vote
2 answers

Best way to have lots of pictures in an App? Android Studio

So I am making an app that has over 200 sunset pictures (taken by myself:P). I want to display them all via android.R.layout.simple_gallery_item I tried tossing them all in the drawable folder, then bitmapping each one and drawing it on the screen,…
Pythogen
  • 591
  • 5
  • 25
1
vote
0 answers

How to deal with huge dataframes and lists in R

I recently asked a question on how to apply a function on data frames inside a list. Hereby I show the link, where it worked perfectly executing the answer received in the post. Apply a function to a List of dataframes in R Where in this example it…
Saul Garcia
  • 890
  • 2
  • 9
  • 22
1
vote
2 answers

improve database querying in ms sql

what's a fast way to query large amounts of data (between 10.000 - 100.000, it will get bigger in the future ... maybe 1.000.000+) spread across multiple tables (20+) that involves left joins, functions (sum, max, count,etc.)? my solution would be…
Gabriel Andrei
  • 187
  • 1
  • 14
1
vote
2 answers

Using MySQL to search through large data sets?

Now I'm a really advanced PHP developer and heavily knowledged on small-scale MySQL sets, however I'm now building a large infrastructure for a startup I've recently joined and their servers push around 1 million rows of data every day using their…
Bilawal Hameed
  • 258
  • 4
  • 11