Questions tagged [large-data-volumes]
302 questions
0
votes
1 answer
More Efficient Way to Copy Large Data from One Table to Another
We have a few tables with over 300,000 records and it takes several hours for records to be transferred/copied from one table to another table. Is there a more efficient effective way than using cursors and copying each record one by one from one…

that_developer
- 319
- 2
- 9
0
votes
0 answers
3D lookup table to discretize the volume
I have a depth camera that returns measured distance values of the volume in millimeters. It's needed to create a 3D lookup table to store all possible distance values for each pixel in the image. So I am getting an array of the size 640x480x2048.…

Razmik Avetisyan
- 25
- 6
0
votes
1 answer
MySql database slow response with multiple large tables
I am new using Mysql and I am having problems when trying to read large tables of the data base. I have created the next data base:
CREATE DATABASE `chamber_master_db` /*!40100 DEFAULT CHARACTER SET utf8 */
with 80 tables as the one below:
CREATE…
0
votes
1 answer
SQL DB Redesign for Retail POS - High volume transactions & Batch Processing
We are in the midst of getting out POS backend DB redesigned to support growth. Currently, our POS transactions are passed through to DB via a Web Service.
I understand that hardware scaling up is an option to increase throughput (transactions per…

sajoshi
- 2,733
- 1
- 18
- 22
0
votes
1 answer
Counts over a large set of records in DB
I have a table [ID,ITEM_NAME,ITEM_PRICE,ITEM_STATUS,ITEM_TYPE,ITEM_OWNER,ITEM_DATE]
The application can query the table with any number of search conditions like with item date and/or item owner etc.
In the resultset, I also need to fetch the…

Sripaul
- 2,227
- 9
- 36
- 60
0
votes
2 answers
Oracle representing hierarchy of data combinations
I'm having an Oracle volume of data issue, which I think is being caused by a data representation problem. I have a hierarchy made from seven different lists, and need to store all the unique potential combinations of these elements. I need to store…

Kieran
- 718
- 1
- 16
- 33
0
votes
1 answer
Oracle: Find previous record for a ranked list of forecasts
Hi I am faced with a difficult problem:
I have a table (oracle 9i) of weather forecasts (many 100's of millions of records in size.)
whose makeup looks like this:
stationid forecastdate forecastinterval forecastcreated …

Simon Edwards
- 145
- 6
0
votes
2 answers
Initializing large arrays efficiently in Xcode
I need to store a large number of different kind of confidential data in my project.
The data can be represented as encoded NSStrings. I rather like to initialize this in code than read from a file, since that is more secure.
So I would need about…

user387184
- 10,953
- 12
- 77
- 147
0
votes
1 answer
out of memory error using ffdf in R
I would like to know how I can solve the following problem.
I've got a table in postgreSQL with 100million of rows and 4 columns that I would like to use in R using ffdf. Here's my code
query <- "select * from ratings"
drv <-…
0
votes
1 answer
MySQL full text table structure
I have a database that has 60 million+ records. The current set up is there is 1 table with 30+ million and a couple small tables with 5 million (ish) in each one. The data structure is the same for each table. The person who had created our search…

briandonor
- 63
- 3
- 11
0
votes
4 answers
Loading huge records into memory
There are 0.5 million records per day, each record consists of ~500 bytes and we have to analyze a year's records. To speed up the process, it would be better to load all records at once but we can't as it requires ~88 GB of memory. Number of…

bjan
- 2,000
- 7
- 32
- 64
0
votes
2 answers
How to add more than 500 entries to the datastore with put() in google app engine?
I tried adding batches of data in a list with a couple of calls to db.put(). But it still timeouts occasionally.
Anyone have some tips?

bluegray
- 35
- 1
- 3
0
votes
1 answer
creating 155000 entities off of a openJPA query
I have a query which in the worst case will create upwards of 150K entities. Probably with an upward limit of 300,000k entities. I have tried several ways to return this set of data back to the user... I run the query just using sql Developer and…

SoftwareSavant
- 9,467
- 27
- 121
- 195
0
votes
1 answer
Slow SELECT COUNT(*), information_schema, cardinality field
I have a large (60+ millions of records) table.
This table has a primary key (id, auto_increment, index id)
I have a report that selects records from this table. And to browse and navigate through this report (written in PHP) I'm using the…

rinchik
- 2,642
- 8
- 29
- 46
0
votes
3 answers
Strategies for writing expanding ordered files to disk
I an a graduate student of nuclear physics currently working on a data analysis program. The data consists of billions of multidimensional points.
Anyways I am using space filling curves to map the multiple dimensions to a single dimension and I am…

James Matta
- 1,562
- 16
- 37