Questions tagged [large-files]

Large files, whether binary or text, can sometimes be problematic even for an experienced programmer. This tag should be used if issues arise relating to opening and/or writing large files in a text editor, managing resources that run to gigabytes, or strategic decisions for large amounts of data.

Large files, whether binary or text, can sometimes be problematic even for an experienced programmer. This tag should be used if issues arise relating to opening and/or writing large files in a text editor, managing resources that run to gigabytes, or strategic decisions for large amounts of data.

Think about how notepad slows down appreciably when working with files that are hundreds of megabytes in size or larger. Some form of strategy needs to be used to work around such resource constraints, especially when data collection is so easy these days.

Processing large amounts of text can also cause bottlenecks if there is much processing to be done. Including this tag could also help elaborate on the optimisations that can be suggested to one's code.

1690 questions
0
votes
1 answer

read large json file

I have a json file with complex structure. {"Objects":{"items":{"item":[ { "field1": "value1", "field2": "value2", "field3":[ { "label1":"1", "label2":"2" }, { "label1":"3", "label2":"4" …
0
votes
1 answer

Split a long file (on stdout) according to a pattern and input that into a loop

I have a very long file (yes, this is DNA in fasta format) that is actually a batch of several files patched together, output on the stdout. E.g.: >id1 ACGT >id2 GTAC = >id3 ACGT = >id4 ACCGT >id6 AACCGT I want to split this stream according to a…
Lionel Guy
  • 13
  • 4
0
votes
1 answer

How to extract specific lines from a huge data file?

I have a very large data file, about 32GB. The file is made up of about 130k lines, each of which mainly contains numbers, but also has few characters. The task I need to perform is very clear: I have to extract 20 lines and write them to a new text…
Luca
  • 57
  • 1
  • 10
0
votes
1 answer

Download a Large file Async in ASP.NET C#

I have the code below which works well for small files but for large files it generates the zip as required but doesn't download it. I get all sorts of errors including Timeout (which I have managed to resolve). The other problem is that it runs in…
RealSollyM
  • 1,530
  • 1
  • 22
  • 35
0
votes
3 answers

How to quickly zip large files in PHP

I wrote a PHP script to dynamically pack files selected by the client into zip file and force a download. It works well except that when the number of files is huge (like over 50000), it takes a very long time for the download dialog box to appear…
user371254
  • 1
  • 1
  • 2
  • 4
0
votes
4 answers

Sorting huge files with millions of lines

I have tens of millions of strings in text file like these: aa kk bb mm cc tt ee ff aa xx bb ss cc gg ee rr And I want to make them look like: aa kk,xx bb mm,ss cc tt,gg ee ff,rr I have tried to sort and rearrange it with grep, sed and other…
sflk
  • 1
  • 1
0
votes
1 answer

Transferring large application to Android Wear through Android Studio

I am developing a large application for Android Wear through Android Studio (~200 MB). Trying to test the application on my LG G Watch R through "Debugging over Bluetooth" is taking a lot of time to send the large app to the Watch. Are there any…
charbelfa
  • 171
  • 1
  • 1
  • 8
0
votes
1 answer

long text file to SAS Dataset

I am trying to load a large text file(report) as a single cell in SAS dataset, but because of multiple spaces and formatting the data is getting split into multiple cells. Data l1.MD; infile 'E:\Sasfile\f1.txt' truncover; input text $char50.…
Akshata T
  • 37
  • 5
0
votes
1 answer

Is there a mercurial command which can generate a clone without largefiles?

Since I believe there is no way to strip largefiles out of a repository, I'm looking for a way to either: clone to (create) a new repo that contains at least all the same files, even without history (export tip revision only) if necessary, deleting…
Warren P
  • 65,725
  • 40
  • 181
  • 316
0
votes
2 answers

PHP read a large file line by line and string replace

I'd like to read a large file line by line, perform string replacement and save changes into the file, IOW rewriting 1 line at a time. Is there any simple solution in PHP/ Unix? The easiest way that came on my mind would be to write the lines into a…
user965748
  • 2,227
  • 4
  • 22
  • 30
0
votes
0 answers

Import large SQL dump file using sqlcmd

In Windows, I am trying to import a large SQL dump file (4.6 GB) into SQL Server 2008. Since it's a large file, I used sqlcmd -S -i C:\.sql -o C:\
snowmonkey
  • 287
  • 1
  • 5
  • 11
0
votes
1 answer

Fastest way to seek a word in very large binary files?

I am writing a function in Android application which can seek a keyword in many large binary files. Currently, I am using "grep" command to check if the keyword exists in each file? Because, there are many LARGE binary file, so I am facing to time…
Long Uni
  • 101
  • 3
  • 12
0
votes
1 answer

Uploading a large plugin to Microsoft Dynamics CRM

For a little background, I am trying to reference our own unsigned libraries as well as some third party ones in a CRM 2011 Plugin. I am registering it to the database and using the following ILMerge…
Corey
  • 466
  • 1
  • 5
  • 17
0
votes
0 answers

C# large switch/case block slow on first call?

For a C# simulation software module (3D mesh), I have a switch/case block consisting of 3^6 cases (3 conditions for 6 neighbours each). Along with the content, which is roughly the same size for each block, it results in about 9000 LoC. Given the…
Brokenmind
  • 41
  • 3
0
votes
0 answers

Write large files to disk in Objective C

I have a large file that I receive through a web service that I am trying to write to disk. I can write most files to disk, but this and other files of this size (or greater) seem to be causing issues. The app will crash when I try to convert the…
user3002092
  • 495
  • 2
  • 11
  • 29