For work I am tasked with finding out how many times a certain set of sequential characters are used in a string. These files are all 1GB + in size and there are anywhere from 4 to 15 files like these. ex. find "cat" in "catastrophe" in every instance "cat" is part of the word in every file.
In theory (at least to me) I would load one text file into memory and then line by line look for the match. At the end of the text file I would remove it from memory and load the next text file.... until all files have been searched.
I have been doing mostly script files to automate tasks these last few years and I have been out of the coding game for so long, I don't remember, or maybe I never knew, the most efficient and fastest way to do this.
When I say speed I mean in elapsed time of the program, not how long it will take me to write this.
I would like to do this in C# because I am trying to get more comfortable with the language but really I could do it in any language. Preferably not assembly ...that was a joke...