0

I apologize if this doesn't make sense. I Am not sure what to google.

Lets say I have two arrays

string a_1[16];
string a_2[20];

I need to output these to a file with a function, first, a_1[0] to a_1[n]. Then reads in the a_2's.

It's also possible to run the function again to add in more a_1's and a_2's to the output file.

so the format will be:

//output_file.txt

a_1[0].....a_1[n]
a_2[0].....a_2[M]
a_1[n+1]...a_1[16]
a_2[M+1]...a_2[20]

my question is. Is there a way to read output_file.txt back so that it will read in all of the a_1's to be in order, a_1[0] to a_1[16]. and then input a_2[0] to a_2[20].

maybe just put "something" between each group so that when "something" is read, it knows to stop reading a_1's and switch to reading in for a_2....

David Walker
  • 41
  • 1
  • 7
  • 1
    The answer to your question is "yes". If you need more details, you should provide what you have tried and indicate where you are stuck and feel there is a problem. – R Sahu Mar 09 '16 at 04:07

1 Answers1

1

What the OP calls "Something" is typically called a Sentinel or Canary value. To be used as a sentinel, you have to find a pattern that cannot exist in the data stream. This is hard because pretty much anything can be in a string. If you use, say, "XxXxXx" as your sentinel, then you have to be very careful that it is never written to the file.

The concept of Escape Characters (Look it up) can be used here, but a better approach could be to store a count of stored strings at the beginning of the file. Consider an output file that looks like

4
string a1_1
string a1_2
string a1_3
string a1_4
2
string a2_1
string a2_2

Read the cont, four, and then read count strings, then read for the next count and then read count more strings

OK, so you're thinking his sucks. I can't just insert a new string into a1 without also changing the number at the front of the file.

Well, good luck with inserting data into the middle of a file without totally smurfing up the file. It can be done, but only after moving everything after the insertion over by the size of the insertion, and that's not as trivial as it sounds. At the point in a programming career where this is the sort of task to which you are assigned, and you have to ask for help, you are pretty much doomed to reading the file into memory, inserting the new values, and writing the file back out again, so just go with it.

So what does this look like in code? First we ditch the arrays in favour of std::vector. Vectors are smart. They grow to fit. They know how much stuff is in them. They look after themselves so there is no unnecessary new and delete nonsense. You gotta be stupid not to use them.

Reading:

std::ifstream infile(file name);
std::vector<std::string> input;
int count;
if (infile >> count)
{
    infile.ignore(); // discard end of line
    std::string line;
    while (input.size() < count && getline(infile, line))
    {
        input.push_back(line);
    }
    if (input.size() != count)
    {
        //handle bad file
    }
}
else
{
    // handle bad file
}

and writing

std::ofstream outfile(file name);
if(outfile << output.size())
{
    for (std::string & out: output)
    {
        if (!outfile << out << '\n')
        {
            // handle write error
        }
    }
}
else
{
    // handle write error
}

But this looks like homework, so OP's probably not allowed to use one. In that case, the logic is the same, but you have to

std::unique_ptr<std::string[]> inarray(new std::string[count]);

or

std::string * inarray = new std::string[count];

to allocate storage for the string you are reading in. The second one looks like less work than the first. Looks are deceiving. The first one looks after your memory for you. The second requires at least one delete[] in your code at the right pace to put the memory away. Miss it and you have a memory leak.

You also need to have a variable keeping track of the size of the array, because pointers don't know how big whatever they are pointing at is. This makes the write for loop less convenient.

user4581301
  • 33,082
  • 7
  • 33
  • 54