This code, for matching a string in NFA, which I think requires O(N^2)
memory, predictably breaks when string size is 20,000
, then works with -O2
compiled code, then breaks again for -O3
. Compilation was done with -std=c++14
enabled. In my opinion, the problem is stack-overflow.
Input string was "ab"
repeated 10,000
times, plus a 'c'
at end.The image below contains the NFA I'm trying to match.
Specifically, my question is -
1) What -O2
optimization is behind this,(which I believe is impressive) fix?
2) And what -O3
optimization breaks it again?
struct State
{
map<char,vector<State*> > transitions;
bool accepting = false;
};
bool match(State* state,string inp){
if(inp=="") return state->accepting;
for(auto s:state->transitions[inp[0]])
if(match(s,inp.substr(1))) return true;
for(auto s:state->transitions['|']) //e-transitions
if(match(s,inp)) return true;
return false;
}
In gcc documentation, it's said O3 has all optimizations of O2, plus some more. I couldn't "get" some of those extras or their relevance to this problem.And I want to emphasize, for what I've seen in similar questions, that I'm not looking for specific ways to fix this problem.