An answer to this question cites page 340 of "Text algorithms" by Crochemore and Rytter for a linear-time algorithm to compute the period of a string. However it's quite complex, and the following, adapted from the maximal suffix algorithm used in the Two Way algorithm (by Chrochemore and Perrin), seems correct for computing the period:
size_t period_of(const char *x)
{
size_t j=1, k=0, p=1;
while (x[j+k]) {
if (x[j+k] != x[k]) {
j += k?k:1; // Previously: j += k+1;
k = 0;
p = j;
} else if (k != p) {
k++;
} else {
j += p;
k = 0;
}
}
return p;
}
Their version in Two Way, from which this is adapted, computes the period of the maximal suffix as a side effect of computing the maximal suffix. However, unless I'm missing something, the validity of the logic does not seem to depend on the maximal suffix property.
Is the above correct? If not, can you provide a counterexample that shows where it fails?