A sensor (camera) discretizes an observed continuous scene into distinct pixels.
The troublesome image sources you cite, LCD / OLED digital displays and CRTs, discretize a scene into their own pixels,
so we have double discretization,
which beat against one another.
This accounts for the annoying moiré patterns visible in your example image.
Apparently you wish to detect such patterns. Consider using FFT.
Pick a segment length L, and a segment center location.
Then rotate a line through several angles, reading pixels from the segment and handing them to FFT.
A moiré pattern will exhibit periodicity for at least one angle.
And nearby segment centers, at same angle, will exhibit similar periodicity.
Perhaps this is a solved problem?
https://github.com/AmadeusITGroup/Moire-Pattern-Detection
https://github.com/AmadeusITGroup/Moire-Pattern-Detection/blob/master/src/positiveImages/355_letterbox1024.jpg
But perhaps "rejecting" isn't quite your true goal at all.
Tell us about the failure mode, about what goes wrong when your algorithms encounter images from troublesome sources.
Could we maybe deal with the discretization by running a light gaussian blur over the image?
Possibly followed by unsharp masking?
You didn't mention whether you have any control over how the images are acquired.
Obtaining a pair of images of "same scene",
separated in time by a second or so,
would offer you lots more information.
If sensor is tripod mounted,
consider moving it a millimeter or so
before snapping the 2nd image.