Plausible reflex rates for limited functions. But won't replace keyboard and mouse.
A real answer could only come if someone tests the product in a laboratory and writes a solid review. But here's my educated guess:
Many non-invasive brain-computer interfacing (BCI) systems are based on measuring P300 signals with EEG, which take about 200-600ms. So even if that amount of time gave you enough information, and you had efficient enough feature processing algorithms, you're not going to get much better than sub-second categorization of brainwaves -- so, a command rate of a dozen words per minute. With current state of the art, looking at these three papers, subjects can only select about 4 or 5 characters per minute with P300-based BCI, or about 25 bits per minute.
However, the rates they are claiming for "reflexes" are plausible, because the OCZ NIA is not a brainwave reader: it's primarily based not on electroencephalography (EEG), but electromyography (EMG), which measures slight muscle movements.
EMG is frequently used in prosthetic limbs, and the signal is trivial to process compared to brainwaves. For instance, this review describes how to use wavelet transforms to preprocess data (read: efficient) and classify motor commands with neural networks. I've also seen talks that describe doing the classification with Linear Discriminant Analysis (read: easy peasy).
So they could be spot-on with their claim, if we assume that twitching a face muscle takes less time than moving a mouse. But it's more like strapping an extra button to your forhead and saying "do this when I twitch my eyebrow" than having it read your brainwaves and predict what you're trying to do -- so the romance is all but gone: With those limitations, we're not eliminating the keyboard and mouse any time soon!