EDIT (Unity 5.6 and GVR SDK 1.3):
In the latest SDK you can simply put Image Effects on the main camera. However, they will not be visible in Unity's play mode (since it relies on two separate cameras created by the SDK). In the build, it uses Unity's native VR rendering and renders the Image Effects correctly. But be careful, some Image Effects cause visual artifacts / flickering / ...
(https://forum.unity3d.com/threads/image-effects-with-daydream-work-but-lead-to-flickering.443478/)
(https://github.com/googlevr/gvr-unity-sdk/issues/448)
ORIGINAL ANSWER
Turning off the Direct Render checkbox on the StereoController of the MainCamera solves the problem. Also make sure to replicate the ImageEffects on both cameras and keep them in the same order.
ImageEffects in Unity only work correctly if the camera that is using them is drawing to the full screen. With cardboard (or VR in general) you have 2 cameras both only drawing half the screen which breaks the ImageEffects. It results in one of the eye's being stretched across both screens.
The problem can be solved by first drawing to a temporary texture that represents the full screen and afterwards splitting that texture to both cameras. Keep in mind that that has a performance impact (additional to the cost of the ImageEffects).
Further reference: https://developers.google.com/vr/unity/guide#deferred_rendering_and_image_effects