The "real" genie effect, as implemented in Mac OS X, is a non-linear transformation of the source image. One way to implement it is using Core Image filters (which is private / undocumented on iOS, but available on the Mac).
You write a filter with a 'time' parameter. For each value of the time parameter in a given time interval (say {0,2}) you need to figure out the source coordinate for each destination coordinate in the image. If the source coordinate is out of bounds, set alpha to zero, otherwise return the input image value at the source coordinate.
kernel vec4 ASGenieKernel(sampler src, float t, float D, float ytarget) {
vec2 takeFrom; // In destination coordinates.
vec2 original = samplerCoord(src);
vec2 size;
float g, t2, a;
vec4 c;
size = samplerSize(src);
t2 = compare(t-1.0,t,1.0);
takeFrom.x = original.x + compare(t-1.0,0.0,1.0)*size.x*(t-1.0);
a = compare(takeFrom.x, 0.0, 1.0);
a = compare(a-0.5,0.0,compare(takeFrom.x-size.x, 1.0, 0.0));
// Apply an envelope. This is where non-linearity is introduced.
t2 = t2 * (1.0 - tan_(1.57*original.x/size.x - 0.78539))*0.5;
g = 1.0 - D / size.y;
takeFrom.y = (original.y - t2*ytarget*g)/(1.0-t2*g);
a = compare(a-0.5,0.0,compare(takeFrom.y, 0.0, 1.0));
a = compare(a-0.5,0.0,compare(takeFrom.y-size.y, 1.0, 0.0));
takeFrom.x = compare(takeFrom.x, 0.0, takeFrom.x);
takeFrom.x = compare(takeFrom.x-size.x, takeFrom.x, 0.0);
takeFrom.y = compare(takeFrom.y, 0.0, takeFrom.y);
takeFrom.y = compare(takeFrom.y-size.y, takeFrom.y, 0.0);
c = sample(src, takeFrom);
c.w = a;
return c; }
I've got a blog post with some more details and a Quartz Composer project here: Genie Effect blog post