I have a method I use to get images from a various views in my iOS application for letting users email screens. Most of the screens where I draw are working ok, but when I use this technique with a UIWebView I only get the visible portion of the Screen. Anything off screen is not included in the rendered image. Been digging around here on Stack, but so far nothing I have found works?!
Here is the method I currently use:
-(NSData *)getImageFromView:(UIView *)view
{
NSData *pngImg;
CGFloat max, scale = 1.0;
CGSize size = [view bounds].size;
// Scale down larger images else we run into mem issues with mail widget
max = (size.width > size.height) ? size.width : size.height;
if( max > 960 )
scale = 960/max;
UIGraphicsBeginImageContextWithOptions( size, YES, scale );
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
pngImg = UIImagePNGRepresentation( UIGraphicsGetImageFromCurrentImageContext() );
UIGraphicsEndImageContext();
return pngImg;
}