0

I am upgrading my watch app from the first version of watchOS. In my first version I was placing UIImageViews on top of each other and then rendering them with UIImagePNGRepresentation() and then converting it to NSData and transferring it across to the watch. As we know there are limited layout options on the apple watch so if you want cool blur effects behind images or images on images they have to be flattened off screen.

Now when I re-created my targets to watchOS2 etc suddenly the images transferred via NSData through [[WKSession defaultSession] sendMessage:replyHandler:] come up with an error saying its too large of a payload!

So as far as I can see I either have to work out how to combine images strictly via watchkit libs or use the transferFile option on the WKSession and still render them on the iPhone. The transferFile option sounds really slow and clumsy since I will have to render the file, save to disk on iPhone, transfer to watch, load into something that I can set on a WK component.

Does anyone know how to merge images on the watch? QuartzCore doesn't seem to be available as a dependency in watch land.

composite image

Mike S
  • 4,092
  • 5
  • 35
  • 68

1 Answers1

0

Instead of sendMessage, use transferFile. Note that the actual transfer will happen in a background thread at a time that the system determines to be best.

Sorry, but I have no experience with manipulating images on the watch.

Drewf
  • 414
  • 2
  • 7
  • transferFile greatly complicates my app. Using this technique, image files will just asynchronously arrive on my watch and I have to determine if the applicable view is even around let along find the place to slot the image in. These aren't like gallery pictures this is just composite UX artwork (see screenshot I've added). Do you know of any way to create composite imagery using watchkit APIs so I don't have to transfer? – Mike S Oct 14 '15 at 04:21