I'm using QWebView as a rendering layer for my application.
I would like to be able to buffer an image in memory and have the webkit engine render the image progressively as it buffers, like it would over an HTTP connection, by inserting an <img> tag.
A bit of Googling suggests that QWebView can access image content from either a local file, HTTP URL, or a Qt resource URL. My application uses none of these mechanisms, it receives and decodes the image data from a Unix socket stream.
What is the least-messy way I can get a buffered/buffering image to render in a QWebView img tag (e.g. without having to set up a localhost HTTP daemon or save files to disk first)?