Although your question isn't really about programming per se, I'll humour your question.
With situations like this it is always best to build something that works, and if speed is an issue in practice, to find out what that issue is by timing things accurately. There are many posts on Stack Overflow that deal with timing, so I won't go into depth here because you should research this.
That said, consider that for most people on the internet 400KiB is a miniscule thing to worry about. (While I'm typing this I'm downloading a torrent at 4.5MiB/s, and streaming music in a separate tab - I know I'm luckier than the rest of the world, but this is illustrative of the fact you shouldn't worry too much about this). I would strongly encourage you to do some profiling before worrying about this.
The simplest way to do this is on a fresh page-load with your browser of choice with its developer tools running (usually F12 or Ctrl-Shift-C). Here is how the timing of loading today's guardian.co.uk looks in Chromium:

However after all that, the common advice is to minimze HTTP requests if possible as each one introduces latency to the page load. This is particularly the case if the assets are retrieved from separate domains where extra DNS lookups add yet more latency.
So:
Profile!
Then minimise HTTP requests and compress your images.
To answer your specific questions
1a: Loading the entire image at once, will technically take less time than the same-sized image divided into 4 because of the delays introduced by making extra HTTP requests. However, the browser may not render it until it is fully downloaded (rarely an issue in practice).
2: No. Unless you are doing something very fancy, you let your browser and OS schedule the timing of your requests.