You're pretty much guaranteed to have an uneven distribution, because client IPs aren't going to hash perfectly. Further, the rate of requests from those client IPs isn't going to be perfectly balanced, either. However, with a large enough number of distinct client IPs, the unevenness will be relatively small, and can be ignored for practical purposes.
If you need absolute evenness of distribution, you're in a world of hurt. Round-robin distribution will give you evenness of request count, however it is an incredibly rare system in which all requests take exactly the same amount of time to process, so the load on a set of round-robin-allocated backends won't be perfectly even. You can improve the load situation somewhat, in the face of wildly varying and unpredictable service times, through the use of a least-connection routing algorithm, where the load balancer sends a job to the backend with the least number of "in-progress" jobs, on the assumption that each concurrent connection makes a roughly-equal contribution to the instantaneous system load.
To improve the situation further requires the use of one of any number of ever-more-complicated scheduling algorithms to try and even out the load, but without perfect a-priori knowledge of the resources required to service every request, you're doomed to approximations. Note that, as far as I know, there aren't any advanced scheduling algorithms built into nginx.
If balancing backend load is important, the best thing you can do is add some sort of "feedback" into the system, so that the balancer is aware of the current state of the backends. To do this properly involves some degree of basic control theory, due to the lag between jobs being routed to a backend and that job's impact on system load. Without such control, you'll end up thundering-herding your system into dust.
Yes, this is a complicated problem. This is why everyone just configures their load balancer for round-robin or least-connections, stores their sessions and cache centrally, and buggers off down to the pub.