Judging by a conversation I had on #appengine at irc.freenode.net, I'm clearly not the only person baffled by GAE pricing, so I figured I'd throw this up on StackOverflow and ask for clarity. Essentially: given an app with the figures below, what should its "CPU time" bill be per year?
Suppose:
h = Google App Engine's charge per hour for CPU time. Currently, h = $0.10
f = Google App Engine's daily free quota of CPU hours. Currently, I think* f = 2853.5
t = total registered users
s = simultaneous users. Assume = t * 0.2
e = (requests/second)/simultaneous user. Assume = 0.5
r = requests/sec = s * e
R = requests/day = r * 3600 * 24
p = CPU hours/request. Assume 150ms/request. I.e. assume p = 0.15/3600
c = CPU hours/sec = r * p
C = CPU hours/day = c * 3600 * 24
y = average number of days in a year = 365.25
B = CPU time bill per year = (C - f) * h * y
Therefore, C = t * 0.2 * 0.5 * (0.15/3600) * 3600 * 24
So suppose I get 10000 registered users, that means C = 3600.
In that case:
B = (3600 - f) * h * y = 9146.5 * $0.10 * 365.25 = $40415 to the nearest dollar
Is that right, or have I misunderstood what CPU time is, how it is priced, or how the quotas work?
*The free daily quota is not clearly expressed, but I think it's 6.5 hours for general use plus 2,487 hours for datastore manipulation: 2853.5 hours/day in total, assuming that my app mostly spends its time handling requests by using a controller to generate views on the models in the datastore, and allowing CRUD operations on those models.
NB. For a transcript of the IRC discussion, please see the edit history of this question.