1

I am working on parse. I want to know how much parse is scalable. I know the fact that it serves 30 requests/second. But practically how many concurrent users can access parse ? I know the different app serves different request rate that also varies based on usage pattern of users. But lets say we have an app for blood donation campaign, then what is the scalability we can expect from parse ?

mahendra kawde
  • 855
  • 4
  • 25
  • 44

1 Answers1

0

The parse.com free tier will indeed limit to 30 req/s.

How that translates into a number of users, depends on two factors:

  • how many requests does one app do per day?
  • how bursty is the traffic?

The first factor is entirely determined by your app. Some apps "call home" all the time (and also frequently from the backgound through a Service). Other apps will only connect occasionaly and download a bunch of content that will last for ages. How your app will behave, is determined by the amount of messages, their latency requirements and if your service is smart enough to combine pending messages into a single request. As an app developer, you're the only one who can come up with a decent estimate.

For the second factor (burstyness) I can make a rough guess by assuming Poisson distribution of the requests.

Here's a simple python script which gives insight in the Poisson probability function:

import math

def poisson_probability(nr_devs, device_rps, total_rps):
    # Returns the probability of total_rps requests per second, 
    # assuming nr_devs devices, each making device_rps (mean)
    l = math.ceil(nr_devs * device_rps)
    return (l ** total_rps) * math.exp(-1 * l) / math.factorial(total_rps)

def cumulative_poisson_probability(nr_devs, device_rps, cutoff_probability=0.999999):
    c = 0.0
    for i in xrange(1000):
        p = poisson_probability(nr_devs, device_rps, i)
        c += p
        print "RPS: %d\tprobability: %1.6f\tcumulative: %1.6f" % (i, p, c)
        if c > cutoff_probability:
            break

Let's try 100k devices which generate one request per 2 hours on average:

>>> cumulative_poisson_probability(100000, 1.0/7200.0)

...
RPS: 26 probability: 0.001299   cumulative: 0.998691
...
RPS: 35 probability: 0.000001   cumulative: 0.999999

With these numbers you have a 99.9% probability of getting max 26 requests in any given second, and a 99.9999% probability of getting max 35 requests in any given second.

This means that you would exceed 26 rps roughly once every 1000 seconds (~ 20 minutes) and 35 rps once in a million seconds (~ 11 days).

Freek Wiekmeijer
  • 4,556
  • 30
  • 37