I have a problem that involves several machines, message queues, and transactions. So for example a user clicks on a web page, the click sends a message to another machine which adds a payment to the user's account. There may be many thousands of clicks per second. All aspects of the transaction should be fault tolerant.
I've never had to deal with anything like this before, but a bit of reading suggests this is a well known problem.
So to my questions. Am I correct in assuming that a secure way of doing this is with a two phase commit, but the protocol is blocking and so I won't get the required performance? I usually write Ruby, but it appears that DBs like redis and message queuing system like Rescue, RabbitMQ etc don't really help me a lot - even if I implement some sort of two phase commit, the data will be lost if redis crashes because it is essentially memory-only.
All of this has led me to look at erlang and scala - but before I wade in and start learning a new language, I would really like to understand better if this is worth the effort. Specifically, am I right in thinking that because of their parallel processing capabilities, these languages are a better choice for implementing a blocking protocol like two phase commit, or am I confused? And if yes, is there any reason to pick one rather than the other (specifically in this context - i know there are plenty of threads and blogs comparing the two more generally)
Apologies for cross posting - this was first posted to stack-exchange but I've added to the question and this version is probably a better fit here