I have this online shop that I built a while ago and I've been using LOCK TABLES on the product stock tables to make sure stock doesn't go below 0 or isn't update properly.
Lately there's been a lot of traffic and lots of operations (some other LOCK TABLES too) on the database so I have been asking myself if there was a faster and safer way to keep the data consistent.
I'm trying to understand transactions and how they work and I'm wondering if the following logic would successfully apply to my checkout process:
1) START TRANSACTION (so that any following query can be rolled back, if one of them fails or some other condition like stock is lower than needed)
2) insert customer in the 'customers' table
3) insert order info in the 'orders' table
4) similar queries to different tables
5) for each product in the cart:
5.1) update products set stock = stock - x where stock - x >= 0 (x is whatever units the customer wants to buy)
5.2) check affected rows and if affected rows == 0 then ROLLBACK and exit (not enough stock for the current product so cancel the order/throw an error)
5.3) some other queries... etc..
6) COMMIT
Does that sound correct?
What I don't get (and don't know if I should be concerned about in the first place) is what happens to the 'products' table and the stock figures, if some concurrent session (another customer) tries to place the same order or an order containing some of the same products.
Would the second transaction wait for the first one to be finished and the second transaction use the latest stock figures or would they both run concurrently and probably both fail in some cases or what?