0

I developed a django app which uses beautifulsoup4 and urllib2 to parse the data from webpages.It was working well on my development server but when I put the same on Apache+wsgi server, the scrapping for a data entry is very slow.It works sometime and sometime throws error 'Lock wait timeout exceeded; try restarting transaction. I can't figure out why its not working on the server.When i executed the same parsing function on server in commandline, it took less than 5 secs. But why when I do a entry from django admin its not working? How should I configure the server to make it fast?

PS:Sometimes data is fetched when I make a entry.Comment if any code snippet is required

Ashish Gupta
  • 103
  • 4
  • that error is a mysql error.. Could be others dbs but i'd focus there – Mike Nov 09 '14 at 14:36
  • yes its mysql error but the thing is I am not getting this error on development server...so what could be wrong on production server possibly – Ashish Gupta Nov 09 '14 at 15:57
  • have to see what mysql is doing.. sounds like the scaper is going too fast for mysql to catch up. One problem is you might be using like a small VPS instance.. and your laptop is 50x more powerful – Mike Nov 10 '14 at 03:48
  • How to check this `scaper is going too fast for mysql to catch up`?Because on the server it takes 2-5 seconds for scrapping and scrapping is working fine..but the problem is somewhere with mysql – Ashish Gupta Nov 10 '14 at 05:12
  • when you are getting those errors you need to debug mysql.. like `show processlist` enable slow query log... – Mike Nov 10 '14 at 13:44

0 Answers0