0


I am implementing a project in PHP with mysql. Right now i don't have much data but i was wondering that in future when i have a large dataset. It will slow down my search in the table. So to decrease that searching time, i was thinking for caching techniques. Which caching i.e. client or server will be good for a large dataset?

Thanks, aby

Community
  • 1
  • 1
insomiac
  • 5,648
  • 8
  • 45
  • 73

2 Answers2

2

Server, in my opinion.

A client-side cacheing technique will have one of two negative outcomes depending on how you do it:

  1. If you cache only what the user has searched for before, the cache won't be of any use unless the user performs exactly the same search again.
  2. If you cache the whole dataset the user will have to download the whole thing, and that will slow your site down and incur bandwidth expenses.

The easiest thing you can do is just add appropriate indexes to the table you're searching. That will be sufficient for 99% of possible applications and should be the first thing you do, before you think about cacheing at all.

Apologies if I've pitched this answer below your level, I'm not sure exactly what you're doing, what you're planning to cache or how much experience you have.

p.g.l.hall
  • 1,961
  • 2
  • 15
  • 26
  • I have a search page on my website, which search for all different mobile devices available. I am searching everything by the device name and already i have create index for that. But i was wondering what if in future i have a large dataset and what kind of caching should be done? – insomiac Apr 17 '11 at 18:16
0

Pay close attention to indexing in your database schemas. If you do this part right, the database should be able to keep up until your data and traffic is large. The right caching scheme will depend significantly on what your usage patterns are like. You should do testing as your site grows to know where the bottlenecks are and what the best caching scheme will be for your system.

Joshua Martell
  • 7,074
  • 2
  • 30
  • 37