1

I'm currently trying to implement a glossary functionality to one of my webprojects, it's all about:

  • 50 - 200 "words" browsable in a glossary
  • Up to 15 synonyms for each word
  • Mostly dynamic contents to search for those words and replace with hyperlinks to the glossary content

The main problem i'm thinking about is the performance of searching through the mostly dynamic contents. My first approach loaded all words from the glossary into an array and search'n'replaced them with links in PHP using a regular expression.

Problem 1: The query

Loading up to 3.000 database tuples everytime someone refreshes a page doesn't seems to be a good idea.

Problem 2: The search

Doing such long loops with regular expressions with PHP doesn't seem to be a good idea, too.

Solution 1: Cached JavaScript

My first and only idea is to generate a javascript based list of the words and synonyms and let JavaScript to the replacements using regular expressions.

The final question

Is my solution a good idea and ist my solution the only way this could work?

Daniel Jäger
  • 177
  • 1
  • 3
  • 13

1 Answers1

0

Some ideas: Store the words in a json file and the Client cache those file. Replace the words in saving in your cms. Create the links on the Generation between your cms and server cache?

Hackbard
  • 440
  • 1
  • 4
  • 19
  • 1
    Part 1: Wouldn't it be better to generate "real javascript arrays" instead of a json file to enhance the performance? Part 2: Due to the different sources of content (backend/admin content, parsed content, user input) this won't be an option - but it's a nice approach. – Daniel Jäger Apr 18 '14 at 18:20
  • I think it will be the same in the end – Hackbard Apr 18 '14 at 18:21