I have a list and I would like to cache it into redis. I tried two ways to achieve it by using hashes.
Consider this first approach. I create only one hash and set items as hash values:
// ..
$apiArray = [..]; // array from parsing an api
if(!$c->keys('lista')){
foreach (json_decode($apiArray) as $item){
$c->hset('lista', $item->id, serialize($item));
}
}
foreach ($c->hgetall('lista') as $key){
$item = unserialize($key);
echo '<p>';
echo '<strong>id</strong>: '.$item->id.'<br>';
echo '<strong>name</strong>: '.$item->name.'<br>';
echo '<strong>email</strong>: '.$item->email.'<br>';
echo '</p>';
}
To loop over 10000 items, it takes 0.5 seconds.
And now consider this one. A single hash on every element of the original array:
if(!$c->keys('lista:*')){
foreach (json_decode($apiArray) as $item){
$c->hset('lista:'.$item->id, 'element', serialize($item));
}
}
foreach ($c->keys('lista:*') as $item) {
$item = unserialize($c->hget($item, 'element'));
echo '<p>';
echo '<strong>id</strong>: '.$item->id.'<br>';
echo '<strong>name</strong>: '.$item->name.'<br>';
echo '<strong>email</strong>: '.$item->email.'<br>';
echo '</p>';
}
The loop of 10000 records takes 3 seconds.
This is very surprising to me, because the second one is the approach covered in Redis official documentation, and it also supports the secondary indexing (by using zadd and sadd).
Why is slowest than the first approach? Am I wrong in something?
I think it might happen because I have to call 10000 times the hgetall() method to get items in the loop. Can you cofirm this?
Have I to prefer the first approach?
Thank you guys
M :)