I have the following code:
<?php
$start = 1;
$timestart = microtime(1);
for ($i = 0; $i < 1000000; $i++) {
$result1 = $start * 4;
}
echo "\n";
echo microtime(1) - $timestart;
echo "\n";
$timestart = microtime(1);
for ($i = 0; $i < 1000000; $i++) {
$result2 = $start << 2;
}
echo "\n";
echo microtime(1) - $timestart;
echo "\n";
This outputs:
0.14027094841003
0.12061500549316
I found on the Internet a Google interview question (which I wanted to apply for a developer, but I realize I can't), and one of the questions asked what the fastest way was to multiply a number. My first thought was to use the *
sign, so I tested it.
My question is, why is shifting bits faster than multiplication?