Looping is time consuming, we all know that. That\'s exactly something I\'m trying to avoid, even though it\'s on a small scale. Every bit helps. Well, if it\'s unset of co
UPD: Warning the slowest solution! See benchmarks below.
Try this code:
$a = array(array('id' => 1234,
'name' => 'blablabla'),
array('id' => 1235,
'name' => 'ababkjkj'),
array('id' => 1236,
'name' => 'xyzxyzxyz'));
var_export(array_reduce($a, function($res, $item) {
$res[$item['id']] = $item['name'];
return $res;
}));
Works fine even in PHP 5.3. And uses only one function array_reduce.
UPD: Here are some benchmarks (PHP 5.6 over Debian 7 on a medium quality server):
$a = [];
for ($i = 0; $i < 150000; $i++) {
$a[$i] = ['id' => $i,
'name' => str_shuffle('abcde') . str_shuffle('01234')];
}
$start = microtime(true);
if (false) {
// 7.7489550113678 secs for 15 000 itmes
$r = array_reduce($a, function($res, $item) {
$res[$item['id']] = $item['name'];
return $res;
});
}
if (false) {
// 0.096649885177612 secs for 150 000 items
$r = array_combine(array_column($a, 'id'),
array_column($a, 'name'));
}
if (true) {
// 0.066264867782593 secs for 150 000 items
$r = [];
foreach ($a as $subarray) {
$r[$subarray['id']] = $subarray['name'];
}
}
if (false) {
// 0.32427287101746 secs for 150 000 items
$r = [];
array_walk($a, function($v) use (&$r) {
$r[$v['id']] = $v['name'];
});
}
echo (microtime(true) - $start) . ' secs' . PHP_EOL;
So, as a conclusion: plain iteration with simple for loop is a winner (as mentioned in this answer). On a second place there is array_combine allowed only for new versions of PHP. And the worst case is using my own solution with closure and array_reduce.