While the performance benefit is negligible on small wikis (like this site - although I still noticed ~1 MB less usage on some pages), for very large wikis the savings are huge, as detailed in the link above. It is unlikely that further savings can be had for larger result sets (1 million rows would still use 100 MB in memory), but the link also talks about selective fetching that could be used on such sets.
I have also added Judy support to the router with respect to that same issue. Even though Judy is supposed to be the end-all when it comes to array memory management, it seems my implementation above is still about 10% more efficient than when Judy is used. This might be due to the fact that Judys cannot be serialized. After using a hybrid implementation, it seems that the overhead of creating a Judy object exceeds that of creating an SplFixedArray when the number of elements is fixed at a small amount (in this case 4). In this case, Judy should only be used for arrays where the total number of elements is not known to be a small amount.
We can probably find array optimizations like this in other places throughout the wiki. The cache used by the fetcher is probably another prime place for Spl or Judy.