Non existing pages should be returned with the noindex robots header, and wiki links to them use nofollow, so robots shouldn't be going to them organically, and they shouldn't be considering them for search results, regardless of the status code. Likely the robot is following links from elsewhere, and if you control those links, they should be fixed. A well-designed robot should be able to tell you how it landed on that page.
However, I think it may be confusing for the webmaster looking at robot reports if the pages are filed under "blocked by robots" when a better category would be "not found". So this has been Fixed in the next release by returning status code 404 on non-existing wiki pages.
I'm not sure whether we should have fixed it in vBulletin as well, because in vBulletin, the standard is for all error pages to return status 200. But this fix was also applied to that platform. I feel like this inconsistency is okay because it offers an improvement.