It had to be written in C# in the Unity Engine so it's kinda slow. The other issue was that I had no access to the culling system besides just turning it off so every object in the scene is rendered every frame.
I think the limitation would be on the GPU side, or at least it was on the system I was testing on.
I ended up in one of the longest, stupidest arguments of my life trying to add my two cents into a discussion about whether to wait on the new MacBook Air or jump on the current one.
The person mentioned they would be doing some gaming on it, so I pointed out that the new Air would almost certainly have Intel HD 3000 graphics and that the Nvidia chipset in the older MacBook Air was better overall for that sort of thing.
Someone proceeded to try and rip me a new one, chastising me for having the gall to recommend purchasing a previous generation laptop. I said I owned an Intel HD 3000 MacBook Pro and even on that vs the Nvidia 320M MacBook Air, the Air performed better when it came to gaming.
He called me a liar, said it wasn't possible and that the HD 3000 chipset couldn't be the culprit. Here we are a year later, and I've been proven right countless times. I'm sure this is interesting to nobody, but it's nice to get a little vindication when someone was laying into you that hard.