Filed Under: Journal - Development - Game Development
First off, my little aside about Ut 2003, Nautilus and OpenGL: it would seem that Nautilus was the culprit. Nautilus2 is awesome, it really is. I find it faster/prettier than OS X's finder (give or take) and definately more flexible and prettier then Explorer, but the damn thing is causing system crashes. I know that it's a developer snapshot, and I should expect crashes, but I have to say that this has me worried. I didn't think a userland application could cause a hard system crash. A few of my little happy dreams have been shattered about Linux. On the plus side, the system is back to being a rock, even if the fps of Ut2003 still sucks the big wammie. Rumor is there's a linux patch floating around for it. Maybe I'll give it a try when I'm out of 'work mode' again.
I WISH I had got the TI instead of the MX. Oh well. Life is full of regrets.
Now back to semi-important things. I've begun to slap some library code back together this time taking a page from STL. I'm trying to seperate classes as much as possible and keep things simpler so that more of the subsystems can be used seperately. Hopefully that will reduce the amout of 'rework' I have to do next time I find myself behind the times. Now I'm refacing an old issue and asking myself the same questions. How pure virtual does a pure virtual class have to be. The new Streams interfaces have some similar methods that I'd rather not have to reimplement in every implementation class, but I always feel a little dirty adding implementation to an interface. Especially when that inteface is templated and all the code has to be inlined.
What does everyone else do with this? Do they just reimplement and bite the bullet? I'm really not sure what the best solution is. I guess the only GOOD thing here is that since the interface won't change, I won't have to change any applications if I decide to extract the implementations from the interfaces later. That actually brings up another question I've wondered. Does the distance from an implementation to it's interface in the class tree modify the time it takes to resolve it at runtime? I suppose I could do some kind of testing, but the compiler does so many optimizations that a simple test might be undone by the pre-processor. I guess I'll give it a try and see what happens.
The most important thing though is that the interfaces won't change.