Code center > Darwinbots3

Release date

<< < (7/11) > >>

Numsgil:
While switching between .net languages is so easy it's almost trivial, moving from VB6 (the current Darwinbots code base) to .Net is decidedly not.

Moonfisher:
1.  Make it work.
   2. Make it right.
   3. Make it fast.

Heh that's actualy how I end up working a lot of the time... even for seperate small features (But with less emphasis on making it fast)...
The thing is you often know if a feature will need to be expanded a lot and may be run very often, so building this with speed and OO in mind can help a lot...
You don't need to make everything perfect in every sence, but you can predict uses for the system and make sure your structure supports it and is easy to expand, and you can run some early performance tests and generaly keep performance in mind while building your feature... not pass any strings and such... try to stick to pointers and such.
Also if you're going to store something and you know it will happen very often then you may want to consider how you're going to decrease the file size before building everything....

I agree there's no need to optimize for something that may never be needed, but I also think it's important to think ahead or you'll spend too much time refactoring code... Sometimes making something work right will barely be related to just making it work, hence that step should be skipped IMO, and sometimes performance is a great issue and therefor you need to remake virtualy everything if you didn't have it in mind from the start...
So I agree that for a lot of features you can proceed this way, but there's also a lot of cases where you need to think things through before getting started. (IF you know that structure or performance can be an issue).

But where game engines are concerned... I don't think they're optimizing prematurely... I think they need to optimize anything they can... GTA 4 still has lagg at times, and so does virtualy any game that didn't cut away anything that could slow them down with extreme prejudice (Like Blizzard who only accept perfect features, so IMO they often end up cutting away a lot of the fun, atleast in WoW they did, all that was left was a MMO single player game where you could brag to your friends about the gear you managed to grind, and if your friends weren't on your served... then it was actualy no different than a single player game, except monsters took ages to die) (Anyway that was offtopic, I'm just not a big fan of MMORPG's where they cut out the R to favor booring things like grinding and quests like "Bring this lettter to someone you never heard about, and may never hear about again" or kill X boars of Y color... worst asset reuse I've ever seen, I was killing boars at lvl 1... and several times at later levels, and then the expansion came... and the first thing I killed in the new world... was a board... and it took me longer to kill it than when I was lvl 1... so by that logic my character spendt WAY to many hours getting worse at killing boars... lamest game EVER... wouldn't play it again if I got payed by the hour... well...)

Anyway just saying, game engines can always use better performance, and WoW is the worst wana be MMORPG I've ever played, less fun than pong... (To be fair I did have some fun with PvP, one thing Blizzard knows how to do is balance PvP, but you get tired of the 3 maps they had pretty fast.)

Numsgil:
Ah, see, you fall in to this mental trap of "the game needs to be fast, so I need to write fast code."  But the thing is, you only need to worry about speed like this if and only if:

1.  Your game is CPU bound (I'll be you $100 that GTA is fill rate bound, ie: it's limited by the number of pixels it can draw.), and
2.  The section of code you're working on is a bottleneck in the code.  Note, however, that you can only determine this last one if you actually profile the code.  If you can't say exactly, with a number, how slow the code you're working on is, then it's premature optimization.  Meaning you do not write the code to be fast up front.  You write it to be readable and understandable 10 months from now when you go back and refactor it.

All this said, choosing the right algorithm is of primary importance, and cuts way deeper than optimization.  If you write an O(n^2) algorithm, then you're screwed, right from the beginning, because the scope of that particular algorithm has an inherent limit.  If you bubble sort a list of 5 elements, you might feel justified because n is so small, and writing a nlogn sorter might be "overkill", but then that's where the stl and 3rd party libraries come in.

So my priority list when I code (and get my say) looks like this:

1.  Choose algorithms with low Big O
2.  Write tests to validate the results of my algorithm, and check for pathological errors.
3.  Write code to fulfill those tests by any means necessary.
4.  Refactor the code I just wrote to look pretty.
5.  Profile for performance issues if necessary.  Ignore performance issues if not necessary.

So as an end result I have a lot of code that is easy to follow, with good algorithms, and tests to ensure they work correctly, but that is written in an inefficient manner.  And at the end of the day I might refactor those inefficiencies or I might not.  It just depends on whether it's necessary at the end of the day.

Moonfisher:
The only time I saw GTA lagg was when a large amount of NPC's got hit and went in rag doll mode... so it would seem like it was just taking too long to process the physics and collisions of all the ragdolls....
But I think a lot of the time, if you're going to build on something you never know exactly how far you're going to take it, so when building a base it feels safer to optimize performace right away so you don't risk having to make a change later on that will affect all the expansions or inherited... stuff... (I'm feeling very intelctual right now)
I kind of had the impression that lighting and physics where the big bottlenecks though... and I don't think most people have a physics card so I would imagine that code optimization should help, even if it's not the main issue.
Anyway... I'm just an intern... what do I know...

Commander Keen:
Readability is more important than optimisation. I've got lots of unfinished QB programs which will never be completed because I can't understand my own code  
My VB6 programs have a much higher success rate, even if that may be because I learnt to pro

At the end of the day, PC's are getting fast enough that you don't need to optimise everything. When I used to program on my 386 laptop in QB, I found optimisation vital, and I wrote several test programs that demonstrated this. Now, with computers capable of calculating over 1,000,000 times faster than before, we don't need so much optimisation, making the use of interpreted languages like Java and Perl useable for serious applications.

Ok, that's my raving done. No guarantees for reliability, because I am an amateur programmer, and most of what I know is self-taught.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version