My main issue at the moment when I've tried to write imaginary DNA to test ideas I've had is that it's very difficult to build a way at the moment to have variable length arrays.
For instance, I could do something like: array starts at X, length is stored in Y, but then how would I write code in the current language that can act on all elements between X and X+Y (+1 depending on how you count). Usually you'll have some function that you'll want to use on each element in your array.
And then, assuming that I do have a way to do this, I need to be sure that my array doesn't cross over into sysvars, ie: protected memory.
And then, this isn't a very evolvable structure. I think if there was a very easy way for evolution to be able to operate on large, potentially variably sized arrays this would be a boon.
---------------------------------------------------------
Expanding on your idea Eric, and incorporating codules which is where this is really going to be useful anyway, I would say that if we go that route, we'd need operators like:
[array] codule calleach - passes each element of [array] (I'm ignoring for a moment how the DNA addresses arrays) to seperate instances of codule.
But in the end, I think using the classic "flat" memory method of DOS and the old days is a poor idea. If I were to completely reinvent the DNA, I would provide baseline support for vectors (2, 3, and 4 dimensional), matricies (2, 3, and 4 dimensional), scalars (floats, integers, types (say for instance, colors), fixed point numbers between [0,1] or [-1,1], etc.), stacks, queues, lists, and maybe a few other things.
But then, I have no idea what it would mean to add 3.6 and blue. So I think our flat memory will exist for a while longer