I will no longer respond in the Crow post sicne the discussion is far from crows.
Numbers have been used to quantify ideas of behaviors or other ideas which are not measured by any object undergoing such implications of that number. For example, a rock with 100Kg of mass will weigh 980 newtons, but the rock dose not need to know it to have this result. So the idea of 100,000 grams having an aplied force of 9.8 times its mass is only necessary to a being that understands the concept of a number better than quantifying how many offspring it had, or how much food it needs. The same is true with velocity vectors, the result can be made the same regardless of the numbers, because the numbers are quantifying the result, and the result is quantifying the numbers (otherwise the numbers do not truely quantify what it means perceptively to gain that result). This is because numbers are meaningless beyond the experience of their use to quantify something.
To first understand how we quantify our thoughts, we must understand how we categorize a thought. Since its entirely up to the individual, the reciever must dictate how a thought corrolates with another thought, to allow for conotations and denotations (our two main categories for how one sifts through information presented to them). Once the intention is understood, the thought can be labled an entity, name (self-dependent and exo-dependent, meaning this can be determined by the speaker or reciever, dont care if they are words, that is part of my point to the counting system, but to avoid ovderlaps in theory we will assume exo-dependent means the information category is depednant on another part of the speach, or the speaker). To begin we start with the basics: Who?
To know what a reciever is seeing, it must be able to know if they recognize something or someone. To a being having experienced nothing before, comes across a rock face on a cliff, it will recognize it later, key to knowing the environment. But for every object we percieve, there exsists a finite amount of more objects to be viewed and given a name(s)-to avoid arguments over things like specialization vs intelligence of the being. We can assume that if the being is alive, its entire field of perception holds some emptyness to it, meaning there is room to see. For every object seen, there stands to be an infinite amount of information withheld from simply knowing its name, or whom we call something or someone. To really understand the environment, this being must ask more than just who, it needs some way to classify the type of entity (another set of who) so that if it calls a rock a rose, and it sees a similar rock it can know that is part of the rose group (or whatever it dictates to be a rock). This allows for a set of more complex questions which must be fufilled before the being can have any real intelligence to it. It knows who and what, but it doesnt haev a clue for what its food is and what it needs to do in order to survive.
The next step is to assume the being quantified emotions much similar to the way it quantified the categories of thought. These emotions can be used to quntify another experience, time. The question of when did something occur is key to answering the next set of question we have yet to quantify for this being. A change in emotional status, such as hunger or pain, or pleasure and abbundance can be used to relate to them something I will get to later. The being comes to a rock and takes the coarse of a lion to run up it, since the being enjoyed watching a mountain lion pounce from cliff rock to cliff rock, the being attempts it and falls, the change in emotional status changes the way the being percieves rocks, mountain lions and/or itself simultaneously, as the experience is challenging the limited experience of this being, assuming it still yet has to identify itself. For all it knows it could have been a mountain lion, this result begs to differ with its memories. So through a grace of mistake, the being now understands a concept of when, it saw a mountain lion previously, attempted to follow after it in the same fasion, and failed. A chronological order based on emotional memory. I wont go into quantifying emotions as that would be like applying smileys to every number without a simple system.
Because it now understands a process of memories which dictate who, what and when, it can begin to draw a map of its surroundings. The rock was there, I went somewhere else, and the mountain lion is still in its cave. Its not incredibly accurate, its not GPS, but the being knows from its perspective that its below a rock, and a lion. It knows somewhat more about the mountain lion and rock than the earth he landed on, and has somewhate of an idea of where it is. As it explores and experiences more, this expands and its concept for quantifying distances changes with each alteration to the previously quantified thoughts.
Lastly comes why. Why did the being fall where the mountain lion had succeede? For a being with nothing to go by as far as a previous generation, this is quite difficult to answer with one experience alone. It must attempt to follow the behaviors of many other beings to fully understand its place in the ecosystem, and/or know what its purpose is for asking such a stupid question. Its quite obvious to us that if we saw a mountain lion jumping, we wouldnt attempt to do likewise, we'ld shoot it because its been taught to us that the lion is for food. The being will eventually know what is food, after trying everything and anything it comes across (assuming nothing can kill him). So how long will it take the being to achieve a full answer to all of its questions? Thats another set of numbers to be given a meaning.
You're in murky waters, specifically cognition theory. While what you're saying isn't necessarily incorrect, you can't take it as fact that it's the way animals and people really work. The mind is a murky place and there are as many theories as their are researchers. Most modern science takes the brain as a black box, where you can only guess at the input and try and observe the output, with the process in between unknowable (but the end results of whatever process it is knowable). Though this methodology is falling out of favor.
For evey thought, there exsists 5 others to give that thought a meaning.
You mean the who, what, when, where, why of newspaper reporters?
5 Ws? First off, you're missing one. "How". Second, not all knowable data fits so nicely in that framework. It's more an information gathering heuristic to quickly gain a grasp of the situation in a short time (very important for a reporter with a deadline who doesn't care about niggling details). In science settings, it isn't used. Instead, the heuristic is the
scientific method, which is more nebulous in what sort of questions it asks. And in another comparison, statistics doesn't use any of the 5 Ws. It uses yes/no questions like "does this data support the null hypothesis". After these yes/no questions are answered, it asks things like "
how confident am I about my conclusions", which does use the 5Ws, so they do have a place. They're just incomplete.
I guess the point I'm making is that you shouldn't use constructs of language to try and find a fundamental level of information and its processing, because you're already introducing bias (maybe there's information which is an answer to a question that can not be asked in English). You have to delve in to abstract mathematics. Thankfully you aren't the first person to think about this sort of thing, and there's a whole
branch of mathematics that deals with exactly this sort of thing, in its purely abstract form. Doesn't necessarily address how life forms process data, of course, but that's the realm of
cognition science and behavioral biology maybe.
So we can take the total entries in the Encyclopedia Encarta and assume that it contains ALL current and up to date information within its archives.
That's a mighty big assumption. Wikipedia generally scores higher in coverage tests than print encyclopedias, and it still is a far cry from covering all current and up to date information.
For every entry, there will exsist 5 sets of information to explain the concept fully. For every set, there will be an entirely other set of information to be used in the Dictionary to agian relate the 5 sets to themselves. So what is the total amount of information that can possibly be gained from the universe? Well its quite large, so large in fact, that the Universe itself would fit in it multiple times if it were quantified as volume.
You lost me. You're saying that the amount of information in the universe is so large that the universe wouldn't be able to hold an encyclodedia with all that information, because it would be so freakin huge? Possibly, but remember that most information in the universe is probably redundant. Statistical significance can be gained using only a small sample of the global population. Plus data compression is possible for data that is highly patternistic.
n to infinity
C = (E^5*D^25)*L where c is a constant for universal translation, and L is the total number of known languages in use (not including theoreticals).
CB = all known celestial bodies in the Universe
LB = CB/all celestial bodies in the Universe life can exsist.
((((E^5*D^25)^n)*S)*CB*LB)^5)/C = T, where E is Encyclopedia entries, D is Dictionary definitions, and n is the the current thougth in the series, or information (assuming something like the letter a is 1 on the scale), and T is the total number of ideas and S is all the species of the world with more brain power than bacteria. (this is assuming all information catilogued in the Encyclopedia on astrology and astronomy contain full details such as a database for all known stars and systesm). If this assumption holds false, simply add the numbers to E and D before raisng them to the 5th or 25th. Also, I cannot find any information which holds the total information of any alien species, so we cannot fully calculate the entire Universal Unknown, but we can do our very limited perspective.
Leaving aside the pseudoscience, judging what you have on purely mathematical grounds, you can't just make up equations and raise them to various powers. Like why are we raising dictionary definitions to the 25th power? Why not the 24th power? Or the 26th power? Plus what sort of units are you using? You have to make sure that the units of your equations properly balance (ala
Stoichiometry). It looks to me like your variable T will have units of something like entries^5 * definitions^25, which is a bogus unit of measure if I ever saw one.
Also, many of your quantities aren't quantifiable, but are qualitative ideas. For instance "is all the species of the world with more brain power than bacteria" assumes first of all that a quantifiable measure of intelligence is possible (a shady prospect even amoung humans). It also has the obvious bias of calling intelligence brain power, and thus gives preferential treatment to things with brains (but I can overlook that as a colorful use of language, if I need to). Third, you're assuming that bacteria can have no "ideas", or information. But that's clearly not the case. A bacterial genome, for instance, certainly contains a great deal of non-trivial information. In addition, bacteria, like most life, can contain data that lies outside its genome. Called epigenetic information. Plus probably a whole host of other things bacteria can do that I'm not touching on. Forth, it seems that your fundamentally assuming that all information in the universe is an idea that is knowable. Which is certainly a basis for science but I think it's a pretty big if. Maybe there's data that's fundamentally unknowable. Should that count?
To know how much of the total information has been collected (not assuming this information adds to the previous series), we take all the knonw information an individual posses for say 100 people world wide, multiply that by the total population of the earth and divide it by the Total.
100 people is not statistically significant, I don't think.
So while we learn more, the percentage of knonw information grows incredibly slow, while the total number of known things grows exponentialy.
I think you mistyped. Did you mean that the total number of unknown things grows exponentially? If so, what is your basis for this assertion. Your previous made up equation? Something you read? I don't think you can just assume this a priori/from first principles.
we will assume th knonw information average and call it I.
I*(P)/T = k where I is knowledge average (or iq if it suits you, doesnt matter requires same amount of information), P is the Population of the Earth and T is total kowledge.
Again, you can't just make up equations to suit your needs. What proof have you given me that this equation is an accurate depiction of reality? I can make up an equation that would seem to say the exact opposite:
T/P = k
where T is the total information that is possible to know, P is the population of all substances (life, molecules, sentience, etc.) capable of storing and retrieving data, and k is the information constant, which is 1.0 data * holders, to an accuracy of +- 9.24 * 10^-20 data * holders. According to my (entirely fictious, mind you) equation, as the total number of things to know increases, so does the number of things capable of understanding that new information. And the reverse is also true.
The unknown I assume is incalculable, since T is geometric to itself, or a reflection of itself due to the quantifying problem that numbers can overlap and mean differnt things.
incalculable is not a recognized term in science. Did you mean uncountably infinite? Also, in what way can something be geometric (exponential?) to itself? And what is the quantifying problem? I think the core problem here is that you hear people using terms you are not familiar with (countably vs. uncountably infinite, etc etc), and think they are making stuff up, so you can make stuff up to (hint: they are not. You might not know everything, so you should actually ask questions about the terms another person is using if you do not understand them). Botsareus also had this phenomenon as well, I think. But it's not true. If you
do invent a new idea, and try to describe it in words, you need to define it first. Being obtuse and being clever are not the same thing.
Since we know that E^5 is relative to time,
We do? Maybe the body of all known knowledge is constant, and the more we learn the more an alien species somewhere else forgets. We are talking about universal knowledge, here, which may have properties we are unaware. Now, we can take it as a base assumption. But if we do we need to use language like "if we assume that..." instead of "we know that...".
and multiplied by its numerical value, 1 for Who, 2 for What, 3 for When, 4 for Where, and 5 for Why,
Why is "Why" exactly 5 times as much information as "who"? By what reasoning can you make that claim?
as it requires the previous set of series to complete a thought for those with trouble orienting thoughts and what is important. So to remember a face in a room of 5 people for someone with an IQ of 134 should remember in v^5*T/(T-(IQ*T)),
You know my stance on made up equations. And you didn't define v. Is it velocity? That would explain why knowledge is relative
since it would requrie all known knowledge for someone with lower IQs. The time calculated can be found be multiplying the result by IQ, the result then should be in seconds (assuming again, that the IQ provided is 99.9% accurate,
Really? Would the units change from seconds to meters if the IQ provided was only 98% accurate? And why 99.9%, and not, say, 99.89%, or 99.91%? You wouldn't be making up random error thresholds, would you? Because you know my stance on made up equations (I disapprove).
Te = 134*(1^5*5^1 /(5^1-(134*5^1))) = one minute, which is considerably fast for someone with such a low IQ.
You never defined Te. Also, you equation as presented gives an answer of Te = -1.007... seconds. Which is not the same as one minute. Not to mention my disapproval of made up equations (although I do seem to keep mentioning it...)
So we now have a value for how long it is to learn something related to all previous knowledge, which assumes that the more you learn and remember, the harder it is as you progress, or as you collect too much information. That is assuming that all humans have a finite amount of memory, which seems to hold true since we can remember a life time, but often lose the ability to learn as fast.
Indeed, the possible information retention of humans is largely unknown. Old people seem to have a harder time learning new things, but that might be a function of stubbornness instead of limitations of the mind. But I will concede a possible upper bound on the information a single human brain might hold. But I would also point out that with the possibility of
transhuman improvements, the capacity for information retention may not hold constant across all time (or even in the relatively near future).
If you dont want to assume this, then simple take: (AGE) log(base10) IQ/v^5, where the log assumes that there is no limit on information that can be learned, and that the amount of information stored is relative to time.
I fail to see how this equation is derived. Again, made up equations are not cool. What would the Fonze say?
So the amount of total information in the known Universe, could be defined using time, assuming that no information exsists before we apreciate it. Since the definition of time or the answer to when was related to emotional markers, we know that each experience draws on those markers, and thus as time progress the amount of information would thus grow indefinitely.
It's possible that
black holes might be able to destroy information. So it's possible that this could balance increases in information.
(14^(Time-Te)*S*CB*LB^v)/C = T, where 14 is the sum of all numerical values raised to the power of time, or the point at which it is now untilit is discovered, or Time-Te.
You can't raise something to the power of time, without balancing it on the other side of the equation with a similar exponential (though I doubt such an equation would be meaningful), since the stoichiometry won't balance, and won't even be deterministic. Again, made up equations are bad.
Feel free to tweek with it some, and ask me if any of it is too confusing. Enjoy.
Indeed. I am especially confused by your cavalier use of mathematics without the accompanying statistical analysis of real world data or theoretical underpinnings in information theory.