6-10-06 Programming
Over the course of the past two and a half decades, I've had occasion to work with different programming languages for computers. Different jobs have necessitated learning various disciplines, the use of different tools, and so forth. What I've tended to be involved with has been using this sort of stuff in order to produce something specific like Quality Assurance testing programs, diagnostic software tools for specific products, and so on. In other words, the world of programming was hardly my major interest, just a means to and end.
Computers and programming have come a long way in the past quarter century. We sometimes consider the computers to have a life of their own these days, since they've become so sophisticated. Sooner or later, computers will become so small that we'll be able to make robots out of them that look and act almost like people!
Let's just suppose for a moment that somewhere along the line, robots will be produced that look like people, act like people, and seem to have a life of their own. People are working towards this kind of end product all the time right now, so it isn't outside the realms of future possibility.
These robots will run on software. There will have to be billions of lines of programming code written, and it will have to be stored on some fantastic future technology that stores a lot more information than we're capable of doing today, unless we make the robot's "brain" the size of a house.
But let's just imagine that at some point in the not so distant future, this is going to happen, and we'll be living in a world full of robots.
Then, to go one step further, there could come a time when humans die out, but the robots remain. They continue to function for indefinite time spans. They know how to repair themselves, and they know how to produce new robots. So, there could be a whole society of robots. Eventually, they begin to try to figure out what really makes them tick, so they study their insides, and they gather all the technical information they can, and sooner or later they discover that there are information bits (zeros and ones) stored in their robot brains. There are so many sequences of zeros and ones stored in their brains that it seems impossible to ever be able to make any sense out them.
But they know that it's the zeros and ones, and the sequences those zeros and ones are in, that "make them tick". Eventually, they build computers to sort out the zeros and ones, and they write programs to make sense of these vast quantities of zeros and ones. Then they start messing around with these zeros and ones, and put them into new robots, and observe what (if any) changes this produces. And they spend a lot of time messing around with little pieces of the software, finding that this sequence produces one thing, and that sequence produces another thing, and so on.
Well, that's about where the human race is at right now with DNA. We don't view the DNA as a programming code, but rather as something that "just happened" to evolve into what we see around us. It's all "by chance", so nobody's willing to consider that the genetic "coding" might have originated as a "programming language" at all.
But that's how I look at it.
Our current level of scientific technology is messing around with genetic coding, completely unwilling to believe or even consider that it's a programming language, and therefore just basically blundering around under the dogmatic view that it's all just "hardware". This would be akin to the robots working on their own software programming from the same stance, ie- all those zeros and ones "just happened" to work out "by chance" to create the society of robots.
The way a programming language works on a computer is based on the binary electronic status of any "bit". A bit in a computer can either be "on" or "off". This is a hardware electronic state, the place that the bit resides in memory is either on or off. Programming at that level is called "low level programming", where you actually set each bit individually. To make things easier, programming "languages" were developed to make it easier to do more things in less time, otherwise we'd still be trying to program computers to change one dot on the screen from black to white, for instance, one at a time.
"High level programming" languages include tools such as Visual Basic, C language, and so forth. These make it easy to do much more complicated things with the computer.
It's not difficult to look at DNA as being comprised of the "bits" of a low level programming language for the "hardware" of plants and animals, except that this isn't a binary (base 2) programming language at its lowest level. It's at least a trinary system, with at least three (maybe four?) separate chemicals being used to store the "bits". Since we don't look at DNA as a programming language, however, it's not yet determined whether it's a three state system, or a four or five state system. Whatever it might be, though, that would be the low level programming basis for the functions.
If it's a base 3 or 4 system, then the higher level programming language would have to be written to accomodate it that way.
Further, the "hardware" in these systems is chemical based, instead of electronics based, so the manner in which the bits are set is still not coming into focus in this society.
We don't have to speculate on "who wrote the software" or how it all came to be what it is today, in order to view the phenomena of life more objectively, and its most basic physical manifestations in DNA as a programming language that possibly came to exist AS a programming language.
To dogmatically rule such things out upon adherence to the "by chance" belief is to be quite unscientific about it.
Removing that kind of dogma from the applications of science would probably tend to make it all so much more interesting, don't you think?
Meanwhile, our genetic "breakthroughs" continue to filter down to the public like this...
http://www.sciencedaily.com/releases/2006/06/060609170032.htm
...In other words, (yawn)... how boring.
Computers and programming have come a long way in the past quarter century. We sometimes consider the computers to have a life of their own these days, since they've become so sophisticated. Sooner or later, computers will become so small that we'll be able to make robots out of them that look and act almost like people!
Let's just suppose for a moment that somewhere along the line, robots will be produced that look like people, act like people, and seem to have a life of their own. People are working towards this kind of end product all the time right now, so it isn't outside the realms of future possibility.
These robots will run on software. There will have to be billions of lines of programming code written, and it will have to be stored on some fantastic future technology that stores a lot more information than we're capable of doing today, unless we make the robot's "brain" the size of a house.
But let's just imagine that at some point in the not so distant future, this is going to happen, and we'll be living in a world full of robots.
Then, to go one step further, there could come a time when humans die out, but the robots remain. They continue to function for indefinite time spans. They know how to repair themselves, and they know how to produce new robots. So, there could be a whole society of robots. Eventually, they begin to try to figure out what really makes them tick, so they study their insides, and they gather all the technical information they can, and sooner or later they discover that there are information bits (zeros and ones) stored in their robot brains. There are so many sequences of zeros and ones stored in their brains that it seems impossible to ever be able to make any sense out them.
But they know that it's the zeros and ones, and the sequences those zeros and ones are in, that "make them tick". Eventually, they build computers to sort out the zeros and ones, and they write programs to make sense of these vast quantities of zeros and ones. Then they start messing around with these zeros and ones, and put them into new robots, and observe what (if any) changes this produces. And they spend a lot of time messing around with little pieces of the software, finding that this sequence produces one thing, and that sequence produces another thing, and so on.
Well, that's about where the human race is at right now with DNA. We don't view the DNA as a programming code, but rather as something that "just happened" to evolve into what we see around us. It's all "by chance", so nobody's willing to consider that the genetic "coding" might have originated as a "programming language" at all.
But that's how I look at it.
Our current level of scientific technology is messing around with genetic coding, completely unwilling to believe or even consider that it's a programming language, and therefore just basically blundering around under the dogmatic view that it's all just "hardware". This would be akin to the robots working on their own software programming from the same stance, ie- all those zeros and ones "just happened" to work out "by chance" to create the society of robots.
The way a programming language works on a computer is based on the binary electronic status of any "bit". A bit in a computer can either be "on" or "off". This is a hardware electronic state, the place that the bit resides in memory is either on or off. Programming at that level is called "low level programming", where you actually set each bit individually. To make things easier, programming "languages" were developed to make it easier to do more things in less time, otherwise we'd still be trying to program computers to change one dot on the screen from black to white, for instance, one at a time.
"High level programming" languages include tools such as Visual Basic, C language, and so forth. These make it easy to do much more complicated things with the computer.
It's not difficult to look at DNA as being comprised of the "bits" of a low level programming language for the "hardware" of plants and animals, except that this isn't a binary (base 2) programming language at its lowest level. It's at least a trinary system, with at least three (maybe four?) separate chemicals being used to store the "bits". Since we don't look at DNA as a programming language, however, it's not yet determined whether it's a three state system, or a four or five state system. Whatever it might be, though, that would be the low level programming basis for the functions.
If it's a base 3 or 4 system, then the higher level programming language would have to be written to accomodate it that way.
Further, the "hardware" in these systems is chemical based, instead of electronics based, so the manner in which the bits are set is still not coming into focus in this society.
We don't have to speculate on "who wrote the software" or how it all came to be what it is today, in order to view the phenomena of life more objectively, and its most basic physical manifestations in DNA as a programming language that possibly came to exist AS a programming language.
To dogmatically rule such things out upon adherence to the "by chance" belief is to be quite unscientific about it.
Removing that kind of dogma from the applications of science would probably tend to make it all so much more interesting, don't you think?
Meanwhile, our genetic "breakthroughs" continue to filter down to the public like this...
http://www.sciencedaily.com/releases/2006/06/060609170032.htm
...In other words, (yawn)... how boring.
0 Comments:
Post a Comment
<< Home