"The future ain't what it used to be."

bill nye trick insight

G

Guest

Guest
for anyone who wishes easy access to the bill nye thread, here it is:

http://xone.net/board/tti/posts/993.html

now, i've read the thread, Lee, and i agree totally. i should have read a little more and i would eventually have gotten to it. now, onto what i really have to say. (note to time travellers: this has nothing to do with time travel! sorry! *grin*)

my insight was actually about the binary counting system. all current mathematical work and theory (calculus, linear algebra, etc etc) is done in base 10. everything we know about math is done in base 10. suppose for a second that someone thought: "hey, wait! if we did things in base 2, could we possibly see things more easily?" suppose, for example, that doing a chain-rule integration in base 10 is actually a quite simple matter when you're doing it in base 2? i have no idea where to go with this thought, but has anyone actually tried to convert advanced math theory to base 2 and looked at the results??

SteelGolem
 
In a "former life" I used to be an Electrical Engineer. We worked a lot with Fourier Transforms as they are very useful for developing electronic filters.

You can do Fourier Transforms in Base 10 by hand. We created analog devices that essentially did this. However, while we were designing analog Fourier Transform devices, others were developing digital Fourier Transform devices using specialty computer chips. These devices only know 1's and 0's, i.e. binary.

The algorithms they used were different than ours, but the results were the same. They were only better because they were tuned for the application. There were no new insights to be gained by doing the math in Base 2.

If you could create a transistor that could attain 10 well-defined states rather than the 2 states that current transistors attain, then you could create a Base 10 computer. This would probably be silly, but a Base 16 computer would be quite useful. You could save a lot more data in a lot less space if you have Base 16 memory units. I don't think you would get any new insights by solving problems with Base 16 memory, just quicker access to more data.
 
Thanks for the compliment.

The chain-rule integration calculation would of course still produce the same value in binary as it would in decimal.

For us humans, decimal makes writing shorter numbers a hell of a lot easier tho doesn't it. When they get so large we go to scientific exponential notation the two sorta come together there huh.
 
Re:Re:bill nye trick insight

At least 10 state transistors would allow us to start a digital circuit design based on decimal heirarchy from scratch huh.

As for the 16 state transistor, to me, when you think about it, we've already accomplished a de-facto equilivent as it is. Every time we make a size breakthrough in technology that reduces a transistor to 1/16th it's former size, (something we've done several times now), we have in effect done just what you suggest here. Not EXACTLY the same thing, but the result is virtually identical. A 16 state transistor would still need 16 circuits to communicate it's combined gate logic in the same manner 16 individual transistors do now.

I doubt at this juncture we could engineer a material that could carry 16 different possible defined voltages along a single circuit at the near molecular level we are now operating at. We can do this with copper wire, but I don't see how we can do it with silicon only a few molecules thick. This is what we would have to do to save any resultant circuit path definitions from the 16 state transistor. Otherwise, how would it be an advantage over what we have now? It isn't the transistors that take up most of the space on a chip, it's the circuit paths. Reducing the number of circuit paths would be great but we're pushing silicon to it's molecular limit now. At this level, I don't see how we can ever design a circuit that can be anything but "on" or "off".

We need a whole new technology here before we're going to get a major breakthrough at this juncture, in my opinion.

But it's a great idea nonetheless.
 
Re:Re:Re:bill nye trick insight

You're right. We will probably only go to systems with more than two states when we move to optical or biological computers.

I'm not up on the latest technology in these areas, but it seems I have heard that they can attain very stable states of more than two.

Then of course there is the Holy Grail of quantum computing where each bit can attain all states simultaneously! <g>
 
Back
Top