The intel paradox..hehe.
Actually, I don't have a link, but the next real leap in computers will be optical. Since reducing the size of a transistor down to non existance is nice, but what would you do with a computer running at light speed?
With the discovery of a process to not only stop light and start it again, that light can have embeded information in it. Some incredible possibilities if they can make it small enough...which I'm sure they will.
Some of the advances in computers (and many other things) often rest on assumptions of discovery, as, in the above "process to stop light and start it again." I don't know if this particular discovery has been made or not, but I know of some that haven't been made, and may never be made. (The extravagant science-fictiony details of nanotechnology come to mind.)
And, also, there are any number of stories that assume computers of the future will operate much like computers of the present...something I find unlikely. But others have made that kind of mistake with other technologies---classic SF stories set in the (still) far future where elevators have operators and still-publishing newspapers have multiple editions...
quote:
And right now, I have to figure out what I did today that made all the typefaces here (and elsewhere) suddenly go up a few sizes, and then undo it.
If you're talking about sites displayed by your internet browser, go to View|Text Size.
Actually, the original rule expressed by Moore was that computing power would double every three years. They kept revising it downwards for a while, then settled on 18 months "all else being equal" or whatever. There's a lesson in that, I think.
It sums up my general feelings towards science (of any kind) without a large applied use.
Smaller chips/technology improvements = more powerful PDAs = eventually something I can WRITE with that's not a clunky computer and not endless reams of paper. (Of course that assumes that handwriting recognition evolves to the point that it'll recognize MY handwriting, which I can't even read half the time).
quote:
I find it strang that Intel manages to decrease the transistor size, but the power requirements and heat produce still continue to climb.
It isn't just Intel, but the entire industry, including some companies who aren't on speaking terms with Intel.
The fact of the matter is that the smaller transistors do consume less power, but remember that with smaller transistors comes more transistors in a given piece of real estate. The critical dimension of the shrinking transistor gate is a linear function, where the number of transistors is a function of area, or a quadratic relationship.
No paradox.
quote:
the next real leap in computers will be optical. Since reducing the size of a transistor down to non existance is nice, but what would you do with a computer running at light speed
I disagree. Light speed is already a very significant factor in chip design becaue the things are getting so small that difference in the lengths of metal lines matter. Most modern chips have at least one clock, several clocks on one chip is not uncommon. Those clock pulses must arrive at certain places at certain times, and electical current (not the electrons) travels at the speed of light.
The next breakthrough is probably going in one of two directions: a) using carbon nanotubes to create transistors, or b) exploiting quantum effects.
Of course this illustrates my point: that the Average Joe of One Hundred Years Hence (or, in this case, Me of Right Now) is only going to know a certain amount about the technology he uses. The computers will change...this will not. (Ask yourself: what do you really know about, say, your phone? How does it work? How do you work it? Did they do it that way a hundred years ago? Will they do it this way a hundred years from now?)
Remember that technology is driven by other factors too. Think of need. What is your world like in 100 years, geographically, politically, economically... This are all driving factors in determining where technology ends up. The computer does not determine whether I will have food on the table, but how much money I have will determine whether I buy that new computer.
It'd be better to have a character (say, a mail clerk, to cite an example I know something about) bring up something simple(mailing a letter), that everybody knows (mail gets picked up here, gets delivered there). A little specialized information (the machines that sort the mail) could be dropped in to give a little extra twist, but not too much (what happens when the machine breaks and what has to be done to repair it.)
(Though I'm a little dubious about anybody being able to build something technically complicated *completely* from scratch---the tools and parts would have to be built to build it, along with the tools and parts to build *them*...)
quote:
The computer does not determine whether I will have food on the table,
That is true, but for employees of Intel and Microsoft, your computer does put food on their tables.