This is topic The next step in computing in forum Open Discussions About Writing at Hatrack River Writers Workshop.


To visit this topic, use this URL:
http://www.hatrack.com/ubb/writers/ultimatebb.php?ubb=get_topic;f=1;t=002788

Posted by Matt Lust (Member # 3031) on :
 
Moore's Law is still holding true after 40 years

New Step in Computing link
 


Posted by Lord Darkstorm (Member # 1610) on :
 
I find it strang that Intel manages to decrease the transistor size, but the power requirements and heat produce still continue to climb.

The intel paradox..hehe.

Actually, I don't have a link, but the next real leap in computers will be optical. Since reducing the size of a transistor down to non existance is nice, but what would you do with a computer running at light speed?

With the discovery of a process to not only stop light and start it again, that light can have embeded information in it. Some incredible possibilities if they can make it small enough...which I'm sure they will.
 


Posted by Robert Nowall (Member # 2764) on :
 
Most people---me certainly---only know that "you drag the arrow to this icon, and click, and something happens...you do something else and something else happens." My new computer is faster than my old computer. And right now, I have to figure out what I did today that made all the typefaces here (and elsewhere) suddenly go up a few sizes, and then undo it.

Some of the advances in computers (and many other things) often rest on assumptions of discovery, as, in the above "process to stop light and start it again." I don't know if this particular discovery has been made or not, but I know of some that haven't been made, and may never be made. (The extravagant science-fictiony details of nanotechnology come to mind.)

And, also, there are any number of stories that assume computers of the future will operate much like computers of the present...something I find unlikely. But others have made that kind of mistake with other technologies---classic SF stories set in the (still) far future where elevators have operators and still-publishing newspapers have multiple editions...
 


Posted by Survivor (Member # 213) on :
 
quote:
And right now, I have to figure out what I did today that made all the typefaces here (and elsewhere) suddenly go up a few sizes, and then undo it.

If you're talking about sites displayed by your internet browser, go to View|Text Size.

Actually, the original rule expressed by Moore was that computing power would double every three years. They kept revising it downwards for a while, then settled on 18 months "all else being equal" or whatever. There's a lesson in that, I think.
 


Posted by Matt Lust (Member # 3031) on :
 
I like that "or whatever"

It sums up my general feelings towards science (of any kind) without a large applied use.
 


Posted by nimnix (Member # 2937) on :
 
I've accidentally changed text sizes a few times, since pressing [CTRL] and scrolling the mousewheel up (smaller) and down (larger) will do it.

Smaller chips/technology improvements = more powerful PDAs = eventually something I can WRITE with that's not a clunky computer and not endless reams of paper. (Of course that assumes that handwriting recognition evolves to the point that it'll recognize MY handwriting, which I can't even read half the time).
 


Posted by Spaceman (Member # 9240) on :
 
quote:
I find it strang that Intel manages to decrease the transistor size, but the power requirements and heat produce still continue to climb.

It isn't just Intel, but the entire industry, including some companies who aren't on speaking terms with Intel.

The fact of the matter is that the smaller transistors do consume less power, but remember that with smaller transistors comes more transistors in a given piece of real estate. The critical dimension of the shrinking transistor gate is a linear function, where the number of transistors is a function of area, or a quadratic relationship.

No paradox.
 


Posted by Kolona (Member # 1438) on :
 
Oooo...cool, nimnix.

 
Posted by Spaceman (Member # 9240) on :
 
quote:
the next real leap in computers will be optical. Since reducing the size of a transistor down to non existance is nice, but what would you do with a computer running at light speed

I disagree. Light speed is already a very significant factor in chip design becaue the things are getting so small that difference in the lengths of metal lines matter. Most modern chips have at least one clock, several clocks on one chip is not uncommon. Those clock pulses must arrive at certain places at certain times, and electical current (not the electrons) travels at the speed of light.

The next breakthrough is probably going in one of two directions: a) using carbon nanotubes to create transistors, or b) exploiting quantum effects.
 


Posted by Robert Nowall (Member # 2764) on :
 
That did it, though I had to go offline twice and come back on to get it back to what I wanted (or what I'm used to). Why it reset itself is a mystery...I suspect it has something to do with this new ergonomic keyboard, which has an odd "zoom" key in the middle of everything.

Of course this illustrates my point: that the Average Joe of One Hundred Years Hence (or, in this case, Me of Right Now) is only going to know a certain amount about the technology he uses. The computers will change...this will not. (Ask yourself: what do you really know about, say, your phone? How does it work? How do you work it? Did they do it that way a hundred years ago? Will they do it this way a hundred years from now?)
 


Posted by Survivor (Member # 213) on :
 
Since I have the technical/theoretical knowledge to build basically any human technology I currently use from scratch (given the time and resources, of course, which is a huge "given"), I can't really speak to this issue. But it is a useful POV device, I'll admit that much
 
Posted by yanos (Member # 1831) on :
 
I'm pretty sure Survivor is not an average joe. For most people in life we know what we need to know about technology. For others, like me, we know what we want to know.

Remember that technology is driven by other factors too. Think of need. What is your world like in 100 years, geographically, politically, economically... This are all driving factors in determining where technology ends up. The computer does not determine whether I will have food on the table, but how much money I have will determine whether I buy that new computer.
 


Posted by Robert Nowall (Member # 2764) on :
 
Having a character displaying unusal technical knowledge isn't without its risks---it's not impossible to have a character with that knowledge, but it risks leaving the reader behind unless handled carefully.

It'd be better to have a character (say, a mail clerk, to cite an example I know something about) bring up something simple(mailing a letter), that everybody knows (mail gets picked up here, gets delivered there). A little specialized information (the machines that sort the mail) could be dropped in to give a little extra twist, but not too much (what happens when the machine breaks and what has to be done to repair it.)

(Though I'm a little dubious about anybody being able to build something technically complicated *completely* from scratch---the tools and parts would have to be built to build it, along with the tools and parts to build *them*...)
 


Posted by Spaceman (Member # 9240) on :
 
quote:
The computer does not determine whether I will have food on the table,

That is true, but for employees of Intel and Microsoft, your computer does put food on their tables.
 


Posted by Survivor (Member # 213) on :
 
Like I said, the time and resources is a huge given in that situation. It's much easier for me to repair human technology, but almost anyone can do that.
 


Copyright © 2008 Hatrack River Enterprises Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.


Powered by Infopop Corporation
UBB.classic™ 6.7.2