Techno-Rant Part Duex! Computers Suck!

Techno-Rant Part Duex! Computers Suck!

Joined: August 2nd, 2001, 4:29 pm

May 7th, 2003, 3:45 pm #1

Ol Doc is getting edgy. You see, I needs me a new puter. But I refuse to buy one.

I have a fairly impressive rig. It's a Power Mac G4 533 dual CPU with a gig of ram and a 17 inch Studio Display CRT. Yet, there are some new software titles (Games mostly) that it wont play.

I HATE the computer game. The day you get it home, it's obsolete. With Macs it's a little better, I still have several old Macs running and being quite productive.

And now, silicon is nearing it's death. I would estimate about the year 2010, maybe 2015 at the latest, for silicon to go the way of the dodo. Nanotube technology has scored the ultimate prize, the ability to light.

http://www.siliconstrategies.com/manufa ... 30501S0035

Great stuff really, they will be making mircochips from these pretty soon that will do things we can only dream of today. So, why buy a computer now?

I would like to play games that's why. But could I hold out for a few more years? I am barely aware of the passing of time like normal folk... Most of the time I dunno what day of the week it is.

Games like NWN for the Mac call me, but, the system reqs are OUTRAGEOUS! Ok so the Mac version will run better and smoother then the peecee counterpart, but I don't give a flying flip at a rolling donut. Sim City 4K looks promising, but, at the projected Mac reqs, I might be able to run it, but, not very well.

Before, I never really balked at tossing out top dollar for a good computer. I paid over 10 grand for a Macintosh IIfx when they first came out, people who know will know why I did... It was a whole new beast as far as computing goes... My current machine has several grand invested. But now I balk at the very idea of spending even a dime. Why, just the other day I considered boxing it up and getting rid of it, along with a host of other infernal gadgets that reside in my home. And I was going to box it up too, but, I just had to play a game of Pocket Tanks and blow stuff up.

The Technology Trap. I think we are all screwed.

I am thinking of buying a new eMac though, maybe the 799.00 model because I feel like a cheapskate. Why, when I was younger, and a lot less cranky about these sorts of things, I would have bought the new Macintosh 17 inch powerbook on the day it came out... What a sucker I was.

Any bets on whether or not I could hold out for the new Nanotube computers that are bound to be coming our way?
Quote
Like
Share

Occhi
Occhi

May 7th, 2003, 4:37 pm #2

"In tests conducted by IBM, its CNFET prototypes, even in their first-generation un-optimized state, bested the fastest, highly optimized, next-generation silicon designs on several metrics. For example, they two to four times as much current carrying capability, the company said."

My brain says . . . more current carrying brings with it . . . more heat. Pumping more 'water through the pipe' has energy costs.

So, will they develop heat sinks, economically, that dissipate more heat? Or will we just pay a higher AC bill for this great leap forward?

Or, is it significant that the form of transmission, light waves, makes the current flow increase non-heat rising?

Brain just hit limit switch of microelectronic knowledge, on that one.
Quote
Share

Occhi
Occhi

May 7th, 2003, 4:44 pm #3

Ol Doc is getting edgy. You see, I needs me a new puter. But I refuse to buy one.

I have a fairly impressive rig. It's a Power Mac G4 533 dual CPU with a gig of ram and a 17 inch Studio Display CRT. Yet, there are some new software titles (Games mostly) that it wont play.

I HATE the computer game. The day you get it home, it's obsolete. With Macs it's a little better, I still have several old Macs running and being quite productive.

And now, silicon is nearing it's death. I would estimate about the year 2010, maybe 2015 at the latest, for silicon to go the way of the dodo. Nanotube technology has scored the ultimate prize, the ability to light.

http://www.siliconstrategies.com/manufa ... 30501S0035

Great stuff really, they will be making mircochips from these pretty soon that will do things we can only dream of today. So, why buy a computer now?

I would like to play games that's why. But could I hold out for a few more years? I am barely aware of the passing of time like normal folk... Most of the time I dunno what day of the week it is.

Games like NWN for the Mac call me, but, the system reqs are OUTRAGEOUS! Ok so the Mac version will run better and smoother then the peecee counterpart, but I don't give a flying flip at a rolling donut. Sim City 4K looks promising, but, at the projected Mac reqs, I might be able to run it, but, not very well.

Before, I never really balked at tossing out top dollar for a good computer. I paid over 10 grand for a Macintosh IIfx when they first came out, people who know will know why I did... It was a whole new beast as far as computing goes... My current machine has several grand invested. But now I balk at the very idea of spending even a dime. Why, just the other day I considered boxing it up and getting rid of it, along with a host of other infernal gadgets that reside in my home. And I was going to box it up too, but, I just had to play a game of Pocket Tanks and blow stuff up.

The Technology Trap. I think we are all screwed.

I am thinking of buying a new eMac though, maybe the 799.00 model because I feel like a cheapskate. Why, when I was younger, and a lot less cranky about these sorts of things, I would have bought the new Macintosh 17 inch powerbook on the day it came out... What a sucker I was.

Any bets on whether or not I could hold out for the new Nanotube computers that are bound to be coming our way?
"Extending Moore's Law

"This a step on the road to extend Moore's Law, but now what we need to find out is what kind of performance we can expect from chips built with carbon nanotubes," Theis said. "Right now we don't know what their performance will be for sure." With the new process, IBM will be able to determine within the next few years whether carbon nanotube-based transistors can achieve higher performance than silicon transistors. "If we can prove that they can outperform any future silicon transistor, then IBM will bring the forces to bear to engineer a technology for mass producing them in complex circuits — hopefully a chemical synthetic techniques that grows just the kind of nanotubes that we want, where we want them," said Theis.

Moore's Law is a piker when compared to Gumperson's, Murphy's, or even Boyle's Law and Charle's Law.

In short, Moore's law forgets and ignores that some things often appear to be a linear slope, but end up as asymptotes eventually . . . once you get enough data points plotted.

Quote
Share

Zed
Zed

May 7th, 2003, 5:04 pm #4

Ol Doc is getting edgy. You see, I needs me a new puter. But I refuse to buy one.

I have a fairly impressive rig. It's a Power Mac G4 533 dual CPU with a gig of ram and a 17 inch Studio Display CRT. Yet, there are some new software titles (Games mostly) that it wont play.

I HATE the computer game. The day you get it home, it's obsolete. With Macs it's a little better, I still have several old Macs running and being quite productive.

And now, silicon is nearing it's death. I would estimate about the year 2010, maybe 2015 at the latest, for silicon to go the way of the dodo. Nanotube technology has scored the ultimate prize, the ability to light.

http://www.siliconstrategies.com/manufa ... 30501S0035

Great stuff really, they will be making mircochips from these pretty soon that will do things we can only dream of today. So, why buy a computer now?

I would like to play games that's why. But could I hold out for a few more years? I am barely aware of the passing of time like normal folk... Most of the time I dunno what day of the week it is.

Games like NWN for the Mac call me, but, the system reqs are OUTRAGEOUS! Ok so the Mac version will run better and smoother then the peecee counterpart, but I don't give a flying flip at a rolling donut. Sim City 4K looks promising, but, at the projected Mac reqs, I might be able to run it, but, not very well.

Before, I never really balked at tossing out top dollar for a good computer. I paid over 10 grand for a Macintosh IIfx when they first came out, people who know will know why I did... It was a whole new beast as far as computing goes... My current machine has several grand invested. But now I balk at the very idea of spending even a dime. Why, just the other day I considered boxing it up and getting rid of it, along with a host of other infernal gadgets that reside in my home. And I was going to box it up too, but, I just had to play a game of Pocket Tanks and blow stuff up.

The Technology Trap. I think we are all screwed.

I am thinking of buying a new eMac though, maybe the 799.00 model because I feel like a cheapskate. Why, when I was younger, and a lot less cranky about these sorts of things, I would have bought the new Macintosh 17 inch powerbook on the day it came out... What a sucker I was.

Any bets on whether or not I could hold out for the new Nanotube computers that are bound to be coming our way?
It's going to be a lot more than just a couple years until we see this in mass-produced quantities and at affordable prices. We've been talking about nanotech, writing our names in atoms, and so forth for how many years now? We're not really all that much closer...

I'm not saying we'll have to wait for 2050, but it'll probably be closer to 2025 than 2010.
Quote
Share

Lissa
Lissa

May 7th, 2003, 7:00 pm #5

"In tests conducted by IBM, its CNFET prototypes, even in their first-generation un-optimized state, bested the fastest, highly optimized, next-generation silicon designs on several metrics. For example, they two to four times as much current carrying capability, the company said."

My brain says . . . more current carrying brings with it . . . more heat. Pumping more 'water through the pipe' has energy costs.

So, will they develop heat sinks, economically, that dissipate more heat? Or will we just pay a higher AC bill for this great leap forward?

Or, is it significant that the form of transmission, light waves, makes the current flow increase non-heat rising?

Brain just hit limit switch of microelectronic knowledge, on that one.
That's the answer in truth...diamond is the best heat conductor known to man. It's an unusual material as typically with most materials, if you have a good ability to conduct heat you likewise have a good ability to conduct electricity...not so with diamond, great heat conductor, sucks as an electrical conductor.

Now, if we can just the diamond consortium to stop freaking every time someone wants to make an artificial diamonds and we could get diamond heatsinks going for a fraction of the price.
Quote
Share

Lissa
Lissa

May 7th, 2003, 7:02 pm #6

"Extending Moore's Law

"This a step on the road to extend Moore's Law, but now what we need to find out is what kind of performance we can expect from chips built with carbon nanotubes," Theis said. "Right now we don't know what their performance will be for sure." With the new process, IBM will be able to determine within the next few years whether carbon nanotube-based transistors can achieve higher performance than silicon transistors. "If we can prove that they can outperform any future silicon transistor, then IBM will bring the forces to bear to engineer a technology for mass producing them in complex circuits — hopefully a chemical synthetic techniques that grows just the kind of nanotubes that we want, where we want them," said Theis.

Moore's Law is a piker when compared to Gumperson's, Murphy's, or even Boyle's Law and Charle's Law.

In short, Moore's law forgets and ignores that some things often appear to be a linear slope, but end up as asymptotes eventually . . . once you get enough data points plotted.
...people don't realize what Moore's law is about. Everyone thinks it's about processor speed when in fact, it's about the number of transistors on the chip. The fact that speed has mirrored it pretty well is what brought about that fallacy, but, Moore's law still holds...every 18 months, the number of transistors on the chips have doubled.
Quote
Share

Anonymous
Anonymous

May 7th, 2003, 8:52 pm #7

It reminds me of parbolic functions. Not a straight line, and it reminds me of limits:

At some point, you wont be able to double 'infinity' since you will run into a variety of limits, be it quantum or purely physical, dimensional, limitations.
Quote
Share

Ozymandous
Ozymandous

May 7th, 2003, 9:58 pm #8

Ol Doc is getting edgy. You see, I needs me a new puter. But I refuse to buy one.

I have a fairly impressive rig. It's a Power Mac G4 533 dual CPU with a gig of ram and a 17 inch Studio Display CRT. Yet, there are some new software titles (Games mostly) that it wont play.

I HATE the computer game. The day you get it home, it's obsolete. With Macs it's a little better, I still have several old Macs running and being quite productive.

And now, silicon is nearing it's death. I would estimate about the year 2010, maybe 2015 at the latest, for silicon to go the way of the dodo. Nanotube technology has scored the ultimate prize, the ability to light.

http://www.siliconstrategies.com/manufa ... 30501S0035

Great stuff really, they will be making mircochips from these pretty soon that will do things we can only dream of today. So, why buy a computer now?

I would like to play games that's why. But could I hold out for a few more years? I am barely aware of the passing of time like normal folk... Most of the time I dunno what day of the week it is.

Games like NWN for the Mac call me, but, the system reqs are OUTRAGEOUS! Ok so the Mac version will run better and smoother then the peecee counterpart, but I don't give a flying flip at a rolling donut. Sim City 4K looks promising, but, at the projected Mac reqs, I might be able to run it, but, not very well.

Before, I never really balked at tossing out top dollar for a good computer. I paid over 10 grand for a Macintosh IIfx when they first came out, people who know will know why I did... It was a whole new beast as far as computing goes... My current machine has several grand invested. But now I balk at the very idea of spending even a dime. Why, just the other day I considered boxing it up and getting rid of it, along with a host of other infernal gadgets that reside in my home. And I was going to box it up too, but, I just had to play a game of Pocket Tanks and blow stuff up.

The Technology Trap. I think we are all screwed.

I am thinking of buying a new eMac though, maybe the 799.00 model because I feel like a cheapskate. Why, when I was younger, and a lot less cranky about these sorts of things, I would have bought the new Macintosh 17 inch powerbook on the day it came out... What a sucker I was.

Any bets on whether or not I could hold out for the new Nanotube computers that are bound to be coming our way?
Been there, done that Doc. I used to be a "bleeding edge" hardware freak myself once upon a time, but now? Pfeh, give me a CPU that is affordable and will last me 3 years (virtually any P4 2.X PC should last that long and still be very good) and I am happy. I know you're buying a Mac, but the idea should remain the same, buy a machine that will last you for the next 3-4 years with only modest upgrades (if that) and save the money because the PC you buy now for $2000 will be 1/2 that price in 6 months and no use wasting the money.

As for the cost of buying new hardware every 2-3 years, well that's what donating the 'old' PC to a school, church, etc comes in. As long as the PC isn't 5 years old someone should be able to find a use for it because spreadsheet or word processing applicatins rarely require the amount of horsepower that new games will and do.

One more year and I can donate my P3 1 GHz machine that I bought for $600 to a good cause and get a base new PC that I will put all the "upgraded" parts into instead.
Quote
Share

Joined: March 5th, 2001, 7:01 pm

May 7th, 2003, 10:32 pm #9

It reminds me of parbolic functions. Not a straight line, and it reminds me of limits:

At some point, you wont be able to double 'infinity' since you will run into a variety of limits, be it quantum or purely physical, dimensional, limitations.
Hi,

People get all wrapped around the axle on this one. Moore's "law" was just an observation of the situation in the early '60s when the technology was new and only a few cycles had gone on. It isn't based on anything deeper than that. It's not a "law" it's a rule of thumb.

What has made it fascinating is that it has held up for over 40 years and does not look to be slowing down yet. There have been all sorts of "limits" expected during that period. Limits on the purity of materials, the wavelengths of light sources, the methods of fabrication. And each limit was met and defeated.

Can Moore's law hold forever? Of course not. The game now is to predict how long it will continue to hold and what the limiting factor will be. The players have many ideas, but most agree that it has about three to six cycles left, or five to ten years. Me? I think that there's a bit more life left in it than that -- as Feynman said, "There's lots of room at the bottom."

--Pete
Quote
Like
Share

Nystul
Nystul

May 7th, 2003, 11:52 pm #10

Been there, done that Doc. I used to be a "bleeding edge" hardware freak myself once upon a time, but now? Pfeh, give me a CPU that is affordable and will last me 3 years (virtually any P4 2.X PC should last that long and still be very good) and I am happy. I know you're buying a Mac, but the idea should remain the same, buy a machine that will last you for the next 3-4 years with only modest upgrades (if that) and save the money because the PC you buy now for $2000 will be 1/2 that price in 6 months and no use wasting the money.

As for the cost of buying new hardware every 2-3 years, well that's what donating the 'old' PC to a school, church, etc comes in. As long as the PC isn't 5 years old someone should be able to find a use for it because spreadsheet or word processing applicatins rarely require the amount of horsepower that new games will and do.

One more year and I can donate my P3 1 GHz machine that I bought for $600 to a good cause and get a base new PC that I will put all the "upgraded" parts into instead.
When I built my computer, I tried to look at where the performance jumps were and where the price jumps were, and get the one without the other. Since the whole game changes in 3 months time, I try not to look back and say "What if..." But I am glad that I didn't spend an extra 300 dollars to get the best GeForce3 instead of second best. And I am glad that I did not spend the extra 300 dollars to get the best Athlon or Pentium instead of one a few hundred Mhz slower.

I would dare to say that nobody should get the top-of-the-line computer at any time unless they plan on buying a new computer every year (and in that case, you will always be outperforming the capabilities of your software anyway). Otherwise, it seems more practicle to pay half as much for a system that is a notch below. Then you have the option of upgrading once in between computers, or buying a new computer twice as often.

That also used to be the turnoff for me about Macs. I don't know if it is still the case or not. Mac fans used to always rave to me about how much better a Mac was than a PC of the same speed, but neglected to mention that the Macs costed twice as much.
Quote
Share