GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
I think you're confusing Moore's Law with "House's Law": https://en.wikipedia.org/wiki/Moore's_law
However, I think you're right that performance may not be growing as quickly as it was though whether this is because more complex processors (multi-core, ever-growing instructions sets, larger registers) are being produced or why those processors are being produced I do not know.
As to hard drives -- if you look at prices for SSDs (where the money is being spent) they're getting cheaper very quickly. I think spinning rust is slowly being edged out. Though, you're right, spinning rust does seem to be coming to the edge of what is possible without using horrible cludges (IMO) like shingling: https://en.wikipedia.org/wiki/Shingl...etic_recording
To me progress is just taking other forms than "doubling x every y" -- things are becoming more complex.
In terms of CPU power a couple things affected those numbers for the last 4-5 years. One is simply the way we compute now that so much processing is handed off to GPU these days. We also did slam into a wall of physics at around the 9-11nm scale, limiting just how much power can fit in how much space at what cost and with what thermal considerations. IBM's breakthrough to stable chips at not 9, not 8 but SEVEN 7 !! nm will fuel he progression for a few years, especially since this was achieved through the most fundamental change, the materials composition. In this case, oddly enough, mixing in some Germanium with the Silicon was the cause.
Additionally the next to slowest component after us behind the keyboard is hard drive throughput and really we have just begun to tweak SSDs. On top of all that we can't forget what a mover and shaker is Mobile computing which, despite "authorities" verdict, should include tablets and smartphones. There are presently in excess of 10,000,000,000 handheld device mobile contracts, many with multiple devices on one subscription, and it would be difficult to overestimate the motivation that represents. So hang on to your hats, the ride ain't over yet.
It's worthy of note that we passed the point where the speed of electron flow (roughly the speed of light) had to be considered in the timing between longer and shorter copper traces at circuit board level, somewhere around the Pentium II.
Do you really expect the power of computers to continue increasing at exponential rates indefinitely? Computers are no different than any other technology. Eventually limits are reached. That technology then continues to be used with few or no changes, until a different technology is discovered or becomes feasible to use. Imagine the horror. A computer bought today may not be obsolete in ten years!
Do you really expect the power of computers to continue increasing at exponential rates indefinitely? Computers are no different than any other technology. Eventually limits are reached. That technology then continues to be used with few or no changes, until a different technology is discovered or becomes feasible to use. Imagine the horror. A computer bought today may not be obsolete in ten years!
Assuming this was directed to me, or for that matter at anyone posting so far. I don't see where you got that idea. I don't see that anyone said anything that could be construed to mean they naively thought that this unending expansion could continue indfefinitely, nor that it should or needs to. I suppose their are some fields in which having a computer of the power of say "Big Blue" in a wrist watch but don't know what that would be, as ideally the tool needs to match the job. The old cliche about "hunting fleas with elephant guns" comes to mind. Certainly one of the reasons that Desktop PC sales have fallen so far is that people perceive theirs is "powerful enough".
Not only are limits reached in capability, limits are reached in application. The overwhelming majority of automobile drivers don't do, nor can do, anything requiring 500 horsepower or more, with the single possible exception of picking up young girls The most common usage of Desktops surfing the web, watching some movies or clip, gathering email can easily get by with 2GHz single core CPU, a gig of ram, and a $50 video card. Anything beyond that level of hardware for those applications applications is just luxury and moreso everyday with the advent of more "push", "thin client", or "cloud" computing comes of age.
Desktops no longer drive the hardware end with the exception of gaming on Windows but that won't prevent manufacturers from taking advantage of Moore's Law continuing for another 10-20 years, IMHO, not to mention that along with increased power to do work, smaller size means less power consumed and less heat generated so the application will most certainly be found in tablets and smartphones. I'm betting we will all live to see 3GHz CPU w/ 32GB RAM in a smartphone.
I purposefully used catch phrases that spans roughly 20 years because this is the period in which the wide adoption of broadband took place, effecting this limitation on client-side hardware needs. I have an ancient Sony laptop sporting a p2 400MHz CPU, maxed out at 256MB ram, running Slackware 13.37 that while somewhat painful to watch boot, once the system is loaded, behaves quite serenely for mail, forums, surf. It really only lacks in the ram and video department to be able to add multimedia desire fulfillment. Routinely I have gotten 10-15 years out of my PCs.
This leveling off is also to blame for so many (still stuck in Windows) still using XP or wishing it was going to be supported indefinitely. People look at expansive hardware requirements, calculate that they will need to buy hardware in addition to "upgrading" the OpSys and wonder why they are being forced to leave a comfortable, familiar environment that does all they perceive they need to do and say "No!" or upgrade only begrudgingly, which generally accounts for increasing complaints and why MS decided it was smart business (to pretend) to offer Win 10 for free.
The only exceptions I see to this are intense business applications like CAD/CAM or entertainment vis a vis gaming. Even in gaming, if one buys smart, one can easily get 10+ years out of a given PC platform. I'd likely still be using a P2 PC today were it not for low levels of RAM support and to a lesser degree the paucity of 8x AGP video cards, since even given those hardware upgrade limitations, I could play AAA game titles at passable rates up until 2008 on a 440BX chipset w/ a release date of 1998. It just took a few relatively inexpensive hardware mods/addons to do it. Among them was a Coppermine capable slotket, IDE accelerator card, and a modified BIOS for fulltime 8x AGP operation. I gave that system to a relative and got an i845 chipset board and did similar upgrades to bring it up to gaming par long past it's intended years.
If you think about it we hadn't maxed out AGP at 8X when 12X was achievable, although the move to series-based peripherals was a very smart move. Even 64bit computing has nowhere reached it's potential, partly due to hardware (the advent of multi-core) and the difficulty of coding in 64bit compared to 32bit. I'm typing this from a dual-core 64 bit CPU w/ 8GB RAM but running on my main OpSys, 32bit Slackware with a custom realtime, low-latency kernel for DAW work and can run current AAA title games at far above passable rates even with the graphics cranked up.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
Some things mentioned, like a wristwatch sized supercomputer, got me to thinking that what we have at the moment is computers which are, generally, more powerful than and as cheap as we need for the tasks we put them to but (as home users at least) not powerful enough for many of the things we'd like them to do.
For example, I would love an actual robot pet -- I don't mean an Aibo but a real dog-like or cat-like thing able to learn.
Or, more practically, robots which could autonomously drive, find people trapped in rubble, fly low-flying taxis, really model the globe's atmospheric conditions in real-time or many other things are the stuff of dreams, and nightmares, but we still seem as far off many of these things as we were 20 or 30 years ago. I am sure most, if not all, will come but they're a long way off the mainstream that iPhones (as an example) are at the moment.
Assuming this was directed to me, or for that matter at anyone posting so far.
At the OP, because is seams clear to me that that person expects continuous exponential increases in computer power and storage space. That time is over. We have reached the practical limit. There will continue to be occasional small improvements, but the next leap will come with a new type of technology that makes the current computer obsolete. When and what? We need to wait and see.
Last edited by Randicus Draco Albus; 08-13-2015 at 05:46 PM.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
Quote:
Originally Posted by Randicus Draco Albus
At the OP, because is seams clear to me that that person expects continuous exponential increases in computer power and storage space. That time is over. We have reached the practical limit. There will continue to be occasional small improvements, but the next leap will come with a new type of technology that makes the current computer obsolete. When and what? We need to wait and see.
That's not how I red the OP at all. In fact, I read from the OP that it seems that Moore's Law has come to an end. That is pretty much agreeing with you.
I though that was the whole point of this thread? To discuss whether Moore's Law (or House's Law etc.) were coming to an end for current processes or not?
That's not how I red the OP at all. In fact, I read from the OP that it seems that Moore's Law has come to an end. That is pretty much agreeing with you.
I though that was the whole point of this thread? To discuss whether Moore's Law (or House's Law etc.) were coming to an end for current processes or not?
You are probably correct.
In which case I look pretty foolish.
Aside from the transitory emotional experience, blushing is a good thing. It generally signals a surprise or wakeup call that denotes an improved future. I was rather shocked that I recently endured a minor stroke and embarrassed to discover the degree of rationalization I allowed to avoid updating my diet and exercise. Geez only 8 years ago I had a damned 6-pack! and wore pants of the same waiste size I wore in college ) which somehow let me ignore the fact that in 8 years it went from 31 to 40. Everybody guilty 'round here.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
Quote:
Originally Posted by Randicus Draco Albus
You are probably correct.
In which case I look pretty foolish.
No worries, it's hard to pick up on the exact meaning of the written word . It seems many of us are thinking along similar lines here.
Let's hope the near future brings something exciting.
Moore's law was a statement by a scientist about the number of transistors per spatial region doubling every 2 years.
It was a good, and probably valid statement, but it's not a scientific, proven law. And he wrote that in 1965. Same year Gold's Gym was founded. Same year I was founded. That's why I tend to remember stuff that touches my life that is related to my birth year.
So back to the OP:
What does 20-30% faster CPU, the speed of GPU/video cards, and hard drive capacity have to do with Gordon Moore's learned paper on transistors?
Note also that Moore made a statement both about the past and how it had proceeded and the proceeded to project that this would continue for some finite duration, one decades, although he revised this statement, probably once that decade passed and transistor based semiconductor development continued. So we are now 5 decades past his observation.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.