Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux? |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
|
05-07-2021, 06:12 AM
|
#1
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
|
IBM's 2nm wafer - thoughts?
https://www.bbc.com/news/technology-57009930
Apparently IBM have made a 2nm wafer (of ICs) in their lab. That is not 7nm, not 5nm, but 2nm.
The breakthrough is remarkable, if it's real. The problem was that a FET consists of an 'N' or 'P' doped conductive channel, with (at 7nm) gate material of the opposite polarity all around the channel. So your N channel FET would have a P doped gate, or vice versa for the P channel devices. There must be complete insulation between gate and channel.
At the point where N meets P, insulation was insufficient, electrons would get through, and all hell would break loose. That problem blocked progress. Everyone was stuck on this issue. If IBM get this on an industrial scale, IBM have leapfrogged generations of devices and it is really huge news.
They are quoting 45% performance increase and 75% less power consumption in comparison to today's devices. In fact, if manufacturing at 2nm was mastered, every chip on the market today would suck, and everything not on 5/7nm would become obsolete. Watch for politics getting in here, as this is a US breakthrough. It offers the chance to the US to lock out Russia & China, until they hack their way in.
Any reflections?
Last edited by business_kid; 05-07-2021 at 06:15 AM.
|
|
|
05-07-2021, 10:06 AM
|
#2
|
LQ Guru
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 6,197
|
Quote:
Originally Posted by business_kid
https://www.bbc.com/news/technology-57009930
Apparently IBM have made a 2nm wafer (of ICs) in their lab. That is not 7nm, not 5nm, but 2nm.
The breakthrough is remarkable, if it's real. The problem was that a FET consists of an 'N' or 'P' doped conductive channel, with (at 7nm) gate material of the opposite polarity all around the channel. So your N channel FET would have a P doped gate, or vice versa for the P channel devices. There must be complete insulation between gate and channel.
At the point where N meets P, insulation was insufficient, electrons would get through, and all hell would break loose. That problem blocked progress. Everyone was stuck on this issue. If IBM get this on an industrial scale, IBM have leapfrogged generations of devices and it is really huge news.
They are quoting 45% performance increase and 75% less power consumption in comparison to today's devices. In fact, if manufacturing at 2nm was mastered, every chip on the market today would suck, and everything not on 5/7nm would become obsolete. Watch for politics getting in here, as this is a US breakthrough. It offers the chance to the US to lock out Russia & China, until they hack their way in.
Any reflections?
|
Phones will get a LOT faster, batteries will get a LOT smaller (because no vendor really wants them to power our devices for much longer, otherwise we would have 1LB cell phones that would last a week on a charge now) and tech will get a lot greener. I expect it to start around the end of 2024, and for me to be able to afford it by 2032. (Assuming of course that I still LIVE in 2032!)
|
|
|
05-07-2021, 10:21 AM
|
#3
|
Senior Member
Registered: Jun 2015
Location: Tucson, AZ USA
Distribution: LMDE 6
Posts: 1,244
|
Quote:
Watch for politics getting in here, as this is a US breakthrough.
|
This part concerns me. Not the politics part, just that it is a US breakthrough. We have been behind on semiconductor development for awhile now. Even US companies do all their work overseas. All of the sudden when US influence is getting shaky now we have this massive jump that puts us on top? That alone gives me pause about the reality of it. Sadly every world power uses propaganda that usually has no real substance. It's just not as obvious as others sometimes.
Last edited by jmgibson1981; 05-07-2021 at 10:24 AM.
|
|
|
05-07-2021, 12:35 PM
|
#4
|
LQ Guru
Registered: Sep 2011
Location: Upper Hale, Surrey/Hants Border, UK
Distribution: One main distro, & some smaller ones casually.
Posts: 5,891
Rep: 
|
If true, might make 'wearables' more of a reality, but do I want them.....I don't think so.
A nice small quiet computer that can do all my daily tasks is all I need - that's why I've been using Raspberry Pi 4 series SBC's lately.
(Along with my silent 4" square desktop box.) 
Last edited by fatmac; 05-07-2021 at 12:36 PM.
|
|
|
05-07-2021, 05:25 PM
|
#5
|
LQ Guru
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 6,197
|
IBM does research and business, not manufacture chips. They will license this to one or more chip makers, and they may or may not all be in the USA.
Other researchers working on the problem may come out with alternate ways to make chips in that scale within a year or two.
If they can figure out how to steal the tech, China will come out with their version soon after.
The human has a proven history at suckage when it comes to keeping djinn in bottles!
|
|
|
05-08-2021, 05:36 AM
|
#6
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
Original Poster
|
IBM over engineers everything. It reminds me of the joke
Quote:
Elephant, n: A mouse built to Government specifications.
|
IBM is in the elephant-building league. Personally I find it difficult to imagine them bringing a wafer fab machine to market.
@fatmac: RazPis are fine until you want the range of software. Anything computationally or graphics-intensive isn't really there on Arm yet. 64bit has less software than 32. There's a (Arm-based) Qualcomm Snapdragon 865 out with fairly decent specs, but the average cpu has ~2Ghz, which makes lousy single threaded performance. I believe the RazPi can be overclocked. I have read you can go to 2.2Ghz without changes which I intend to try; actually, I was going to stop at 2.0Ghz. Something like FreeCad would crawl.
|
|
|
05-09-2021, 05:16 AM
|
#7
|
Member
Registered: Jun 2020
Posts: 614
Rep: 
|
According to what I read on the AnandTech coverage of this, IBM isn't building a new fab machine, they're using equipment from ASML and just perfecting manufacturing techniques, and actual gate sizes on this are probably like 12nm, with some individual feature maybe hitting 2nm here or there (and basically all FinFET are not 'really' whatever their stated pitch is - as in 7nm is not all 7nm transistors, and so on, because once they moved to 3D land it all became "equivalent node" because they're just trying to shorthand for density).
You can read more here:
https://www.anandtech.com/show/16656...first-2nm-chip (has a table breaking it all out by transistor density vs marketing nanometers)
This is also apparently the result of many years of work, not just an 'all of a sudden' thing, and while IBM is the 'face' of it, it is apparently the result of considerable collaboration between Intel, IBM, GloFo, ASML, and so forth. I agree with wpeckham as well - it is probably many years before this actually sees the light of day in a real product, and it will likely go through various other revisions and so forth as it is scaled to production. It'll be interesting to see who gets there first in terms of a production chip and/or if GloFo will be manufacturing on this node (if I remember right they threw in the towel on 10nm and related some time ago, but with demand going through the roof recently maybe they'll change course).
|
|
|
05-09-2021, 09:00 AM
|
#8
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
Original Poster
|
Yeah I know 2nm fab doesn't have 2nm Fab all the way. Power components need big wide chunks of silicon. There the emphasis is on getting a channel from open to closed in the shortest time possible. The possibility exists that you have a 50nm transistor channel handling, say, 10Amps. Then as you start turning off, the conducting area shrinks. So the outer core goes off, and the inner core (capable of handling only 3A) is tasked with handling 10A, and it blows. It was a regular problem with power fets.
Regardless of that, the critical problem holding up the best brains in the business was outlined in post #1. With big fab in times past, PN junctions used look atomically like
++++NNNN----, where N is a positive or negative doped piece on silicon where the doping has been cancelled by electrons or ions from silicon of the opposite polarity.
Chemical mixes and doping has been optimised, and we now have
++NN--, but when we try to move to +N-, electrons will shoot through because the insulation is insufficient. Unless it's all smoke and mirrors. From my understanding, Both the Finfet transistor design and chemical optimisation of doping had reached limits. So I can understand perhaps 4nm, but 2nm is way off scale, and would require a world class breakthrough.
Now the only places you need 2nm are CPU & GPU cores, and you're down to 0.8V-1.2V, but you need things to work there. The 'N' area in the middle is proportional to voltage - the bigger the reverse bias, the more NNs there are.
|
|
|
05-12-2021, 10:12 AM
|
#9
|
Member
Registered: Nov 2005
Location: Land of Linux :: Finland
Distribution: Pop!_OS && Windows 10 && Arch Linux
Posts: 832
|
i am always excited about progress related to computers, i dont understand much but i am happy customer 
|
|
|
05-12-2021, 01:32 PM
|
#10
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
Original Poster
|
Well, there will come a point very soon when further miniaturization will be prohibitively expensive, and consequently impossible, or simply bad for business. IBM's "breakthrough" looks like it's too far, too fast. What I need to see is an in-depth technical article or paper explaining how the various challenges were overcome. I had predicted - 4nm if it turns out to be possible.
- 3nm or a fraction, perhaps 3.3nm,if it turns out to be possible.
- 2nm or some fractional decrease, if it turns out to be possible.
- 1nm or some fractional decrease, if it turns out to be possible.
To see them leap frogging 4nm & 3nm makes one suspect a 'smoke & mirrors' exercise.
The percentage decreases are interesting, and the size aimed at is always a gamble. Decreases are usually 20%-30%. 7nm-->5nm = 29%; 5-->4nm = 20%; 4nm --> 3nm = 25%; 3nm --> 2nm = 33%; 2nm --> 1nm = 50%! I expect it to go to fractions soon. I'm not thinking below 1nm, as I don't feel they will get that far.
It's also debatable how much extra speed will come out of whatever transistor shape they find themselves saddled with, as the FinFet will probably have to change to a different shape with different properties. So I was not impressed to see the IBM propaganda talking FinFet @2nm.
Make no mistake, if you can get the world's only fab factory with a couple of machines at 2nm running 24/7/365, you would have the world's biggest profit machine.
EDIT: NOTE TO SELF: You must be boring people silly. Get off your hobby horse and make shorter posts
Last edited by business_kid; 05-12-2021 at 01:35 PM.
|
|
1 members found this post helpful.
|
05-13-2021, 12:53 PM
|
#11
|
Member
Registered: Nov 2005
Location: Land of Linux :: Finland
Distribution: Pop!_OS && Windows 10 && Arch Linux
Posts: 832
|
Quote:
Originally Posted by business_kid
EDIT: NOTE TO SELF: You must be boring people silly. Get off your hobby horse and make shorter posts
|
thanks for the explanation, and no, not too long posts 
|
|
|
05-13-2021, 01:10 PM
|
#12
|
Moderator
Registered: Feb 2003
Location: Arizona, USA
Distribution: Debian, EndeavourOS, OpenSUSE, KDE Neon
Posts: 4,030
|
Quote:
Originally Posted by business_kid
Well, there will come a point very soon when further miniaturization will be prohibitively expensive, and consequently impossible, or simply bad for business. IBM's "breakthrough" looks like it's too far, too fast. What I need to see is an in-depth technical article or paper explaining how the various challenges were overcome. I had predicted - 4nm if it turns out to be possible.
- 3nm or a fraction, perhaps 3.3nm,if it turns out to be possible.
- 2nm or some fractional decrease, if it turns out to be possible.
- 1nm or some fractional decrease, if it turns out to be possible.
To see them leap frogging 4nm & 3nm makes one suspect a 'smoke & mirrors' exercise.
The percentage decreases are interesting, and the size aimed at is always a gamble. Decreases are usually 20%-30%. 7nm-->5nm = 29%; 5-->4nm = 20%; 4nm --> 3nm = 25%; 3nm --> 2nm = 33%; 2nm --> 1nm = 50%! I expect it to go to fractions soon. I'm not thinking below 1nm, as I don't feel they will get that far.
It's also debatable how much extra speed will come out of whatever transistor shape they find themselves saddled with, as the FinFet will probably have to change to a different shape with different properties. So I was not impressed to see the IBM propaganda talking FinFet @2nm.
Make no mistake, if you can get the world's only fab factory with a couple of machines at 2nm running 24/7/365, you would have the world's biggest profit machine.
EDIT: NOTE TO SELF: You must be boring people silly. Get off your hobby horse and make shorter posts
|
Since IBM doesn't actually produce stuff, they really only do the "big" jumps to demonstrate technical feasibility and help their partners with production ramp-up. They would never have done a 4nm, as that's a "refinement" of the 5nm process for most fabs (TSMC's 4n, which I BELIEVE actually is scheduled to go into limited production late this year). As discussed earlier in the thread, due to the 3d nature of modern transistor design for cpu/gpu, this COULD have been called 3nm. However, they get more news by calling it 2nm, without patently lying. Marketing.
The biggest thing from this IMO is that Intel is now an IBM partner (as is Samsung if I recall correctly), so this could help Intel long term help get back on course with their fabs.
Last edited by Timothy Miller; 05-13-2021 at 01:11 PM.
|
|
|
05-14-2021, 04:22 AM
|
#13
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
Original Poster
|
You're very likely to see funny publicity about this stuff. We had this with the 32/64bit changeover and guys trying to convince each other that a Celeron was every bit as good.
FinFet @2nm is impossible, to the best of my knowledge. That's why you have to do a reality check.
|
|
|
05-14-2021, 06:56 AM
|
#14
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,613
Original Poster
|
You're very likely to see funny publicity about this stuff. We had this with the 32/64bit changeover and guys trying to convince each other that a Celeron was every bit as good.
FinFet @2nm is impossible, to the best of my knowledge. That's why you have to do a reality check.
EDIT: Marketing ploys don't actually go far. A Lithography(or wafer fab) reduction must be accompanied by a performance increase =higher frequency at the same current. And for the size reduction to be real, the same device made in a smaller size must consume less current.
The relevant formula is W(atts) = ˝CV˛F. With a size reduction, you reduce the C(apacitance); So you can either increase the F(requency) or reduce the W, the power consumption. What you're suggesting doesn't really change the lithography.
What has me wondering about this is the mechanical elements of tiny fab. Take ball bearings: they theoretically don't wear out because the ball has a film of oil or grease, so the balls never touch the inner or outer tracks. But that film is more than 2nm thick…
|
|
|
05-14-2021, 09:25 AM
|
#15
|
Senior Member
Registered: Dec 2010
Location: California, USA
Distribution: I run my own OS
Posts: 1,061
|
IBM likely has made a real advancement, but they are not saying much about it. Instead, we get the "2nm" marketing term. The nanometer wars are making the gigahertz wars seem honest by comparison.
IBM sold its microelectronics division to Global Foundries years ago. That IBM is still doing silicon process development is somewhat unexpected.
Ed
|
|
|
All times are GMT -5. The time now is 03:03 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|