The coming dark age has begun
http://www.nordichardware.com/news/7...xclusive-.html
Quote:
Luckily the US is behind in all this, so when I return I will buy the most powerful system available and keep it safe. I think one day, they will come for it tho. Anyway, better get used to loosing your power and rights, you will be medieval peasants soon. You won't even eat meat, only the king gets meat. |
Quote:
|
Its about time somebody stopped the endless upward march of GPU TDPs. Seriously. 200watt+ GPUs have been around for a while now, and if you start looking at the big 2 x GPUs 'top end' cards you're looking at 300watt+ TDPs.
The only thing that worries me about this is the artifical limit of 320GB/sec and relating the energy consumption to bandwidth. Quote:
I'll be very interested in seeing more info about this as it comes out. |
Quote:
Quote:
|
The computer I'm using here doesn't even have a graphics card. And when it dies, rather than building another I might buy one of these:
http://www.aleutia.com/ 18W maximum power: that's my sort of computer! |
Actually they have gotten it all wrong. If you read the articles they link to you can only come to the conclusion that they have serious problems with reading and therefore shouldn't write articles at all.
1. The EU law they want to bring up is not about general power consumption of video-cards, but about idle/standby power. No one is limiting the power draw of these cards under load. 2. The 320GB/sec is the limit where these law is not applicable anymore: Quote:
|
The paper is huge, so if you can post the page numbers I will read it.
|
Actually what you are saying is: I read this article on the net. It was the only one with this content, despite its controversial nature. But I didn't bother to read the sources they gave or at least the comments that are taking that article apart and just repeat what they wrote with an attention begging headline. Now that I am made aware of the fact that this article is FUD I still won't make my homework and just ask instead of reading.
But anyways, look at page 14 of this PDF, which is the actual draft of the law: http://www.eup-network.de/fileadmin/...ect-to-ISC.PDF |
Come on now, people ask me all the time for 'proof' and I do my best. And now when I ask for it, you still give me s*** for it. Let's be reasonable now. You didn't even cite your source ... which is plagiarism if we must get technical. I cite my sources.
Anyway, back to the topic. I see that in the very next section, i.e. 1.2 it says 30 months after regulation comes into force. 1.2.2 Exemption in point 1.1.3 is no longer applicable. http://www.eup-network.de/fileadmin/...ect-to-ISC.PDF I will remind you that I am not a lawyer, so I don't claim to understand legal documents. I don't see why it would matter how much power they consume while idle versus while under load. There is a minimum clock speed that they can run at, and that connects both. If you limit the idle power you will limit the maximum power. |
Idle power and maximum power are not related in terms of clock speed. CPUs and mobile GPUs shutdown whole parts of the chip if they go idle to save power. I don't see any reason why this shouldn't be able with desktop video chips. Downclocking is simply not enough. Why should the 1000+ shader units in a highend video card idle at lower clock speed? Just shut them down, you won't need so many of them anyways to accelerate you desktop and video decoding isn't done on the shaders also.
This law is not even ratified now and as you already stated, the exemptions are not allowed anymore 30 months after the law is ratified. So the chip-makers have plenty of time to implement some power-saving. This may even be benificial for Linux users, since it may be possible that this leads at least AMD to release the needed documentation for proper power-saving in the radeon drivers. The sad thing about this article is that it now is cited as source by other bloggers and even tech magazines, although it is pretty obvious that this is FUD, once you read the sources. If a coming dark age has begun it is the dark age of bad journalism. |
You still have not clarified what about it is FUD. What you said in post 6 is only true before 30 months. As for shutting down shader units ... that is easier said than done. Currently, the only way to do it is to downclock or to shutdown the card completely and use an integrated card like Optimus does.
Either way, I'm not arguing about what should be done, but what is being done. Why is it FUD ? |
It is FUD because the author stated that the EU want to ban video-cards that exceed that bandwith limit that is given as limit for exemptions. At least they have made today corrections to that article, admitting that they simply were wrong on this. So why is it FUD? Because it is a wrong statement, together with a nice title that is made to draw attention (although your title was even better for this).
So what does it say about the level of professionalism of this author when he has to admit that he was wrong, because it was pointed out to him by his readers, just because they have done the research he should have done in the first place. Quote:
Quote:
It is just a matter what is better for them (in business terms): Selling two video cores instead of one (AMD, Nvidia with future Tegra products), selling a video-chip and let others bother with power-saving (Nvidia on the x86 mobile market) or being forced to actually reduce the power consumption when there is no load on the GPU? Isn't hard to figure out that actually doing something about power-consumption isn't the decision the shareholders want to see, because it is the least cost efficient option. |
Quote:
It doesn't really make sense to me now, why would they ban current graphics cards ? Oh, well, it doesn't matter, I will leave the EU before it comes to be. |
Quote:
|
Oh me oh my!
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
|
All times are GMT -5. The time now is 11:45 AM. |