LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > General
User Name
Password
General This forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!

Notices


Reply
  Search this Thread
Old 10-16-2012, 10:09 AM   #1
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
The coming dark age has begun


http://www.nordichardware.com/news/7...xclusive-.html

Quote:
NordicHardware has seen exclusive information about a new energy law that will apply within the EU. The law requires that both discrete and integrated graphics cards live up to certain energy standards. AMD is worried that this will affect next generation graphics cards and have them barred from sales in the EU.

There are standardizations that make sure pre-built computers, but also discrete components, achieve a certain level of energy efficiency. Exactly how much depends on a row of criteria. These standards also include simple things, such as that after a certain amount of time the computer will enter sleep mode. The idea behind this is to have as energy efficient computers as possible to reduce the overall consumption of energy. The specification for the so called Eco design Lot 3 with the EC can be found here, where there are hundreds of pages to read for those with lots of time to spare.
Soon, they will regulate all other computer components and remove all the power. It will go in line with all the other propaganda. You won't need a powerful computer to access the cloud, just a terminal. You will all be using locked in mobile devices anyway. You won't care about it, right ?

Luckily the US is behind in all this, so when I return I will buy the most powerful system available and keep it safe. I think one day, they will come for it tho.

Anyway, better get used to loosing your power and rights, you will be medieval peasants soon. You won't even eat meat, only the king gets meat.
 
Old 10-16-2012, 10:34 AM   #2
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
Quote:
You won't even eat meat, only the king gets meat.
What, you don't LIKE rat meat?
 
Old 10-16-2012, 10:39 AM   #3
cascade9
Senior Member
 
Registered: Mar 2011
Location: Brisneyland
Distribution: Debian, aptosid
Posts: 3,753

Rep: Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935Reputation: 935
Its about time somebody stopped the endless upward march of GPU TDPs. Seriously. 200watt+ GPUs have been around for a while now, and if you start looking at the big 2 x GPUs 'top end' cards you're looking at 300watt+ TDPs.

The only thing that worries me about this is the artifical limit of 320GB/sec and relating the energy consumption to bandwidth.

Quote:
The commission wants to stop dedicated graphics cards of group G7 from going above 320 GB/s

snip!

Besides that the energy efficiency requirements will be tighter - in this case the energy consumption of the card in relation to its memory bandwidth.
How they will stop next gen (or at worst, the gen after that) cards from going over 320GB/sec is beyond me, and I cant see any point in limiting bandwidth.

I'll be very interested in seeing more info about this as it comes out.
 
Old 10-16-2012, 11:14 AM   #4
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928

Original Poster
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
Quote:
Originally Posted by cascade9 View Post
How they will stop next gen (or at worst, the gen after that) cards from going over 320GB/sec is beyond me, and I cant see any point in limiting bandwidth.

I'll be very interested in seeing more info about this as it comes out.
There is a point to limiting bandwidth, they want to limit how powerful your computer is. They will likely even reduce the limit to something mediocre. It's a complete regression in the computing industry ... as well as in everyday life, a dark age. I don't know how bad it will be, but I won't be sticking around to find out when things get tough. All aspects of the world will change, not for the better.

Quote:
Originally Posted by MensaWater View Post
What, you don't LIKE rat meat?
You'll be lucky to get rat meat.
 
Old 10-16-2012, 11:42 AM   #5
DavidMcCann
LQ Veteran
 
Registered: Jul 2006
Location: London
Distribution: PCLinuxOS, Debian
Posts: 6,140

Rep: Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314Reputation: 2314
The computer I'm using here doesn't even have a graphics card. And when it dies, rather than building another I might buy one of these:
http://www.aleutia.com/
18W maximum power: that's my sort of computer!
 
Old 10-16-2012, 12:50 PM   #6
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Actually they have gotten it all wrong. If you read the articles they link to you can only come to the conclusion that they have serious problems with reading and therefore shouldn't write articles at all.

1. The EU law they want to bring up is not about general power consumption of video-cards, but about idle/standby power. No one is limiting the power draw of these cards under load.
2. The 320GB/sec is the limit where these law is not applicable anymore:
Quote:
Category D desktop computers and integrated desktop
computers meeting all of the following technical parameters are
exempt from the requirements specified in points 1.1.1 and
1.1.2:
(a)
a minimum of six physical cores in the central processing
unit (CPU); and
(b)
discrete GPU(s) providing total frame buffer bandwidths
above 320 GB/s; and
(c)
a minimum 16GB of system memory; and
(d)
a PSU with a rated output power of at least 1000 W.
The next time before repeating such FUD articles here I would recommend to check the sources first.
 
Old 10-17-2012, 06:26 AM   #7
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928

Original Poster
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
The paper is huge, so if you can post the page numbers I will read it.
 
Old 10-17-2012, 06:54 AM   #8
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Actually what you are saying is: I read this article on the net. It was the only one with this content, despite its controversial nature. But I didn't bother to read the sources they gave or at least the comments that are taking that article apart and just repeat what they wrote with an attention begging headline. Now that I am made aware of the fact that this article is FUD I still won't make my homework and just ask instead of reading.

But anyways, look at page 14 of this PDF, which is the actual draft of the law: http://www.eup-network.de/fileadmin/...ect-to-ISC.PDF
 
Old 10-17-2012, 07:28 AM   #9
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928

Original Poster
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
Come on now, people ask me all the time for 'proof' and I do my best. And now when I ask for it, you still give me s*** for it. Let's be reasonable now. You didn't even cite your source ... which is plagiarism if we must get technical. I cite my sources.

Anyway, back to the topic. I see that in the very next section, i.e. 1.2 it says 30 months after regulation comes into force. 1.2.2 Exemption in point 1.1.3 is no longer applicable.
http://www.eup-network.de/fileadmin/...ect-to-ISC.PDF


I will remind you that I am not a lawyer, so I don't claim to understand legal documents.

I don't see why it would matter how much power they consume while idle versus while under load. There is a minimum clock speed that they can run at, and that connects both. If you limit the idle power you will limit the maximum power.
 
Old 10-17-2012, 09:59 AM   #10
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Idle power and maximum power are not related in terms of clock speed. CPUs and mobile GPUs shutdown whole parts of the chip if they go idle to save power. I don't see any reason why this shouldn't be able with desktop video chips. Downclocking is simply not enough. Why should the 1000+ shader units in a highend video card idle at lower clock speed? Just shut them down, you won't need so many of them anyways to accelerate you desktop and video decoding isn't done on the shaders also.
This law is not even ratified now and as you already stated, the exemptions are not allowed anymore 30 months after the law is ratified. So the chip-makers have plenty of time to implement some power-saving.
This may even be benificial for Linux users, since it may be possible that this leads at least AMD to release the needed documentation for proper power-saving in the radeon drivers.

The sad thing about this article is that it now is cited as source by other bloggers and even tech magazines, although it is pretty obvious that this is FUD, once you read the sources.
If a coming dark age has begun it is the dark age of bad journalism.

Last edited by TobiSGD; 10-17-2012 at 10:01 AM.
 
Old 10-17-2012, 10:13 AM   #11
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928

Original Poster
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
You still have not clarified what about it is FUD. What you said in post 6 is only true before 30 months. As for shutting down shader units ... that is easier said than done. Currently, the only way to do it is to downclock or to shutdown the card completely and use an integrated card like Optimus does.

Either way, I'm not arguing about what should be done, but what is being done. Why is it FUD ?
 
Old 10-17-2012, 11:07 AM   #12
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
It is FUD because the author stated that the EU want to ban video-cards that exceed that bandwith limit that is given as limit for exemptions. At least they have made today corrections to that article, admitting that they simply were wrong on this. So why is it FUD? Because it is a wrong statement, together with a nice title that is made to draw attention (although your title was even better for this).
So what does it say about the level of professionalism of this author when he has to admit that he was wrong, because it was pointed out to him by his readers, just because they have done the research he should have done in the first place.

Quote:
As for shutting down shader units ... that is easier said than done. Currently, the only way to do it is to downclock or to shutdown the card completely and use an integrated card like Optimus does.
Currently it is the easiest way, not the only possible way. AMD's APUs are already able to shut down unused units:
Quote:
The AMD A6-3410MX employs a number of low-level enhancements, that significantly reduce processor's power consumption. For instance, power gating enhancements allows the APU to save power by switching off idle graphics processing unit, UVD, and/or one or more CPU cores.
http://www.cpu-world.com/CPUs/K10/AM...A6-3410MX.html
It is just a matter what is better for them (in business terms): Selling two video cores instead of one (AMD, Nvidia with future Tegra products), selling a video-chip and let others bother with power-saving (Nvidia on the x86 mobile market) or being forced to actually reduce the power consumption when there is no load on the GPU?
Isn't hard to figure out that actually doing something about power-consumption isn't the decision the shareholders want to see, because it is the least cost efficient option.

Last edited by TobiSGD; 10-17-2012 at 11:08 AM.
 
Old 10-17-2012, 11:31 AM   #13
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928

Original Poster
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
Quote:
Originally Posted by TobiSGD View Post
It is FUD because the author stated that the EU want to ban video-cards that exceed that bandwith limit that is given as limit for exemptions. At least they have made today corrections to that article, admitting that they simply were wrong on this. So why is it FUD? Because it is a wrong statement, together with a nice title that is made to draw attention (although your title was even better for this).
So what does it say about the level of professionalism of this author when he has to admit that he was wrong, because it was pointed out to him by his readers, just because they have done the research he should have done in the first place.
Ok, so it was FUD, but they fixed it.

It doesn't really make sense to me now, why would they ban current graphics cards ? Oh, well, it doesn't matter, I will leave the EU before it comes to be.
 
Old 10-17-2012, 12:04 PM   #14
TobiSGD
Moderator
 
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
Blog Entries: 2

Rep: Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886Reputation: 4886
Quote:
Originally Posted by H_TeXMeX_H View Post
It doesn't really make sense to me now, why would they ban current graphics cards ?
They don't. This law is expected to be ratified in late 2013, more likely will be 2014. More than a year for the chip-makers to come up with something and more than a year for the chip-makers to talk to the people designing this law and telling them that their demands are currently impossible to fulfill and would seriously damage economy in that sector.
 
Old 10-17-2012, 01:58 PM   #15
k3lt01
Senior Member
 
Registered: Feb 2011
Location: Australia
Distribution: Debian Wheezy, Jessie, Sid/Experimental, playing with LFS.
Posts: 2,900

Rep: Reputation: 637Reputation: 637Reputation: 637Reputation: 637Reputation: 637Reputation: 637
Oh me oh my!

Quote:
Originally Posted by H_TeXMeX_H View Post
they will come for it tho.
fud or paranoia.
Quote:
Originally Posted by H_TeXMeX_H View Post
Anyway, better get used to loosing your power and rights,
more fud
Quote:
Originally Posted by H_TeXMeX_H View Post
you will be medieval peasants soon.
more fud
Quote:
Originally Posted by H_TeXMeX_H View Post
You won't even eat meat,
so what
Quote:
Originally Posted by H_TeXMeX_H View Post
only the king gets meat.
and Henry VIII had an incredibly painful condition because of it.
Quote:
Originally Posted by H_TeXMeX_H View Post
as well as in everyday life, a dark age.
Take a sugar pill and get some sleep.
Quote:
Originally Posted by H_TeXMeX_H View Post
I don't know how bad it will be, but I won't be sticking around to find out when things get tough.
What ya gonna do? call Ghostbusters?
Quote:
Originally Posted by H_TeXMeX_H View Post
All aspects of the world will change, not for the better.
You are really on a role aren't you!
Quote:
Originally Posted by H_TeXMeX_H View Post
You'll be lucky to get rat meat.
more fud.
Quote:
Originally Posted by H_TeXMeX_H View Post
Come on now, people ask me all the time for 'proof' and I do my best. And now when I ask for it, you still give me s*** for it. Let's be reasonable now. You didn't even cite your source ... which is plagiarism if we must get technical. I cite my sources.
This is so funny. You actually tell people to do their own research.
Quote:
Originally Posted by H_TeXMeX_H View Post
I will remind you that I am not a lawyer, so I don't claim to understand legal documents.
So maybe instead of getting all hot and bothered you should maybe ask what does this all mean instead of telling us we are going in for a "dark age".
Quote:
Originally Posted by H_TeXMeX_H View Post
You still have not clarified what about it is FUD.
Tobi may not have, but I did! read what is above this little line.
Quote:
Originally Posted by H_TeXMeX_H View Post
Either way, I'm not arguing about what should be done, but what is being done. Why is it FUD ?
It is fud because you do not use reasonable language, instead you resort to doom and gloom (fear, uncertainty and distress) that is what makes it fud.

Last edited by k3lt01; 10-17-2012 at 01:59 PM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Open Source Software is “coming of age”: Accenture LXer Syndicated Linux News 0 09-02-2010 04:10 PM
Anyone had any luck with Dark Age of Camelot ? budword Linux - Software 0 05-06-2008 11:38 PM
Dark Age of Camelot - Cedega brew1brew Linux - Games 1 11-28-2004 05:50 PM
Dark Age of Camelot - Cedega Issues Makaelin Linux - Games 2 11-01-2004 12:03 PM
Dark Age of Camelot in Winex (Cedega) [42]Sanf0rd Linux - Games 2 07-04-2004 11:56 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > General

All times are GMT -5. The time now is 05:44 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration