LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Use GPU as CPU, big data minning. (https://www.linuxquestions.org/questions/linux-newbie-8/use-gpu-as-cpu-big-data-minning-4175656848/)

samir.18 07-04-2019 05:11 AM

Use GPU as CPU, big data minning.
 
I have read in some places that you can configure Linux to use the graphics card to process data as if it were a processor.

My goal of this post is to know if that is more or less easy for someone of my level, I am not a programmer but I am say that "stuck, amateur", I have modified some open source programs, made additions, and also use a lot of Excel to make complex calculation sheets of many data, that is my level more or less, I also work with computers all day, but important, there are many things that I do not know about programming and what I do not know because I know it. Many times I manage to fix a problem by changing things that I do not know because they are solved, I do not know some basic things like defining an Array, I only know what they are used for but if they ask me to explain what it is, I have no idea.

Going back to the main thing:

Use Windows 10 64 bits,

And I'm pretty short on computing power, I have an Intel core i7 8700 and sometimes I have it for weeks at 100% usage use calculating statistical data. And I have a video card that I do not use, mid-range (modern and powerful for graphics).
So, I want to experiment more or less to see if the calculation capacity can be increased a little.

Is it something easy for someone of my level to do? Is there any tutorial or documentation to more or less have an idea on the subject? Or projects that have been made or have been configured?

I would be very grateful that they could pass the links of some projects (if any).
It is worth mentioning that I have never used Linux in my life, once when I was a child I installed it on a computer to see what it was like, but nothing else, I do not have the slightest idea about Linux.

Thank you very much and greetings.

BW-userx 07-04-2019 08:01 AM

that is a good question. WIki says,
Code:

Any language that allows the code running on the CPU to poll a GPU shader for return values,
can create a GPGPU framework.

source. -> WIKI-WiKi-wIkI

2. https://www.lifewire.com/graphics-ca...raphics-834089

It looks to be just like apps taking advantage of multi CPU's, it needs to be coded to do so.

dc.901 07-04-2019 09:12 AM

+1 BW-userx; if your application does not support using GPU then not much you can do (except get a machine with multiple CPUs for higher CPU threads).
What type of GPU?

TB0ne 07-04-2019 10:27 AM

Quote:

Originally Posted by samir.18 (Post 6011817)
I have read in some places that you can configure Linux to use the graphics card to process data as if it were a processor. My goal of this post is to know if that is more or less easy for someone of my level, I am not a programmer but I am say that "stuck, amateur", I have modified some open source programs, made additions, and also use a lot of Excel to make complex calculation sheets of many data, that is my level more or less, I also work with computers all day, but important, there are many things that I do not know about programming and what I do not know because I know it. Many times I manage to fix a problem by changing things that I do not know because they are solved, I do not know some basic things like defining an Array, I only know what they are used for but if they ask me to explain what it is, I have no idea.

Going back to the main thing: Use Windows 10 64 bits,

And I'm pretty short on computing power, I have an Intel core i7 8700 and sometimes I have it for weeks at 100% usage use calculating statistical data. And I have a video card that I do not use, mid-range (modern and powerful for graphics). So, I want to experiment more or less to see if the calculation capacity can be increased a little.

Is it something easy for someone of my level to do? Is there any tutorial or documentation to more or less have an idea on the subject? Or projects that have been made or have been configured? I would be very grateful that they could pass the links of some projects (if any). It is worth mentioning that I have never used Linux in my life, once when I was a child I installed it on a computer to see what it was like, but nothing else, I do not have the slightest idea about Linux.

Sorry, but I don't think there's much anyone on a forum is going to be able to help you with. Sounds like you're asking us to teach you programming from the ground up, which is a fairly complex subject. You don't say what language(s) you want to learn, and doing things in Excel doesn't help much. Add to that the fact you're NOT using Linux right now, means that asking on a Linux forum won't get you much help in Windows.

Short answer: yes you can.
Longer answer: There are many projects that use GPU's to crunch numbers...and you can find them yourself with a Google search. Read the "Question Guidelines"...we're happy to help with specific questions, but asking us to look things up for you isn't a good thing.

Linux is an operating system, just like Windows or MacOS. Except that all the things you pay for with the other two, are free in Linux, including compilers, editors, debugging tools, and a LOT of professional-level advice. If you've NEVER programmed before, you will be in WAY over your head if you start with a project like this. It would be like saying "I've never driven a car before, but have started one in a driveway one time. I'd like to enter a professional race, where can I sign up?" Choose a programming language and start to follow some tutorials, look at sample code, and experiment. If you don't want to run Linux right now fully, install Virtualbox on your Windows system, and run Linux in a virtual machine.

Lysander666 07-04-2019 11:11 AM

Quote:

Originally Posted by TB0ne (Post 6011910)
It would be like saying "I've never driven a car before, but have started one in a driveway one time. I'd like to enter a professional race, where can I sign up?"

Or, "I've never made a proper dessert before but I've had strawberries and ice cream. How do I make a Riz à l'impératrice?"

dugan 07-04-2019 11:15 AM

You're thinking of compute shaders.

Quote:

Originally Posted by samir.18 (Post 6011817)
Is it something easy for someone of my level to do?

No.

Quote:

Is there any tutorial or documentation to more or less have an idea on the subject?
Absolutely. Here's one for a widely used library (NVidia only, I think) to write programs to do that:

http://supercomputingblog.com/cuda-tutorials/

There are online courses you can take in this stuff.

X-LFS-2010 07-05-2019 01:29 PM

get back to me when your finished writing your custom "use GPU not CPU" driver in 100 years

btw: you have about 1-2 years to develop market and COLLECT SALES, because beyond that the big cheese manufacturers will have changed all the standards and your code and your model will be f'ed and you'll have to do significant re-writes

good luck!

hey btw: new Intel gen 9 chips have huge computing power and a very fast bus to it's built-in gpu (which, btw, professional softwares ALREADY use for math acceleration and other uses in win10 and imac sofware releases - perhaps they are using the free "intel comiler" not CLANG). amd and nvidia "separate card" gpu are so faster ... but often software misses the fact that 32 CPU presents no real need to cross the graphics bus (pcie physical bus) unless you have a card and it's time to render.

just things to think about: todays "use only cpu, use only gpu, gpu can't be in the cpu" ... these are all things than can be vastly different just 5 years from now


All times are GMT -5. The time now is 08:31 PM.