LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Blogs > DJ Shaji
User Name
Password

Notices


Rate this Entry

Models of Parallel Processing

Posted 10-07-2011 at 08:38 PM by DJ Shaji

Although the human brain works in a similar manner to popular technology - in that it can only process one instruction at a time, it differs starkly from contemporary technology in that the brain itself is divided into thousands of millions of tiny networked processing units, which are specialized to perform a single function, which enable us to perform amazing tasks of pattern recognition and cognitive ability that would otherwise be rendered impossible. There are multiple ways to accomplish this in an artificial setting.

1. To begin with, we could use concurrent processing, more popularly called as multiple threads. Although this would be a unified approach to the problem, it would not be feasible, as there would be issues of data transfer between threads, and as we want the system to evolve, it would be logical to assume that the system would be capable of developing more and more threads as it became more organized, and after a point, the system would no longer be stable.

2. A more feasible approach would be to make the system modular, in that we would divide the system into independent plugins, for the lack of a better word, although the paradigm as it exists currently and relates to the usage of this term in popular software would not be applicable here. Rather, it would be better to say that we would be designing the program as a bunch of smaller programs, and which would be distributed in and according to their purpose and usage in the system; this approach would then tailor the design to the underlying operating system, and so instead of reinventing the wheel, simply reuse the scheduling interface implemented by the operating system itself.

In other words, instead of creating a large monolithic program, we would be designing a bunch of tiny individual programs - neurons, if you will, and then organize them in a heirarchy. Now, the core of the system - the brain, critical sensory interfaces, among other things, would run at a lower nice value and would be protected by setting the appropriate permissions, and with the advent of the modern kernels, we could even have dedicated CPU bandwidth for these processes. To state it simply, we won't make a smart system; the computer will be the system.
Views 1351 Comments 0
« Prev     Main     Next »
Total Comments 0

Comments

 

  



All times are GMT -5. The time now is 03:23 AM.

Main Menu
Advertisement
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration