LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 12-02-2013, 02:26 PM   #1
CamTheSaxMan
Member
 
Registered: Nov 2013
Distribution: Linux Mint 13 Cinnamon Edition 64-bit, Windows 7 Home Premium SP1 64-bit, Arch Linux 32-bit
Posts: 161

Rep: Reputation: 9
Where can I find information and resources on creating GUI programs in Assembly?


I'm currently learning x86 assembly language. I can create simple command-line programs, but I'm looking to expand my skills to graphical applications. Does anyone know where I can find documentation on this? And please don't tell me to use C/C++ or any of that object-oriented crap. I'm interested in assembly.
 
Old 12-02-2013, 05:49 PM   #2
mtsinc
LQ Newbie
 
Registered: Jan 2002
Location: Jacksonville, FL
Distribution: RedHat
Posts: 26

Rep: Reputation: 16
Well, if all I wanted was an excruciating pain in the head, I'd go find a granite wall and smash my noggin against it repeatedly. But then, I got all the assembly language I needed back in my mainframe days where you didn't have any high-level languages capable of doing OS-level programming.

These days, it's quite the opposite, thank goodness. But you're going to have to do a fair amount of looking to find many examples of GUI assembly language programming for Linux because there's rarely any practical justification. All the GUI libraries are themselves written in C or C++ and even the graphics hardware itself is programmed at a high level.

You can certainly code assembly language application logic that calls these functions, same as you'd call any other library function from assembly language code. It's just a matter of stuffing the proper registers and/or stack (depending on what machine's assembly language you're speaking) all done according to the API definitions for the desired GUI library functions.

The days of raw brute force bit-blitting and so forth are long gone. We're no longer jamming pixels into RAM, we're talking to sophisticated graphics co-processing systems which are generally "black boxes" - or at least "No User Serviceable Parts".
 
Old 12-02-2013, 06:41 PM   #3
astrogeek
Moderator
 
Registered: Oct 2008
Distribution: Slackware [64]-X.{0|1|2|37|-current} ::12<=X<=15, FreeBSD_12{.0|.1}
Posts: 6,269
Blog Entries: 24

Rep: Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196
Quote:
Originally Posted by CamTheSaxMan View Post
I'm currently learning x86 assembly language. I can create simple command-line programs, but I'm looking to expand my skills to graphical applications. Does anyone know where I can find documentation on this? And please don't tell me to use C/C++ or any of that object-oriented crap. I'm interested in assembly.
Speaking as an old assembler, you may want to rethink your opinion, and the appropriate uses of C/C++, OOP and other crap...

Assembly language is awesome, you should learn it! But it is not applicable to all problems.

C/C++ and OOP are not simply "alternative" programming languages. They are methods that were developed to allow mortal programmers to cope with growing complexity within a single human lifetime - the complexity of GUI programming being right near the top of the list!

You "could" probably implement a GUI entirely in assembly language, but you would reinvent fire, smelting of metals and quite a number of wheels along the way, if you lived long enough!

If you are having difficulty finding references for this, it is probably because there are none.

So, while I would encourage your learning programming at the assembly level as very worthwhile, I would also suggest that you learn the historical development of programming concepts and the lessons of assembly that led to OOP and the languages that support it as well. They complement each other for very good reasons.

Last edited by astrogeek; 12-02-2013 at 06:45 PM.
 
Old 12-02-2013, 08:19 PM   #4
Kenarkies
Member
 
Registered: Nov 2007
Location: South Australia
Distribution: Ubuntu 11.10
Posts: 81

Rep: Reputation: 23
Just some more (de)motivation

Another good reason to avoid assembler for anything but tutorial practice is that the code is only applicable to one type of processor, and often only to its particular hardware environment. Most likely your application will be obsolete before you finish it. I started to use assembler programming 40 years ago on 8080 systems with 16K memory, which made sense. Everything in the last few decades has been in C (for embedded work) as C is reputed to produce code very close in speed and size to the best that assembler can do. Even OO languages can be coaxed to do as well.
 
Old 12-02-2013, 09:13 PM   #5
CamTheSaxMan
Member
 
Registered: Nov 2013
Distribution: Linux Mint 13 Cinnamon Edition 64-bit, Windows 7 Home Premium SP1 64-bit, Arch Linux 32-bit
Posts: 161

Original Poster
Rep: Reputation: 9
I really don't get why people avoid assembly so much. Sure, it's more work and more lines of code, but the end product can be far more efficient than something written in high-level languages such as C/C++. There are also macros and include files to eliminate a lot of the grunt work. Try making something like KolibriOS in C++. The entire operating system with its GUI, games, and programs, fits on a single 1.44 MB floppy. Tiny Core Linux without the GUI and programs is 8 MB. Assembly language is the reason why RollerCoaster Tycoon 2 ran so smooth on crappy 90s hardware. I'm sure people questioned why Chris Sawyer decided to program the whole thing in Assembly, but he pressed on and made this programming masterpiece.

I want to know how to invoke all of the various windowing system calls and X libraries in Assembly. I just want to know if anyone knows where I can find information about this. Even if it's obsolete, I still want to learn it.
 
Old 12-02-2013, 09:34 PM   #6
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,667
Blog Entries: 4

Rep: Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945
"Bit-blitting?" Do I detect the presence of a fellow Amiga soul?

---

"Cam," it might actually surprise you to find that compiler-generated code these days is often more efficient than the stuff you might come up with on your own, because microprocessors today really are designed to process instructions that have been created by compilers ... not by humans. "The masters of silicon" ... ooommmm ... might In Their Infinite Wisdom™ Determine That "some particular instruction sequence™" is Optimal.™ Therefore, what's the very-best way to ensure that this sequence gets generated, therefore used, "at just the right time?" Yep, you guessed it! By arranging for gcc (say ...) to do so.

While it is certainly interesting, and informative, to delve into "the guts of a microprocessor," you should no longer presume that this endeavor will in fact lead to The Nirvana™ Of Efficiency. This is no longer true.
 
Old 12-02-2013, 10:08 PM   #7
sag47
Senior Member
 
Registered: Sep 2009
Location: Raleigh, NC
Distribution: Ubuntu, PopOS, Raspbian
Posts: 1,899
Blog Entries: 36

Rep: Reputation: 477Reputation: 477Reputation: 477Reputation: 477Reputation: 477
While I agree with all of the practical points made thus far. I can also understand the want of learning something just for the heck of it (even if no real benefit comes of it but just as a hobby). Though I haven't done GUI programming in assembly myself (I have otherwise) it would be best to understand that distributions usually package the X windows system with the Linux kernel. I'll give you my process for how I found the information.
  1. First I googled "x windows api reference"
  2. That led me to xlib.
  3. Then I googled "calling xlib assembly"
  4. That led me to many documents on the subject however this one stuck out in particular.

To quote that last document...
Quote:
As you can see, producing an XLib program in assembly language is rather
unwieldly. The code produced is primarily data manipulations and C calls; there
is not a lot that assembly has to offer, even in the event loop. In fact, the
only real optimization --aside from overhead added by the compiler, which in
the above case we do not bypass-- is in the use of straight calls rather than
the macros my original C "hello world" relied on.

While this in itself is somewhat of a triumph --for by coding the C application
in assembler you learn exactly how much superfluous code there was to get rid
of-- it is not enough. In the next issue, I will cover Xt programming in
assembler, which will use widgets/resources rather than create windows from
scratch, therefore placing the bulk of the code in existing system libraries
and therefore making the resultant application much smaller.
Which is pretty much the solution mtsinc stated in their first reply...

Quote:
Originally Posted by mtsinc View Post
You can certainly code assembly language application logic that calls these functions, same as you'd call any other library function from assembly language code. It's just a matter of stuffing the proper registers and/or stack (depending on what machine's assembly language you're speaking) all done according to the API definitions for the desired GUI library functions.
Granted this requirement will change if you decide to go with another windows system or completely invent your own GUI system. Since the asm mentioned was gcc asm it is worth noting the gcc x86 constraints.

Last edited by sag47; 12-02-2013 at 10:17 PM.
 
Old 12-02-2013, 10:12 PM   #8
astrogeek
Moderator
 
Registered: Oct 2008
Distribution: Slackware [64]-X.{0|1|2|37|-current} ::12<=X<=15, FreeBSD_12{.0|.1}
Posts: 6,269
Blog Entries: 24

Rep: Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196Reputation: 4196
Quote:
Originally Posted by CamTheSaxMan View Post
I want to know how to invoke all of the various windowing system calls and X libraries in Assembly. I just want to know if anyone knows where I can find information about this. Even if it's obsolete, I still want to learn it.
But that really illustrates a point being made here - all those libraries and calls are written in C/C++ and it is the compiler that maps all those calls for you.

So, while you "can" call them from an assembly based code block, you will mostly need to discover and map those calls, and any data structures they use, yourself.

There is no guide to doing it, because it is not done that way...

Last edited by astrogeek; 12-02-2013 at 10:15 PM.
 
Old 12-03-2013, 02:41 AM   #9
zhjim
Senior Member
 
Registered: Oct 2004
Distribution: Debian Squeeze x86_64
Posts: 1,748
Blog Entries: 11

Rep: Reputation: 233Reputation: 233Reputation: 233
Jump the train: http://www.menuetos.net/
 
Old 12-03-2013, 07:00 AM   #10
mtsinc
LQ Newbie
 
Registered: Jan 2002
Location: Jacksonville, FL
Distribution: RedHat
Posts: 26

Rep: Reputation: 16
Quote:
Originally Posted by Kenarkies View Post
Another good reason to avoid assembler for anything but tutorial practice is that the code is only applicable to one type of processor, and often only to its particular hardware environment. Most likely your application will be obsolete before you finish it. I started to use assembler programming 40 years ago on 8080 systems with 16K memory, which made sense. Everything in the last few decades has been in C (for embedded work) as C is reputed to produce code very close in speed and size to the best that assembler can do. Even OO languages can be coaxed to do as well.
Don't discount OO languages.

As I said, I spent years doing assembly on an IBM mainframe. Not always because it was more efficient (although for systems-level stuff, that's always kept in mind), but because the high-level languages of the day dragged in stuff that simply could not operate in the areas where we were working. My boss hated it, because whatever machine efficiency was gained was more and more cancelled out by how inefficiently it used his staff. We tried all sorts of things, including an elaborate system of macros from NASA.

The tipping point came circa 1985 when IBM's Pascal/VS came out. This was a compiler that could produce code as efficient - or more so - than the average assembler programmer, and it could globally re-optimize the entire program top to bottom every time you made a code change. Which simply wasn't going to happen with assembly languages.

Returning to OO languages, Java has gone assembly one better. Some Java Virtual Machines actually monitor running code and re-write it on the fly. For example in the old days when timings were much simpler, often a conditional branch statement took a lot longer to execute a branch than to skip it. The JVM could monitor the ratios of branches to skips and invert the statement to take better advantage. That's just a primitive example now that we also have pipelined processors where you want to avoid "bubbles", context switches (such as page faults) and so forth. The JVM is a big hungry chunk of code all by itself, but a lot of apps these days are even bigger and hungrier, so it can pay off handsomely as things scale upwards.

I have nothing against academic exercises, but assembly code for an Intel X86_64 isn't going to run on a Raspberry Pi and most of any X-based GUI app will consist of function calls to the C/C++ GUI services, so the price/performance ration in practical terms sucks.

The primary uses for assembler these days are in the lowest-level interrupt and I/O routines. Almost everything else is in a higher-level language. In the case of Linux, this goes all the way back to Day 1 (Red Hat celebrated a profitable year by sending out a poster with the entire original Linux kernel source code on it).

Even a lot of embedded process controllers don't do assembly any more (for example, the Arduino). While assembler will never die, the career opportunities have become a lot more limited. Although I did see an ad for an IBM mainframe assembler programmer last week. Probably maintaining a system that's older than most people on this forum.
 
1 members found this post helpful.
Old 12-03-2013, 07:18 AM   #11
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,667
Blog Entries: 4

Rep: Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945Reputation: 3945
We also did assembler because the "mainframes" of the day were tiny. It really did matter that the code must be as compact as possible. The compiler technologies also were forgettable, again because the machines were so small. (Early books on compiler-writing talk extensively about how to design the compiler using overlays ... and how to organize space on the magnetic drum.) Even the venerable System/370 had a 24-bit address bus and therefore supported only 16 megabytes of RAM (which was "inconceivably plenty").

Yes, Pascal/VS changed everything. In fact, IIRC there was a full complement of "/VS" compilers, and all of them were good. They required a bigger machine in order to run them, but by then, that's what you had.

Within the Linux kernel you can see what I would call the "proper use" of assembler: highly-targeted asm {...} blocks within "otherwise straight-'C'" source code. (There are also a very-few ".a" files, e.g. in the "trampoline" bootstrap code, where nothing else will do.) Trouble is, assembler code today might be slower than what the compiler, left to its own devisings, might cook-up for you, because modern microprocessors make extremely heavy use of pipelining ... and the designers of those microprocessors work very closely with the designers of those compilers. They'll find a particular thing that needs "lubrication," and design an instruction or set of instructions, then work with compiler writers to generate the "proper" code sequences in the "proper" way at just the "proper" times.

The best thing to do as a programmer, IMHO, is to write clear, maintainable code that expresses your intent as well as, secondarily, your approach. And, as Kernighan and Plauger wrote in "The Elements of Programming Style," all those years ago:
Quote:
"Don't 'diddle' your code to make it faster: find a better algorithm."

Last edited by sundialsvcs; 12-03-2013 at 07:21 AM.
 
Old 12-03-2013, 08:05 AM   #12
mtsinc
LQ Newbie
 
Registered: Jan 2002
Location: Jacksonville, FL
Distribution: RedHat
Posts: 26

Rep: Reputation: 16
Some perspective.

In 2004, I introduced the company I worked for to Linux. The system in question was a "junk" 500 MHz Pentium. But what it was used for was mind-boggling.

I installed the Hercules mainframe emulator on it.

We didn't have a physical mainframe. We only ran Windows and Solaris servers and outsourced the mainframe work to other companies. But we wanted a way to rip data definitions out of the COBOL source for one of those systems so that we could import from their flat files into our database. The easiest way to do this was to compile their source code locally (we had a license) and extract the binary offsets from the compiler output listing.

Like I said. A "junk" computer. Emulating roughly at the same speed as an IBM System/370 Model 168, which in the later 1970s was a water-chilled room-filling behemoth. Running the version of IBM's top-of-the-line OS/MVS operating system circa 1986.

Fast-forward about 5 years and my cellphone could do as much.

I recently brought up a copy of Hercules on a $35 Raspberry Pi Model B (A model A would do as much for $10 less). The "DASD Farm" is on a USB thumb drive and is larger than the one we had on our Amdahl mainframe in 1986.

But the smallest JVM these days requires roughly 8 times as much RAM as the entire System/370 address space.

That's why I don't agonize over bits and bytes and microseconds like I used to.


And yes, good algorithms are more important. Even back then. One app I worked with had a bubble sort in it. The reason it had a bubble sort was that a bubble sort is something you can do in 1 page of assembly code. However, the actual data was MOSTLY already sorted. with just enough out-of-line elements that meant a lot of bubbling had to be done.

The usual bulk sorting algorithms perform very poorly when data is already mostly in order. But a Shell-Metzner sort happens to be a good fit. It's a modified bubble that allows data to skip wide intervals instead of moving up and down step by step by step.

It was also 3 pages of assembler code versus the 1 page that a straight bubble would be and I didn't get any medals or bonuses for my pains, but I was proud of it.
 
Old 12-03-2013, 08:30 AM   #13
schneidz
LQ Guru
 
Registered: May 2005
Location: boston, usa
Distribution: fedora-35
Posts: 5,313

Rep: Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918
maybe you can find the source code to something like gedit and compile it using gcc using the -S option so it will output the assembler instructions ?

Last edited by schneidz; 12-03-2013 at 08:33 AM.
 
Old 12-03-2013, 07:54 PM   #14
CamTheSaxMan
Member
 
Registered: Nov 2013
Distribution: Linux Mint 13 Cinnamon Edition 64-bit, Windows 7 Home Premium SP1 64-bit, Arch Linux 32-bit
Posts: 161

Original Poster
Rep: Reputation: 9
I have found a good bit of information on GUI assembly for Windows, but I was wondering if there was anything like this for Linux.

Assembly still has its uses though, even if not specifically for graphical applications. It's used in device drivers, codecs, bootloaders, debugging, and video game hacking. ...and computer viruses.

@sundialsvcs How does that work? Is the processor just better at doing certain instructions and sequences than others. I know with assembly language, there are oodles of optimizations and "dirty tricks" that no compiler can touch.
 
Old 12-03-2013, 08:59 PM   #15
sag47
Senior Member
 
Registered: Sep 2009
Location: Raleigh, NC
Distribution: Ubuntu, PopOS, Raspbian
Posts: 1,899
Blog Entries: 36

Rep: Reputation: 477Reputation: 477Reputation: 477Reputation: 477Reputation: 477
Quote:
Originally Posted by CamTheSaxMan View Post
@sundialsvcs How does that work? Is the processor just better at doing certain instructions and sequences than others. I know with assembly language, there are oodles of optimizations and "dirty tricks" that no compiler can touch.
Research CPU pipelining (aka multi cycle cpus). Also research data hazards and control hazards and methods used to bypass them. Pipelining introduces the "better at doing certain sequences than others". Primarily because shared components are simply not available so the pipeline could be delayed if certain instructions are next to each other. Smarter CPUs also use branch prediction which if the assembly is not properly optimized could end up clearing the pipeline at each branch which leaves several wasted cycles in a programs runtime. It is estimated the Intel processors have as many as 40-50 stage pipelines. Sometimes processor manufacturers release optimization information in their Instruction Set Architecture which are usually spawned from limitations in the pipeline. Shared components across different stages of the CPU pipeline are often done as a result to cut costs in manufacturing and downsize real estate on a processor die. It's usually done as the design matures out of the prototype stages and into mass production.

Other compiler optimizations can include loop unrolling which tends to be much faster in a pipeline than iterating through the loop one item at a time.

Last edited by sag47; 12-03-2013 at 09:20 PM.
 
  


Reply

Tags
assembly, gui, linux, programming



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How to find out indone information and datablocks information in a file system chaitanya1982 Linux - Newbie 1 09-24-2008 01:58 AM
What are some good resources to learn assembly Language? theunixwizard General 16 07-24-2008 03:27 AM
Audio programs use many resources xeo Linux - Newbie 1 05-30-2005 03:48 PM
Kernel resources Information rabee77 Linux - Software 2 02-21-2005 01:23 AM
good resources on creating ur own distro gekkokid Linux - General 2 04-21-2004 12:53 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 04:21 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration