Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
One of my first computers was an IBM 286. It had DOS and Windows 3.1. I actually preferred using DOS over Windows. A few years later when I had a 586, I was introduced to Red Hat 6.0. I already felt comfortable with a CLI, and found the Linux equivalents to DOS commands were intuitive enough to learn quickly. I had Windows and Linux installed on separate hard drives. I never had luck getting Linux to connect to the Internet via dial-up, so I had to boot the Windows drive to connect to the Internet for help if I was really stuck on something.
Some things are easier to learn than others. I never really got into vi or vim. I've stuck with nano and it just works for me. My knowledge of vim is nothing more than :q!. I'm actually more familiar with ed.
My first "scripts" were really nothing more than a few lines of commands to run. Now I have scripts I've written that are hundreds of lines to automate some tasks.
Man pages in Linux are hit or miss for how helpful they are. I've seen man pages give a brief description of their purpose and tell you to see --help for options. The Arch wiki is a good resource. (regardless of which distro you use)
For some things like grep, I didn't find it difficult to learn. I think in my early days, I picked up some bad habits following tutorials online giving examples cat file.txt | grep 'something' instead of simply grep 'something' file.txt or cat file.txt | less instead of simply less file.txt.
(1) Although most of the (different …) editors that I find myself using have some “support” for git version control, to me they are both “not the same” and “tedious.” I simply open a terminal window, select the project directory, and: … ”biddy-boom!” it’s done. I know the (small handful of) necessary commands by heart.
(2) So far as I know, many of the things that a developer routinely needs to so, such as updating [language-specific …] packages, have only a “command-line” interface.
Perhaps it’s just because I have to deal with so many “GUIs” at a time that I find “the CLI” more convenient.
Bottom line as I see it: “don’t be afraid of the console.” It enables you to “execute any program that you want to, anytime that you want to, as long as you know what you are doing.” Every GUI-tool builder “built on top of this.” And some of them did it extremely well.
Last edited by sundialsvcs; 01-19-2024 at 11:55 AM.
Where do yous like to go for underground scripts and tricks?
Start with following basic scripting (provided on link below), copy and paste 2 or 3 lines from the link. Observe and check what each line does, don't go and delve immediately for complex scripts or how the algorithm works why it produces such as output. Just remember let's say for this scenario or processes you need this commands.
The inflexibility of paper and ink has special cognitive properties << if you prefer this method, then I suggest print whatever you find on the internet.
Good luck! Just enjoy the journey!
Last edited by JJJCR; 01-22-2024 at 12:28 AM.
Reason: edit
The inflexibility of paper and ink has special cognitive properties << if you prefer this method, then I suggest print whatever you find on the internet.
I wonder how I could omit this in my previous post. Having a book lying beside the keyboard and – provided the index at the end is worth the designation – to *know* that you can return to the very same chapter *now* and as often as you want to, has been a comfortable way to learn.
Although I also learned to discard some books which I had payed for but which turned out to be failures. With hindsight, this had been proof of my progress, too, like a small, paid certification.
Last edited by Michael Uplawski; 01-20-2024 at 03:18 AM.
Reason: language WIP
Although the syntax at first appears to be "daunting," the practicalidea is, "well ... practical."
grep -rilw pattern << very nice trick
Yep, you'll get used to it in no time. << word of encouragement. I completely agree, instead of doing other stuff that doesn't really add value to what is necessary just focus on things that will help to contribute and attain the goal and yes in no time, only fat fingers will be the cause of the issue
There are probably thirty or forty “command-line commands and ‘pipe sequences’” that I know by heart. This is the practical knowledge that daily keeps me going. But it is impossible to know everything about anything, and it turns out that you don’t actually have to.
(N.B.: By “pipe sequence,” I mean where the output of one command is “piped” into a downstream command using the “|” operator.)
For example: ”git.” A tool that I encounter almost every day. And, “I know about it” exactly what I need to know, and I won’t learn more until my next engagement requires me to. Am I therefore very inclined to “puzzle out” how [this-or-that] editor tried to “GUI-ize” it? Frankly, no. Because I know that, if I can get to a terminal (CLI) and then get to the directory, I can accomplish the necessary task “by reflex.”
Or, as above: [b]grep -rilw[/b … ”give me a list of every file, in the specified directory and every one of its subdirectories, which contains the following ‘word’ (delimited by ‘whitespace’).” How long would it take you to convince a GUI to do that, if its implementor thought of it and chose to implement it? Or did you presuppose that you would have to “write a brand-new program?
And then: ”(pipe this output stream into) … No. You have just stepped out of “GUI-land” and into ”power tools.”
The “command line” should be properly regarded as being alongside graphical environments. However, it is entirely understandable that this idea might be very alien to some. It may well be that you have never before encountered a computing environment which had this property. “We understand.”
Last edited by sundialsvcs; 01-23-2024 at 06:08 PM.
Or, as above: [b]grep -rilw[/b … ”give me a list of every file, in the specified directory and every one of its subdirectories, which contains the following ‘word’ (delimited by ‘whitespace’).” How long would it take you to convince a GUI to do that, if its implementor thought of it and chose to implement it? Or did you presuppose that you would have to “write a brand-new program?
I haven't deliberately installed a file manager in a Linux system for years because it's so much quicker and simpler to use the coreutils tools with a grep filter when working in a big system directory.
I haven't deliberately installed a file manager in a Linux system for years because it's so much quicker and simpler to use the coreutils tools with a grep filter when working in a big system directory.
I've been using nnn for my file manager. Runs in the terminal. Useful for navigating directories quickly or quickly selecting multiple files/dirs from multiple directories to process.
Distribution: ChromeOS,SlackWare,Android and Lubuntu
Posts: 68
Rep:
While I don't hail from the days of either the early days of computing mainframes and early dos, however, I was raised on a dos based machines at school I learned how to type with Mavis Beacon teaches typing on DOS 4. x through 6. x. I remember the first computer that my father purchased for our family was a Dell machine with meager specifications which came with either MS-DOS above versions 4 or 5 of MS-DOS and by the mid-nineties my father purchased for our family. This computer came with MS-DOS or another we upgraded to Windows 95 then Windows 98 and finally finished the early Windows era with 98se. As started to become more skillful with computers around XP I decided to poke around with the BSDs and Linux.
It may seem ridiculous to certain veterans of the command line ways that I recently (23 + years Linux user) would cut their Linux Teeth by jumping in feet first and configuring a local file and print server with an early 200s version of Red Hat Linux (pre IBM Days). I also learned how to solve my self-inflicted woes by either turning to Google or a forum like LQ or just pure trial and error. The steepest learning curve that I have is Over the aforementioned 23 + years of using both GNU/Linux and the BSD distributions.
I learned a whole lot more about how the software works once I could come by second hand pcs that I would slap either GNU/Linux or one of the many flavors of the BSD operating system heck I've even gotten not only Broadcom chipset based Wifi adapters working under a flavor of BSD but also to get hard to get working Nvidia based video cards working visa-via e of how either using the drivers from the Freebsd ports system or the pre-compiled packages. Ive also gain a great deal of knowledge of how to compile not on just packages but also customize and compile my own kernels on both GNU/Linux and FreeBSD.
Ive also gain a great deal of knowledge of how to compile not on just packages but also customize and compile my own kernels on both GNU/Linux and FreeBSD.
I thought building your own kernel was frowned upon in the BSD world because of the tight integration of the base system. I used to do it in Linux back in the days when I used very limited hardware. I got sick and tired of watching those dots crawl across the screen as LILO read the kernel in, knowing that most of what it was reading was not actually going to be used.
One of my favorite distros in those days was Crux, because it ran fast even on old hardware, but they made you build your kernel locally. I was surprised to find out how easy it is to do that using menuconfig, though of course the actual compilation took ages. Kernel Help is the most amazing help system that I have ever seen, the only one I know of that actually tells you whether or not you need to use a given option. And of course a home-made kernel doesn't need an initrd, so that was another bit of mystification done away with.
But now that I'm old and tired and have a 4-core cpu and 4 GB of ram, I don't do that any more.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.