There are a lot of components inside a Linux distribution, hundreds and thousands of software that are combined together in order to provide you with a running, usable operating system. Each Linux distribution has its own set of components, it might be very similar to other Linux distributions sometimes and it might be different.
Those components are not developed by a single vendor, for example Linux kernel is developed by Linus Torvalds and the kernel community and companies (Thousands of people), GNU tools are developed by GNU project developers and Free Software Foundation, KDE desktop environment is developed by KDE Project, Firefox browser is developed by Mozilla, X display server is developed by the X Foundation.. And so on.
What Linux distributions do is that they get the source code for all those applications and build them, converting them into packages that can be installed on users computers and put them in a repository containing all those package. Later, the distribution makers build an ISO file from these source codes that you can download and burn and install on your PC. Linux distributions are actually just the shipment of those software in a specific way and format to end users, and they differ from each other by the default set of components, software and features they offer.
Each component is programmed by a different programming language, you can’t say that Ubuntu for example is developed by C language, because it contains a lot of applications that are developed by other programming languages like Python, Ruby, C++, Perl.. etc.
Here we list some main components of a Linux distribution.
Table of Contents:
What does a Linux Distribution Consists of?
Linux Kernel
The heart of any operating system; the kernel. in 1991 Linus Torvalds announced the availability of the first public release of the Linux kernel. Since then, it evolved a lot and today there are thousands of volunteers, companies and software developers who are working on maintaining the Linux kernel. It almost exist in every smart device that you may see around you, from servers to android phones passing by cars and IoT devices and a lot more. Linux is almost everywhere.
Linux Kernel is the responsible part about linking the software to the hardware and distributing resources among software applications, it also runs the I/O processes and executes them by passing them to the CPU. It’s also responsible for turning on hardware parts like GPU, sound cards, internet cards, RAM, hard disks.. etc. The kernel is the heart of every single operating system and it can’t work without it.
Today, there are 20 million lines of code in the Linux kernel right now.
GNU Tools
GNU is a project started in the 1980s by Richard Stallman who is also the founder of the FSF foundation. GNU aim was to create a completely free operating system that can be an alternative to UNIX. GNU developers created all the tools and programs that surrounds the operating system, like the boot loader and the bash shell and the GCC compiler, but they didn’t get to write the operating system kernel, so what happened actually in 1991 is that Linus Torvalds created the Linux kernel and then used the GNU software and tools to create a functional operating system.
GNU projects includes a lot of things actually, like the Bash shell, GNU C Compiler, GRUB boot loader, GTK+, Gzip, Nano and a lot of other software. Many of them his its own position right now at your Linux distribution.
One of the main projects for GNU is GRUB boot loader:
This is the first software that is loaded after you turn on the power button of your computer, it will load the boot loader from the MBR (or GPT) partition on the hard disk. The boot loader’s job is to load the operating system kernel and its other needed-components in order to be used later, almost 99% of Linux distributions use a boot loader called “GRUB“. There are other boot loaders like lilo, Burg.. But they are not common and very old to use.
We also have the Bash shell and it’s utilities, which is actually the command line interface where you enter the commands you want the system to execute directly, it’s the spirit of every Linux distribution:
Display Server or Windows System
A display server is the responsible software about drawing the graphical user interface on the screen. From icons to windows and menus, every graphical thing you see on the screen is done by a display server (known also as windows system). Without a display server, you will end up using the black command line interface for the whole screen.
There are many display servers out there, for for Unix-Like systems and Linux distributions the most famous one is the X display server, it was released in 1987, even before the Linux kernel and is used until today.
As you can see and because X server is more than 30 years old, buggy and full of security problems, some developers backed by companies like Red Hat and Intel started to work on a new display protocol called “Wayland“, it’s still under heavy development.
Display Manager
DMs are used to show the welcome screen after the boot loader and start desktop sessions as a connection with the X display server. Display managers are the welcome screens that ask you for your username and password before letting you login to your desktop environment. If you are using a GNOME environment then the default display manger should be GDM, if you are using KDE environment then the default display manager should be KDM.
You can use any display manager you want even if you run a different desktop environment from its default.
You can’t run more than one DM at a time because X server doesn’t allow multiple graphical sessions to be run together.
Daemons
Daemons are programs that run in the background of the operating system instead of being normal applications with windows on the user interface. They run some specific jobs and processes that are needed by the operating system, like the network-manager daemon which allows you to connect to the network automatically when you login to your system.
The most famous daemon is called “systemd” which is the main daemon controlling the whole operating system jobs. It is the first process that is executed after loading the Linux kernel. Its job is simply to control other daemons and run them when needed at boot time or at any time you want, it controls all the services available on the operating system and it can turn it on or off or modify it when needed.
In the past “sysvinit” was holding this job but because it became old, almost all Linux distributions are using systemd now.
Package Manager
On Linux, software are treated as “packages”. If you want to install an application, a library, a game or whatever on the Linux system, you should use packages and package manager to do that. On Linux you don’t go through the internet and try to find some good applications to download and install on your PC, you should never do that on Linux, this is not Windows.
What you do simply is that you open your package manger / software center to find and install the applications you need, you can search for their names and install them in one click. If you don’t know what you are looking for, you may search the package descriptions online on websites like Ubuntu Packages and Fedora Packages to get the package name and install it later.
The main thing here is the packaging system, the Redhat family and many other families of Linux distributions use the “rpm” packaging system. In this system, packages come in .rpm format (like .exe on Windows), but in the Debian family of Linux distributions, the system uses the “dpkg” packaging system which ships its packages as a .deb formats.
You can’t install .deb files on an rpm-powered Linux distribution, it won’t work, you may try to convert it from .deb to .rpm but using a software called “Alien” but it may not work also, you should grab the packages from your system’s official repositories
Remember: The packaging system is the core system for managing software on Linux. You may use it to create packages / install them locally and many other things, but usually we don’t deal with it on daily bases, we use a “package manager” which is like the interface for managing software on Linux.
There are many different package managers on the Linux desktop, and they differ from a distribution to another, for example, Ubuntu uses the “apt” package manager while Fedora uses “dnf”, openSUSE uses “zypper” while Arch uses “pacman”.
To make things clear, if you want – for example – install Firefox on your system, you should running the following command on Ubuntu:
sudo apt install firefox
On Fedora:
sudo dnf install firefox
On openSUSE:
sudo zypper install firefox
And so on.. They are very similar as you can see, but in the deep they are very different; from the way how they work, to speed and passing by security, “rpm” is the standard packaging system for Linux desktop (chosen by Linux foundation) but “deb” packages are way more in numbers than .rpm packages, therefore it’s your choice to make a decision between quality or ease of use.
Desktop Environment
A desktop environment is a group of applications and libraries that are combined together to provide graphical applications to the user. They provide a lot of core libraries, services, programs for end-users and a lot more. The most famous desktop environments on the Linux desktop are GNOME and KDE.
GNOME for example uses the GTK library to draw the GUI (Graphical user interface) widgets of its applications while KDE uses Qt library. GNOME comes with a desktop interface called “GNOME Shell” while KDE comes with “KDE Plasma”. Desktop environments also provide other applications like a display manager, file manager, session manager, archiver application, web browser, ui toolkit, settings manager.. and a lot more.
User Applications
Finally you got the normal applications that you use every day; Firefox, LibreOffice, Terminal Emulator, VLC Media Player, Pidgin.. etc, those applications are all in the user space and can completely be different from a distribution to another.
Those applications (and all other packages) and downloaded from the distribution repositories; repositories are like a place to store the files of packages and its metadata and distribute them to the users, so that they can install them at any time they want.
Conclusion
There are a lot of components that form a Linux distribution, those components may change from a distribution to another and they may be completely different, but it’s a good thing since the code is open-source and you can do whatever you want with your operating system. I hope that you all got a good view about those things after this article. Share me any questions in your minds in the comments section below.
With a B.Sc and M.Sc in Computer Science & Engineering, Hanny brings more than a decade of experience with Linux and open-source software. He has developed Linux distributions, desktop programs, web applications and much more. All of which attracted tens of thousands of users over many years. He additionally maintains other open-source related platforms to promote it in his local communities.
Hanny is the founder of FOSS Post.