Linus Benedict Torvalds with the Penguin, mascot of Linux
Linux celebrates its 25th anniversary: a quarter-century in which it truly changed the world. Luckily for me, I was an early convert. And an adopter, if not in practice at least in mind. It was 1991, and I was living in Washington, DC, Southwest. Somehow my MCIMail account was among the recipients of a mailing list message that is likely to remain a memorable and historic announcement. It read:
From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki
Hello everybody out there using minix –
I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I’d like any feedback on things people like/dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things).
I’ve currently ported bash (1.08) and gcc (1.40), and things seem to work. This implies that I’ll get something practical within a few months, and I’d like to know what features most people would want. Any suggestions are welcome, but I won’t promise I’ll implement them 🙂
PS. Yes – it’s free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that’s all I have :-(.
I don’t recall giving Linus Torvalds a technical feedback or even a broad suggestion. For I was still a UNIX newbie challenged by an entrenched industrial operating system. For a while, I looked into A/UX —Apple’s defunct version of UNIX. Next, I made unsuccessful efforts to run an Apache web server on MachTen UNIX, from Tenon Intersystems. That company’s Berkely Software Distribution (BSD)-based OS targeted Macintosh computers built on either the PowerPC, M68K or G3 chips.…
Dr. Bob Kahn (left) and Dr Vinton Cerf (right): inventors of the TCP/IP Internet, which made the creation of Linux possible, and spurred its growth and popularity.
Months after receiving Torvald’s email, I had the privilege of participating in the 1992 Kobe, Japan, conference. Co-inventor, with Dr. Robert Kanh, of the TCP/IP Stack — of Standards and Protocols — that underlies the Internet, Dr. Vinton Cerf chaired the event. And I was part of a group of technologists from eight African countries (Algeria, Tunisia, Egypt, Kenya, Zambia, Nigeria, Senegal, Guinea) who were invited to the meeting. There, with the other delegates, we witnessed and celebrated the founding of the Internet Society.…
In hindsight — and for a social sciences and humanities researcher like me —, the early 1990s proved serendipitous, challenging and groundbreaking. As Linux began to gain foothold, I alternatively tested some of its distributions: MkLinux, Red Hat, CentOS, Ubuntu, Debian… before settling on CentOS and Ubuntu. Ever since, I keep busy managing my Linux Virtual Private Server (VPS) which hosts a fairly complex array of services, languages, utilities, applications, front-end frameworks (Bootstrap, Foundation), the Drupal, WordPress and Joomla Content Management Systems, etc. The VPS runs in full compliance with rules, regulations and Best Practices for efficiency, availability, productivity and security. It delivers rich content on each of my ten websites, which, together, make up my webAfriqa Portal. Still freely accessible —since 1997—, the sites offer quality online library collections and public services: history, anthropology, economy, literature, the arts, political science, health sciences, diplomacy, human rights, Information Technology, general topics, blogging, etc. They are searchable with the integrated Google Custom Search Engine.
Obviously, with the mobile devices onslaught, websites can double up as apps. However, beyond responsive web design stand Web 3.0 era aka of the Semantic Web. Hence the raison d’être of the Semantic Africa project. It is yet a parked site. Hopefully, though, it will evolve into an infrastructure capable of mining and processing Big Data and Very Large African Databases (MySQL, MongoDB), with advanced indexing and sophisticated search features (Solr, Elasticsearch). The ultimate goal is to build networks of knowledge distribution aimed at fostering a fuller understanding of the African Experience, at home and abroad, from the dawn of humankind to today.
Needless to say, such an endeavor remains a tall order. Worse, an impossible dream! For the roadblocks stand tall; chief among them are the predicaments of under-development (illiteracy, schooling, training, health care, food production, water supply, manufacturing, etc.), compounded by the self-inflicted wounds and crippling “technological somnanbulism” of African rulers and “elites.”
Looking back at the 2014 USA-Africa Summit in Washington, DC, I will publish additional articles about the continent’s economic and technical situation and prospects. One such paper is called “Obama and Takunda: a tale of digital Africa,” another is named “African telecommunications revolution: hype and reality.”
For decades now, proprietary and Open Source software have been competing head to head around the world for mind and market share. I wonder, though, to what extent African countries seek to leverage this rivalry. Are they implementing policies and spending resources toward balancing commercial applications with free software? Are they riding the Linux wave ? Or are they, instead, bucking the trend? To be determined!
Anyway, I share here Paul Venezia’s piece “Linux at 25: How Linux changed the world,” published today in InfoWorld. The author is profiled as “A devoted practitioner (who) offers an eyewitness account of the rise of Linux and the Open Source movement, plus analysis of where Linux is taking us now.”
Read also “A Salute To Shannon”
Tierno S. Bah
Linux at 25:
How Linux changed the world
I walked into an apartment in Boston on a sunny day in June 1995. It was small and bohemian, with the normal detritus a pair of young men would scatter here and there. On the kitchen table was a 15-inch CRT display married to a fat, coverless PC case sitting on its side, network cables streaking back to a hub in the living room. The screen displayed a mess of data, the contents of some logfile, and sitting at the bottom was a Bash root prompt decorated in red and blue, the cursor blinking lazily.
I was no stranger to Unix, having spent plenty of time on commercial Unix systems like OSF/1, HP-UX, SunOS, and the newly christened Sun Solaris. But this was different.
The system on the counter was actually a server, delivering file storage and DNS, as well as web serving to the internet through a dial-up PPP connection — and to the half-dozen other systems scattered around the apartment. In front of most of them were kids, late teens to early 20s, caught up in a maze of activity around the operating system running on the kitchen server.
Those enterprising youths were actively developing code for the Linux kernel and the GNU userspace utilities that surrounded it. At that time, this scene could be found in cities and towns all over the world, where computer science students and those with a deep interest in computing were playing with an incredible new toy: a free “Unix” operating system. It was only a few years old and growing every day. It may not have been clear at the time, but these groups were rebuilding the world.
A kernel’s fertile ground
This was a pregnant time in the history of computing. In 1993, the lawsuit by Bell Labs’ Unix System Laboratories against BSDi over copyright infringement was settled out of court, clearing the way for open source BSD variants such as FreeBSD to emerge and inspire the tech community.
The timing of that settlement turned out to be crucial. In 1991, a Finnish university student named Linus Torvalds had begun working on his personal kernel development project. Torvalds himself has said, had BSD been freely available at the time, he would probably never have embarked on his project.
Yet when BSD found its legal footing, Linux was already on its way, embraced by the types of minds that would help turn it into the operating system that would eventually run most of the world.
The pace of development picked up quickly. Userspace utilities from the GNU operating collected around the Linux kernel, forming what most would call “Linux,” much to the chagrin of the GNU founder Richard Stallman. At first, Linux was the domain of hobbyists and idealists. Then the supercomputing community began taking it seriously and contributions ramped up further.
By 1999, this “hobby” operating system was making inroads in major corporations, including large banking institutions, and began whittling away at the entrenched players that held overwhelming sway. Large companies that paid enormous sums to major enterprise hardware and operating system vendors such as Sun Microsystems, IBM, and DEC were now hiring gifted developers, system engineers, and system architects who had spent the last several years of their lives working with freely available Linux distributions.
After major performance victories and cost savings were demonstrated to management, that whittling became a chainsaw’s cut. In a few short years, Linux was driving out commercial Unix vendors from thousands of entrenched customers. The stampede had begun— and it’s still underway.
Adaptability at the core
A common misconception about Linux persists to this day: that Linux is a complete operating system. Linux, strictly defined, is the Linux kernel. The producer of a given Linux distribution — be it Red Hat, Ubuntu, or another Linux vendor — defines the remainder of the operating system around that kernel and makes it whole. Each distribution has its own idiosyncrasies, preferring certain methods over others for common tasks such as managing services, file paths, and configuration tools.
This elasticity explains why Linux has become so pervasive across so many different facets of computing: A Linux system can be as large or as small as needed. Adaptations of the Linux kernel can drive a supercomputer or a watch, a laptop or a network switch. As a result, Linux has become the de facto OS for mobile and embedded products while also underpinning the majority of internet services and platforms.
To grow in these ways, Linux needed not only to sustain the interest of the best software developers on the planet, but also to create an ecosystem that demanded reciprocal source code sharing. The Linux kernel was released under the GNU Public License, version 2 (GPLv2), which stated that the code could be used freely, but any modifications to the code (or use of the source code itself in other projects) required that the resulting source code be made publicly available. In other words, anyone was free to use the Linux kernel (and the GNU tools, also licensed under the GPL) as long as they contributed the resulting efforts back to those projects.
This created a vibrant development ecosystem that let Linux grow by leaps and bounds, as a loose network of developers began molding Linux to suit their needs and shared the fruit of their labor. If the kernel didn’t support a specific piece of hardware, a developer could write a device driver and share it with the community, allowing everyone to benefit. If another developer discovered a performance issue with a scheduler on a certain workload, they could fix it and contribute that fix back to the project. Linux was a project jointly developed by thousands of volunteers.
Changing the game
That method of development stood established practices on their ear. Commercial enterprise OS vendors dismissed Linux as a toy, a fad, a joke. After all, they had the best developers working on operating systems that were often tied to hardware, and they were raking in cash from companies that relied on the stability of their core servers. The name of the game at that time was highly reliable, stable, and expensive proprietary hardware and server software, coupled with expensive but very responsive support contracts.
To those running the commercial Unix cathedrals of Sun, DEC, IBM, and others, the notion of distributing source code to those operating systems, or that enterprise workloads could be handled on commodity hardware, was unfathomable. It simply wasn’t done — until companies like Red Hat and Suse began to flourish. Those upstarts offered the missing ingredient that many customers and vendors required: a commercially supported Linux distribution.
The decision to embrace Linux at the corporate level was made not because it was free, but because it now had a cost and could be purchased for significantly less — and the hardware was significantly cheaper, too. When you tell a large financial institution that it can reduce its server expenses by more than 50 percent while maintaining or exceeding current performance and reliability, you have their full attention.
Add the rampant success of Linux as a foundation for websites, and the Linux ecosystem grew even further. The past 10 years have seen heavy Linux adoption at every level of computing, and importantly, Linux has carried the open source story with it, serving as an icebreaker for thousands of other open source projects that would have failed to gain legitimacy on their own.
The tale of Linux is more than the success of an open kernel and an operating system. It’s equally as important to understand that much of the software and services we rely on directly or indirectly every day exist only due to Linux’s clear demonstration of the reliability and sustainability of open development methods.
Anyone who fought through the days when Linux was unmentionable and open source was a threat to corporate management knows how difficult that journey has been. From web servers to databases to programming languages, the turnabout in this thinking has changed the world, stem to stern.
Open source code is long past the pariah phase. It has proven crucial to the advancement of technology in every way.
The next 25 years
While the first 15 years of Linux were busy, the last 10 have been busier still. The success of the Android mobile platform brought Linux to more than a billion devices. It seems every nook and cranny of digital life runs a Linux kernel these days, from refrigerators to televisions to thermostats to the International Space Station.
That’s not to say that Linux has conquered everything … yet.
Though you’ll find Linux in nearly every organization in one form or another, Windows servers persist in most companies, and Windows still has the lion’s share of the corporate and personal desktop market.
In the short term, that’s not changing. Some thought Linux would have won the desktop by now, but it’s still a niche player, and the desktop and laptop market will continue to be dominated by the goliath of Microsoft and the elegance of Apple, modest inroads by the Linux-based Chromebook notwithstanding.
The road to mainstream Linux desktop adoption presents serious obstacles, but given Linux’s remarkable resilience over the years, it would be foolish to bet against the OS over the long haul.
I say that even though various issues and schisms regularly arise in the Linux community — and not only on the desktop. The brouhaha surrounding systemd is one example, as are the battles over the Mir, Wayland, and ancient X11 display servers. The predilection of some distributions to abstract away too much of the underlying operating system in the name of user-friendliness has rankled more than a few Linux users. Fortunately, Linux is what you make of it, and the different approaches taken by various Linux distributions tend to appeal to different user types.
That freedom is a double-edged sword. Poor technological and functional decisions have doomed more than one company in the past, as they’ve taken a popular desktop or server product in a direction that ultimately alienated users and led to the rise of competitors.
If a Linux distribution makes a few poor choices and loses ground, other distributions will take a different approach and flourish. Linux distributions are not tied directly to Linux kernel development, so they come and go without affecting the core component of a Linux operating system. The kernel itself is mostly immune to bad decisions made at the distribution level.
That has been the trend over the past 25 years — from bare metal to virtual servers, from cloud instances to mobile devices, Linux adapts to fit the needs of them all. The success of the Linux kernel and the development model that sustains it is undeniable. It will endure through the rise and fall of empires.
The next 25 years should be every bit as interesting as the first.
- Panama Papers and Open Source SoftwareAccording to reports, outdated and vulnerable versions of WordPress and…
- FCC chairman for strongest net neutrality rulesThe chairman of the Federal Communications Commission just said he’s…
- Objection et rappel à Kababachir.comLe site Kababachir.com reproduit entièrement les publications parues sur mon…
- Salute to ShannonI just read the New Yorker’s article titled “Claude Shannon,…
- President Barack Obama on Economic Mobilityhe President: Thank you. (Applause.) Thank you, everybody. Thank you…
When a new government comes into power, especially an inexperienced one, there’s one phenomenon that never fails: every crook on earth shows up. And every crook on earth has the biggest promises, has access to billions of dollars of lines of credits, of loans.”
“Lorsqu’un nouveau gouvernement vient au pouvoir, en particulier un cabinet inexpérimenté, le même phénomène se reproduit infailliblement : chaque escroc sur Terre se présente. Et chaque escroc sur Terre fait les plus grandes promesses, se prévaut de l’accès à des milliards de dollars en lignes de crédit, en prêts.” (Mahmoud Thiam, ancien ministre des Mines)
En dépit de leur formulation catégorique ou de leur apparente exagération ces propos doivent être pris au sérieux. En effet leur auteur, Mahmoud Thiam, parle en connaissance de cause. Détenteur du portefeuille des mines dans le gouvernement du Capitaine Moussa Dadis Camara (fin 2008-début 2010), il détient la double — et suspecte — “qualité” d’observateur et de participant aux magouilles, transactions et pots-de-vin qui, depuis Sékou Touré, caractérisent les activités de ce département minstériel. Dans l’article ci-dessous Jeune Afrique relate un nouvel épisode des rocambolesques et criminelles machinations qui enveloppent la gestion des ressources minières du pays sous la présidence d’Alpha Condé. Ainsi continue la valse des vampires, des prédateurs, des parasites, des corrupteurs et des corrompus en Guinée.
Lire également : (a) Bribery Arrest May Expose African Mining Rights Scandal Tied to Och-Ziff (b) U.S. Charges Gabonese Fixer Tied to Hedge Fund Och-Ziff With Bribery. Prosecutors allege Samuel Mebiame bribed African officials for mining rights (c) “Exploiting a State on the Brink of Failure: The Case of Guinea” (d) Corruption : un homme d’affaires gabonais écroué aux Etats-Unis (e) Samuel Mebiame, l’interlocuteur de Mahmoud Thiam dans l’audio « secret », sort de sa réserve
Tierno S. Bah
Daniel Och, chairman and CEO of Och-Ziff Capital Management
Mines : le Gabonais Samuel Mebiame arrêté à New York pour soupçons de corruption au Niger, en Guinée et au Tchad
Le fils de Léon Mebiame, ancien Premier ministre du Gabon (1975-1990), a été arrêté mardi à New York. La justice américaine accuse l’entrepreneur gabonais d’avoir versé des pots-de-vins à des officiels au Niger, en Guinée et au Tchad pour l’obtention de concessions minières.
Samuel Mebiame a été arrêté à Brooklyn (New York) le mardi 16 août par des officiers fédéraux américains, rapportent le New York Times et le Wall Street Journal. L’entrepreneur gabonais, fils de l’ancien Premier ministre Léon Mebiame, est soupçonné de corruption.
Selon le New York Times, les procureurs accusent Samuel Mebiame d’avoir régulièrement versé des pots-de-vin à des responsables publics au Niger, en Guinée et au Tchad afin d’obtenir des concessions minières pour une structure liée à par un fonds d’investissement spéculatif américain.
Si l’identité de ce fonds n’est pas dévoilée dans la plainte en question, selon le quotidien new-yorkais, il s’agit de Och-Ziff Capital Management Group, qui gère plus de 39 milliards de dollars d’actifs.
Les procureurs américains estiment qu’en tant que consultant et « arrangeur » pour une co-entreprise entre Och-Ziff Capital Management Group et une compagnie enregistrée aux îles Turks-et-Caïcos (territoire britannique des Caraïbes), Samuel Mebiame aurait été impliqué dans plusieurs opérations illicites dans les trois pays africains sus-mentionnés.
Paiements et « belles voitures »
Parmi les allégations citées par la presse américaine figurent : au Niger, des paiements de plus de 1,3 million de dollars à une société détenue par des responsables publics, la fourniture de « belles voitures » et le règlement de frais juridiques ; en Guinée, les procureurs américains soupçonnent l’entrepreneur gabonais et ses associés d’avoir utilisé des documents à caractère officiel pour avertir des concessionnaires de l’existence de problèmes juridiques entourant leurs permis d’exploitation minière ; au Tchad, le fils de l’ancien chef du gouvernement gabonais est soupçonné d’avoir usé de méthodes illicites pour l’acquisition de permis sur des gisements d’uranium pour Och-Ziff Capital Management Group et ses partenaires.
Si les responsables du fonds d’investissement américain et l’avocat de Samuel Mebiame n’ont pas répondu aux sollicitations de la presse américaine, il faut noter que Och-Ziff Capital Management Group a confirmé plus tôt cette année être sous le coup d’une enquête du département américain de la Justice et de la Security and Exchange Commission, le régulateur boursier américain, pour soupçons de paiement de pots-de-vin au Zimbabwe, au Congo et en Libye.
Le fonds américain a annoncé avoir provisionné 400 millions de dollars dans l’anticipation d’une résolution négociée de cette procédure.
Selon un communiqué diffusé le mardi 17 août, « le gouvernement de la Guinée et les autorités judiciaires pénales guinéennes suivent attentivement l’affaire Samuel Mebiame et mèneront toutes les investigations nécessaires pour aider les autorités américaines à faire la lumière sur les allégations en lien avec la Guinée. »
L’exécutif guinéen se dit « à la disposition des autorités américaines, avec qui il a déjà collaboré et continue de collaborer étroitement, pour apporter toute l’assistance nécessaire à la procédure en cours aux États-Unis ».
- Simandou, justice et rancoeur vaineubliés sur Aminata.com un certain Ibrahima Sory Touré a tenu…
- President Condé, the military and GuineaIn an interview to the BBC, President Alpha Conde denounces…
- Dadis, le tueur et “justicier” de ConakryLa junte militaire et son gouvernement ont publié un communiqué…
- L’insondable Walter HennigCe businessman sud-africain défraie la chronique pour avoir signé un…
- BSGR mining concession questionsThe Financial Times has published serious corruption allegations relating to…