Dr. I. Sow, psychiatre Pullo, analyse Kumen

Arɗo (pasteur, guide, astrologue, vétérinaire, chef) tenant son bâton de commandement et entouré de sa famille. Ces éleveurs tressaient les cheveux d'hommes et de femmes. Ils ont emporté dans l'au-delà les connaissances et le mode de vie du Pulaaku. Ni paeïns, ni fétichistes, ils étaient, au contraire, monothéistes. Ils croyaient en Geno, l'Etre Suprême. Ici, une calebasse de trayeuse est posée aux pieds d'une matriarche. Un lien spiituel fécond unit cette dernière à Foroforondu, la gardienne tutélaire du laitage, et épouse de Kumen, l''archange des troupeaux. Photo <a href="http://www.webguinee.net/bbliotheque/histoire/arcin/1911/tdm.html">Arcin</a>, Fuuta-Jalon, 1911. — T.S. Bah.
Arɗo (pasteur, guide, astrologue, vétérinaire, chef) tenant son bâton pastoral de commandement et entouré de sa famille. Ces éleveurs tressaient les cheveux d’hommes et de femmes. Ils ont emporté dans l’au-delà les connaissances et le mode de vie du Pulaaku. Ni paeïns, ni fétichistes, ils étaient, au contraire, monothéistes. Ils croyaient en Geno, l’Etre Suprême. Ici, une calebasse de trayeuse est posée aux pieds d’une matriarche. Un lien spiituel fécond unit cette dernière à Foroforondu, la gardienne tutélaire du laitage, et épouse de Kumen, l”archange des troupeaux. Photo Arcin, Fuuta-Jalon, 1911. — T.S. Bah.

Dr. Ibrahima Sow épelle Koumen (en réalité Kumen) dans un article détaillé doublé d’une exégèse élaborée et originale, qu’il intitule “Le Monde Peul à travers le Mythe du Berger Céleste”. Le document parut dans Ethiopiques. Revue Négro-Africaine de Littérature et de Philosophie. Numéro 19, juillet 1979. La contribution de Dr. Sow est basée sur Koumen, Texte initiatique des Pasteurs Peuls, le chef-d’oeuvre d’Amadou Hampâté Bâ, rédigé en français en collaboration avec l’éminente ethnologue française, Germaine Dieterlen. Gardée secrète par ses détenteurs Fulɓe, la version originale Pular/Fulfulde a peut-être disparue à jamais avec la mort de Hampâté.

Il ne faut pas confondre ce spécialiste avec Prof. Alfâ Ibrâhîm Sow.

Pour un glossaire sur le Pulaaku cosmogonique et culturel on peut se référer à ma liste en appendice à Koumen.

Dr. Sow est l’auteur de deux autres textes dans la même revue:

  • “Le Listixaar est-il une pratique divinatoire ?”
  • “La littérature, la philosophie, l’art et le local”

Ma réédition complète de l’analyse de Kumen par Dr. Sow est accessible sur Semantic Africa. J’ai (a) composé la table des matières, (b) créé les hyperliens internes et externes (c) ajouté des illustrations, pertinentes comme les liens Web.
La réflexion de l’auteur porte sur la cosmogonie, la centralité du Bovin, la religion, le divin, le couple Kumen/Foroforondu, le pastoralisme, les corrélations avec les sociétés voisines (Wolof, Jola, etc.). Le document met en exergue la croyance monothéiste en  Geno, l’Etre Suprême, que les Fulɓe adoraient des millénaires avant l’arrivée de l’Islam. D’où l’interchangeabilité des noms sacrés Geno et Allah dans la littérature ajamiyya islamique, sous la plume des saints et érudits musulmans, sur toute l’aire culturelle du Pulaaku, de la Mauritanie au Cameroun. Par exemple, la treizième strophe (vers 16 et 17) de la sublime Introduction de Oogirde Malal, déclare :

Geno On wi’a: « Kallaa ! ɗum waɗataa
Nafataa han nimse e wullitagol! »

L’Eternel dira : « Plus jamais ! Cela ne sera point !
A présent inutiles les regrets et les plaintes !

Le nom de Geno est fréquent sous la plume de Tierno Muhammadu Samba Mombeya, Usman ɓii Foduyee, Sheku Amadu Bari, Moodi Adama, Cerno Bokar Salif Taal, Tierno Aliyyu Ɓuuɓa Ndiyan, Amadou Hampâté Bâ, etc.

Table des matières

  • Introduction
  • Symbolisme et vision du monde peul
  • L’Autre féminin de Koumen
  • Le paradoxe, dimension du symbole
  • Le grand jeu de la réalité
  • Aux origines premières du monde
  • Le lion est un voyant
  • Foroforondou
  • Koumen le Pasteur divin
  • Une façon originale d’habiter le monde

Ardue mais bonne lecture à la découverte du Pulaaku antique et ésotérique, ni banal ou vulgaire !

Tierno S. Bah

In Memoriam D. W. Arnott (1915-2004)

D.W. Arnott. The Nominal and Verbal Systems of Fula
D.W. Arnott. The Nominal and Verbal Systems of Fula

This article creates the webAfriqa homage and tribute to the memory of Professor David W. Arnott (1915-2004), foremost linguist, researcher, teacher and publisher on Pular/Fulfulde, the language of the Fulbe/Halpular of West and Central Africa. It is reproduces the obituary written in 2004 par Philip J. Jaggar. David Arnott belonged in the category of colonial administrators who managed to balance their official duties with in-depth social and cultural investigation of the societies their countries ruled. I publish quite a log of them throughout the webAfriqa Portal: Vieillard, Dieterlen, Delafosse, Person, Francis-Lacroix, Germain, etc.
The plan is to contributed to disseminate as much as possible the intellectual legacy of Arnott’s. Therefore, the links below are just part of the initial batch :

Tierno S. Bah


D. W. Arnott was a distinguished scholar and teacher of West African languages, principally Fulani (also known as Fula, Fulfulde and Pulaar) and Tiv, David Whitehorn Arnott, Africanist: born London 23 June 1915; Lecturer, then Reader, Africa Department, School of Oriental and African Studies 1951-66, Professor of West African Languages 1966-77 (Emeritus); married 1942 Kathleen Coulson (two daughters); died Bedale, North Yorkshire 10 March 2004.

He was one of the last members of a generation of internationally renowned British Africanists/linguists whose early and formative experience of Africa, with its immense and complex variety of peoples and languages, derived from the late colonial era.

Born in London in 1915, the elder son of a Scottish father, Robert, and mother, Nora, David Whitehorn Arnott was educated at Sheringham House School and St Paul’s School in London, before going on to Pembroke College, Cambridge, where he read Classics and won a “half-blue” for water polo. He received his PhD from London University in 1961, writing his dissertation on “The Tense System in Gombe Fula”.

Following graduation in 1939 Arnott joined the Colonial Administrative Service as a district officer in northern Nigeria, where he was posted to Bauchi, Benue and Zaria Provinces, often touring rural areas on a horse or by push bike. His (classical) language background helped him to learn some of the major languages in the area — Fulani, Tiv, and Hausa — and the first two in particular were to become his languages of published scientific investigation.

It was on board ship in a wartime convoy to Cape Town that Arnott met his wife-to-be, Kathleen Coulson, who was at the time a Methodist missionary in Ibadan, Nigeria. They married in Ibadan in 1942, and Kathleen became his constant companion on most of his subsequent postings in Benue and Zaria provinces, together with their two small daughters, Margaret and Rosemary.

From 1951 to 1977, David Arnott was a member of the Africa Department at the School of Oriental and African Studies (Soas), London University, as Lecturer, then Reader, and was appointed Professor of West African Languages in 1966. He spent 1955-56 on research leave in West Africa, conducting a detailed linguistic survey of the many diverse dialects of Fulani, travelling from Nigeria across the southern Saharan edges of Niger, Dahomey (now Benin), Upper Volta, French Sudan (Burkina Faso and Mali), and eventually to Senegal, Gambia, and Guinea. Many of his research notes from this period are deposited in the Soas library (along with other notes, documents and teaching materials relating mainly to Tiv and Hausa poetry and songs).

He was Visiting Professor at University College, Ibadan (1961) and the University of California, Los Angeles (1963), and attended various African language and Unesco congresses in Africa, Europe, and the United States. Between 1970 and 1972 he made a number of visits to Kano, Nigeria, to teach at Abdullahi Bayero College (now Bayero University, Kano), where he also supervised (as Acting Director) the setting up of the Centre for the Study of Nigerian Languages, and I remember a mutual colleague once expressing genuine astonishment that “David never seemed to have made any real enemies”. This was a measure of his integrity, patience and even-handed professionalism, and the high regard in which he was held.

Arnott established his international reputation with his research on Fula(ni), a widely used language of the massive Niger-Congo family which is spoken (as a first language) by an estimated eight million people scattered throughout much of West and Central Africa, from Mauritania and Senegal to Niger, Nigeria, Cameroon, Central African Republic and Chad (as well as the Sudan), many of them nomadic cattle herders.

Between 1956 and 1998 he produced almost 30 (mainly linguistic) publications on Fulani and in 1970 published his magnum opus, The Nominal and Verbal Systems of Fula (an expansion of his PhD dissertation), supplementing earlier works by his predecessors, the leading British and German scholars F.W. Taylor and August Klingenheben. In this major study of the Gombe (north-east Nigeria) dialect, he described, in clear and succinct terms, the complex system of 20 or more so-called “noun classes” (a classificatory system widespread throughout the Niger-Congo family which marks singular/plural pairs, often distinguishing humans, animals, plants, mass nouns and liquids). The book also advanced our understanding of the (verbal) tense- aspect and conjugational system of Fulani. His published research encompassed, too, Fulani literature and music.

In addition to Fulani, Arnott also worked on Tiv, another Niger-Congo language mainly spoken in east/central Nigeria, and from the late 1950s onwards he wrote more than 10 articles, including several innovative treatments of Tiv tone and verbal conjugations, in addition to a paper comparing the noun-class systems of Fulani and Tiv (“Some Reflections on the Content of Individual Classes in Fula and Tiv”, La Classification Nominale dans les Langues Négro-Africaines, 1967). Some of his carefully transcribed Tiv data and insightful analyses were subsequently used by theoretical linguists following the generative (“autosegmental”) approach to sound systems. (His colleague at Soas the renowned Africanist R.C. Abraham had already published grammars and a dictionary of Tiv in the 1930s and 1940s.)

In addition to Fulani and Tiv, Arnott taught undergraduate Hausa-language classes at Soas for many years, together with F.W. (“Freddie”) Parsons, the pre-eminent Hausa scholar of his era, and Jack Carnochan and Courtenay Gidley. He also pioneered the academic study of Hausa poetry at Soas, publishing several articles on the subject, and encouraged the establishment of an academic pathway in African oral literature.

The early 1960s were a time when the available language-teaching materials were relatively sparse (we had basically to make do with cyclostyled handouts), but he overcame these resource problems by organising class lessons with great care and attention, displaying a welcome ability to synthesise and explain language facts and patterns in a simple and coherent manner. He supervised a number of PhD dissertations on West African languages (and literature), including the first linguistic study of the Hausa language written by a native Hausa speaker, M.K.M. Galadanci (1969). He was genuinely liked and admired by his students.

David Arnott was a quiet man of deep faith who was devoted to his family. Following his retirement he and Kathleen moved to Moffat in Dumfriesshire (his father had been born in the county). In 1992 they moved again, to Bedale in North Yorkshire (where he joined the local church and golf club), in order to be nearer to their two daughters, and grandchildren.

Philip J. Jaggar
The Independent

Fulani Proverbial Lore and Word-Play

D. W. Arnott first published this paper in 1957 under the title “Proverbial Lore and Word-Play of the Fulani” (Africa. Volume 27, Issue 4, October 1957, pp. 379-396). I have shortened it a bit for web search engines.
Only the summaries are posted here, pending publication of the full text of the document.
See also: In Memoriam  Professor D. W. Arnott (1915-2004)
Tierno S. Bah

Abstract

The wit and wisdom of the Fulani, as of other African peoples, are expressed most characteristically in their proverbs and riddles. Their proverbs are amply illustrated by the collections of H. Gaden and C. E. J. Whitting, and a selection of riddles appeared in a recent article in Africa by M. Dupire and the Marquis de Tressan. But there are other types of oral literature—both light and serious—which various writers have mentioned, without quoting examples. So Mlle Dupire refers to formes litteraires alambiquées and ritournelles des enfants Bororo, and G. Pfeffer, in his article on ‘Prose and Poetry of the Fulbe,’ speaks of jokes and tongue-twisters. The aim of this article is to present some examples of these types of proverbial lore and word-play—epigrams, tongue-twisters, and chain-rhymes—which were recorded, along with many more riddles and proverbs, in the course of linguistic research during a recent tour of the Fula-speaking areas of West Africa, and to consider their relation to proverbs and riddles. These types of oral literature are of course by no means peculiar to the Fulani, and a number of the examples here quoted may well have parallels in other languages of West Africa or farther afield. But an examination of such pieces in one language may perhaps contribute something to the general study of this kind of lore.

Résumé
Proverbes et devinettes peules

Bien que les proverbes et les devinettes soient l’expression la plus caractéristique de l’esprit et de la sagesse des Peuls, il existe d’autres types de littérature orale—des épigrammes, des phrases difficiles à prononcer et des rimes enchaînées — qui partagent certaines particularités avec eux.
Les devinettes ne sont pas basées sur un jeu de mots, comme la plupart des devinettes anglaises, mais sur un jeu d’idées ou d’images (généralement visuelles, mais quelquefois auditives, ou une combinaison des deux), la comparaison de deux phénomènes qui se ressemblent par leur situation, leur caractère ou leur comportement. Quelquefois la devinette est posée en termes généraux et celui qui veut la résoudre doit trouver la particularité appropriée; mais ordinairement une particularité est donnée et celui qui cherche à résoudre la devinette doit choisir correctement ses traits saillants et trouver un autre objet ayant les mêmes traits.

De même, certains proverbes énoncent un principe général, mais la grande majorité, tout en donnant un exemple d’un principe général, sont exprimés en termes d’une situation particulière. Leur application à d’autres situations entraîne un procès de comparaison analogue à celui associé avec l’invention et la solution de devinettes.

Les épigrammes, comme les proverbes, sont des considérations aphoristiques sur la vie, mais elles sont plus longues et plus compliquées. Elles consistent en un rapprochement de plusieurs phénomènes ayant des caractéristiques générales en commun qui sont habituellement disposés par trois ou par groupes de trois ; les caractéristiques générales peuvent être décrites ou rester implicites, tandis qu’un troisième type classe plusieurs objets apparentés en catégories nettes.

Ces épigrammes ont une structure formelle typique et diverses autres particularités qui les distinguent du langage ordinaire, et qu’ils partagent dans une mesure plus ou moins grande avec les proverbes et les devinettes — une légère anomalie grammaticale, une régularité cadencée et certains procédés stylistiques, tels que la répétition des phrases parallèles et l’assonance basée sur l’utilisation de suffixes identiques.

La structure de la langue peule se prête à de telles assonances et également à la ‘ jonglerie ’ verbale de phrases difficiles à prononcer, tandis que la subtilité de celles-ci égale l’ingénuosité des rimes enchaînées. Ces dernières consistent en un enchaînement d’idées où le dernier mot de chaque ligne évoque le thème de la ligne suivante. Elles montrent également quelques unes des particularités stylistiques et autres, déjà constatées dans les épigrammes, les proverbes et les devinettes. Ainsi, les divers types de littérature orale peule, dont certains sont frivoles et d’autres sont sérieux, sont rapprochés par ces caractéristiques communes comme des éléments intimement liés d’une seule tradition littéraire.

D.W. Arnott

Portail webAfriqa et le Web Sémantique

Web sémantique, ou Toile Sémantique, est une extension du Web standardisée par le World Wide Web Consortium (W3C)
Web Sémantique, ou Toile Sémantique, est une extension du Web standardisée par le World Wide Web Consortium (W3C)

Le Portail webAfriqa accueillera bientôt les visiteurs des dix sites membres par des pages de couverture Drupal. Le onzième site, BlogGuinée, lui, continue de paraître sur WordPress. Au-delà de la page d’accueil, toutefois, tous les liens continueront à pointer   pour l’instant vers les sites statiques originels. Cet arrangement conjoncturel sera graduellement remplacé par la seule version Drupal.

La publication du Portail webAfriqa sur les CMS (systèmes de gestion de contenu) Drupal et WordPress est dictée par l’évolution des technologies du Web. Elle obéit précisément à deux puissants courants:

  • Les changements et raffinements technologiques continus des normes et protocoles qui régissent le World Wide Web, sous l’égide du World Wide Web Consortium (W3C).
  • L’efficience inhérente au quatuor Linux-Apache-MySQL-PHP. Exemples: indexation automatique du contenu, moteur de recherche interne, possibilité d’intégrer le super-serveur de recherche Solr, publication multimédia (texte, son, photo, film, cartographie et GPS, localisation, internationalisation, etc.

Depuis son irruption en 1992 sur Internet —qui, lui, date de 1969— le Word Wide Web a connu une évolution en quatre phases:

  1. Le Web initial (Web 1.0), de 1992 à 1995, dominé par le langage HTML
  2. Le Web 2.0, de 1995 à 2001 avec l’introduction des styles de présentation cascadés, Cascading Style Sheets (CSS)
  3. Le Web basé sur les CMS (Content Management System), de 2001 à nos jours
  4. Le Sémantique Web, de 2008 à nos jours. Il met l’accent sur le modelage d’Ontologies et l’ingénie du savoir à l’aide de (meta)-langages tels que XML, RDF/S, OWL, SPARQL (J’y reviendrai en détail dans un prochain article intitulé “Africa and the Semantic Web”)

Une question importante se pose ici : Quel est le niveau, quantitatif et qualitatif, d’adoption de ces phases par les sites africains et guinéens  ? Il serait intéressant et utile d’enquêter et de faire le point sur la situation

Ces étapes se succèdent chronologiquement ; mais elles coexistent aussi et ne s’excluent pas mutuellement. Cela n’empêche pas qu’elles soient technologiquement très différentes les unes des autres. Chaque phase cherche à résoudre des problèmes cruciaux antérieurs. Elle s’engage à améliorer les technologies précédentes et contribue à faire évoluer sérieusement le Web en général. Cependant, chaque phase aussi apporte plus de complexité et élève ainsi le niveau requis pour l’acquisition et la compétence dans les technologies numériques de publication. De la sorte, la Phase 4 est plus complexe que la Phase 3, qui est, à son tour, plus exigente que la Phase 2, etc.

Phase 3 et webAfriqa

Etant donné leur popularité et leur solide palmarès, les plateformes Drupal et WordPress ne sont plus à présenter. La première, par exemple, supporte des méga-sites gouvernementaux (Maison Blanche, NIH, etc.) et commerciaux (The Economist, Symantec, etc.). Pour sa part, avec un taux mondial d’adoption de 27 %, la seconde trône au premier rang des quelque 3,000 Content Management Systems (y compris Drupal) disponibles sur Internet.

Comme indiqué sur Gofundme, la migration des sites du Portail webAfriqa est nécessaire mais coûteuse au triple plan technologique, matériel et financier. Sa conception et son exécution demandent du temps. En conséquence, j’ai décidé de lancer une étape initiale à deux niveaux :

  • Reconfiguration du serveur web Apache pour faire cohabiter les sites statiques avec les sites dynamiques pour chacun des noms de domaine (moins BlogGuinée, qui paraît sur WordPress) :
  1. webFuuta
  2. webPulaaku
  3. webMande
  4. webCôte
  5. webForêt
  6. webGuinée
  7. Campboiro
  8. webAfriqa
  9. AfriXML/Semantic Africa
  10. webAmeriqa
  • Dans chaque cas le site Drupal sera le conteneur extérieur. Quitte au  site classique c’est-à-dire statique, de continuer à herger le DocumentRoot ou directoire du contenu du site. Les hyper-liens de la page d’accueil Drupal pointeront donc vers le site statique, qui utilise des langages et outils “conventionnels” :  XHTML/CSS2.1, CSS, jQuery, HTML5/CSS3, Bootstrap, Google Custom Search Engine.
  • Adaptation et aménagement de sous-thèmes du type Responsive Design (Bartik, Bootstrap, Scholarly, etc.)

L’environnement Drupal

Comme indiqué plus haut, Drupal et WordPress appartiennent à l’initialisme LAMP, porte-flambeau des logiciels Open Source. Les initiales se déchiffrent comme suit :

L = Linux, le système opératoire le plus répandu sur le Web. Son noyau (kernel) supporte (a) la Pile des normes et protocoles Internet (TCP/IP) (b) diverses distributions commerciales ou libres: CentOS, Ubuntu, Red Hat, etc. Lire :

A = Apache, le serveur web (httpd) intégré à Linux. En coopération étroite avec le serveur de nom de domaine (dns), il décide de la présence ou de l’absence d’un site sur la Toile. Sa misconfiguration conduit à la panne.

M = MySQL, la base de données qui est le magasin et le cerveau de la plupart des Content Management Systems (CMS)

P = PHP, le langage de programmation qui relie la base de données MySQL et le browser, vice versa.

Drupal, lui-même, est une application composée d’un noyau  (Core) et d’additions (Contributed) tierces (modules, thèmes, libraries, script DruSH, etc.) Le site officiel de la plateforme dénombre à ce jours 36,678 Modules (fonctionalité) et 2,418 Thèmes (présentation, apparence).

Avertissement ! Si l’on filtre ces chiffres par le critère “activement mis à jour” on obtient 11,381 modules et     702 thèmes. Autrement dit, de nombreux modules et thèmes sont périmés ou obsolescents. Certains d’entre eux ne fonctionnent que sur la version 6 de Drupal, en voie d’abandon en faveur des versions 7 et 8.

A l’image de ses confrères du LAMP, Drupal est donc modulaire. Ses composantes requièrent trois types complémentaires de spécialisation:

  • Programmation : PHP, JavaScript, Python, JAVA, etc.
  • Structuration du contenu, construction et administration
  • Design, interface graphique, esthétique

Drupal est flexible, productif, sécurisé et mûr. La plateforme offre aux utilisateurs (chercheurs, businesses, publicistes, artistes, etc.,) des outils capables de transformer les idées et projets en réalité. Exemples :

  • Les modules taxonomy, rdf (Core), rdfa, rdfx, etc., résident soit dans le noyau (Core) soit parmi les Contributions (Contributed) de Drupal. Taxonomy assiste dans l’élaboration efficace et élégante de Vocabulaires, indispensables au modelage des Ontologies et à l’ingénierie savoir dans le Web Sémantique.
  • Les modules book (Core) et biblio agissent en tandem pour faciliter la création de catalogues bibliographiques et la publication de collections complètes de bibliothèques virtuelles (livres, périodiques). En cela, ils sont précieux pour le Web Sémantique. C’est avec leur aide que j’ai reproduit l’ontologie Ebola, publiée dans le format PubMed par National Institutes of Health (NIH). Pour ce faire, j’ai d’abord collecté le code de dix chiffres assigné à chaque publication. J’ai ensuite rassemblé les codes dans un fichier, à raison d’un code par ligne. J’ai enfin soumis le fichier à la base de données PubMed. En quelques secondes elle a préparé et renvoyé  des centaines de titres à Semantic Africa. Résultat : une bibliographie élaborée où tous les principaux champs sont remplis : auteur, titre, éditeur, lieu et date de publication, résumé, image de couverture, etc. Cette tâche aurait occupé une équipe humaine pendant quelques jours.
  • Le module metatag permet (a) d’intégrer le site aux moteurs de recherche et (b) de soumettre les données spécifiques à l’intention des logiciels du Web Sémantique. Il inclut Dublin Core, OpenGraph Protocol (FaceBook), Twitter Cards, Google Plus, etc.
  • Le module schemaorg ajoute à Drupal la possibilité d’intégrer les centaines de collections (personnes, évènements,  etc.) reconnues par les moteurs de recherche

webAfriqa, “Web of Documents” et “Web of Data”

Le Web Sémantique, alias Web of Data, alias Web 3.0 ne remplace pas son prédécesseur, le Web of Documents (Web 2.0). Il l’élargit, l’approfondit. Par dessus tout, il accentue la dimension collaboration, et en fait sa pierre angulaire. Les compagnies multinationales (Apple, Microsoft, Google, IBM Facebook, Amazon,  etc.), les institutions gouvernementales américaines (Pentagon, Agriculture, etc) et d’autres pays, l’ONU, etc. exploitent et tirent grand bénéfice du Web Sémantique.
La mission du Portail webAfriqa est d’être une plateforme de recherche, d’édition et de diffusion de l’Héritage Culturel de l’Afrique, en général, et de celui des Fulɓe, en particulier, en combinant les technologies du Web des Documents (Web of Documents) avec celles du Web des Données (Web of Data). Prise entre le marteau d’hégémonies extérieures rapaces et aliénatrices, d’une part, et l’enclume d’élites et de dirigeants nombrilistes et technologiquement somnambules, d’autre part, l’Afrique accuse des faiblesses systémiques aggravées par des obstacles structuraux. Ancrés dans l’histoire ancienne et récente, ainsi que dans le présent, ces barrages sont la cause et la conséquence du retard, technologique, économique, et dans une large mesure, culturel, du continent.
Et pourtant le Web Sémantique offre potentiellement une autre opportunité d’un saut en avant à l’Afrique. Mais, comme le dit le proverbe, il n’y a pas de roses sans épines. Le continent regorge de contenu et de données uniques, qui se prêtent aux technologies de recherche du Web Sémantique, certes. Mais la Révolution numérique a des exigences que l’Afrique ne satisfait pas. Les handicaps majeurs consistent en l’absence ou la faiblesse de l’infrastructure publique (énergie, eau, agriculture, élevage, pêche, forêts, finances, écoles, universités, hôpitaux, communications, transports, manufacture, etc.)
En définitive les questions restent posées de savoir comment et quand l’Afrique remplira-t-elle les conditions sine qua non qu’Internet et le Web Sémantique requièrent afin de pouvoir jouer leur important rôle d’éperon du développment.
A suivre.

Tierno S. Bah

A quarter-century of Linux

Linus Benedict Torvalds with the Penguin, mascot of Linux
Linus Benedict Torvalds with the Penguin, mascot of Linux

Linux celebrates its 25th anniversary: a quarter-century in which it truly changed the world. Luckily for me, I was an early convert. And an adopter, if not in practice at least in mind. It was 1991, and I was living in Washington, DC, Southwest. Somehow my MCIMail account was among the recipients of a mailing list message that is likely to remain a memorable and historic announcement. It read:

From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID: <1991Aug25.205708.9541@klaava.Helsinki.FI>
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki
Hello everybody out there using minix –
I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I’d like any feedback on things people like/dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things).
I’ve currently ported bash (1.08) and gcc (1.40), and things seem to work. This implies that I’ll get something practical within a few months, and I’d like to know what features most people would want. Any suggestions are welcome, but I won’t promise I’ll implement them 🙂
Linus (torvalds@kruuna.helsinki.fi)
PS. Yes – it’s free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that’s all I have :-(.

I don’t recall giving Linus Torvalds a technical feedback or even a broad suggestion. For I was still a UNIX newbie challenged by an entrenched industrial operating system. For a while, I looked into A/UX —Apple’s defunct version of UNIX. Next, I made unsuccessful efforts to run an Apache web server on MachTen UNIX, from Tenon Intersystems. That company’s Berkely Software Distribution (BSD)-based OS targeted Macintosh computers built on either the PowerPC, M68K or G3 chips.…

Dr. Bob Kahn (left) and Dr Vinton Cerf (right): inventors of the TCP/IP Internet, which made the creation of Linux possible, and spurred its growth and popularity.
Dr. Bob Kahn (left) and Dr Vinton Cerf (right): inventors of the TCP/IP Internet, which made the creation of Linux possible, and spurred its growth and popularity.

Months after receiving Torvald’s email, I had the privilege of participating in the 1992 Kobe, Japan, conference. Co-inventor, with Dr. Robert Kanh, of the TCP/IP Stack — of Standards and Protocols — that underlies the Internet, Dr. Vinton Cerf chaired the event. And I was part of a group of technologists from eight African countries (Algeria, Tunisia, Egypt, Kenya, Zambia, Nigeria, Senegal, Guinea) who were invited to the meeting. There, with the other delegates, we witnessed and celebrated the founding of the Internet Society.…
In hindsight — and for a social sciences and humanities researcher like me —, the early 1990s proved serendipitous, challenging and groundbreaking. As Linux began to gain foothold, I alternatively tested some of its distributions: MkLinux, Red Hat, CentOS, Ubuntu, Debian… before settling on CentOS and Ubuntu. Ever since, I keep busy managing my Linux Virtual Private Server (VPS) which hosts a fairly complex array of services,  languages, utilities, applications, front-end frameworks (Bootstrap, Foundation), the Drupal, WordPress and Joomla Content Management Systems, etc. The VPS runs in full compliance with rules, regulations and Best Practices for efficiency, availability, productivity and security. It delivers rich content on each of my ten websites, which, together, make up my webAfriqa Portal. Still freely accessible —since 1997—, the sites offer quality online library collections and public services: history, anthropology, economy, literature, the arts, political science, health sciences, diplomacy, human rights, Information Technology, general topics, blogging, etc. They are searchable with the integrated Google Custom Search Engine.
Obviously, with the mobile devices onslaught, websites can double up as apps. However, beyond responsive web design stand  Web 3.0 era aka of the Semantic Web. Hence the raison d’être of the Semantic Africa project. It is yet a parked site. Hopefully, though, it will  evolve into an infrastructure capable of mining and processing Big Data and Very Large  African Databases (MySQL, MongoDB), with advanced indexing and sophisticated search features (Solr, Elasticsearch). The ultimate goal is to build networks of knowledge distribution aimed at fostering a fuller understanding of the African Experience, at home and abroad, from the dawn of humankind to today.
Needless to say, such an endeavor remains a tall order. Worse,  an impossible dream! For the roadblocks stand tall; chief among them are the predicaments of under-development (illiteracy, schooling, training, health care, food production, water supply, manufacturing, etc.), compounded by the self-inflicted wounds and crippling “technological somnanbulism” of African rulers and “elites.”

Looking back at the 2014 USA-Africa Summit in Washington, DC, I will publish additional articles about the continent’s economic and technical situation and prospects. One such paper is called “Obama and Takunda:  a tale of digital Africa,” another is named  “African telecommunications revolution: hype and reality.”

For decades now, proprietary and Open Source software have been competing head to head around the world for mind and market share. I wonder, though, to what extent African countries seek to leverage this rivalry. Are they implementing policies and spending resources toward balancing commercial applications with free software? Are they riding the Linux wave ? Or are they, instead, bucking the trend? To be determined!
Anyway, I share here Paul Venezia’s piece “Linux at 25: How Linux changed the world,” published today in InfoWorld. The author is profiled as “A devoted practitioner (who) offers an eyewitness account of the rise of Linux and the Open Source movement, plus analysis of where Linux is taking us now.”
Read also “A Salute To Shannon
Tierno S. Bah

Linux at 25:
How Linux changed the world

I walked into an apartment in Boston on a sunny day in June 1995. It was small and bohemian, with the normal detritus a pair of young men would scatter here and there. On the kitchen table was a 15-inch CRT display married to a fat, coverless PC case sitting on its side, network cables streaking back to a hub in the living room. The screen displayed a mess of data, the contents of some logfile, and sitting at the bottom was a Bash root prompt decorated in red and blue, the cursor blinking lazily.

I was no stranger to Unix, having spent plenty of time on commercial Unix systems like OSF/1, HP-UX, SunOS, and the newly christened Sun Solaris. But this was different.

The system on the counter was actually a server, delivering file storage and DNS, as well as web serving to the internet through a dial-up PPP connection — and to the half-dozen other systems scattered around the apartment. In front of most of them were kids, late teens to early 20s, caught up in a maze of activity around the operating system running on the kitchen server.

Those enterprising youths were actively developing code for the Linux kernel and the GNU userspace utilities that surrounded it. At that time, this scene could be found in cities and towns all over the world, where computer science students and those with a deep interest in computing were playing with an incredible new toy: a free “Unix” operating system. It was only a few years old and growing every day. It may not have been clear at the time, but these groups were rebuilding the world.

A kernel’s fertile ground

This was a pregnant time in the history of computing. In 1993, the lawsuit by Bell Labs’ Unix System Laboratories against BSDi over copyright infringement was settled out of court, clearing the way for open source BSD variants such as FreeBSD to emerge and inspire the tech community.

The timing of that settlement turned out to be crucial. In 1991, a Finnish university student named Linus Torvalds had begun working on his personal kernel development project. Torvalds himself has said, had BSD been freely available at the time, he would probably never have embarked on his project.

Yet when BSD found its legal footing, Linux was already on its way, embraced by the types of minds that would help turn it into the operating system that would eventually run most of the world.

The pace of development picked up quickly. Userspace utilities from the GNU operating collected around the Linux kernel, forming what most would call “Linux,” much to the chagrin of the GNU founder Richard Stallman. At first, Linux was the domain of hobbyists and idealists. Then the supercomputing community began taking it seriously and contributions ramped up further.

By 1999, this “hobby” operating system was making inroads in major corporations, including large banking institutions, and began whittling away at the entrenched players that held overwhelming sway. Large companies that paid enormous sums to major enterprise hardware and operating system vendors such as Sun Microsystems, IBM, and DEC were now hiring gifted developers, system engineers, and system architects who had spent the last several years of their lives working with freely available Linux distributions.

After major performance victories and cost savings were demonstrated to management, that whittling became a chainsaw’s cut. In a few short years, Linux was driving out commercial Unix vendors from thousands of entrenched customers. The stampede had begun— and it’s still underway.

Adaptability at the core

A common misconception about Linux persists to this day: that Linux is a complete operating system. Linux, strictly defined, is the Linux kernel. The producer of a given Linux distribution — be it Red Hat, Ubuntu, or another Linux vendor — defines the remainder of the operating system around that kernel and makes it whole. Each distribution has its own idiosyncrasies, preferring certain methods over others for common tasks such as managing services, file paths, and configuration tools.

This elasticity explains why Linux has become so pervasive across so many different facets of computing: A Linux system can be as large or as small as needed. Adaptations of the Linux kernel can drive a supercomputer or a watch, a laptop or a network switch. As a result, Linux has become the de facto OS for mobile and embedded products while also underpinning the majority of internet services and platforms.

To grow in these ways, Linux needed not only to sustain the interest of the best software developers on the planet, but also to create an ecosystem that demanded reciprocal source code sharing. The Linux kernel was released under the GNU Public License, version 2 (GPLv2), which stated that the code could be used freely, but any modifications to the code (or use of the source code itself in other projects) required that the resulting source code be made publicly available. In other words, anyone was free to use the Linux kernel (and the GNU tools, also licensed under the GPL) as long as they contributed the resulting efforts back to those projects.

This created a vibrant development ecosystem that let Linux grow by leaps and bounds, as a loose network of developers began molding Linux to suit their needs and shared the fruit of their labor. If the kernel didn’t support a specific piece of hardware, a developer could write a device driver and share it with the community, allowing everyone to benefit. If another developer discovered a performance issue with a scheduler on a certain workload, they could fix it and contribute that fix back to the project. Linux was a project jointly developed by thousands of volunteers.

Changing the game

That method of development stood established practices on their ear. Commercial enterprise OS vendors dismissed Linux as a toy, a fad, a joke. After all, they had the best developers working on operating systems that were often tied to hardware, and they were raking in cash from companies that relied on the stability of their core servers. The name of the game at that time was highly reliable, stable, and expensive proprietary hardware and server software, coupled with expensive but very responsive support contracts.

To those running the commercial Unix cathedrals of Sun, DEC, IBM, and others, the notion of distributing source code to those operating systems, or that enterprise workloads could be handled on commodity hardware, was unfathomable. It simply wasn’t done — until companies like Red Hat and Suse began to flourish. Those upstarts offered the missing ingredient that many customers and vendors required: a commercially supported Linux distribution.

The decision to embrace Linux at the corporate level was made not because it was free, but because it now had a cost and could be purchased for significantly less — and the hardware was significantly cheaper, too. When you tell a large financial institution that it can reduce its server expenses by more than 50 percent while maintaining or exceeding current performance and reliability, you have their full attention.

Add the rampant success of Linux as a foundation for websites, and the Linux ecosystem grew even further. The past 10 years have seen heavy Linux adoption at every level of computing, and importantly, Linux has carried the open source story with it, serving as an icebreaker for thousands of other open source projects that would have failed to gain legitimacy on their own.

The tale of Linux is more than the success of an open kernel and an operating system. It’s equally as important to understand that much of the software and services we rely on directly or indirectly every day exist only due to Linux’s clear demonstration of the reliability and sustainability of open development methods.

Anyone who fought through the days when Linux was unmentionable and open source was a threat to corporate management knows how difficult that journey has been. From web servers to databases to programming languages, the turnabout in this thinking has changed the world, stem to stern.

Open source code is long past the pariah phase. It has proven crucial to the advancement of technology in every way.

The next 25 years

While the first 15 years of Linux were busy, the last 10 have been busier still. The success of the Android mobile platform brought Linux to more than a billion devices. It seems every nook and cranny of digital life runs a Linux kernel these days, from refrigerators to televisions to thermostats to the International Space Station.

That’s not to say that Linux has conquered everything … yet.

Though you’ll find Linux in nearly every organization in one form or another, Windows servers persist in most companies, and Windows still has the lion’s share of the corporate and personal desktop market.

In the short term, that’s not changing. Some thought Linux would have won the desktop by now, but it’s still a niche player, and the desktop and laptop market will continue to be dominated by the goliath of Microsoft and the elegance of Apple, modest inroads by the Linux-based Chromebook notwithstanding.

The road to mainstream Linux desktop adoption presents serious obstacles, but given Linux’s remarkable resilience over the years, it would be foolish to bet against the OS over the long haul.

I say that even though various issues and schisms regularly arise in the Linux community — and not only on the desktop. The brouhaha surrounding systemd is one example, as are the battles over the Mir, Wayland, and ancient X11 display servers. The predilection of some distributions to abstract away too much of the underlying operating system in the name of user-friendliness has rankled more than a few Linux users. Fortunately, Linux is what you make of it, and the different approaches taken by various Linux distributions tend to appeal to different user types.

That freedom is a double-edged sword. Poor technological and functional decisions have doomed more than one company in the past, as they’ve taken a popular desktop or server product in a direction that ultimately alienated users and led to the rise of competitors.

If a Linux distribution makes a few poor choices and loses ground, other distributions will take a different approach and flourish. Linux distributions are not tied directly to Linux kernel development, so they come and go without affecting the core component of a Linux operating system. The kernel itself is mostly immune to bad decisions made at the distribution level.

That has been the trend over the past 25 years — from bare metal to virtual servers, from cloud instances to mobile devices, Linux adapts to fit the needs of them all. The success of the Linux kernel and the development model that sustains it is undeniable. It will endure through the rise and fall of empires.

Paul Venezia
Paul Venezia

The next 25 years should be every bit as interesting as the first.

Paul Venezia
InfoWorld