The inclusivity paradox of the digital age
Behind every "age", as if by definition, lies a spark. Ironically, although the "digital age" may be the most profound of them all, as deducible from its own so-called "digital revolution", its time-span is too fluid and that "revolution" is more revolutionary linguistically than it is on the ground. Still, "digital age" captures our minds, just as the "Cold War Age" had done after World War II, and the "age of imperialism" did until World War II. Though scholars and pundits typically trace the "digital revolution" to the 1980s or 1990s (when a sedate extant term, "digital", dramatically acquired new glow in everyday usage), whatever "digita" was ascribed to at the time had actually been foreseen before, for example, in John von Neumann's "parlour games" of the late-1920s. Since nothing scientific validates a concrete "digital revolution" or a "digital age", a "digital transformation" in the second 20th century half may arguably be a more appropriate label.
Behind that thick introduction lies what the Economist posited in one of its 2014 cover-pages as the emerging "digital age" fear: "Rise of the robots," and with it, a "robot invasion. . .[to]. . .change the way people think about technology." "Digits" (numbers) have not displaced "texts" (description), and "net jobs" have not been lost to "robots" or other new gadgets. Of importance is to identify what catalysed this innocuous "digital transformation" into a force rattling our nervous system, serving both as a weapon we can wield and a threat we personally face.
Henning Meyer's five filters lubricate the start. These are ethical, social, corporate governance, legal, and productivity filters. Ethical filters get invoked, for example, when a new bio-technological contraption alters natural food components, triggering moral, religious, or political concerns. Of far broader relevance, social filters dig out job-related consequences of new technologies, ranging from job-displacement by machines (as the assembly-line does to manual production), to job-transfers (from one skills-level, or profession-specific technology, to another). How Dhaka's automated metro-line will soon shift train-drivers and train-conductors into programme managers, managing several trains simultaneously from remote switchboards rather than from within each vehicle, exemplifies the point.
Corporate governance elicits a similar fear: Meyer's distinction between the short-term Anglo-American orientation over the long-term European counterpart in technology-related preferences also has our own counterpart: whether meat should be prepared by human hands or machines in a Muslim, thereby halal-receptive, society. Automobile accidents or insurance similarly raise legal questions in western countries, as too productivity filters flagging a new online technology that adds prints on hand-crafted fabric, thus modernising only one component amid an otherwise traditional production process.
After filtration, which usually is not time-consuming, perhaps even a parallel function, the relationship between the human job and the new machine/technology invites other considerations, typically within a tripartite compartmentalisation: will the machine "substitute" the human, serve as a "complement", or become "creative", that is, generate new jobs.
Several empirical studies already question if machines actually substitute humans: if new technology displaces jobs in one part of the company, it cannot but create new jobs in other segments of the company's production line. Since machines have to be programmed, serviced, repaired, enhanced, and so forth, for the company to compete, new service-jobs get created as manual jobs get trenched. That is, modernisation, and particularly relevant to Bangladesh since its leading export commodity, ready-made garments (RMGs), faces automated external threats. Manual labour-jobs get swallowed (60 percent, experts say, by 2030), and robots service RMG plants.
Machines elevate society to higher skill-levels. After all, losing manual knitting and sewing just when higher-skilled technicians create and supervise robot programmes, represents advancement. Why should the society/country be worried?
Two other important trade-offs beg Bangladesh's attention. The first is economic: against increasingly intense RMG competition from such countries as Cambodia, Myanmar, and Vietnam (and in the near future, a string of African countries), might Bangladesh turn to automation just to retain its global competitive punch? The second has social bearings: what would happen to the 4-5.5 odd million RMG workers (more than half being women)? Would gender-balancing be constrained? Or would alternate jobs particularly suitable for women open?
Although both trade-offs must be empirically tested, since the very growth of information technologies (ITs) also requires the same meticulous manual attention and application as the RMG production processes, those same RMG workers qualify for higher-skill opportunities with their meticulous manual input. Women have done just as well, if not better, in this arena than men. Shifting laid-off RMG women workers to the IT industry releases two "sparks": more active recruitment of women in Bangladesh's IT sector, expanding the sector size; and pushing basic intellectual training upwardly to those women-dominated RMG-specific sectors. The private sector caters to the first if market demand is there, which promotes IT product transactions across the country and transfers/directs women-power into this sector. Robust governmental intervention enhancing comprehensive education from the very start is needed for the second.
"Digital Bangladesh" includes the newly initiated "Sheikh Russell Computer and Language Lab" (SRCLL) plan with the ICT (Information and Communications) Ministry and CRI (Center for Research and Information) to open 10,000 schools and colleges across the country. Bangladesh must now seize openings suitable for robotising the RMG industry, should "push" become "shove" globally.
Digressing into Bangladesh's RMG sector exposes five of Meyer's "digital age" cornerstones. His first adapts education to 21st Century technology-tailored jobs and society. Past education curricula will not withstand the needs of the practical and materialistic 21st century: theories have to go, meaning a greater blow on many social science disciplines than for professions or natural sciences. Bangladesh's Universities Grants Commission's glacial a posteriori flexes to intellectual changes must now be substituted by a priori calculations or assessments. Jobs on the streets and in the market must be actively supported, supplemented, and safeguarded by the Ministry of Education. It cannot but go online, instead of retaining tons of meticulously crafted manually assembled office files and registers.
Meyer's second cornerstone, of finding new jobs for robot-displaced workers, has already been addressed through Bangladesh's RMG industrial changes. So too has the third, of public policymakers reading the job-market accurately, but more importantly, pre-emptively, if only to retain a cutting-edge in this fast-moving world. Many policies require lengthy gestation from preparation to practice/enactment, and a jobs ministry working in tandem with a social affairs ministry should become a top-priority consideration. It also implies some degree of private-public partnership (PPP) between initiated and institutionalised sectors. In fact, the fourth cornerstone, of financing job-creation and -cultivation, especially for digitally displaced workers, overlaps this PPP initiative, which carries other spill-overs worth exploitation. For instance, the private could be broadened to include foreign entrepreneurs, since multi-nationalising economic behaviour is part and parcel of the digital revolution. Above all, the digital revolution now permeates almost every human function and/or production process.
Ultimately, the fifth cornerstone builds upon, and can actually happen only if all the above four become operational. Capital ownership, but no IT society, can be easily democratised, setting up a tension, since new knowledge stems more from individual-level than social-level skill-sets. Cultivated knowledge accelerates faster than socially streamlined knowledge: competitiveness, which is intrinsic to creating new knowledge, generates that private-public chasm. As it bears upon education, this tension must be allowed to grow, albeit along more tamed parameters and contours at more sub-innovation levels for the public. PPP democratises capital ownership by encouraging both new IT thinking at lower tiers and top-tier competition. Balancing both extracts the most, but necessitates governmental intervention.
That must be the digital age message: how to minimise the ever-widening gap between those who know and those who do not, those with "active" and "passive" brain-power, since there is no space for even resurrected brain-power to close that gap. Claims of a digital revolution also fall apart with knowledge emanating from individuals more than society. Pushed farther, even reference to a digital age is also not at all new within society: what is new to the public is unlikely to be so to the generator of that idea. Martin Krzywdzinski, Christine Gerber, and Maren Evers, among others, have been hammering away at this weak digital revolution claim for some time now. In a 2018 piece, "Social consequences of the digital revolution," they correctly pointed out how renowned Social Scientist Herbert Simon, along with Allen Newell, predicted in 1958, when there was no PC (personal computer), how a digital computer would defeat the world chess champion. The only existing contraption was the massive ENIAC (Electronic Numerical Integrator and Computer) room-sized outfit, prepared under supervision of the century's greatest mathematician, John von Neumann, in the University of Pennsylvania's Moore School of Engineering and Applied Science (there was no Nobel Prize for economists then to award him for his contributions).
Another brilliant social scientist, the Norwegian Johann Galtung (and a repeated Nobel Peace Prize nominee), predicted the very pathway to digital preponderance (in "A structural theory of imperialism"). This was in 1971, far before digital revolution is commonly traced back to. His final phase sees imperialism emanating from communications, which incorporates anything digital. Military and economic imperialism began the ballgame, he contended, before producing political imperialism, followed by the cultural. Communications imperialism becomes the final straw. True, British imperialism did not begin with the military in South Asia, but the East Indian Company could come so far to scope economic opportunities only because "Britannia ruled the waves": its navy protected all passageways, from the English Channel through the Atlantic transit into the Indian Ocean. The rest became sordid imperial history.
Communication imperialism may be the recipe to create and destroy new information. This allows us to prevent others from learning, thereby making access to anything cutting-edge more privileged. Dooming others becomes a vital interest under cut-throat competitiveness.
As a professor, I walk that line constantly, how to prevent myself from not pushing new knowledge to students, many of whose families have put their final paisas into educating their children. How that would implode at the cusp of becoming a developed country yet still not finding the final push from its own future, that is, its present students, becomes burdensome. Students, by definition, struggle against acquiring new knowledge given their increasingly less spare time (given this digital age), but because it is being unloaded upon them, unloading cannot go on forever without consequences. The less they learn, the more gaps in our collective country-wide innovative capacities for the forward-pass we are so capable of but invariably fail to get.
This is where free-lancing makes a crucial distinction. Digital marketing has opened jobs that unemployed youths and married women can utilise from their very homes. Outsourcing software programmes, especially to promote corporate public relations, is already a vast and viable industry: Bangladesh ranks behind India in its freelancing business, earning USD 1 billion in foreign exchange, and with 600,000 participants perfectly complementing the SRCLL initiative.
There is more to the "inclusivity" capacity potential of new technologies. Where knowledge is concerned, inclusivity cannot happen in the private market, given how corporate competitiveness over-ride the search for new, often social, knowledge: the government must be involved, and the more it is involved, the greater the inclusiveness of technological spread-effects. This we know from the onset of public education: it did not coincide with the industrial revolutions for no reason, since these revolutions created and consolidated the firmest of gaps between the "haves" and "have-nots". Public universities may not be at the cutting-edge of knowledge-creation all the time, but their job of stopping the knowledge-gap from widening is a full-time and very crucial job: no other agency can do it; and without the government, we will be under the complete command of knowledge creators, for either good, that is, harmony and progress, or for evil, that is, anarchy and the survival-of-the-fittest instinct.
Some authors emphasise the 5 Cs as the founding pillars of digitalisation (Sumeet Bhutani and Yashi Paliwal, among others): consciousness, connectedness, compliance, collaboration, and contentment. These qualities did not spring from digitalisation, but without the government helping the "have-nots" to sharpen them, and thereby the country, they would have no chance in a "networked society" itself, an over-drive digitalisation outcome.
Other authors, more concerned about knowledge controls (such as Xiudian Dai), reaffirm the crucial need for governmental presence, and thereby social inclusivity. They distinguish between "market regulation" (where the fittest survive) and "state-regulation" (where the less-fit get a chance to breathe), arguing both play a balancing role upon each other: the weaker one side gets, the stronger the other side gets, as if automatically, which must be stopped.
Nobody disputes the key features of digitalisation: how ubiquitous or universal it is, that it is so affordable anyone can jump in (perhaps to be bitten later). It is reliable ("numbers do not lie"), brisk (computers outpace even the brightest mind faster), and usable (application in "all" walks of life). Yet, it is in shepherding each of those five discussed cornerstones that the maximum gains can be extracted at minimal costs (what von Neumann and his equally genius mathematical partner, Oskar Morgenstern, dub maximin).
A careful dissection of "maximin" shows how it cannot be a one-person game, raising a digitalisation paradox: although digital information emanates from individual-level intellect (brutally exposing the "have-have not" divide), no digital revolution can be successful without disseminating that know-how, making "collaboration" an essential factor. Inclusivity needs collaboration, for example, bridging across economic, educational, political, societal, and all other divides for the outcome, as we Bangladeshis managed in 1971 to win the war. Our collaborative capacity had three components, each of which digitalisation also advocates: collaboration only happens because there is "trust", upon which "purpose" is cultivated, in turn producing the "energy" propelling collaboration. Rob Gross, Amy Edmondson, and Wendy Murphy emphasise precisely these features in articulating "the nuts and bolts of digital transformation" (MIT Sloan Management Review, Winter 2020, 37-43).
When the "enemy" shifts from another human being or a corporation to robots in this enveloping AI (artificial intelligence) age, just to stay ahead demands comprehensive changes, as much on "tangibles", such as the tools we use (to get the data first, then play with the numbers, ultimately to build the architecture connecting "inputs" with desired "outputs", and fixing the playground), as the intangibles, like attitudes (cultural, economic, educational, political, social). Across a combative, competitively-inclined world today, our "outcomes" have to be a maximin-defying second-best. While this is natural against the typically competitive business environment, governmental intervention may save the day (Iansiti and Lakhani discuss these in "Competing in the age of AI," Harvard Business Review, January-February 2020, 61-7).
In the final analysis, it is governmental intervention that breaks our biases and stereotypes to prepare the level digital playing-field demand today. Have we lived up to that challenge? Time will certainly tell, but having prior knowledge displays our capacity to stay ahead of robotic power.
Dr Imtiaz A Hussain is Dean (Acting), School of Liberal Arts & Social Sciences, and Head of Global Studies & Governance Program, Independent University, Bangladesh.
Comments