[The second of a multi-article series on digital revolution]
With the "digital revolution" neither a pie in the sky nor a drop from the sky, how we comprehend this movement depends as much upon the length, breadth, and depth of the term, as it does on what intentions we have with its numerous contraptions at our disposal. Crucial to both dimensions is to recognise the "waves" that have led to the present. Bob Merritt, who has emerged as some sort of a pioneer in excavating this phenomenon theoretically, distinguishes between three of them. Tracing the "digital revolution" tracks to 1970s, he distinguishes the "analog wave" from the 1960s to the 1990s being challenged by what he calls the "first digital wave" from the early 1980s, but overtaking the analog stream by the mid-1990s, just as the "second digital wave" did the same to the "first" from about that point, that is pose a challenge from the 1990s before taking the reins away on the eve of the Great Recession (2008-10). This "third wave" is poised, according to Merritt, to set the pace for at least another decade, if not longer.
Based on contraptions we were hooked to, but which by now seem too old-fashioned (though still harder to abandon), like the television and video camera recordings (VCRs), the "analog" wave yielded to the emergence of the personal computer in the "first" digital wave, itself being overtaken by artificial intelligence (AI) contraptions facilitating speed and network capabilities, highlighted by networking, and ultimately, the robot family, in the "second" digital wave.
Those who have lived through them all can personally attest to the fundamental changes, yet to which the "digital revolution" was applied far too belatedly and all too recently to accord that term as much legitimacy as the awe it has been instilling. More profoundly, yet treated with less reverence, is how all of them expanded the social realm. With the television and video camera recordings we could actually see distant lands without ever stepping outside our own home. It helped us compare, contrast, and make preferences accordingly, creating positive and negative accounts extensively. These would broaden our mindset enormously, for good or for bad.
With the personal computer not only could the length, breadth, and depth of the analog experiences be stretched out farther, but substantive knowledge also expanded, that is, if and when we wanted it to: we became more selective whom we wanted in our company (or "social" gathering), as evident in Facebook likes, and whom we did not. If we were introverted, we could now break out expressively with others of the same feathers, while extroverts got the chance to not only add to their numbers but also make more noise advocating pet issues. This was part of a "social revolution," albeit in slow motion, so much so that multiple industries profited from engagement: media and communications networks multiplied, until the entire globe could be covered if one wanted, giving beef to the cliché "the global village," but tourism also blossomed out to such an extent as to make Mother Nature cry out in many areas for reasons of saturation or exhaustion. These were not the only illustrations of the expanding human realm.
One crucial component of this social spread was not only the dissemination of education, again, not only reaching all nooks and corners of the world if anyone so wanted, and evident in the huge flows of students across national boundaries, but equally glaringly the retreat of many well-off students from education beyond a certain level. As with the social spread, expanding education also exposed the average individual getting richer and the net social wealth also expanding: neither meant poverty was being eliminated, but simply that relative wealth was the new norm, the "short" end typically being dubbed the poorer pole, chosen by those who wanted to do nothing, or so the argument ran.
As the third digital wave unfolds before our eyes, we can see many of the costs and casualties: education becoming more and more burdensome the higher or the lower the net personal wealth, but regardless of class, being overtaken by the various "digital revolution" contraptions, like Facebook, Internet, WhatsApp, and so forth; social explosions now facing resistance and becoming dysfunctional as, for example, populism creeping worldwide for both spontaneous reasons (protest against slipping economic or political competitiveness), and networking (the capacity of populists identifying each other across the world through "digital revolution" contraptions); and the emergent consequences of both of the above, the retreat from education having an unlikely rendezvous with the spillovers of creeping populism and generating overall disorder in one arena or another, either in governance or welfare, or something else.
These are not incurable social, political, or economic diseases, but the steam to rectify them has been overtaken by the unwillingness or inability to do so. Nowhere better to understand that better than the very unit where all of these changes have been having an impact, indeed, to the very springboards of those changes: the individuals. One of the unmeasured assumptions of the "digital revolution" has always been that the subject (as creator or recipient), was keeping abreast of all technological changes. Few had ever seriously analysed early enough how the technological changes might actually be proceeding far more quickly than human beings were being able to consume them, thus leading to very superficial "outcomes" and increasingly quivering platforms. What the robotic turn confirms is that this has not only happened, but more disturbingly, it may also be just a tad too late to make any corrective or streamlining gestures. The result: the defeatist resignation to those changes, under arguments like technological developments have gone on far faster than we humans can absorb them, or the blame-game viewpoint that we never had the means to get on board to enjoy the technological upturn in the first place, so our inabilities are now being taken advantage of now, without opening any sort of entry-points to redress the widening gap.
The only problem with these is how the human being remains the instrument of last resort, the unit most capable of modifying and halting progress as and when necessary, given the nature and volume of costs. After all, we were able to stop any further nuclear bombings, even though that does not convey how clumsily we were in doing so: the costs were too high to have any more, but not to resort to threatening others with them, even though that makes accidental nuclear explosions more likely. What is of essence is how "measured control" is possible, an opportunity that cannot be rubbed under any carpet. This may ultimately help us out on an even grander scale: environmental plunder has proceeded just as recklessly as the nuclear-weapons race, but like the nuclear-weapons race, it demands we pay much more attention now than ever before since we have reached tipping-points on several fronts, with either resources running out, or too heavy pollution leaving us too engrossed to function to our best. The habits behind these crises are not new, but though the time to correct them has become more essential, artificial intelligence, with the robot family in charge, may help us to not only learn the full extent of the damages being done world-wide, but also begin correcting the malaise here and there, to begin with.
There is another secular reason why technological advances cannot be stopped: one group among us has been rising to the challenge. The next piece looks at what "digital revolution" has done to our most dynamic humans, the youth.
Dr. Imtiaz A. Hussain is Professor & Head of the Department of Global Studies & Governance at Independent University, Bangladesh.