Scattered Clouds
clouds

18 April 2024

Amman

Thursday

71.6 F

22°

Home / View Points

Governing the unknown

24-07-2023 02:02 PM


Kaushik Basu
Technology is changing the world faster than policymakers can devise new ways to cope with it. As a result, societies are becoming polarised, inequality is rising and authoritarian regimes and corporations are doctoring reality and undermining democracy.

For ordinary people, there is ample reason to be “a little bit scared”, as OpenAI CEO Sam Altman recently put it. Major advances in artificial intelligence raise concerns about education, work, warfare and other risks that could destabilise civilisation long before climate change does. To his credit, Altman is urging lawmakers to regulate his industry.

In confronting this challenge, we must keep two concerns in mind. The first is the need for speed. If we take too long, we may find ourselves closing the barn door after the horse has bolted. That is what happened with the 1968 Nuclear Non-Proliferation Treaty: It came 23 years too late. If we had managed to establish some minimal rules after World War II, the NPT’s ultimate goal of nuclear disarmament might have been achievable.

The other concern involves deep uncertainty. This is such a new world that even those working on AI do not know where their inventions will ultimately take us. A law enacted with the best intentions can still backfire. When America’s founders drafted the Second Amendment conferring the “right to keep and bear arms”, they could not have known how firearms technology would change in the future, thereby changing the very meaning of the word “arms”. Nor did they foresee how their descendants would fail to realise this even after seeing the change.

But uncertainty does not justify fatalism. Policymakers can still effectively govern the unknown as long as they keep certain broad considerations in mind. For example, one idea that came up during a recent Senate hearing was to create a licensing system whereby only select corporations would be permitted to work on AI.

This approach comes with some obvious risks of its own. Licensing can often be a step towards cronyism, so we would also need new laws to deter politicians from abusing the system. Moreover, slowing your country’s AI development with additional checks does not mean that others will adopt similar measures. In the worst case, you may find yourself facing adversaries wielding precisely the kind of malevolent tools that you eschewed. That is why AI is best regulated multilaterally, even if that is a tall order in today’s world.

Another big concern is labour. Just as past technological advances reduced demand for manual labour, new applications like ChatGPT may reduce demand for a lot of white-collar labour. But this prospect need not be so worrying. If we can distribute the wealth and income generated by AI equitably across the population, eliminating plenty of work would not be a problem. Far from being diminished by not working, feudal lords were aggrandised by their leisure.

The problem, of course, is that most people do not know how to use free time. Pensioners often become anxious because they do not know what to do with themselves. Now, imagine that happening on a massive scale across younger cohorts. If left unchecked, crime, conflict and perhaps extremism would become more likely. Averting such outcomes would require modifying our education systems to prepare people for the leisure force. As in earlier eras, education would mean learning how to enjoy the arts, hobbies, reading and thinking.

A final major concern involves media and the truth. In “How to Stand Up to a Dictator”, the Nobel laureate journalist Maria Ressa laments that social media has become a powerful tool for promoting fake news. As Amal Clooney points out in her foreword to the book, autocratic leaders can now rely on “an army of bots” to create the impression that “there is only one side to every story”.

This is a bigger challenge than most people realise. It will not go away even if we pass laws prohibiting automated disinformation. As Amartya Sen pointed out more than 40 years ago, all description entails choice. Reality is so complex that we cannot possibly represent it without making decisions about what to include and what to omit. In a world that is drowning in information, savvy influencers do not need to make up news; they can simply be biased in what they choose to report. News outlets can influence voters’ opinions in ways both subtle and flagrant. Just compare the images of Donald Trump and of Joe Biden that Fox News chooses.

We cannot solve the problem of authoritarian influence by banning fake news. Our best hope again lies in education. We will need to do a better job teaching people to be discerning and less susceptible to manipulation.

Innovation in law and policy must go hand in hand with innovation in education, and all are necessary to keep up with innovation in technology.

Kaushik Basu, a former chief economist of the World Bank and chief economic adviser to the government of India, is professor of Economics at Cornell University and a non-resident senior fellow at the Brookings Institution. Copyright: Project Syndicate, 2023.

www.project-syndicate.org




No comments

Notice
All comments are reviewed and posted only if approved.
Ammon News reserves the right to delete any comment at any time, and for any reason, and will not publish any comment containing offense or deviating from the subject at hand, or to include the names of any personalities or to stir up sectarian, sectarian or racial strife, hoping to adhere to a high level of the comments as they express The extent of the progress and culture of Ammon News' visitors, noting that the comments are expressed only by the owners.
name : *
email
show email
comment : *
Verification code : Refresh
write code :