Opinion
In the AI Era, Privacy and Democracy Are in Peril
—
U.S. regulators should look to Europe and India’s models as two low-risk approaches to data protection.
By Vasant Dhar
Lawmakers should not let the impeachment inquiry distract them from addressing an equally pressing issue that threatens U.S. democracy. The unchecked use of Artificial Intelligence (AI) by the major internet platforms is invading our privacy, putting our democracy at risk.
Google reads everyone’s email, analyzes our searches, where we surf, where we go in the physical world, who our friends are, who we have spoken to and much more. The internet giants have appropriated public and private data to create “prediction products” that can forecast individual behavior and manipulate them without their awareness (or consent). Pure profit maximization has led to a capitalistic form of surveillance that is arguably even worse than the Chinese model, which is ostensibly about maximizing the benefits to all of society. These two extreme objective functions – unconstrained profit maximization and state control – lead to the same result: The ceding of free will to AI algorithms that control us overtly or covertly. What does democracy mean if there is no free will?
I recommend regulating the use of personal data for prediction products. I also propose classifying certain platforms as “digital utilities” that aim to maximize public benefit and spur economic growth, much like the interstate highway system and the “Information superhighway” have done for physical and electronic commerce.
Read the full article in The Hill.
___
Vasant Dhar is a Professor of Information Systems.
Google reads everyone’s email, analyzes our searches, where we surf, where we go in the physical world, who our friends are, who we have spoken to and much more. The internet giants have appropriated public and private data to create “prediction products” that can forecast individual behavior and manipulate them without their awareness (or consent). Pure profit maximization has led to a capitalistic form of surveillance that is arguably even worse than the Chinese model, which is ostensibly about maximizing the benefits to all of society. These two extreme objective functions – unconstrained profit maximization and state control – lead to the same result: The ceding of free will to AI algorithms that control us overtly or covertly. What does democracy mean if there is no free will?
I recommend regulating the use of personal data for prediction products. I also propose classifying certain platforms as “digital utilities” that aim to maximize public benefit and spur economic growth, much like the interstate highway system and the “Information superhighway” have done for physical and electronic commerce.
Read the full article in The Hill.
___
Vasant Dhar is a Professor of Information Systems.