Privacy is next to impossible to maintain in today’s digitally connected world. But, said digital privacy scholar Anita Allen, Ph.D., we should strive for it anyway.
“Identifying and doing what is ethical in a digital society will require deep, honest, selfless thinking,” Allen said in an appearance at Fordham’s Lincoln Center campus on March 7.
“We are drawn to technology like moths to a flame. We risk hurting ourselves. We also empower ourselves to hurt others.”
Allen, the Henry R. Silverman professor of law, professor of philosophy, and vice provost at the University of Pennsylvania, delivered her lecture, “Ethics and Digital Life,” as part of the Graduate School of Arts and Sciences’ Gannon lecture series.
The lecture, which focused on the ethical implications of data mining and social media on people, was also part of a series of events celebrating the 20th anniversary of Fordham’s Center for Ethics Education, which co-sponsored the evening.
A Crisis of Confidence in Big Tech
In her talk, Allen noted that the public mistrust in organizations that harvest and store our data has reached crisis levels. American lawmakers have been slow to recognize this, but the European Union has not, Allen said, as evidenced by the fact that it invited her to speak in October at its annual conference on data privacy, the same day as Apple CEO Tim Cook.
“I really applaud the E.U. for its effort to bring ethics to the fore,” she said. “As a global community, we cannot afford to go mindlessly wherever technology innovators and adaptors may choose to drag us.”
Although it was written over 2,000 years ago, Plato’s story Ring of Gyges is worth considering as we contemplate these issues, Allen said. In the story, a shepherd named Gyges discovers a ring that grants him invisibility. He uses it to seduce a king’s wife, have the king killed, and finally take the place of the king himself.
“I think about the myth of Gyges a lot, because I actually believe that when we put technology in the hands of human beings, we presume that they will use that technology for good,” she said.
“We should presume there will be an awful lot of not-good uses, as well as a lot of accidental problems like data breaches or data leaks that no one intended.”
Our Own Ethical Responsibilities
Although it is impossible for us to control every aspect of our digital lives, she suggested that we all have five ethical responsibilities: In addition to supporting ethical, effective laws regarding data privacy, she said, we should also support corporate digital ethics and accountability, stand behind technology and ethics education at all levels, and work to advance nonprofit and civil society organizations that champion data security and ethics.
And perhaps most importantly, we should mind our own behaviors to make sure we’re respecting our own privacy as well as that of others.
“To me, privacy is a very important good that has a relationship to respect and self-respect,” she said.
“Individuals of moral character will moderate their sharing, and will be mindful about the extent to which their virtues like reserve and prudence might be at stake when they engage in digital life, whether it’s using social media, using credit cards, using or not using passwords, encryption, security software, and so forth.”
There are some reasons for hope, she said. In a landmark 2017 court case, for instance, the Supreme Court in India ruled for the first time that citizens there had a fundamental right to privacy. The decision, which came in response to a lawsuit related to the country’s biometric national ID—known as the Aadhar card—cited Allen and several American colleagues in its written opinion. This led in turn to a unanimous decision of the country’s top court last year to overturn a colonial-era ban on consensual gay sex.
Allen also praised the European Union’s General Data Protection Regulation (GDPR), which was passed last year, for codifying a “right to be forgotten” that people can use to petition search engines to expunge embarrassing information about themselves from search results. In the United States, privacy laws exist to protect health data (HIPA) and education data (FERPA), so it stands to reason that other forms of personal information should be protected, she said.
“It’s not that America doesn’t have a lot of privacy laws. The problem is a lot of our privacy laws are out of date and narrow,” she said, noting that the last major overhaul to U.S. telecommunications law was in 1996.
Think of the Future
Ultimately, she said, it’s up to us to embrace what she called a “quotidian practice” of a commitment to privacy values, which are important because they’re linked to ideals of freedom and dignity and they’re important for forming independent, self-respecting individuals.
“If we just completely give up on the idea of privacy now, we might wake up and discover in 20 years from now that we wish we hadn’t,” she said.
“I’m not saying to anybody, you shouldn’t do 23andMe and you shouldn’t have an Alexa. What I’m saying, is, if you have 23andMe and Alexa and you live your life on Twitter and Facebook and Instagram, and you have one of those 24-hour web cameras in your house where you’re on YouTube all day long—if you’re living in the panopticon, then you’ll have a problem.”