In the wake of COVID-19, with increased need for location tracking and the sharing of biometric and medical data to combat the pandemic, a dialogue has emerged in governmental, commercial, and civil societies debating the balance between personal privacy and the protection of the public. Calls for personal data sovereignty–the idea of individuals owning their data and controlling its use–have been mounting.
IEEE addressed this topic during our Open Forum session “Personal Sovereignty: Digital Trust in the Algorithmic Age” at the 2020 Internet Governance Forum (IGF), which was the fifteenth annual–and first-ever entirely virtual–meeting of the IGF.
The speakers, Dr. Salma Abassi, CEO of the eWorldwide Group and John Havens, Executive Director of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, were joined by moderator, Moira Patterson, Global Market Affairs and Community Engagement Director for IEEE Standards Association (IEEE SA). Together with the attendees, they discussed personal data sovereignty. Topics explored in the conversation include: the balance between personal data sharing and public health, the importance of a collective effort, the role of standards in keeping humanity at the center of technology, how industry can gain customer trust, and what we as individuals can do to gain control of our data. The discussants offered several proactive measures that we could take in the face of this pressing issue.
Moira Patterson began by posing the question “Why is the emergence of data sovereignty so essential, especially in the algorithmic age?
John Havens first defined data ownership as owning the narrative and understanding particular technologies, tools, and policies. Right now other people, including advertisers and governments, know more about us than we know about ourselves, he explained. We are tracked by businesses to help us make better purchases, and by the government, to keep us safe in a pandemic–if things are working the way we want. Data tracking takes on a dangerous form when you look at how people are being bombarded and manipulated through specific channels, political or commercial, however. We do not see the physical algorithms that track our data, Havens warned, but that does not mean that they are not there.
Patterson added that there is a level of trust that exists in the physical world that can be lost in the online world, where more and more data can be used in new and potentially intrusive ways. She then asked Dr. Abassi “How do we reset and address the concerns around surveillance post-COVID-19 in the context of sovereignty, identity and data governance?”
The case of COVID-19 and data collection:
How to strike a balance between personal privacy and protection of the public
“There is conflict between how much data should be given to address the pandemic and how much data should be protected because we are moving into a very dangerous area of surveillance,’ said Dr. Salma Abassi. There has been a huge acceleration in government use of technology to support us in the response to COVID-19, through test, track, and trace systems. Some governments are abusing this power and sharing the medical status of individuals with their neighbors. The question becomes “How much surveillance is acceptable?” noted Abassi.
Policymakers also need to recognize that technology and surveillance systems are constructed by the private sector, she said, whose motives and ethics may not necessarily align with what private citizens believe to be acceptable. The situation has been further complicated by intensified data gathering, not only as a result of pandemic, but more generally the Internet boom, smartphone data gathering, and the proliferation of IoT devices, and we the citizens are unable to really understand exactly how our data is being used.
We have to begin to reset and understand what to do after the pandemic is over, she said. How do we protect ourselves and how do states actually work in this space? Abassi called for the creation of a technical task force that would examine how data is being collected and if it is being used in a transparent way.
Call for a collective effort from government, private sector, and civil society
Without the collective effort of governments, the private sector, and civil society it will be difficult to shift the power away from companies and back to the people, warned Abassi. ‘Citizens cannot own their data unless there are enforceable laws to help them.’ Abassi called for IEEE to help in this context, noting that governments cannot ask industry to set regulations for themselves. In order to increase negotiation power and build a more balanced ethical understanding of how our data is being used and manipulated, she recommended that governments increase their collaboration with think tanks and organizations like IEEE.
Abassi called for individuals to demand their rights. She noted that AI is being used to measure trends for business, but analysis of trends in health or trends in humanitarian issues will not happen unless driven by citizens and supported by governments. One such issue is child online protection, and IEEE is an excellent forum in which to discuss the topic. The Internet Governance Forum, this one, is another.
We have to accept that, in the future, governments will need to collect more data to keep us safe, said Abassi. To build trust, governments need to set policies that allow individuals to create their own Terms of Reference outlining who is allowed to store their data, and how and when they are allowed to share the data, and when it is to be deleted. Abassi pointed to the Government of Estonia as being the Gold Standard to look to in the future. After a huge cyber attack in 2007, the government set up strong security infrastructure, in a very transparent way. Citizens knew what data was being collected, how was it going to be used, how was it going to be protected, and which agency was going to use it.
Standards play a fundamental role in scaling solutions
Patterson pointed out that it is important to highlight the role of standards in the context of personal data sovereignty because as a standards organization, IEEE is deeply engaged, and sees the critical role that standards can play in helping to create the ecosystems and tools necessary to empower people and scale solutions. Standards are the building blocks that can help make best practices more accessible to all stakeholders.
For instance, IEEE recently approved IEEE 3527.1™ Standard for Digital Intelligence (DQ), the world’s first global standard related to digital literacy, digital skills, and digital readiness. a standard to help measure and create digital literacy frameworks to help people gain the necessary digital literacy, skills, and readiness. skills. But even more fundamentally perhaps, said Patterson, human dignity needs to be at the core of our thinking. The technology should serve people and people’s needs at a basic level. Developing standards to help support that fundamental understanding will be very important.
How do we ensure trust amidst the commercialization of data
Havens discussed how companies can build trust within a model of data sovereignty. Buying data about their customers’ preferences is not only an expensive practice, but it represents a lost opportunity to build the customer’s trust. Directly asking the customer what she wants creates two-way trust. To ask, you have to build the data sovereign channels to allow your customers to speak back to you or it does not make a difference that you think you are being trustworthy. You have not empowered customers with the tools to actually answer you back.
Individuals need to do their part in creating trust
First of all, there needs to be a paradigm shift toward personal data sovereignty. We should not be made to feel guilty by others who claim that personal data sovereignty is about having something to hide, said Abassi. It is not really about hiding, but rather more a matter of being in charge of our data, being in charge of what we would like to reveal, said Havens. Second, we need to reappropriate our data and stop giving it away for free, said Abassi. We need to agree that we own our data and there is a new narrative that is set by us, she said. We have to rethink how we are going to give our data to business.
In his closing remark, Havens stated: ‘We can either be angry or we can just get the work done. The work is protecting our kids, honoring our dignity, taking all the tools that are available for advertisers and saying, “thank you for developing all of these great tools,” now we would like to use them as well.’
IEEE wants to make these ideas implementable and practical, so it is helping to create scalable solutions and roll them out broadly. So, if you are interested in helping to create new standards for personal digital sovereignty, please get in touch with us.
Learn more about IEEE’s work in promoting personal sovereignty:
- Digital Inclusion, Identity, Trust, and Agency (DIITA) Program
- The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems
- IEEE P7000 – Draft Model Process for Addressing Ethical Concerns During System Design
Author: Kristin Little