Event Previews & RecapsGovernment & PolicyHealthcare & Life SciencesIEEE SA NewsPrivacy & SecurityTechnology

How to Give Users Control and Consent Over Their Health Wearable Data

IEEE SA at RightsCon 2021

A Paradigm Shift in Healthcare Data Management

Who should own patient data? Healthcare providers? Health wearables developers/manufacturers? The patient? Should the patient have the right to consent as to how their healthcare data is shared or accessed? These were all questions considered in the IEEE SA Community Lab session at the 10th annual RightsCon Conference. The discussion centered on pragmatic approaches to this complex issue. 

Organized by AccessNow, an organization working for the digital rights of internet users, the RightsCon conference examines issues at the intersection of human rights and technology, privacy, and transparency. This year’s event saw record-breaking attendance with 9,120 participants from 164 countries. 

IEEE Standards Association (IEEE SA) organized a Community Lab session entitled “Giving People Control and Consent over their Health Wearable Data: How to Enable a Paradigm Shift.” A pressing topic, the session proposal was selected from amongst over 1000 applications from over 770 host institutions in 105 countries.

Maria Palombini, Director of Emerging Communities & Opportunities Development, Global Business Strategy and Intelligence (GBSI) and the IEEE SA Healthcare and Life Sciences Practice Lead moderated the discussion in the “Community Lab” session format, which enabled much attendee participation and fostered a productive and lively discussion around how to accommodate increasing data sovereignty concerns, given the privatization of health information and the growing value of health data. 

Ms. Palombini emphasized that the intent of the session was to discuss ways to increase trust between users and healthcare entities through pragmatic and realistic approaches to data sharing. She asked participants to view the problem from the vantage point of 1) a wearables company, 2) a policymaker, and 3) a user (patient). The discussion explored the pressures that these stakeholders face, how those forces might be addressed, and what actions could be taken. Discussants recognized that this is not just a technological problem, but that it has business, economic, policy and ethical aspects that all need to be considered in the search for a solution as well.

We have seen an extreme shift in health data access. For example, at one point in time, not so long ago for some, we [the patient] carried our data from doctor to doctor. If our doctor recommended we see a specialist, he/she would hand us the hard copies of the file to take to a specialist. We were the transporter of the information. Patients were ultimately responsible for carrying their data to the next doctor. Then, as time passed, our health data began to be transmitted via fax and the Internet, at which point, we (the patients) lost connection to it. Doctors have access to the data and send it directly to the specialist without the patient ever seeing what was transmitted. We went from one extreme to the other. 

Discussants pointed out that at the moment, approaches that advocate for the extremes—patients  control all data all the time or healthcare providers control everything and share nothing—may not be immediately workable. They emphasized that there are no perfect solutions to this complex challenge and that it was important to focus on “the possible” rather than “the pie in the sky,” and be sure that solutions apply to everyone, including those who don’t yet have access to healthcare. 

Raising Awareness About Sharing Personal Health Data for Research Purposes 

The discussion moved to the importance of understanding what it means to share your data, the responsibility tied to that, and the opportunity that data sharing represents to help your personal situation. 

Discussant Dr. Florence Hudson, Executive Director of the Northeast Big Data Innovation Hub at Columbia University, and working group chair of the IEEE P2933™ Standard for Clinical Internet of Things (IoT) Data and Device Interoperability with TIPPSS, pointed out that some people would be quite willing to share personal data as it could help with their situation—either in terms of a better diagnosis or to inform research for certain diseases. 

For example, a cancer patient might want to be able to quickly access his data and freely share it with doctors to help both his course of care as well as to contribute to new cancer research. Conversely, parents of a child in the NICU might be less apt to want to share their child’s health data as freely. In both situations, participants agreed that it is important to have an awareness of what it means to share one’s data. It was pointed out that we need to ensure that consumers/patients can make conscious decisions, and that technical standards can alleviate the uncertainty surrounding access control and data quality.

Balancing Data Ownership and Dignity

Discussant Greg Adamson, Principal of the firm Digital Risk Innovation and chair of the IEEE SA DIITA (Digital Inclusion Identity Trust and Agency) Industry Connections Program, suggested considering the phenomenon that users do not care about privacy and access to data… until they do care…at which point they want protection, assurance, and remediation. 

He noted that it is hard to work out what the right balance is between 100% user-owned data and 100% healthcare provider-owned data. Neither extreme is practical. Data ownership is a concept that has a specific meaning and takes us into legal concepts. Instead, we should be considering issues of access, control, and data quality, remarked one participant. Take train timetables as an example. The train company does not own the timetables. Anyone could stand there and watch the trains pass and make a timetable. Really, we are talking about the control of the data—the right to share it and the right to be forgotten. 

He suggested that rather than focus on ownership, which is fraught, we need to focus on dignity. When deciding what should be asked of users, companies should be considering the importance of preserving the dignity of users, ensuring that even though they may not own the data, they have at least the right to consent to share it and manage it. In turn, this consideration for the consumer is a golden opportunity for the company to build user trust.

For this to work, however, depends on the user actually knowing they have the means to access their health data when they need it. One participant pointed out that we may not know and understand all the risks and impacts of data processing and sharing on individuals. “How can a rightsholder be expected to be aware and care about the consequences if they are sometimes unknown, even to those deploying and selling the tech?”

How to Empower Users to Understand and Control Their Health Data

When asked the question: Should we depend on companies to give people control and consent over their data? Sampathkumar Veeraraghavan, President of Brahmam Innovations, and the 2020 IEEE Ted Hissey Outstanding Young Professional Award recipient, suggested that we should start with the users and work from there. 

There are multiple dimensions of user awareness and focusing on empowering users with the knowledge they need to advocate for themselves would be a smart place to start. They could be empowered to understand and control their data. This way, with raised awareness, patients will demand what they need, and will be able to make informed decisions about which healthcare providers to use. 

He also noted that we are moving from an age of mass production to an age of mass personalization and that privacy is foundational for all users. Technologists should work backwards from the user perspective and build solutions to safeguard privacy. Companies need to foster a paradigm shift where privacy strategies, design choices, and safeguard measures are driven by accounting for the interest and rights of both current and future users. He stressed that companies must proactively implement robust privacy measures to safeguard the rights of unconnected marginalized populations whose data could potentially be collected. 

Looking at the issue from the company perspective,  it was agreed that we need to help businesses understand that by partnering with the consumer, they may benefit even more in the end. One area mentioned was user agreements. Often, when we need to sign an agreement to use a wearable health device, we know that we need to check the “I agree” box or we do not get to use the device. Most people, however, do not really realize what they are agreeing to. While we could focus on user education, we could also call on the companies to make more user-focused systems. If they offered some return to the patient/consumer, potentially both could benefit more.

Participants noted that overly restrictive or prescriptive policies could choke innovation and that we need to try to integrate the patient into the mix to improve the system. This would call for a market-driven approach to fixing the problem rather than exclusively relying on a policy-focused approach. 

Protect Data Privacy and Security with Standards for Connected Technologies

Consensus-developed technical standards can help in both industry and policy-driven approaches to make the process seamless. The purpose of a standard is to remove the question of credibility and uncertainty in the use of the technology and/or applications as it relates to responsible patient data governance, data interoperability, security and other concerns with sharing of health data. 

For instance, if someone needs access and the ability to share their own health data for an urgent health matter, they should not be consumed by the question “Is this going to work? Is my data secure? How will they use it?” The point of the standard is to make those questions go away and let the user focus on the real problem they are trying to solve. Standards can make this happen.

To find out more about related pre/standards working groups and activities in the area of trust, identity, privacy, protection, and the connected medical technologies such as clinical or medical IoTs, check out the resources below and join the conversation. 

The Global Connected Healthcare Cybersecurity Virtual Workshop Series

IEEE SA and Northeast Big Data Innovation Hub offer a free virtual workshop series which convenes a global community of leaders in healthcare, technology, and policy to develop mutual understanding and recommendations for standards to improve connected healthcare cybersecurity. Workshops are recorded and are available following the event:

  • Workshop 1: Global Connected Healthcare Cybersecurity Risks and Roadmap (24 February 2021)
  • Workshop 2: Privacy, Ethics & Trust in Connected Healthcare (28 April 2021)
  • Workshop 3: Data & Device Identity, Validation & Interoperability in Connected Healthcare (16 June 2021)
  • Workshop 4: Connected Healthcare Integrated Systems Design (22 September 2021)
  • Workshop 5: Connected Healthcare Technology and Policy Considerations (17 November 2021)

IEEE SA Healthcare and Life Sciences Practice 

The IEEE SA Healthcare and Life Sciences Practice is a global center of excellence bringing together committed volunteer stakeholders to evaluate, validate, and develop solutions for establishing trust in new technology applications that will afford the right to safety, security and protection of life. The practice is focused on three main priority areas to address the obstacles to universal and sustainable quality of care for all individuals:

  • Clinical health
  • Bio/pharma
  • Wellness 

IEEE SA Digital Inclusion, Identity, Trust and Agency (DIITA)

The DIITA Industry Connections Program aims to drive innovation by identifying technology solutions that enable all to participate online without barriers and building consensus in the market.

IEEE P2933 Standard on Trust, Identity, Privacy, Protection, Safety, Security

Get involved with the P2933 working group on the Standard for Clinical Internet of Things (IoT) Data and Device Interoperability with TIPPSS – Trust, Identity, Privacy, Protection, Safety, Security. 

IEEE SA Global Wearables and Medical IoT Interoperability & Intelligence (WAMIII) Program

The WAMIII program cultivates a global community of multi-disciplinary stakeholders to openly collaborate, build consensus, and develop solutions for wearables and medical Iot interoperability and intelligence.

IEEE Humanitarian Action Committee (HAC)

The IEEE HAC provides a suite of resources that inspire and enable IEEE volunteers around the world to carry out and support impactful humanitarian technology and sustainable development activities at the local level.

Show More

Kristin Little

Senior Manager, Public Affairs, IEEE Standards Association (IEEE SA) - As Senior Manager of Public Affairs at IEEE Standards Association, Kristin Little works to build IEEE's government engagement programs with the aim of facilitating communication between technology experts and policymakers. Kristin has 15 years' experience with the World Bank, designing, conducting, and contributing to field-based, mixed-methods evaluations of over $200B of investments. This work helped to shape new World Bank policies and improve resource allocation in areas such as adoption of new technologies, infrastructure, disasters, water, cultural heritage, and social development. In 2020 Kristin was appointed Digital Cooperation and Diplomacy Fellow by The People-Centered Internet -- a not-for-profit organization founded by Vint Cerf and Mei Lin Fung working to ensure the Internet is a positive force for good. Kristin holds a bachelor’s degree in International Development Studies from UC Berkeley, and two master’s degrees from MIT—MCP (Urban Planning) and MArch (Architecture).

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button