Is AI Development "Research" Under HIPAA?
Using PHI to advance AI development could vastly improve health care and reduce costs — but is it HIPAA-sanctioned "research"?
WASHINGTON DC USA -- HIT/HIPAA UPDATE NEWS SERVICE -- JUNE 28, 2023: Continued advancement in artificial intelligence offers great promise to improve health care. But AI feeds on tremendous amounts of data, and using protected health information (PHI) to develop or improve AI often involves navigating the HIPAA Privacy Rule. This has led to a regulatory question of paramount importance: is the development and improvement of AI considered "research" for purposes of using PHI under HIPAA?
The HIPAA Privacy Rule does not allow covered entities or business associates to use or disclose PHI unless there is a specific permission or requirement in the Privacy Rule. AI may be employed as part of treatment or payment activities or as part of a covered entity's health care operations. A covered entity or, with appropriate permission a business associate, may use PHI to create de-identified information, which in turn may be used to develop or improve AI but that could be sub-optimal for developing AI.
What is less clear is whether the development of AI potentially qualifies as "research" under HIPAA in certain circumstances.
HIPAA defines "research" as "a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge." This definition is the same as, and derived from, the definition of "research" found in the Common Rule governing protection of human subjects in research at 45 C.F.R. § 46.102. Based on this definition, there are two key elements: (1) the activity must be a "systematic investigation" and (2) the activity must be designed to develop or contribute to generalizable knowledge.
For the first element, for AI development to potentially qualify as research, it must be systematic in nature. For example, the AI developer may help to establish this by describing the goal of the AI development, the process for achieving this goal, and a means for evaluating the effectiveness of the result.
The second element — contributing to generalizable knowledge — is where much confusion and controversy arise. "Generalizable knowledge" is not defined in HIPAA or the Common Rule, but is commonly understood to include where the intended use of the research findings is applicable to populations or situations beyond those studied. Many interpret this element to require that results be published academically to qualify as "research" under HIPAA. But guidance from the HHS Office for Human Research Protections (OHRP) clarifies otherwise: "Whether or how an investigator shares results with the scientific community is not the deciding factor for whether the activity was designed to develop or contribute to generalizable knowledge. For example, lots of information is published that comes from activities that do not meet the Common Rule's definition of research. And sometimes results from research that meets the Common Rule definition never get published."
The preamble commentary to the Privacy Rule includes examples of commercial research, such as a pharmaceutical company recruiting patients for drug research. This research is not academic in nature and is for the purpose of creating and selling a drug, rather than publication of the results of the research. But commercial research still is regarded as "research" for purposes of HIPAA and the Privacy Rule.
Arguably, the same logic applies to development efforts in the area of AI. AI development, if systematic in nature, arguably qualifies as "research" for purposes of HIPAA if the intent is to contribute to generalizable knowledge by applying the AI more broadly, regardless of whether there is an intent to publicly publish results of the research and development efforts.
Another misconception is that if the AI development activity can qualify as "research," then that alone is sufficient to satisfy HIPAA. In actuality, HIPAA generally requires individuals' authorizations to use or disclose PHI for research purposes. One exception at 45 C.F.R. § 164.512(i) is if an institutional review board (IRB) or privacy board determines and documents a decision to waive HIPAA's authorization requirement. This type of waiver arguably could permit the use and disclosure of PHI for AI research and development. But a number of safeguards must be met. The Privacy Rule requires the IRB or privacy board to meet certain criteria to promote impartiality. For example, a privacy board must include at least one member who is not affiliated with the covered entity, not affiliated with any entity conducting or sponsoring the research, and not related to any person who is affiliated with any of such entities. Additionally, the IRB or privacy board may waive the authorization requirement only if certain criteria are met, including that the use or disclosure of the PHI involves no more than a minimal risk to the privacy of individuals based on a number of prescribed factors. Accordingly, if parties take the position that AI development qualifies as "research" for purposes of HIPAA and seek waiver of HIPAA authorization requirements, then there remain significant regulatory safeguards and processes to protect the privacy of individuals.
Finally, although covered entities may use and disclose PHI for research after meeting the direct HIPAA requirements, business associates are further limited by their business associate agreements (BAAs). Accordingly, a business associate that is seeking to use or disclose PHI for AI research not only must comply with HIPAA requirements, such as obtaining an IRB or privacy board's waiver of authorization, but also must verify that all applicable BAAs permit the business associate to use and disclose PHI for research purposes. This specific research permission is not typical in BAAs and often is a product of negotiation between the parties to the BAA.
Research is merely one potential basis under HIPAA to use PHI to systematically develop and improve AI in health care. Additionally, besides HIPAA, other laws may further restrict the use and disclosure of personal information. For more information on how health information potentially may be used to develop and improve AI and the various laws that must be navigated, you may listen to a recording of our webinar: Health Information Privacy in the World of AI.
AUTHORS
|
Adam Greene, JD, MPH
Partner and Co-chair, Health Information and HIPAA Practice, Davis Wright Tremaine LLP; Former Senior Health Information Technology and Privacy Specialist, Office for Civil Rights, HHS, Washington, DC |
Rebecca L. Williams, RN, JD
Partner and Chair, Health Information Practice, Davis Wright Tremaine LLP, Seattle, WA
|
FOR E-MAIL ADDRESS CHANGE, ADD OR DELETE REQUESTS:
For changes or additions, please email your request to: listmgr@PaymentDelverySystemReformUpdateNewsService.com.
For removal of your e-mail address, please click the "SafeUnsubscribe" link located in the footer of this message below to automatically remove your address from the list.
|