A new study led by professor Anil Aswani of the Industrial Engineering & Operations Research Department (IEOR) in the College of Engineering at UC Berkeley has revealed that what we’ve seen as exciting advances in artificial intelligence may be just the opposite.

He and his team discovered that the advances open the door for more and new threats to the privacy of health data. In the midst of current AI developments, current laws and regulations are no longer enough to keep our health statuses private. You can view all of their research on JAMA Network Open.

Aswani’s study, partially funded by UC Berkeley’s Center for Long-Term Cybersecurity, proved that artificial intelligence can identify individuals through tracking and learning daily patterns in step data (like that collected by activity trackers, smartwatches, and smartphones) and then correlating it to demographic data.

After gathering two years’ worth of data encompassing over 15,000 Americans, he concluded that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation require dire change.

We Need to Look at That

“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” Aswani says.

“The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”

“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he explains. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”

The problem is not with the advancements and devices themselves but with how one can misuse these devices for their own personal gain.

“I’m not saying we should abandon these devices,” he says. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”

From this study covering step data, Aswani suggests that more than just the specific area of research could be affected. “HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says.

“Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”

As AI advance, information is easier to retrieve, and employers, mortgage lenders, credit card companies, and others could begin to use it in illegal and unethical ways.

“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he says. “But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing.

The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing.”

Advancement of AI Opens Health Data Privacy to Attack

A new study led by professor Anil Aswani of the Industrial Engineering & Operations Research Department (IEOR) in the College of Engineering at UC Berkeley has revealed that what we’ve seen as exciting advances in artificial intelligence may be just the opposite.

He and his team discovered that the advances open the door for more and new threats to the privacy of health data. In the midst of current AI developments, current laws and regulations are no longer enough to keep our health statuses private. You can view all of their research on JAMA Network Open.

Aswani’s study, partially funded by UC Berkeley’s Center for Long-Term Cybersecurity, proved that artificial intelligence can identify individuals through tracking and learning daily patterns in step data (like that collected by activity trackers, smartwatches, and smartphones) and then correlating it to demographic data.

After gathering two years’ worth of data encompassing over 15,000 Americans, he concluded that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation require dire change.

We Need to Look at That

“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” Aswani says.

“The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”

“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he explains. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”

The problem is not with the advancements and devices themselves but with how one can misuse these devices for their own personal gain.

“I’m not saying we should abandon these devices,” he says. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”

From this study covering step data, Aswani suggests that more than just the specific area of research could be affected. “HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says.

“Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”

As AI advance, information is easier to retrieve, and employers, mortgage lenders, credit card companies, and others could begin to use it in illegal and unethical ways.

“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he says. “But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing.

The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing.”

A new study led by professor Anil Aswani of the Industrial Engineering & Operations Research Department (IEOR) in the College of Engineering at UC Berkeley has revealed that what we’ve seen as exciting advances in artificial intelligence may be just the opposite.

He and his team discovered that the advances open the door for more and new threats to the privacy of health data. In the midst of current AI developments, current laws and regulations are no longer enough to keep our health statuses private. You can view all of their research on JAMA Network Open.

Aswani’s study, partially funded by UC Berkeley’s Center for Long-Term Cybersecurity, proved that artificial intelligence can identify individuals through tracking and learning daily patterns in step data (like that collected by activity trackers, smartwatches, and smartphones) and then correlating it to demographic data.

After gathering two years’ worth of data encompassing over 15,000 Americans, he concluded that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation require dire change.

We Need to Look at That

“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.,” Aswani says.

“The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”

“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he explains. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”

The problem is not with the advancements and devices themselves but with how one can misuse these devices for their own personal gain.

“I’m not saying we should abandon these devices,” he says. “But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it’s a net positive.”

From this study covering step data, Aswani suggests that more than just the specific area of research could be affected. “HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says.

“Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”

As AI advance, information is easier to retrieve, and employers, mortgage lenders, credit card companies, and others could begin to use it in illegal and unethical ways.

“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he says. “But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing.

The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing.”