How non-medical information helps health firms’ risk assessments

Research lays the case for how social factors can have an impact on individual health

How non-medical information helps health firms’ risk assessments

Fitness tracking was a trend exclusive to athletes and sports enthusiasts just a few years ago, but they are set to become the next disruption in life and health insurance. Aside from helping people commit to healthy living, they also provide insurers with data to determine people’s risk for various health conditions and diseases.

Trackers can help insurers gather useful medical data such as heart rate and number of steps taken per day. But as it turns out, that’s just a fraction of the information that insurers want to learn about people as they try to make predictions on health risks.

“With the help of data-analytics companies, [health plans] are now making use of information such as how much people earn, how often they travel and even if they have a pet,” reported the Wall Street Journal.

Two trends are contributing to the rise in such efforts. Aside from a shift toward value-based care, where providers are rewarded for keeping patients healthy, there’s increasing recognition that social factors significantly contribute to health.

Citing a 2013 report by the National Conference of State Legislatures, the Journal said around 80% of a person’s state of health is the result of socioeconomic status, physical environment, health behaviours, and biology. That means by analysing social and medical data, and turning to machine learning, health organizations could identify at-risk patients and direct them to appropriate prevention programs and qualified specialists.

An example of this effort is a new collaboration between insurer Health Alliance Medical Plans and data firm Carrot Health, which are both based in the US. Under the partnership, Carrot considers hundreds of data points about individuals, including pet ownership and their frequency of travel, which it has found to be statistically correlated with better health.

It gets the information from data brokers, public records, government agencies, and other sources, and uses it to generate a patient risk score; according to co-founder Steve Sigmond, the firm shares only the final score with clients, and doesn’t include the data used to determine them. Once high-risk patients are found, Health Alliance aims to conduct outreach efforts to keep them healthy, said Chief Medical Officer Robert Good.

There’s also a partnership between Oklahoma health plan GlobalHealth Holdings and analytics start-up VitreosHealth, which began in 2014. More than half of the plan’s total costs come from emergency care; using clinical and social data such as income and education levels, VitreosHealth helped GlobalHealth prioritize members based on the risk of their needing such care.

According to David Thompson, the plan’s chief operating officer, care managers were instructed to help the top 20% of people most at-risk with matters like taking their medications and connecting with social services. Thompson said the effort produced savings, adding that they don’t see the personal social data used by VitreosHealth, and that plan members are told their social information will be used when they enrol.

In the US, health insurers face state and federal regulations surrounding the use of socioeconomic data to set rates or premiums. An industry group spokesperson also noted that using social data for that purpose is not allowed.

But those medical insurers are not prohibited by privacy laws from collecting public data and integrating it with protected health information. According to the co-head of the privacy and cybersecurity law practice of Morgan Lewis & Bockius LLP, it’s advisable for plans to disclose the practice to patients.

Privacy advocates are also airing concerns. “This information, if collected for good, can be really helpful,” said Deven McGraw, chief regulatory officer for Ciitizen Corp, which aims to help consumers manage their medical records digitally. “If it is put to nefarious purposes, or breached, then [it] can have consequences.”