Dixon, for one, worries about the domino effect of these jumbled scoring methods. If your health insurance costs are tied to your neighbor’s credit scores and you’re a pioneer in a gentrification project, do you get redlined? “We have to re-examine predictive analysis,” she says. “We can’t let discrimination sneak back in.”
“Scoring of America,” which Dixon co-authored with Robert Gellman, a longtime privacy and information policy consultant, reveals a large collection of scores used on Americans that most of us have never heard of, let alone dreamt of. The scores grade your behaviors as well as those of the people closest to you, including your neighbors, with a score that puts new meaning to having the nicest house on the block. The neighborhood score, for example, assesses the approximate credit capacity of the neighborhood, not individuals in it.
The Affordable Care Act, for example, has an “individual risk score” that is based on your demographic and health-status information. The primary source of that information is data already collected from employers and health plans. That then determines what you will pay for care under Obamacare because it allows insurance companies to spread the risk out.
Marketing experts will tell you that these scores are not nearly as invasive as you might think and amount to nothing more than perfecting a sales pitch to you — customizing it based on your preferences, a hallmark of Big Data and Big Content’s contribution to a merchant’s sales-and-marketing playground. What’s more, they claim this is what consumers consistently say they want: tailor-made information delivered where they want it and when they want it.
But the scoring track record is hardly perfect. The Federal Trade Commission’s 10-year study on credit scores released last year discovered that one in five consumers had errors on at least one of their three credit reports, which impacted the scores that dictate what kind of interest rates they will pay on a mortgage or whether they will get certain jobs.
Those scores are derived from some pretty specific credit-card and loan data in your credit report, which itself can be a nest of misinformation. But we get to look at that and make corrections if needed.
Imagine now that a report we can never see, incorrectly determines you’re a dog lover who never follows the doctor’s orders. You could end up with a bevy of dog-grooming coupons, higher health-insurance rates and be denied for a jumbo mortgage loan.
“The quality of data matters,” the reports says. “Errors in data used to make a score create a score that is not predictive. With thousands of factors, error rates and false readings become a big issue.”
If you can’t see the scores, how do you know if they’re accurate? “When other consumer scores enter the marketplace without transparency or the limits that apply to credit scoring, consumer benefits are much more uncertain and unfairness is more likely,” according to the report.
To that end, Dixon and Gellman are calling on Congress to force data miners and marketers to make this information transparent to consumers, much like our credit scores and reports are.