On Oct. 6, news broke that 23andMe, the genomics company that collects genetic material from hundreds of individuals for ancestry and genetic predisposition tests, had a large data breach.

But because it seems, the corporate’s servers weren’t hacked. Rather, hackers targeted a whole lot of individual user accounts — allegedly those who had repeated passwords. After getting access to the accounts, hackers could leverage the “DNA relatives matches” function of 23andMe to get details about hundreds of other people.

This data breach challenges how we take into consideration privacy, data security and company accountability in the data economy.

Hackers targeted user passwords to access 23andMe’s user data.

Shared information

Genetic information databases have a notable feature: anyone’s DNA data also reveals details about others who share a part of their genetic code with them. When someone sends a sample to 23andMe, the corporate has genetic details about that person their relatives even when those relatives didn’t send a sample or consent to any data collection. Their data is inevitably intertwined.

This isn’t only a characteristic of genetic data. Most data is about a couple of person because data often describes shared features between people.

The ramifications of overlooking how personal data affects others extend to the complete information economy. Every individual alternative about personal data has spillover effects on others. People are exposed to consequences — starting from financial loss to discrimination — stemming from data practices that depend not only on details about themselves, but additionally on details about others.

User data-collection agreements can result in indirect harm to 3rd parties. For example, the negative impacts of the Cambridge Analytica scandal prolonged far beyond those whose data the corporate collected.

This predicament underscores the collective impact of individual data decisions.

Data analytics

Algorithms powered by artificial intelligence draw inferences by analyzing the relationships between data points. AI algorithms depend on databases containing details about multiple people to learn things about a selected person or a selected group.

Companies draw conclusions about people by analyzing data collected from others, making probabilistic assessments based on personal characteristics and relationships. Companies proceed so as to add details about people to their datasets each day. And, the more people a dataset just like the one built by 23andMe includes, the less someone’s alternative to not be a part of it matters.

AI-powered algorithms analyze user information and the connections and relationships with other people’s data.
(Shutterstock)

Similarly, each time a user agrees to the gathering, processing or sharing of private information, it also affects others who share similarities with the user. These collective assessments make data processing profitable, corresponding to through marketing, data sales and business decisions based on consumer behaviour.

Equity issues

The interconnected nature of knowledge isn’t a coincidence — it’s on the core of how businesses operate in the data economy. This also creates equity issues.

In the 23andMe case, hackers are offering the assembled genetic information on the market, with lists that include hundreds of individuals. Hackers reportedly assembled and put up on the market lists of individuals with Ashkenazi Jewish ancestry.

Individuals on the list now face increased risk of discrimination or harassment, as leaked data includes names and placement. Other information from the corporate would allow them to do the identical for individuals with a propensity for Type 2 diabetes, Parkinson’s disease or dementia — all of which 23andMe measures — putting them liable to other harms, from raised insurance premiums to employment discrimination.

Data’s collective risks

We often fail to acknowledge the interconnected nature of knowledge because we’re fixated on each individual. As a consequence, firms can exploit one person’s agreement to legitimize data practices involving others. Companies’ legal obligations to acquire individual agreements for data collection fail to acknowledge broader interests beyond those of the one who agreed.

We need privacy laws attuned to how the data economy works. Providing consent on behalf of others, as 23andMe users did after they clicked “I agree,” can be illegitimate under any meaningful notion of consent. To contain group data harms like those this hack produced, we’d like substantive rules about what firms can and may’t do.

Prohibitions on indiscriminate data collection and dangerous data uses avoid leaving unsuspecting individuals as collateral damage. Because corporate data practices can impact , their safety obligations should too.

This article was originally published at theconversation.com