3 Data Privacy Trends to Watch
This is the last in a four-part series on data privacy and compliance
In our third post in this series, we explained why data privacy frameworks don’t apply in usual ways to decentralized storage solutions, and how decentralization enables applications to be more private and secure in the way they store data.
Our final post presents three data privacy trends, and shares ways that Storj is working to help our users manage the complex web of privacy regulations so they can make better decisions about data storage. Here are the three trends we’re keeping an eye on:
1. More Data Privacy Regulations — & More Data Breaches
In addition to the 140 data privacy regulations on the books globally, we expect to see more regulations and more complexity in the future.
China’s new data privacy law went into effect in November 2021, with critics saying it raises more questions about compliance than it answers. Meanwhile, a personal data protection bill in India has been pending since 2019. Other international data privacy laws are in the works, which may impact U.S. businesses.
Even as the number of regulations increase, there are likely to be more high-profile data breaches and more concerns about privacy. One reason is that fines and regulatory actions haven’t yet achieved their intended impacts.
For example, Google paid $170 million in 2019 to settle federal and state claims alleging that they violated children’s privacy on YouTube by collecting personal information without parental consent. That sounds like a costly fine — until you learn that the company’s daily earnings that year were over $443 million.
2. Bias & Data Privacy in Artificial Intelligence
Another trend we’re keeping an eye on is the intersection of bias and data privacy in artificial intelligence (AI).
A report published in 2019 by the AI Now Institute at New York University pointed out that only 18% of the authors at a leading AI conference were women and more than 80% of AI professors were men. It cited similar disparities in AI researchers at leading tech companies.
The report’s authors said we need to reevaluate AI tools, some of which claim they use data to detect sexuality from headshots, predict criminality based on facial features, and assess worker competence from microexpressions.
As the authors observe, issues of discrimination in the workforce and in system building are deeply intertwined. This was demonstrated recently with the departure from Google of Timnit Gebru, a renowned AI Black woman researcher, relating to Google’s handling of a review of a research paper she co-authored that highlighted issues with large language models. The events caused outrage and raised concerns about the chilling impact of discrimination on the ability of people to voice concerns about developing technology.
Even autonomous vehicles are subject to the biases of developers who use and interpret data to create their AI processes. Everything from decision-making in maps to suggestions for what to do in an emergency — and how those suggestions differ by socio-economic area — is influenced by the developers. Data generated and collected from these vehicles has profound privacy implications depending on who has access to that data and how it’s used.
3. Demand for More Private Alternatives Like Decentralized Storage
Finally, we anticipate that more applications, products and services will soon be privacy-focused, if for no other reason than to avoid the pitfalls of a highly-complex regulatory environment. This will give rise to more people looking for viable alternatives like decentralized storage.
At Storj, we’re committed to helping our users make better decisions about data storage. Beyond our roadmap, which is always privacy focused, we’re exploring these additional ways to increase privacy and compliance:
- Geofencing. Many of our users have expressed interest in having the ability to select the geographies of the nodes in which their data will be stored. Now we’re evaluating best practices and looking at ways to turn this capability into a Storj feature.
- Data Processing Agreements. We’re considering whether we need to enhance our relationship with storage node operators by incorporating some aspects of a data processing agreement (potentially in our contracts or terms and conditions).
- Increased User Privacy. Today we offer end-to-end encryption on the client tools side and server side encryption on our hosted S3 compatible gateway. We’re looking at the possibility of creating a light encryption library for use with the gateway to enable end-to-end encryption throughout all aspects of the platform.
Conclusion
We hope the four blog posts in this series (and the webinar they’re based on) have given you a better sense of our approach to privacy and security as a decentralized platform — and the regulatory framework in which we operate.