A world where privacy is non-existent can be a very beneficial one. Scott Adams writes about this once in a while and it is eye-opening to those that aren’t familiar with the rapid progress in data collection and analysis technology, or haven’t connected the dots to what can be done with that analysis. The feasible short-run achievements would include drastic reduction in crime (physical, as well as white collar and identity theft), drastic reduction in adverse selection in insurance and financial services (rewarding good actors with lower prices, better service, etc.); you get the idea. In the long-run, businesses that succeed will be those that can use the available data on your life and personal preferences to customize products and services to meet your needs in more granular ways and with less effort required on the part of customers.
Most people tend to get queasy and defensive when confronted with giving up privacy, even in exchange for a better life. Dystopian portrayals are available aplenty, see Minority Report. And unfortunately for those people, there seems to be little that can be done to stop the erosion of privacy. Laws and regulations could certainly slow down the process but the information will only become easier to obtain with better technology, and pushing it into black markets and towards non-transparent usage will only increase the likelihood of a dystopian outcome. There are other ways…
The amount of data being collected on individuals, licitly and not, is reaching a tipping point. At the moment, we find ourselves surrounded by growing walled gardens hell-bent on knowing everything about us. Google, Facebook, Twitter, etc. have collectively built hundreds of billions of dollars of market value providing us with convenience and better access to information about the world and other people we care about. In exchange, we let them harvest our data and sell it to interested third parties (mostly businesses selling us things). There are two major issues with the current model. We cede ownership of this data to services which, while potentially trustworthy and having good motives, are not infallible. They are also not terribly transparent, so we must take them at their word. The degree to which our data is protected from theft by less benevolent actors is unclear. And there are less obvious but potentially just as harmful issues. With the data locked up inside walled gardens, we stifle the innovative potential of new companies which could potentially use this data in even more beneficial ways.
In the long run, with the right technological foundation, cultural norms will change and we will not only be eager to allow others to access our data and use it, but we will have better knowledge and control as to how its used, as well as potentially being compensated directly (perhaps even monetarily — this would rapidly increase the pace of widespread adoption since frankly most people don’t care much about the security of their personal data, but everyone likes money). For this, we must align incentives.
I can’t propose a precise solution, but the general features of one come to mind.
1. We should insist that the data we generate, whether its from posting status updates, letting our phones track our location, or in the future allowing devices in public spaces to identify us and analyze our activity (actually, its happening already), is not only in our possession but also under our control. More precisely, it seems like what we need is an open source framework for encrypting data we generate and placing it into a storage system which we can allow conditional, piecemeal access by selected parties.
2. The open system will allow for transparency in understanding the security features surrounding our data. I’m not sure to what extent this would be feasible, but ideally you prevent bulk transfer of data out of the encrypted store (unless by the owner, obviously). This is tricky. Off the top of my head, the data could be categorized and stored in a query-friendly format. External access would be through the form of queries and the amount (as well as the specific data points) decrypted and exported as the query result can be tracked. If you can make the argument that personal data decays quickly — e.g. data from a year ago isn’t nearly as valuable as data from yesterday — an alternative approach could be as simple as changing the private key for your data’s encryption at some reasonable frequency (monthly). Then when your data is accessed by a third party, they will need a different key for each piece of history, as well as perhaps for different data types (a key for your status updates, a separate key for your pictures, etc.)
3. To get the full benefit, the system would need a secure audit trail of access (what data and by whom) as well as a gatekeeper mechanism which controls this access to some level of granularity. Interestingly, the recent innovation of a publicly cryptographically signed chain of transactions (blockchain, i.e. Bitcoin) seems potentially applicable here. In a nutshell, different pieces of data owned by an individual and stored at a trusted service running the open source protocol would by encrypted by various private keys (which could even change over time for the same data set). Access to data could be enabled by a transaction which is logged to a publicly-visible block chain. You could use a unit of account which must be transferred from accessor to owner in exchange for this access. Think of it like a cryptocurrency for your private data. You can see where that brings us…
A system similar to one described above seems to enable an alignment of incentives across many parties:
– makes individuals eager to rent the data, but critically to have it be their choice and be potentially compensated for
– creation of an ecosystem which will be incentivized to improve the infrastructure of the security, storage and analysis of this data
– enabling new businesses, outside of the walled gardens of current internet giants, which need personal data to work well
– a clearer accounting of who is accessing your data, including government agencies. if they want to know everything about us, lets at least make that transparent
Adoption of such a mechanism seems like a chicken/egg problem at first, and also seems vulnerable to exploitation by nefarious parties by collecting data through the system and reselling it themselves (or using it for purposes that you didn’t intend to allow). It seems to me that grass roots adoption of such a framework by individuals currently eager to share their personal data with innovating companies (in exchange for whatever service they offer for free, basically) will seed the system and begin drawing adoption by new entrants as a selling point as well as a data source to feed their business. Eventually, we should see increased pressure for large incumbents (google, etc.) to move their data into such an open system as a show of transparency and good faith towards their users — especially if people start to see that they can be compensated directly for access to their curated data. As far as access by nefarious parties, I would argue it’s not any worse than the system (or lack thereof) that currently exists in the form of walled gardens and incompetent government agencies. At least with an open source mechanism we can have a concerted and transparent effort towards preventing our data from being abused.
The ideas here are half-baked and create plenty more new problems beyond the ones addressed (how to prevent bulk extraction, how people can manage large sets of keychains, how to manage large permission lists, etc.) but the key is the to create a platform of transparency within which these problems can be incrementally chipped away at. Currently, we’re flying blind.