page image

CHAPTER II - The Social Dynamics of Digital Assets

The Social Dynamics of Digital Assets

Digital Identity

If people can acquire greater control over their personal data and digital identity, they will not only become drivers for new sorts of business models keyed to “the person as platform,” they will enable new sorts of bottom-up social institutions to emerge.

John Seely Brown of the Deloitte Center for the Edge pointed out that so much of the attention paid to value and value capture has been focused on financial capital. But there is a real need to consider value that is based on social relationships and identity, and the role that those play in building new sorts of markets. Michael Fertik, Founder and CEO of Reputation.com, calls this phenomenon the “reputation economy.”

It is important for us to recognize social collectives and individuals as an integrated whole, not as bipolar and separate phenomena, said John Clippinger: “There is a field that studies this concept called ‘holonics,’” he said, an empirically based theory of living systems that blends multiple scientific and humanistic disciplines. In a recent essay on holonics, theorist Mihaela Ulieru writes:

A recurrent problem is our failure to understand that human endeavors are part of holistic, living systems, natural and constructed, whose constitutive elements are mutually defining, expressive and constantly evolving. In actual circumstances, the individual cannot be cast against, below or above the group; the individual is in fact nested within dynamic forms of social organization.

By conceiving the new data environment as a holonic living system, we are more likely to do a better job of aligning individual and collective interests, and to design more value-enhancing network infrastructures, business models and software.

One starting point is to engineer data as a resource that is portable throughout an open network, and not tethered to a particular proprietary platform. Portability of data is not the rule on most network platforms today. Companies are generally too intent on capturing the value of their users’ personal data. But Shane Green of Personal.com argued that data portability will enable all sorts of time-saving conveniences and innovations for people, helping them make better, smarter decisions with their own data. This would be a vast improvement over the current default model, “advertising based monetization of data” —which he criticized as “probably one of the most inefficient sorts of economies there is, if you look at it on a per-transaction basis.”

If and when individual data-portability becomes the norm, it could unleash innovative new types of value-creation. “We are still laboring under the assumption that the organizational assets are the ones that matter, instead of the digital assets,” said Michael Fertik, Founder and CEO of Reputation.com. “This asset is moveable, and it is growing in value.”

But do individuals really want to manage their own data, let alone profit from it? Kara Sprague, Principal of the Business Technology Office of McKinsey & Company, was not so sure: “Facebook is making a profit of maybe $1 per user each year. I would gladly pay a dollar to have the services that Facebook provides. If I could become a broker of all of my own information, could I monetize directly what Google and Facebook do for me, by taking my eyeballs and matching them with advertisers? I do not know I would want to. What would that net me?”

This may be true, conceded Khaliah Barnes, Director of Student Privacy at the Electronic Privacy Information Center, the Washington, D.C. policy advocacy group for privacy. But individuals do have an affirmative interest in shielding their personal information from potential employers, credit agencies and law enforcement. When third parties like these have access to personal information, it tends to have a “chilling effect on individuals,” she argued.

How Data Can Secretly Shape and Limit Opportunities

Still, many conference participants remained concerned about the special nature of data and its use in limiting opportunities or manipulating individuals. One then-timely example was Facebook’s manipulations of the news feeds of nearly 700,000 users in a study of “emotional contagion through social networks.” By manipulating positive and negative posts on news feeds, the researchers found: “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social network.” The apparent conclusion, that user happiness and sadness could be artificially induced via undisclosed manipulations of news feeds, caused great concern among the public when it was made public in July 2014.

Michel Fertik of Reputation.com cited his 2013 essay in Scientific Americandescribing how insurance, medical and other companies usedata-mining of personal information to transmit differential advertising to different demographic groups, including no marketing at all to some groups. “Some laud this trend as ‘personalization’—which sounds innocuous and fun, evoking the notion that the ads we see might appear in our favorite color schemes,” he said. But this “behind-the-scenes curation” of Internet content has some serious consequences. Fertik wrote:

If you live on the wrong side of the digital tracks, you would not even see a credit offer from leading lending institutions, and you would not realize that loans are available to help with your current personal or professional priorities….Last September, Google received a patent on a technology that lets a company dynamically price electronic content. For instance, it can push the base price of an e-book up if it determines you are more likely to buy that particular item than an average user; conversely, it can adjust the price down as an incentive if you are judged less likely to purchase. And you would not even know you are paying more than others for the exact same item.

There are other stories of data being used to shape the social and commercial environment:

  • Airbnb not only allows tenants to rate landlords, but landlords to rate tenants.
  • OpenTable, a restaurant reservations service, will kick people off its system if they cancel too many reservations.
  • Uber allows drivers to rate passengers. Some social network data can now construct a more reliable, predictive metric of creditworthiness than conventional FICO scores.
  • Reputation.com has scored over 100 million American professionals as to their current and future earning power, concluding with some degree of statistical certainty that the best proxy for reputation is your personal income.

Such stories prompted Shane Green of Personal.com to note, “What is scary about data compared to tanks or guns is that you do not necessarily see how it is being used, and it does shape your environment.” Robin Chase of Zipcar and other startup ventures said, “I am happy to have data be used to predict things that are of low consequence and of immediacy. But for things of large consequence and far into the future, I am really uncomfortable with that….We really need to preserve an ‘infrastructure of opportunity’ for people.”

A few conference participants were fatalistic about such uses of data. “All of the horses have run out of the stable—six, seven, eight years ago,” said Joseph Vardi, a leading Israeli venture capitalist as Chairman of International Technologies Ventures. “I think that to try to believe in 2014 that you can keep your privacy is very naïve….The whole system is wide open. Most of us are safe, and only a small percentage is being compromised. If you want privacy, you have one way: Try to disconnect from your telephone, credit card and the Internet, and go ten feet underground. Then you will get complete privacy.” Vijay Sondhi, Senior Vice President of CyberSource, believes that “people are often choosing utility over privacy” in any case. “They want instant gratification…they are making a conscious tradeoff.”

But other conference participants disagreed with these perspectives. “I think privacy is a big issue,” said John Clippinger of ID3, citing the worldwide response to the Snowden revelations of NSA snooping and European attitudes toward privacy. “Luxembourg is flooded with companies looking for more secure cloud computing because no one wants secret NSA access through back doors.”

Khaliah Barnes of the Electronic Privacy Information Center agreed that there is a great deal of personal concern about privacy for data. She cited the dismissal of the CEO of Target after the massive credit-card security breach in the company’s computers, and the U.S. Government’s new willingness to extend Privacy Act protections to EU citizens following the Snowden NSA disclosures. “We are seeing that privacy really matters, and that it is affecting our international relationships,” said Barnes. The other point to stress, she said, is that “there is such an ‘information asymmetry’ in what people know about the uses of data.” People just do not know whether law enforcement, the NSA, credit bureaus or others will be able to access one’s data, she noted.

Shane Green predicted that “there will become a tipping point in the future where people, on whatever platforms, will start to care about how their data is used.” In the meantime, he added, “information about how data is used will have to become more transparent so that people can know if their interests are truly aligned with the people using the data.”

One structural approach to this problem, noted John Clippinger of ID3, is not to look to conventional regulation or court rulings, but rather to look to “algorithmic systems” embedded in the technology itself. He cited the distributed block-chain ledger used by Bitcoin as an example. Such secure digital structures are attractive alternatives to fallible, slow-moving regulatory and legal systems that can easily be gamed by wealthy, sophisticated players.

Thomas Malone of M.I.T. wondered “whether privacy is even the right framework for thinking about a lot of these issues.” The more appropriate frame might be “ownership” as secured through property rights, he said. “Rather than having all-or-nothing laws about what you can or cannot do with information, maybe data should be treated like many other assets. This would mean that you would have an ownership right to control how the information is treated. There may well be some legal frameworks, like the Uniform Commercial Code, that could formalize certain kinds of contracts about how information is managed.” Clippinger agreed that “looking at personal data as an asset class,” and not as something managed through a privacy legal framework, has merit.

Share On