Orwellian?

The US PCAST report puts forward the following scenario to illustrate how privacy mores change over time, and what the future could be like if digital natives fully trust in the cloud. They admit that “Taylor’s world seems creepy to us”, but they want to demonstrate that “In such a world, major improvements in the convenience and security of everyday life become possible.”

Taylor Rodriguez prepares for a short business trip. She packed a bag the night before and put it outside the front door of her home for pickup. No worries that it will be stolen: The camera on the streetlight was watching it; and, in any case, almost every item in it has a tiny RFID tag. Any would‐be thief would be tracked and arrested within minutes. Nor is there any need to give explicit instructions to the delivery company, because the cloud knows Taylor’s itinerary and plans; the bag is picked up overnight and will be in Taylor’s destination hotel room by the time of her arrival.

Taylor finishes breakfast and steps out the front door. Knowing the schedule, the cloud has provided a self‐ driving car, waiting at the curb. At the airport, Taylor walks directly to the gate – no need to go through any security. Nor are there any formalities at the gate: A twenty‐minute “open door” interval is provided for passengers to stroll onto the plane and take their seats (which each sees individually highlighted in his or her wearable optical device). There are no boarding passes and no organized lines. Why bother, when Taylor’s identity (as for everyone else who enters the airport) has been tracked and is known absolutely? When her known information emanations (phone, RFID tags in clothes, facial recognition, gait, emotional state) are known to the cloud, vetted, and essentially unforgeable? When, in the unlikely event that Taylor has become deranged and dangerous, many detectable signs would already have been tracked, detected, and acted on?

Indeed, everything that Taylor carries has been screened far more effectively than any rushed airport search today. Friendly cameras in every LED lighting fixture in Taylor’s house have watched her dress and pack, as they do every day. Normally these data would be used only by Taylor’s personal digital assistants, perhaps to offer reminders or fashion advice. As a condition of using the airport transit system, however, Taylor has authorized the use of the data for ensuring airport security and public safety.

Alluring.

Turning humans into robots

John Foreman, himself a data-scientist, writes a (somewhat rambling but) funny and self-aware essay on machine learning:

Data Privacy, Machine Learning, and the Destruction of Mysterious Humanity

I highly recommend you read it. Keep an eye for two coinages: “data-driven probabilistic determinism” and “data-laundered discrimination”. Machine learning is one side of the argument here. For the other side I also recommend the book Big Data: A Revolution That Will Transform How We Live, Work, and Think

Here are a few juicy quotes from Foreman’s essay:

Our past data betrays our future actions, and rather than put us in a police state, corporations have realized that if they say just the right thing, we’ll put the chains on ourselves.

The promise of better machine learning is not to bring machines up to the level of humans but to bring humans down to the level of machines.

“A human being is a deciding being,” but if our decisions can be hacked by corporations then we have to admit that perhaps we cease to be human as we’ve known it.

A little bit of Huxley there, and reminiscent of Tim Wu who called us humans “comfort-seeking missiles”:

… for most of us, our technological identities are determined by what companies decide to sell based on what they believe we, as consumers, will pay for. … Comfort-seeking missiles, we spend the most to minimize pain and maximize pleasure. When it comes to technologies, we mainly want to make things easy. Not to be bored. Oh, and maybe to look a bit younger.

The imagery of the WALL-E at the end of Foreman’s essay is an appropriate warning.