Showing posts with label privacy. Show all posts
Showing posts with label privacy. Show all posts

Friday, June 14, 2019

On privacy and Surveillance Capitalism

I stored up for a bigger one, this time, in a topic wherein I actually know something! Though yes, in background we have worries about a looming U.S.-Iran war, which I've warned about since November 2017... and more recently... asking you to make sure your neighbors know terms like "Saddam's WMDs," "Tokin Gulf Incident," "Gleiwitz," and "Reichstag fire." (And see what Navy vet Jim Wright says about this recently, here.)   

Over the long haul, our way out of these messes will almost always be more light. Exposing the wicked. Which brings us to...

== Fear of exposure ==

Harvard Prof. Shoshana Zuboff’s new book - The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power - is a massive overview of the major quandary of our age. It's reviewed by Noah Smith at Bloomberg, who begins by citing a simplified view of my own Transparent Society. (In fact, a world awash in light won't end privacy. It is (I assert) the only possible way that citizens will be able to preserve some privacy.)

Zuboff's book is also reviewed in the Guardian – and yes, I’ve been asked my reaction. Here's a substantial and worthwhile extract from that review:

“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later."

Zuboff thus  connects to the recent works of Yuval Harari, who foresees a future society driven and propelled by "dataism." Back to the Guardian review. 

"Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”
  
Reviewer John Naughton continues: 
“While the general modus operandi of Google, Facebook et al has been known and understood (at least by some people) for a while, what has been missing – and what Zuboff provides – is the insight and scholarship to situate them in a wider context. She points out that while most of us think that we are dealing merely with algorithmic inscrutability, in fact what confronts us is the latest phase in capitalism’s long evolution – from the making of products, to mass production, to managerial capitalism, to services, to financial capitalism, and now to the exploitation of behavioural predictions covertly derived from the surveillance of users. In that sense, her vast (660-page) book is a continuation of a tradition that includes Adam Smith, Max Weber, Karl Polanyi and – dare I say it – Karl Marx.” 

(An aside: on a recent flight from DC, I sat across from a teenager who was reading Das Kapital. Old Karl has been re-awakened and is flying off the shelves, worldwide. And this resurrection was achieved by the gluttonous outrages of an oligarchy that seems bent on behaving exactly as KM described.)

== Simplistic, but with cause ==

Summarized in this interview, Zuboff correlates past episodes of rapacious colonialism with the way major data corporations treat us. Good line: Once we searched Google, but now Google searches us. Once we thought of digital services as free, but now surveillance capitalists think of us as free.

“Demanding privacy from surveillance capitalists,” says Zuboff, “or lobbying for an end to commercial surveillance on the internet is like asking old Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck, or a cow to give up chewing. These demands are existential threats that violate the basic mechanisms of the entity’s survival.”

"At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx's image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human experience."  She examines several major organizations -- notably Amazon, Apple, Facebook, Google, and Microsoft -- that are in various stages of developing a "technologically advanced and increasingly inescapable raw-material-extraction-operation." In the end, "surveillance capitalism operates through unprecedented asymmetries in knowledge and the power that accrues to knowledge." 

== And just like Marx... this model has fatal weaknesses ==

While describing valid complaits about info-greed by capitalists, Zuboff misses the key point that all elite accumulations of power will do this, trying to arrange for information to flow upwards, as it did into the manors, castles and cathedrals of old. This is monkey behavior; you see it in chimps. Hence, when she reflexively shouts "They're looking at you!" she and almost every other privacy paladin ignores the only possible conclusion from this tome:  that being seen is inevitable

Seriously, what is it she aims to accomplish with her book, with all its alarums, if failure of all constraints and freedom is unavoidable?

Time to step back. Maybe take a whiff of how our ancestors were treated when past elites similarly knew everything of any importance about those toiling below them in the villages and fields, when the aim of collecting "data" about the peasants (via priests and local gossips and by torture) was not about "selling them stuff." It was about life and death. About eviction from your hovel, or being levied into a hopeless war. It was about starvation.

Sure, elites always had imbalanced advantages when it came to surveillance  and it's worrisome, as it always was! But it's what they can do to you that matters. And right now what they can do - the plaint of Zuboff and most privacy paladins - is intrusively try to sell you stuff. 

Now, there are reasons why that business model is doomed, but that's beside the point. The way to limit what the mighty can do to you with your information is not to limit what elites know. There is not a scintilla of a chance that can happen and no example across the history of our species when it ever actually occurre.

The solution is not to (impossibly) blind elites, but to strip them naked, so that - no matter what they know about you, they hare severely hampered at using it against you.

That remedy has actually been used effectively, across the last 200 years. I give example after example, in The Transparent Society. 

== The reflex is addictive ==

Alas, Our earnest and sincere paladins of progress and freedom keep issuing hysterical screams "They're LOOKING at you!" without ever offering even a glimpse at the only remedy that can possibly work.

This power to shape behaviour for others’ profit or power is entirely self-authorising. It has no foundation in democratic or moral legitimacy, as it usurps decision rights and erodes the processes of individual autonomy that are essential to the function of a democratic society. The message here is simple: Once I was mine. Now I am theirs.

Yet the author displays stunning contempt for the masses:There can be no exit from processes that are intentionally designed to bypass individual awareness and produce ignorance, especially when these are the very same processes upon which we must depend for effective daily life. So our participation is best explained in terms of necessity, dependency, the foreclosure of alternatives, and enforced ignorance.”

Mind you, I agree with the overall call to action: “Our societies have tamed the dangerous excesses of raw capitalism before, and we must do it again….  We need new paradigms born of a close understanding of surveillance capitalism’s economic imperatives and foundational mechanisms.” 

Um sure. But doesn’t that imply that the solution is either state paternalism or else leveling the playing field?

Alas, the inevitable tilt is toward the former:  “GDPR [a recent EU law on data protection and privacy for all individuals within the EU] is a good start, and time will tell if we can build on that sufficiently to help found and enforce a new paradigm of information capitalism.”

Except… does she point to a single paternalistic privacy protection or restriction that has ever effectively limited the data-aggrandizement processes that she decries?

In spurning other suggestions, Prof. Zuboff commands that the tide go out: “For example, the idea of “data ownership” is often championed as a solution. But what is the point of owning data that should not exist in the first place?” 

“So what is to be done? In any confrontation with the unprecedented, the first work begins with naming. Speaking for myself, this is why I’ve devoted the past seven years to this work… to move forward the project of naming as the first necessary step toward taming. My hope is that careful naming will give us all a better understanding of the true nature of this rogue mutation of capitalism and contribute to a sea change in public opinion, most of all among the young.”

Vague, vague, vague arm-wavings after a 900 page, well-documented call for resignation and despair, avoiding any look at the one thing that ever worked. The only thing that can.


== A fictional perspective ==
  
Someone report back on this new novel - Golden State, by Ben H. Winters, author of the alternate history, Underground Airlines. As reviewed on NPR: The world as we know it has been destroyed, and though we never find out exactly how, it appears it had something to do with a pandemic of lies. In the Golden State, lies are against the law, and the main enforcers of the truth are known as Speculators. If it's against the law to lie, it must also be against the law "to hypothesize, to imagine versions of what might have happened. But when you are trying to solve, for example, a suspicious death, sometimes it is necessary to hypothesize so we can try to follow the leads and crack this case. So to do that there are individuals within the Golden State, a special sort of law enforcement officer who has license to speculate."

It doesn’t sound remotely human or plausible – like those absurd films and tales abut dystopias that ban emotion – but perhaps an interesting thought experiment about a type of transparency.

And finally...

The object of the videogame DietDash is to travel through the aisles of a supermarket and avoid sugary foods. While it’s not an exciting game, overweight people who play it win in real life by losing up to 3.1 percent of their bodyweight after 8 weeks. The game was developed at Drexel University and researchers there are seeking recruits for a newer, highly gamified version of the shopping simulation.  Huh.

Wednesday, June 05, 2019

Disputation - and disinformation


I'm in DC for NASA and other meetings. So let's offer up a trove of observations about privacy, transparency and freedom....

A new debate platform - Kialo - goes some way toward the sort of “Disputation Arenas” I’ve been talking about and urging for 20 years. Kialo enables you to visualize discussions as an interactive tree of pro and con arguments. At the top is the thesis, which is supported or weakened by pro and con arguments underneath. Each one of these arguments can branch into subsequent arguments that support or attack them in turn.  (I do offer some added and important layers.)

The greatest innovation of our enlightenment experiment wasn’t democracy, or freedom, or fair-competitive markets, though these are important. One thing is enabled by those things, and makes them enabled – reciprocal accountability.

Reciprocal accountability (RA) is what lets us criticize each others’ favorite delusions and errors, the synergy that led to all our recent successes... that kings, lords, owners and priests all reflexively punished for 6000 years.  RA can come via cooperation, or by negotiation, or via fair-open competitive argument.  By applying RA, we got our five great positive sum arenas for competitive creativity… democracy for policies and law, Science for zeroing in toward better models of the world, markets for creating ever better goods and services, justice courts for adversarially and openly iterating justice, and sports – 

-- the example that makes clear how necessary regulation is, to deter the otherwise inevitable cheating that always spoiled these arenas, in eras past. The kind of cheating that threatens to spoil everything today, as cooperation, negotiation, and fair-open competitive argument are all being directly and deliberately undermined in America and the West, by powers that want a return to those 60 centuries of feudal misrule.

Can we transform the Internet from a swamp of lies into a process by which RA works, as it has in the other five arenas, till now? For a rather intense look at how "truth" is determined in science, democracy, courts and markets, see "Disputation Arenas: Harnessing Conflict and Competition," now posted on my website.

== The poisons chilling reciprocal accountability ==

At the opposite extreme… "Substitute arguing services" offered in China
A number of online services in China offer professional arguers who will verbally or electronically assault other people for a fixed fee. According to Radii China, “20 RMB (3 USD) gets you the standard angry phone call or WeChat message; 40 RMB (6 USD) guarantees a full day of spam calls; and 100 RMB (15 USD) blows up your target’s phone with 999 hate calls.”

Also from the  Institute for the Future (IFTF) -- social and issue-focused groups are particularly susceptible to disinformation campaigns and were targeted with computational propaganda during the 2018 mid-term elections. It also shows why the targeting of these groups will continue, and potentially worsen, in 2020. The research, The Human Consequences of Computational Propaganda, led by IFTF’s Digital Intelligence Lab, provides recommendations for fighting back.

Do these new changes at Facebook “change everything?” - or even anything? Right after the 2016 election, I spoke to some of the folks at Facebook who were looking into the fake rumors crisis and offered what I deemed unconventional but simple suggestions that would utilize competitive processes to quickly denote falsehoods... not one of which was tried. There are simple, efficient things FB could do, to start making their system more self-correcting. They are not remotely interested in achieving that outcome, alas, so it will be up to us. (If their current ameliorations fail atrociously in 2020, you can be sure Facebook will be broken up.)

Beyond FB, no one - and I mean no one, to my knowledge - seems to grasp what's missing from the Internet ecosystem.  It is the one thing that enabled markets, democracy, science, courts and sports to function. It is right there, glaringly obvious.

For a look at how "truth" is determined in science, democracy, courts and markets, see the lead article in the American Bar Association's Journal on Dispute Resolution (Ohio State University), v.15, N.3, pp 597-618, Aug. 2000, "Disputation Arenas: Harnessing Conflict and Competition."  Now posted on my website.

== Face Recognition points the way to… Big Brother? Or else… ==

In October, Shanghai’s Hongqiao airport reportedly debuted China’s first system that allowed facial recognition for automated check-in, security clearance, and boarding. And since 2016, the Department of Homeland Security has been testing facial recognition at U.S. airports. Delta has an optional biometric system in Atlanta that uses facial recognition kiosks for check-in, baggage check, TSA identification, and boarding.

And if you howl in objection, exactly how do you foresee stopping this? Even if you pass fierce, European-style restrictions, all that Privacy Laws accomplish — according to Robert Heinlein — is to “make the spy bugs smaller.”  And smaller, faster, better, cheaper and more numerous they are getting, as predicted in Brin’s Corollary to Moore’s Law.

The reflex to solve these issues by shutting down information flows is impractical and impossible, and it runs diametrically opposite to the methods we used to get the very freedom and privacy we now fear losing. The only approach that ever worked - or can possibly work - is to start by asking
“which do I fear most?

What elites know about me?

Or what they can do to me?”

The former will never be constrained in any major way — name one time in the history of our species, when the mighty let themselves be blinded for long. But the latter — limiting what powerful men can DO to us — can and has been seriously accomplished during this enlightenment.

That is the difference between China’s implementation of these technologies and what we see in the West. And yes, it might end here tomorrow!  I am as frightened of Big Brother as you are. Probably more so. 

There is a narrow path out of this danger zone. And it does not start with futile howling “Don’t look at me!”

== On Data Privacy ==

Vint Cerf sent me a note saying the following passage reminded him of The Transparent Society:

Worries about data privacy erupted in the spring of 1964 with the publication of “The Naked Society,” by Vance Packard, a journalist best known for his unsparing critique of modern advertising. “The Naked Society” made a comparable assessment of the marketing schemes of big corporations, noting their immense and profitable traffic in personal data about American consumers. But he trained most of his attention on the entity that was then the largest user of mainframe computing power: the United States government.”

My recent podcast on surveillance, transparency and the future of freedom is “The future of privacy policy: A Q&A with author David Brin,” Interviewed on the AEI site by James Pethoukis.

I began tracking this when I lived in Britain in the 1980s and they led the world introducing CCD cameras on the street. Now: “Thousands of San Diego street lights are equipped with sensors and cameras. Here's what they record.” 

== Wellsprings of freedom ==

A federal court ruled this December that secretly recording government officials, including police officers, is protected under the First Amendment, over-ruling a 50 year old Massachusetts law. And if you expect me to rejoice, well, sure, yeah. This is the most important civil liberties issue of our times. For it is on the street that citizens are most likely to encounter dangerous authority and need tools of accountability. Yet, I am willing to admit a need for some discussion re: the “secretly” part. Yes, in most cases. But there may be some room for compromise. Especially since it can happen that the bully is the one holding the camera. (A truth that was interpreted all-wrong in The Circle.) 

My other cavil is that the First Amendment is not the most crucial bulwark for this right to see-and-record. 

The real justification is the almost never mentioned Sixth… the sacred Sixth… that most concerns a citizen’s right to see, to access fair witnesses and to assertively get any evidence that might exculpate and prove innocence.  Why do none of the attorneys in these cases ever mention this vastly stronger argument?

Meanwhile. Must Writers Be Moral? Their Contracts May Require It.” Seriously. We need to remember that extreme social justice warriors may be our current allies against a far-worse, worldwide mafia-oligarchic putsch… nevertheless the worst of these allies are just another kind of bully and no friends of the Enlightenment that gave us everything. Including social justice. We can agree about the direction - ever-increasing tolerance/diversity and accountability – while recognizing that sanctimony-driven bullies will be drawn to any height from which to thwart reason.

Crowd wisdom?  Someone out there explore this site claiming to be on internet censorship, and report back in comments?

== Not getting it… and getting it too much? ==

Anonymous browsing? This piece reveals how hidden your activity really is when using popular privacy tools. 

A reporter commissioned a 3D printed model of his own head to test the face unlocking systems on a range of phones — four Android models and an iPhone X: only the iPhone X defended against the attack. As far back as The Transparent Society (1998) I forecast that both the optimists and pessimists would be disappointed in face recognition and related technologies.

A disturbing article shows you how easy it is for companies to parse your activities, even when personal identifiers are stripped away, as promised by modern privacy policies. Your apps still report movements in a gross, non-ID way… and meta analysis can swiftly correlate to rebuild the fact that each movement was you.

Seriously, if your enraged or anxious reaction is to demand regulations to ban such correlations, who knows? You might succeed! And in your victorious smugness you will celebrate a potemkin triumph, an exercise in stunning delusion and futility.

I’ve said this since before The Transparent Society. We cannot base our security or safety or freedom on “policies” that aim — even with good intent! — to obscure our personal information.  

At risk - certainty - of repetition... there is a solution, when you recall that it matters far-less what elites *know* about you than what they can *do* to you. And there is a way to make them afraid of doing bad things. It is a proven way, demonstrated by 200 years of increasingly successful experiments, while “privacy via secrecy” has almost no track record of ever succeeding for long.

Want evidence?  Look at how hard the world’s elites, especially the rising oligarch-mafia, are striving to get information obscured behind veils and clouds. In such a world, they thrive. We don’t.

Saturday, December 08, 2018

We can't own information


A cogent and interesting article from Forbes - Privacy is Not a Property Right in Personal Information - examines the well-intentioned “privacy reforms” implemented to some degree in Europe and pushed in the U.S., that would grant individuals “ownership of their own data,” plus the right of portability, to choose where it is to be stored and used.

As the author – Mark McCarthy - shows, this is a well-motivated… and utterly stupid approach to addressing a very real problem. The age-old problem of asymmetry of informational power, with elites like the rich or corporations almost-inevitably absorbing every bit and byte and fact about us, to use as they wish.

Look, I absolutely share that fear! It is why I wrote TheTransparent Society, because there are potential solutions, using the very same technique that has already worked increasingly well for 200 years. In contrast, the “data ownership” proponents cannot point to a single time in the history of our species when the citizens of a commonwealth commanded their elites “don’t look-at or know about me!” with even a scintilla of success. Again, that never happened. Because it cannot possibly work.

I’ve been dealing with this “don’t look!” fetish for 25 years, and they never learn. Ten years ago there were howls to banish and make illegal face recognition systems! And if that sounds quaint, well, talk of that vague “reform” is back, alas: It’s time to regulate facial recognition and affect recognition,” says Kate Crawford, a researcher at Microsoft. And yes, every worry should be considered. Yet, talk of curbing such technologies never forces the tide to go back out. Nor will any restrictions hamper elites, in the slightest. 

There is an alternative approach, the one responsible for all our freedom and progress. And it is always considered last.

In The Transparent Society I point out that all our great, positive-sum “arenas” - like markets, democracy, science and justice - thrive amid openness and light, but wither when shadows prevail. Forbidding others to know things is inherently aggressive and threatening, especially when they might view you as one of the dangerous elites they worry about.

And sure, I avow that yes, letting the mighty simply vacuum up everything about us, while they get to do mysterious things with our information, benefiting or even conspiring… heck, that’s a sure-fire path to Big Brother. So what's to be done?

There are some approaches that might work! Say, if hundreds of thousands of citizens were to pool their data rights in ways that could exert group efficiencies and group market power. This is simply an extrapolation of the greatest social innovation of the 2nd half of the 20th Century, the NGO. So we already know something like it could work.

Even more cogent is the suggestion by Jaron Lanier that we should not so much own our information -- creating a mythical and preposterous notion that you can exclude others from ever touching it -- as retain strong interests in it. A right to get feedback data on how it is being used, by whom, and to get paid micro-royalties if some corporation or elite entity benefits from using it.

But the fundamental remains the same. We are not made safe by hiding from power! It never worked and it won’t work in the future.

What gave us this window of freedom was not preventing surveillance, but insisting on sousveillance… looking back at power. Stripping the mighty naked, so we can supervise. Because it matters much less what they can know about you than what they can do to you!

And – as we have learned on the streets with our cell-cameras -- the only and best way to control what others – even the police – do to you is to let them know that even the watchmen are being watched.

== The Crux ==

How do you "own" something that -- when it (inevitably!) leaks -- can be infinitely duplicated at zero cost?

Is the word "ownership" even remotely applicable?

Even "control"?

Well, Tim Berners-Lee has some credibility... and he claims that a new online realm called "Solid" will resonate with the global community of developers, hackers, and internet activists who bristle over corporate and government control of the web. “On Solid, all the information is under (the user's) control. Every bit of data he creates or adds on Solid exists within a Solid pod–which is an acronym for personal online data store. These pods are what give Solid users control over their applications and information on the web. Anyone using the platform will get a Solid identity and Solid pod. This is how people, Berners-Lee says, will take back the power of the web from corporations.”

Um, while I’d love to be proved wrong, I remain boggled that folks believe there aren’t ten thousand ways for our information to leak through every such promise, if not through spychips in your Alexa or Charlie… or keyboard... or copy-plus-interpolation of every datum entering or leaving Solid. Go ahead and make that world! I’m concerned about the Olympian realms that elites are making for themselves. And dig it, they are likely to make much better use of secrecy protection methods than you ever will.

== News from the Transparency Front ==

Microsoft intends to develop two blockchain products designed to give consumers greater control of their personal data. One is an encrypted personal data store, or "identity hub," a combination of a user's personal devices and cloud storage; their permission would be required for third parties to access it. Also a "wallet-like app" that people could use, among other purposes, to manage these permissions to their data, including the ability to revoke them when desired. Decentralized identifiers (DIDs) do not require a central authority because they are registered on a distributed ledger. IBM, Accenture and RSA are working on similar concepts.

Fine. Go ahead and try. There may come a first time - in the history of our species - when a general approach to equalizing power via concealment will work.

Microsoft also aims at a system to allow a high volume of low-value payments, perhaps like the micropayments systems I have been pushing for almost a decade, which would then have the potential to smooth our commerce, save journalism empower creatives and finally end the era of domination of the Internet by advertising. Here's a good goal.

In a related development San Diego startup LunaDNA, which aims to create a community-owned database of donated genetic/health information for medical research, has filed with securities regulators to issue shares to people who provide their data, building  a credit union like co-op around an anonymous genomic and health database. People who donate their DNA/health data would get shares based on the value of the data, which LunaDNA calculates based on current market value. For example, a full human genome nets 300 shares.Three weeks of fitness/nutrition data gets two shares. "If LunaDNA ever makes money from fees charged to researchers who tap into the database or drug discovery royalties, shareholders would get dividends."

Another such venture is “Hu-Manity.co” whose app would let peoplespecify how their medical data can and cannot be used. Pharmaceutical companies could potentially pay each user $10 a month for access to their data, Etwaru says. The drug companies would also pay Hu-manity.co for access.

Combine these developments and we move toward a world predicted slightly in Neal Stephenson's SNOWCRASH but more significantly in Web philosopher Jaron Lanier's notion about personal data. To date, there have been three notions about our information future. 

== Three Notions ==

Here are the three most common mythologies:

(1) We are spinning into a dystopian age when the mighty elites will know everything about us and the little gal and guy will be helpless pawns. Naturally, this is the future we see depicted in a lot of sci fi films and novels because dystopia makes drama trivially easy. Besides, this is clearly where a world mafia-oligarchy wants us to go, so some paranoia is justified!

 All decent folks who want to preserve freedom and individual opportunity rightly oppose this death mode for the Enlightenment. But in opposition, we've seen some pretty simplistic notions.

2) A wild west future of all information floating free and thus empowering the masses. Yes, at a very simplistic - and hence dumb - level, this conveys the notion of generalized accountability that I tout in The Transparent Society. Sure, I'd rather err on the side of everyone seeing! Because all our great enlightenment systems -- markets, democracy, science, justice courts and sports - all of them wither and die in fog or darkness. But the arguments in favor of transparency are more subtle than this. And yes, humans want some privacy and control. And any decent civilization will include those things.

3) Paternalistic walls. Alas, the vast majority of smart, sincere paladins fighting for freedom and rights and against the dystopian age almost all reflexively turn to demanding laws that restrict information flows. Supposedly empowering citizens to declare "you cannot know this about me!" Enshrining "ownership of my own information." 

It all sounds so positive and freedom-y, that no one -- certainly in Europe -- ever dares to respond: "Not only can that not possibly work, at any level, but it is exactly what elites and oligarchs want most -- walls, guarded by law and the state, within which they can connive and reach out to control." 

Alas, every last error that I just described can be found in well-meaning initiatives like this proposed "Internet Bill of Rights," which would have none of the intended benefits and a whirling myriad of horrific, unintended consequences. 

I mentioned Jaron Lanier's notion that merits repetition. It was not that we should own all information about ourselves, but that we should have strong interest in our data, and get to benefit from anyone who uses it... much in the way that patent laws weren't originally meant to prevent use of inventions, but to ensure the inventors got paid a fair share, so that sharing would make sense to them.

Which brings us back around to the stunningly foolish assertion that any kind of political agitation - or even law - can possibly thwart the arrival of easy-cheap face recognition. Again, the fear of sliding into an Orwellian surveillance/control state is genuine and terrifying! Alas, it is trivialized and lobotomized by those who think they can stymie the "surveillance" part, by howling at the mighty "don't look at us!" 

It is the "control" part that can still be prevented, via the method we used with increasing effectiveness for 200 years -- answering surveillance with sousveillance. 

== An Addendum... and Alert... on "war with Iran" ==

John Bolton and Mike Pompeo have long sought war with Iran. Now, Iranian President Hassan Rouhani has repeated an earlier threat to block ships from leaving the Persian Gulf if the U.S. government continues to seek to block Iranian oil exports. Rouhani’s comments came a day after the U.S. sent an aircraft carrier to the Persian Gulf on Monday, ending the longest period the U.S. had gone without an carrier in the Gulf over the past two decades.

Examine the pieces. The US Navy has been trying to ramp down tensions in the gulf, and especially to keep its most valuable assets out of a friction zone and potential death trap. ALso especially since the Persian Gulf is a lot less important to us, now that the U.S. has achieved effective energy independence. We shouldn't tuirn our backs on the region. On the other hand, it is an opportunity to stop being in reflex-reactive mode, no longer letting that crazy realm control what we do.

Ah, but our professionals have (alas) insane bosses. By clamping on the Iranian economy and ordering the USS Stennis into the gulf, Pompeo, Bolton and the Bannonites are setting a stage for what the Saudis and Trump and Benjamin Netanyahu all desperately need, an international distraction from their own mounting troubles. 

A “Tonkin/Reichstag/Gleiwitz/Sarajevo/Remember-the-Maine” “Incident” (look them all up!) is something that our professionals in the military/intel/diplomatic corps have skillfully forestalled, till now. But for how much longer? 

Especially when even the Iranian mullahs would benefit from a brief, colorful, pippety-poppety “tomahawk war” that does no major harm, but gives them an excuse to clamp down on their own millions of young liberals? 

Of course, the top winner of such a dog-wag is blatantly obvious. Only one man can possibly rake in all the marbles… when V. Putin steps up to spread the Russian umbrella and “protect our neighbor.” At which point Russia gets the Persian satrapy it has sought for 300 years. Thanks to Vlad’s agents in the White House.