[M] a bright and shiny hell

Newsgroups 
Subject: [M] a bright and shiny hell
From: rsw@therandymon.com (RS Wood)
Newsgroups: misc.news.internet.discuss, dictator.general
Organization: blocknews - www.blocknews.net
Date: Oct 26 2017 02:27:49
From the «George you prescient bastard, you» department:
Title: A bright and shiny hell
Author: Charlie Stross
Date: Sun, 10 Sep 2017 09:55:24 -0400
Link: http://www.antipope.org/charlie/blog-static/2017/09/a-bright-and-shiny-hell.html

(Apologies for blogging so infrequently this month. I'm currently up to my
elbows in The Labyrinth Index, with a tight deadline to hit if the book's going
to be published next July. Blogging will continue to be infrequent, but
hopefully as provocative as usual.)

Remember Orwell's 1984 and his description of the world ahead—"if you want a
vision of the future, imagine a boot stamping on a human face, forever"?

This is the 21st century, and we can do better.

George got the telescreens and cameras and the stench of omnipresent
surveillance right, but he was writing in the age of microfilm and 3x5 index
cards. Data storage was prodigiously expensive and mass communication networks
were centralized and costly to run — it wasn't practical for amateurs to set up
a decentralized, end-to-end encrypted shadow network tunnelling over the public
phone system, or to run private anonymous blogs in the classified columns of
newspapers. He was also writing in the age of mass-mobilization of labour and
intercontinental warfare. Limned in the backdrop to 1984 is a world where atom
bombs have been used in warfare and are no longer used by the great powers, by
tacit agreement. Instead, we see soldiers and machine-guns and refugees and the
presentation of inevitable border wars and genocides between the three giant
power blocs.

Been there, done that.

What we have today is a vision of 1984 disrupted by a torrent of data storage.
Circa 1972-73, total US manufacturing volume of online computer storage — hard
drives and RAM and core memory, but not tape — amounted to some 100Gb/year.
Today, my cellphone has about double that capacity. I'm guessing that my desk
probably supports the entire planetary installed digital media volume of 1980.
(I'm looking at about 10Tb of disks ...) There's a good chance that anything
that happens in front of a camera, and anything that transits the internet,
will be preserved digitally into the indefinite future, for however long some
major state or corporate institution considers it of interest. And when I'm
taking about large-scale data retention, just to clue you in, Amazon AWS
already offers a commercial data transfer and storage service using AWS
Snowmobile[1], whereby a gigantic trailer full of storage will drive up to the
loading bay of your data center and download everything. It's currently good
for up to 100PB per Snowmobile load. (1PB is a million gigabytes; 1EB is a
billion gigabytes; ten snowmobile loads is 1EB, or about 10,000,000 1973's
worth of global hard drive manufacturing capacity). Folks, Amazon wouldn't be
offering this product if there wasn't a market for it.

These heaps and drifts of retained data (and metadata) can be subjected to
analytical processes not yet invented — historic data is still useful. And some
of the potential applications of neural network driven deep learning and
machine vision are really hair-raising. We've all seen video of mass
demonstrations over the past year. A paper to be presented at the IEEE
International Conference on Computer Vision Workshops (ICCVW) introduces a
deep-learning algorithm that can identify an individual even when part of their
face is obscured. The system was able to correctly identify a person concealed
by a scarf 67 percent of the time against a "complex" background[2]. Police
already routinely record demonstrations: now they'll be able to apply offline
analytics to work out who was there and track protestors' activities in the
long term ... and coordinate with public CCTV and face recognition networks[3]
to arrest them long afterwards, if they're so inclined.

It turns out that facial recognition neural networks can be trained to
accurately recognize pain[4]! The researchers were doubtless thinking of
clinical medical applications — doctors are bad at objectively evaluating
patients' expressions of pain and patients often don't self-evaluate
effectively — but just think how much use this technology might be to a regime
bent of using torture as a tool of social repression[5] (like, oh, Egypt or
Syria today). They also appear to be better than human beings at evaluating
sexual orientation of a subject[6], which might be of interest in President
Pence's Republic of Gilead, or Chechnya, or Iran. (There's still a terrible
false positive rate, but hey, you can't build an algorithmic dictatorship
without breaking heads.)

(Footnote: it also turns out that neural networks and data mining in general
are really good at reinforcing the prejudices of their programmers, and
embedding them in hardware. Here's a racist hand dryer[7] — it's proximity
sensor simply doesn't work on dark skin! Engineers with untested assumptions
about the human subjects of their machines can wreak havoc.)

All of this is pretty horrific — so far, so 2017 — but I'd like to throw two
more web pages in your face. Firstly, the Gerasimov Doctrine[8] which appears
to shape Russian infowar practices against the west. We've seen glaring
evidence of Russian tampering in the recent US presidential election, including
bulk buying of micro-targeted facebook ads[9], not focussing on particular
candidates but on party-affiliated hot-button issues such as race, gay rights,
gun control, and immigration. (I'm not touching the allegations about bribery
and Trump with a barge pole — that way lies the gibbering spectre of Louise
Mensch — but the evidence for the use of borderline-illegal advertising to
energize voters and prod them in a particular direction looks overwhelming.)
Here's a translation of Gerasimov's paper, titled  e Value of Science Is in the
Foresight: New Challenges Demand Rethinking the Forms and Methods of Carrying
out Combat Operations[10]. As he's the Russian army Chief of General Staff,
what he says can be taken as gospel, and he's saying things like, "the focus of
applied methods of conflict has altered in the direction of the broad use of
political, economic, informational, humanitarian, and other nonmilitary [my
emphasis] measures — applied in coordination with the protest potential of the
population". This isn't your grandpa's ministry of propaganda. Our social media
have inadvertently created a swamp of "false news" in which superficially
attractive memes outcompete the truth because humans are lousy at
distinguishing between lies which reinforce their existing prejudices and an
objective assessment of the situation. And this has created a battlefield where
indirect stealth attacks on elections have become routine to the point where
savvy campaigns pre-emptively place bait for hackers[11].

There are a couple of rays of hope, however. The United Nations Development
Program recently released a report, Journey to extremism in Africa: drivers,
incentives and the tipping point for recruitment[12] that pointed out the
deficiencies in the Emperor's wardrobe with respect to security services.
Religion and ideology are post-hoc excuses for recruitment into extremist
groups: the truth is somewhat different. "The research specifically set out to
discover what pushed a handful of individuals to join violent extremist groups,
when many others facing similar sets of circumstances did not. This specific
moment or factor is referred to as the 'tipping point'. The idea of a
transformative trigger that pushes individuals decisively from the 'at-risk'
category to actually taking the step of joining is substantiated by the Journey
to Extremism data. A striking 71 percent pointed to 'government action',
including 'killing of a family member or friend' or 'arrest of a family member
or friend', as the incident that prompted them to join. These findings throw
into stark relief the question of how counter-terrorism and wider security
functions of governments in at-risk environments conduct themselves with regard
to human rights and due process. State security-actor conduct is revealed as a
prominent accelerator of recruitment, rather than the reverse." In fact, the
best defenses against generating recruits for extremist organizations seemed to
be things like reduced social and eonomic exclusion (poverty), improved
education, having a family background (peer pressure), and not being on the
receiving end of violent repression. Because violence breeds more violence —
who knew? (Not the CIA and USAF with their typical "oops" response whenever a
drone blows up a wedding party they've mistaken for Al Qaida Central.)

So, let me put some stuff together.

We're living in a period where everything we do in public can be observed,
recorded, and will in future provide the grist for deductive mills deployed by
the authorities. (Hideous tools of data-driven repression are emerging almost
daily without much notice, whether through malice or because they have socially
useful applications and the developers are blind to the potential for abuse.)
Foreign state-level actors and non-state groupings (such as the new fascist
international[13] and its hive of internet-connected insurgents) are now able
to use data mining techniques to target individuals with opinions likely to
appeal to their prejudices and inflame them into activism. Democracy is
directly threatened by these techniques and may not survive in its current
form, although there are suggestions that what technology broke, technology
might help fix[14] (TLDR: blockchain-enabled e-voting, from the European
Parliament Think Tank). And there are some signs that our existing
transnational frameworks are beginning to recognize that repressive policing is
one of the worst possible shields against terrorism.

Social solidarity. Tolerance. Openness. Transparency that runs up as well as
down the personal-institutional scale. And, possibly, better tools for
authenticating public statements such as votes, tweets, and blog essays like
this one. These are what we need to cleave to if we're not going to live out
our lives in a shiny algorithmic big data hellscape.

Links:
[1]: https://aws.amazon.com/snowmobile/ (link)
[2]: https://motherboard.vice.com/en_us/article/mbby88/ai-will-soon-identify-protesters-with-their-faces-partly-concealed?utm_source=mbtwitter (link)
[3]: https://arstechnica.co.uk/tech-policy/2017/06/police-automatic-face-recognition/ (link)
[4]: https://www.researchgate.net/publication/224680068_Pain_Recognition_Using_Artificial_Neural_Network (link)
[5]: http://www.middleeasteye.net/in-depth/features/new-accounts-suggest-severe-torture-egypt-ongoing-not-decreasing-92831992 (link)
[6]: https://osf.io/zn79k/ (link)
[7]: https://www.youtube.com/watch?v=8Eo9Xdrvf-E (link)
[8]: http://www.politico.com/magazine/story/2017/09/05/gerasimov-doctrine-russia-foreign-policy-215538 (link)
[9]: https://www.nytimes.com/2017/09/06/technology/facebook-russian-political-ads.html (link)
[10]: http://usacac.army.mil/CAC2/MilitaryReview/Archives/English/MilitaryReview_20160228_art008.pdf (link)
[11]: https://www.inquisitr.com/4199432/macron-campaign-says-email-leak-was-fake-news-deliberate-move-to-sow-doubt-compares-to-u-s-election/ (link)
[12]: http://journey-to-extremism.undp.org/content/downloads/UNDP-JourneyToExtremism-report-2017-english.pdf (link)
[13]: https://balticscholars.stanford.edu/?p=951 (link)
[14]: https://bitcoinmagazine.com/articles/european-parliament-document-suggests-future-of-democracy-linked-to-blockchain-enabled-e-voting-1475701523/ (link)


Date Subject  Author
26.10. * [M] a bright and shiny hellRS Wood
26.10. `- Re: [M] a bright and shiny hellJAB

This forum property of The Dictator's Handbook. Please read our charter.