Friday, October 27, 2017

We're Building a Dystopia Just to Make People Click on Ads

We're building an artificial intelligence-powered dystopia, one click at a time, says techno-sociologist Zeynep Tufekci. In an eye-opening talk, she details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren't even the real threat. What we need to understand is how the powerful might use AI to control us - and what we can do in response.
Outline:
  1. Terminator AIs are a distant threat and 1984 is not the correct dystopia for the 21st century.
  2. The present threat is not in what AI will do to us on its own, but in how it will be used by [the people who control AI] to manipulate us in extremely negative ways.
  3. AI is a jump in category - it's "a whole different world". It's not just 'the next thing'. 
  4. With prodigious potential come prodigious risks.
  5. We "roll our eyes" at online ads, but "persuasion architecture" works. Huge collections of data allow AIs to very accurately guess who will respond to what kinds of persuasion.
  6. The data is far to complex for humans to understand. Tufekci says "It's giant matrices with... maybe millions of rows and columns". (I'd expect significantly more than 2 dimensions).
  7. "It's like we're not programming anymore, we're growing intelligence that we don't truly understand."
  8. This only works with immense amounts of data. Deep surveillance on all of us is encouraged.
  9. Ethics in targeting is an issue. "What if the system targets people who are bipolar and about to enter the manic phase?" Such people are prone to overspending and compulsive gambling. There's no way to determine whether this is the result of AI targeting.
  10. A lot of this stuff is "off the shelf".
  11. YouTube auto-play follows increasing order of extremism. "It's like you're never hard-core enough for YouTube.
  12. With nobody minding the ethics of the store, these sites can profile people to find people susceptible to more extremism. It's cheap and easy to target any category.
  13. Trump's campaign used Facebook to demobilize people, strictly targeted. "Dark Posts", privacy guaranteed.
  14. Facebook - multiple election manipulation experiments proved workable; what if they decided to support a candidate?
  15. "Little by little, public debate is becoming impossible," because the most-used systems for public debate only let you see what you already believe.
Conclusions:
  1. We're building this structure of surveillance authoritarianism merely to get people to click on ads, but what if it gets worse? If authoritarianism is using overt fear to terrorize, we'll be scared, but we'll know it. We'll hate it, we'll resist it.
  2. If the powers that be are quietly watching, judging and nudging, predicting and identifying "troublemakers", using personal individual weaknesses/vulnerabilities...
    ... if they're doing it at scale through our private screens so we don't know what others are seeing...
    ...That authoritarianism will envelope us like a spider's web, and we may not even know that we're in it.
  3. The structure of the architecture is the same whether you're selling shoes or politics. The algorithms don't know the difference. 
  4. Social media is great in many ways. It's not that people are maliciously and deliberately trying to wreck the world, but the structures and business models are still very dangerous. There's no simple solution.
  5. Restructuring is needed. We have to face lack of transparency, the structural challenge of machine-learning capacity. These structures are organizing how we plan and function; are controlling what we can and can't do.
  6. We have to mobilize our tech, creativity, and politics so we can build AI that supports human goals but is constrained by human values. 
  7. We need a digital economy where our data and our attention is not for sale to the highest bidding authoritarian or demagogue!
We do want the prodigious potential of artificial intelligence and digital technology to blossom, but for that we must face this prodigious menace open eyed, and now.


For a more holistic view of this topic, see the Hypernormalization BBC Documentary.
Here are my notes/outline on that.

No comments:

Post a Comment

The Blame Game

  IT'S NOT JUST A GAME ANYMORE   There's an adamant refusal, built into human nature, to look to our own faults. The typica...