Eaten Inside Out

 
 
[01.27.2020] Newsletter: MM.png
 

See if you can spot the "aha" moment in Yuval Noah Harari's extraordinary TED talk on the looming threat to democracy posed by the application of artificial intelligence. It's not just in the fact his live stage presentation in Vancouver was accomplished by means of hologram imagery transmitted from Tel Aviv (for video click here; or see written transcript here).

Nor is it in his astute description of fascism -- being the ugly stepbrother of nationalism in its denial of individual identity and the demand for supreme obligation to the state. Nor even is it in the way that resistance to such subordination is overcome by making it appear beautiful in the fascist mirror.

Not even is the real epiphany in the way he describes how the new technological realities i.e. the merger of artificial intelligence and machine learning, may eliminate the current advantages the democratic marketplace enjoys over the ultimately distorting central planning approach of authoritarian regimes. We've addressed that feature before (MM 09.17.18 | Artificial Intelligence). 

No, what's striking here, and is the background of our discussion, is the looming merger of information technology with biotechnology, and how it leads to the creation of algorithms that know us better than we know ourselves. Of course, we've sensed this all along in the commercial world where ads "somehow" instinctively reflect our personal profile. So what?, we might say, as we know deep down that nothing is free and that that which seems so, simply means we are the product. Besides, we tell ourselves, it's just marketing and our individual agency gives us the freedom to respond to commercial solicitations as we choose. The real insight is actually in the next step down the rabbit hole.

Such is the moment when we truly recognize the way these machine-learning algorithms have insinuated themselves below the conscious level and have tapped into our very subconscious -- that's what's really meant when we say these algorithms know us better than we know ourselves. Immense (and increasing) data flows emanate from all that which is measurable -- what we write, what we purchase, what we search, our likes/dislikes, our connections, some of what we say and how we interact -- can be and are being vacuumed up, triangulated, and distilled into algorithms that, again, know us right down to the emotional level. Soul hacking.

Yes, we've seen this movie before. Two years ago we discussed China's experience  (MM 01.15.18 | Social Engineering (China)) and even noted at the time that this could possibly be within America's embrace as well. But, at the time, we fancied ourselves and our comparatively  free democracy as somehow and somewhat less vulnerable. Silly us. 

As we learned from the China example, that which can be measured can be controlled. Our very autonomy becomes increasingly compromised as the algorithms learn how to tap into, feed, and can thereby manipulate our emotions. And don't kid yourself, our big life decisions are emotionally made and then (maybe) intellectually justified.    

May we thus remain vigilant, especially as we embark on this election cycle, as to who/what controls the data knowing how easily they can be used (weaponized) in the same way it has been applied to sell product . . . . or, as power brokers recognize as they gather for a dinner meeting: if you're not at the table you're on the menu.

Steve SmithComment