Are the algorithms going to lead us into war?
Click here to view this email in your web browser
Greetings all,
First, on the podcast this week, my guest (John Biggs, https://twitter.com/johnbiggs, https://www.linkedin.com/in/johndbiggs/) is an entrepreneur, writer, and technologist. He was the east coast editor at TechCrunch for fifteen years, was an editor at Coindesk, and is currently the Editor in Chief at Gizmodo. In this conversation we dive into a wide range of topics including 5G, the future of work, blockchains, recommended books, John’s entrepreneurial ventures, his new book (Get Funded), previous books (https://johnbiggsbooks.com/), and practical advice for entrepreneurs.
Check it out: Emerging technology, startup funding, and books for entrepreneurs :: with John Biggs
FYI, you can listen to the show in your favorite podcast app (just search for “Ventures”).
Dataism, the algorithms, and the potential for war
Born in 1980, I was among the first cohort of youth to be guinea pigs for smartphones and social media. Even though we were in college and appreciated the analog world of our past, screens and connectivity took over. We became sick. All the things I was taught growing up - avoiding vanity, pursuing humility, being kind and empathetic, etc. - were at odds with what social media was encouraging us to do.
That was my first “yellow flag”. Had I been smarter and wiser I would have stopped back then, but I got caught up in technology entrepreneurship. I was among a group of founders building apps that leveraged the Hook Model, i.e. a proven psychological tactic to make users addicted to our services.
We were relatively innocent back then. Afterall, we didn’t yet have easily accessible AI. We were just trying to make a bit more money, impress our friends and investors, and champion stories of our users being happy and connected with the world.
But then, a few services won the attention of everyone and started applying AI. The machines put the Hook Model on steroids. They knew us better than we knew ourselves, and successfully hacked our brains to make us addicted to scrolling/tapping/swiping so we can watch and click on ads.
That was just the start.
I’m an optimist. When my colleagues and I devoured Harari’s books in the mid-2010s, we were convinced that the new “ism” - i.e. Dataism, trust in the algos’ handling of data - was going to be a good thing (maps, health, relationships, work, etc…). We didn’t anticipate Brexit/Trump and sweeping nationalism globally that the algos were going to encourage.
If you are not familiar, the “algos” serve up feeds that are custom to each individual. Check out The Social Dilemma on Netflix. People on the right and left of the political spectrum (or really any spectrum) are whipped into a frenzy with posts and news that are designed for engagement and ad-viewing/clicking. The byproduct of this is all sorts of fake news and echo-chambery content that is making our world more polarized and less intelligent. (My guest on the podcast this week - John Biggs - has been on the front-lines as a writer/editor of popular content for the past couple of decades (TechCrunch/Gizmodo). He admitted that audiences are clearly getting less intellectual. There is little debate here.)
All of this is fun and games until people start getting hurt.
The riots and killings in Myanmar, the dramatic rise in teenage depression and suicide, the countless extremists (on all sides) that damage property and people, etc… have been intensifying over the past decade. All over the world people are angry, depressed, anxious, etc.. (often rightfully so) and this is leading to outrage.
While of course I don’t believe that “the algos” are the root cause of our problems, they are certainly helping to magnify them. The machines are pouring gasoline on the fires in our hearts, and collectively we are experiencing much more harm than good.
So. Yes. Unfortunately many of us won’t be surprised if the 2020s bring multiple civil wars in “developed” countries around the world. Many smart people are thinking about how to solve this problem, but those smart people aren’t getting media attention.
Therefore, it starts with us. I think we - as technology entrepreneurs, investors, and tech-savvy people - need to do our part to fix the problem we unwittingly helped create. Let’s build and fund services that promote multi-perspective discourse.
From my perspective, I think this is going to best be done in small(er) groups. We need a global community of “nano-influencers” that can reach their friends and encourage thoughtful communication. Because of financial forces, I think it will be impossible to completely get rid of “mega-influencers” who capture most of our traditional and social media attention, make TV shows, get elected into offices, etc… but if the global uprising of “nano-influencers” can tip the balance a bit, I think that will be a good thing.
So, back to optimism. I’m trying to do my part by building Satchel, which I’ve talked about in detail here. But in general I see hope in the rise of the passion economy, i.e. people able to make a living by creating content for - and delivering that content directly to - a relatively small number of people. In this economy, the role of social media and traditional media, in my opinion, will diminish. It won’t go away, of course, but I’m hopeful there will be plenty of opportunities for people to learn and engage with other people known more personally (maybe even within the same physical community, wouldn’t that be wild?).
Have a great rest of your week, and stay safe out there.
~Will