Commenting on the absurdity and ridiculousness of the social media ‘influencer’ craze, Joey Borelli said:
This is the world we live in now, folks: the Look At Me Generation, where everyone finds a way to make everything about themselves and make sure you see it — what a legacy to leave behind.
He really struck a chord with me on social media and its impact on our culture. But it’s not only about influencer cults, narcissism, rampant stupidity, or egocentricity.
These traits were already around, but social media has magnified them. The outcomes are things like the fake news crisis, cults of personality, harassment and bullying, conspiracy theories, radicalism and polarization, teen suicide, genocide — all enabled through social media over the past few years.
I am shocked by the glaring irresponsibility of the tech companies behind those apps and websites. Their tools make it easier to connect with friends, but these tools spy on us, track us, and shove ads in our face, and the tech companies sit back and let the rest of society deal with the consequences.
They employ top talent with the brightest minds to do one thing: create tools that exploit our weaknesses. And for what? Money? Power? Ad revenue?
There’s an obvious analogy to the tobacco industry, which spent decades trying to make us believe — through blatant lying, false studies, and influencing policy makers — that smoking was actually healthy.
Big Tech tells you how important their mission is, and all the good they are doing, but in the end you’re not their customer, you’re their product. What if just a portion of those bright minds actually worked on important problems? You know, help to battle climate change? Fight inequality? Make the world a better place?
We should be deeply concerned about the impacts of social media to society. We need to be willing to have the difficult discussions about what world we want to live in.
The point is best made in this quote attributed to Marshall McLuhan:
We shape our tools and thereafter our tools shape us
We first create our tools and then they change our society and culture. Every new medium creates new environments for people to act in, from television and radio, to the phone, the car, the bicycle, or the printing press.
Both go together; you can’t have only one. And there is a responsibility to be taken. After all, technology is not neutral. As a culture, we always pay the price for our technology. You can’t have the ship without the shipwreck, the car without the car accident or social media without… Well, what?
This is not a technological problem, but a lack of perspective. We embrace new technologies without even considering how we might be changed by them. We don’t adjust for system failures: we move on. And in moving on without assessing our shortcomings, we learn nothing.
If we don’t implement what we learn, it means we don’t care about how technology will change our society for the better — or for the worse.
A different world is possible.
A world where we can build something better. We can decide how we become shaped by our tools. We can choose to use tools deliberately designed to shape us into better people. Imagine our apps not trying to exploit us, but instead trying to help us be better human beings, augmenting our thinking, helping build a better life for us all, and a better future for our children.
A world where we create tools that truly make things cheaper, faster, and better, without destroying the very fabric of our humanity. One where we make better choices about how technology affects society. One where we can say no when technology does not align with our values.
We first shape our tools, then those tools shape us
If we focus on how tools shape us, we can encourage new technologies, ideas, and visions that amplify humans, and also produce tremendous wealth for our society.
Want more ideas like this in your inbox?
My letters are about long-lasting, sustainable change that fundamentally amplify our human capabilities and raise our collective intelligence through generations. Would love to have you on board.