summing up 87

summing up is a recurring series on topics & insights on how we can make sense of computers that compose a large part of my thinking and work. drop your email in the box below to get it straight in your inbox or find previous editions here.

How Technology Hijacks People’s Minds, by Tristan Harris

The ultimate freedom is a free mind, and we need technology to be on our team to help us live, feel, think and act freely.

We need our smartphones, notifications screens and web browsers to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first. People’s time is valuable. And we should protect it with the same rigor as privacy and other digital rights.

the way we use, create and foster technology today will be looked back the same way as we look back at the use of asbestos in walls & floors or naive cigarette smoking. creating useful technology is not about creating a need in the user, but to create things that are good for the user.

Build a Better Monster: Morality, Machine Learning, and Mass Surveillance, by Maciej Cegłowski

We built the commercial internet by mastering techniques of persuasion and surveillance that we’ve extended to billions of people, including essentially the entire population of the Western democracies. But admitting that this tool of social control might be conducive to authoritarianism is not something we’re ready to face. After all, we're good people. We like freedom. How could we have built tools that subvert it?

We need a code of ethics for our industry, to guide our use of machine learning, and its acceptable use on human beings. Other professions all have a code of ethics. Librarians are taught to hold patron privacy, doctors pledge to “first, do no harm”. Lawyers, for all the bad jokes about them, are officers of the court and hold themselves to high ethical standards.

Meanwhile, the closest we’ve come to a code of ethics is “move fast and break things”. And look how well that worked.

the tools we shape, shape us and create a new world. but technology and ethics aren't easy to separate – that new world doesn't necessarily have to be a better world for all of us. maybe just for some.

Is it really "Complex"? Or did we just make it "Complicated"? by Alan Kay

Even a relatively small clipper ship had about a hundred crew, all superbly trained whether it was light or dark. And that whole idea of doing things has been carried forward for instance in the navy. If you take a look at a nuclear submarine or any other navy vessel, it's very similar: a highly trained crew, about the same size of a clipper. But do we really need about a hundred crew, is that really efficient?

The Airbus 380 and the biggest 747 can be flown by two people. How can that be? Well, the answer is you just can’t have a crew of about a hundred if you’re gonna be in the airplane business. But you can have a crew of about a hundred in the submarine business, whether it’s a good idea or not. So maybe these large programming crews that we have actually go back to the days of machine code, but might not have any place today.

Because today – let's face it – we should be just programming in terms of specifications or requirements. So how many people do you actually need? What we need is the number of people that takes to actually put together a picture of what the actual goals and requirements of this system are, from the vision that lead to the desire to do that system in the first place.

much of our technology, our projects and our ideas comes down to focusing on everything but the actual requirements and original problem. nevertheless it doesn't matter how exceptional of a map you can draw if someone asks for directions to the wrong destination.


Want more ideas like this in your inbox?

My letters are about long-lasting, sustainable change that fundamentally amplify our human capabilities and raise our collective intelligence through generations. Would love to have you on board.