A few months ago, Facebook wanted us to send them copies of our nudes so that
they can block those images if they are later uploaded by someone else. I don't
even want to know how they came up with such utter nonsense. But regardless of
that, it is a perfect example of what I call digital laziness. Instead of
fixing the actual, hard and sometimes messy problem we come up with an easy
I often get approached with similar requests – well, not nudes – but similar
nonsensical ideas. We want 10,000 visitors on our website! We want to sell our
high touch consulting service directly from our website! We don't want to do
anything and still be able to grow our business!
We can use websites, drip campaigns, newsletters and digital marketing
strategies to get more and better clients. But we'll fail utterly if we don't
assert the fundamental goal we're trying to achieve. Instead, we have to see
the above as tools we can use to reach these goals and augment parts of our
summing up is a recurring series of interesting articles, talks and insights on
culture & technology that compose a large part of my thinking and work. Drop
your email in the box below to get it – and much more – straight in your
Our computers have lured us into a cage of our own making. We’ve reduced
ourselves to disembodied minds, strained eyes, and twitching, clicking, typing
fingertips. Gone are our arms and legs, back, torsos, feet, toes, noses,
mouths, palms, and ears. When we are doing our jobs, our vaunted knowledge
work, we are a sliver of ourselves. The rest of us hangs on uselessly until we
leave the office and go home.
Worse than pulling us away from our bodies, our devices have ripped us from
each other. Where are our eyes when we speak with our friends, walk down the
street, lay in bed, drive our cars? We know where they should be, and yet we
also know where they end up much of the time. The tiny rectangles in our
pockets have grabbed our attention almost completely.
These days almost every room is equipped with electricity, light and buttons to control it.
It's so common that we hardly trouble ourselves thinking about it. It is not a
device we carry around in our pockets, have to charge up every night and buy a
new version every two years – like a flashlight. Imagine how small and lonely
a world like this would be, where everyone carries his own, personal
flashlight, seeing only one thing at a time and having one hand always busy.
Teaching machines were going to change everything. Educational television was
going to change everything. Virtual reality was going to change everything. The
Internet was going to change everything. The Macintosh computer was going to
change everything. The iPad was going to change everything. And on and on and
Needless to say, movies haven’t replaced textbooks. Computers and YouTube
videos haven’t replaced teachers. The Internet has not dismantled the
university or the school house. Not for lack of trying, no doubt. And it might
be the trying that we should focus on as much as the technology.
The transformational, revolutionary potential of these technologies has always
been vastly, vastly overhyped. And it isn’t simply that it’s because educators
or parents are resistant to change. It’s surely in part because the claims that
marketers make are often just simply untrue.
The hype surrounding our technologies is sometimes so pervasive that raising
skepticism can often be seen as one's failure to recognize that the hype
is deserved. This is the game we're playing. It's no longer about the real
transformational power, about real change & potential, but mostly about
a superficial pop culture.
Think about how much has changed. Some things have changed without
recognition, sadly the way we build software hasn't. There is nothing you do
day by day that wouldn't have been familiar to me 25 years ago. Yes, you're
using more powerful machines, yes you're using browsers and all this other
stuff. But the way you work day to day has not improved.
Most of software development today is based on myth, superstition or arrogance.
And this won't change until we're willing to be humble enough to admit when
we're wrong. Only then we can find out how the world actually works and do
things based on that knowledge.
This year I tested for the 2nd Dan black belt in Taekwondo, after preparing
almost three and a half years. Black belt exams are an interesting beast. They
feature an extremely deep curriculum, but most importantly there is a minimum
time requirement for advancing from one rank to the next. The reason is a very
simple one: time is needed for techniques and the individual to mature. The
point is that it is a journey, not a result. There are no tricks, hacks or
Similarly, tricks, hacks or silver bullets never made a big impact for my
business. And honestly, they have never made a big impact for any of my clients
either. If there is a trick or hack it's always been crafting holistic,
consistent and high-quality experiences.
Crafting these experiences takes time. Only when all parts of a (digital)
business strategy (positioning, right channels, product/service-market-fit)
align, you get something that is better than the sum of its parts. And an
enduring process, which will bring new clients, growth and revenue for a time
Today, exactly 50 years ago, a man invented the future.
If you've been following my writing, talks and ideas you've certainly heard his
name: Doug Engelbart.
On 9th December 1968, he and his team demonstrated the prototype of his vision
at the Fall Joint Computer Conference in San Francisco in front of about 1,000
This demo introduced so many key concepts we still use today: the computer
mouse, windows, graphics, video conferencing, word processing, copy & paste,
hypertext, revision control, a collaborative real-time editor and much more. No
wonder it's also known as the Mother of all Demos.
What's so striking about Engelbart's demo however isn't how much has changed
since then, but how many things have stayed the same.
To celebrate this somewhat special day, I want to deviate a bit from my usual
format and highlight some of his key ideas which impress me to this day.
The Mother of all Demos,
which I alluded to earlier, is certainly one of the most important pieces of
our computer history. If you can spare some time this holiday season, I can
only commend to watch parts of this demo. It was a jaw-dropping experience for
me. And a testament to what can happen when you get a bunch intelligent people
together and ask them to invent the future.
The ABCs of Organizational Improvement
is a framework I rely heavily on when working with clients. It depicts three
types of basic activities which should be ongoing in any healthy business:
(A) Business as usual: Processes you can find in every business and include
the core activities, such as developing a product, manufacturing, marketing,
sales etc. It is all about execution and carrying out today's strategy.
(B) Improving how we do that: Thinking about how to improve the ability to
perform A. This includes training, hiring, adopting new tools & processes,
workflows or bringing in external consultants.
(C) Improving how we improve: How can we improve how we improve? How can we
get better at inventing better processes in B? It's this part most businesses
struggle with, but at the same time brings the most value. This kind of
meta-thinking is the shift from an incremental to an exponential improvement
and ultimately the advancement of the business as a whole.
Augmenting Human Intellect: A Conceptual Framework
lays down Engelbart's fundamental vision. In there you can find his famous
example of taping a pencil to a brick and thereby significantly slowing down
the ability to write. When you make it harder to do the lower parts of an
activity, it becomes almost impossible to do the higher parts of an activity –
like exploring ideas, structuring your thoughts & ideas or to distill the
essence of something to the essential. Our tools influence the thoughts we can
think, and bad tools interfere with thinking well.
Engelbart's vision went much further, as he intended to augment human intellect
and enable people to think in powerful new ways, to collectively solve urgent
global problems. To really understand what he means by that, you have to forget
today. You have to forget everything you know about computers. You have to
forget everything you think you know about computers. His vision is not about
computers, it's about us and the future of mankind:
Technology should not aim to replace humans, rather amplify human capabilities.
Engelbart’s vision & philosophy continues to influence many technologists
today, myself included, I hope I could explain why.
Would you rather go across town on a tricycle or a bicycle?
It is clearly easier to learn to ride a tricycle, but nowhere as efficient as a
bicycle. Learning to ride a bicycle is hard. But in the end it seems to be
worth it – you don't see many tricycles around these days, do you?
Similarly, is it better to use tools that are easy to learn and use? Or is it
worth to put in the time to learn and master difficult, powerful tools? On one
hand, simple tools may be easy to learn and use, but it will be hard work to
accomplish difficult tasks. On the other hand, difficult and powerful tools
call for considerably more skills, but the ratio of time to effort is
dramatically higher. And this is a very interesting perspective: If you happen
to use the computer as a tool for your lifetime, isn't it worth to invest time,
become skillful and save time in the long run?
They don't care if you have a mortgage to pay. They don't care if you have to
pay your bills. They don't care if you have to feed your family. They don't
care why the project is late, what clothes you wear, where you went to school
or your favourite dish.
The only thing clients care about are themselves and their problem.
We're only talking to them because they believe – even a little bit – that
we're able to better their situation.
Nevertheless, I see so many websites and newsletters talking about themselves,
their team, their vision, their products & services and so on. And while these
might be interesting bits here and there, they simply don't help you advance
You want to make your communication "you" focused, not "I" (we, us, ...)
focused. Help your clients understand how you can help them and what expensive
problem you solve for them. Great websites say "you" – they don't say "I".
While travelling I specifically try to avoid restaurants with large menus. Why?
Because they make me miserable.
First, you spend endless time just browsing through the menu, looking for the
best dish (whatever that means anyway), comparing each and every option, and
struggling to decide. Then, just after you found something to order, the dish
is either not available or right after ordering, the neighbouring table has
found an even more delicious meal. You wouldn't be happy with yours anyway.
Logic suggests that having options allows people to select precisely what makes
them happiest. But as
show, abundant choice often makes for misery. This holds true for websites as
summing up is a recurring series of interesting articles, talks and insights on
culture & technology that compose a large part of my thinking and work. drop
your email in the box below to get it – and much more – straight in your
People – I think – don't change that much. What changes over time are
cultural differences and values, but people have the same goals, the same
desires and the same urges.
Technology matches our desires, it doesn't make them. People haven't become
more vain because now we have cameras. Cameras have been invented and they
became popular because we've always been a bit vain, we've always wanted to see
ourselves. It's just the technology was never in place to enable that
expression of our characters before.
The more I study history the more I understand that people from different
cultures, people from different historical periods... we're not exceptional,
there's nothing exceptional about us, there's nothing exceptional about them.
The technology might be new, but the way we react to it, the way we use it, is
the same it always has been.
Whatever we think about ourselves, we aren't more intelligent than our
ancestors. Neither were they more intelligent than we are. But technology and
knowledge plays it's role in augmenting us – and that is what makes us
When we get fluent in powerful ideas, they are like adding new brain tissue
that nature didn't give us. It's worthwhile thinking about what it means to get
fluent in something like calculus and to realize that a normal person fluent in
calculus can outthink Archimedes. If you're fluent at reading you can cover
more ground than anybody in the antiquity could in an oral culture.
So a good question for people who are dealing with computing is what if what's
important about computing is deeply hidden? I can tell you as far as this one,
most of the computing that is done in most of industry completely misses most
of what's interesting about computing. They are basically at a first level of
exposure to it and they're trying to optimize that. Think about that because
that was okay fifty years ago.
Probably the most important thing I can urge on you today is to try and
understand that computing is not exactly what you think it is. You have to
understand this. What happened when the internet got done and a few other
things back in the 70s or so was a big paradigm shift in computing and it
hasn't spilled out yet. But if you're looking ahead to the 22nd century this is
what you have to understand otherwise you're always going to be steering by
looking in the rearview mirror.
If someone today could outthink Archimedes and anyone who is literate can cover
more ground than any oral culture... What can someone do with a computer today?
The most interesting point is that it isn't as much as we think. We keep
mouthing platitudes about innovation and pretend we're much more advanced than
our ancestors. But the more you look at what computing can really be about, the
more pathetic everything we're doing right now sounds.
“Technology is changing faster than ever” – this is a related, repeated claim.
It’s a claim that seems to be based on history, one that suggests that, in the
past, technological changes were slow; now, they’re happening so fast and we’re
adopting new technologies so quickly – or so the story goes – that we can no
longer make any sense of what is happening around us, and we’re just all being
swept along in a wave of techno-inevitability.
Needless to say, I don’t think the claim is true – or at the very least, it is
a highly debatable one. Some of this, I’d argue, is simply a matter of
confusing technology consumption for technology innovation. Some of this is a
matter of confusing upgrades for breakthroughs – Apple releasing a new iPhone
every year might not be the best rationale for insisting we are experiencing
rapid technological change. Moreover, much of the pace of change can be
accounted for by the fact that many new technologies are built atop – quite
literally – pre-existing systems: railroads followed the canals; telegraphs
followed the railroads; telephones followed the telegraphs; cable television
followed the phone lines...
So why then does the history of tech matter? It matters because it helps us
think about beliefs and practices and systems and institutions and ideology. It
helps make visible, I’d hope, some of the things that time and familiarity has
made invisible. It helps us think about context. It helps us think about
continuity as much as change. And I think it helps us be more attuned to the
storytelling and the myth-making that happens so frequently in technology and
We're confusing technology consumption for technology innovation. Innovation
augments ourselves to do things that were previously impossible, consumption
just allows us to do more of the same. Maybe better, faster of whatever, but
still the same.
No, I don't want to sign up for your newsletter. I don't want to see your open
positions. Read your food & drinks blog. Learn about your cassolette of white
asparagus served with black garlic and saffron sauce. See photos of your latest
celebration. Know the favourite wine selection of your general manager.
All haystack and no needle.
There's only three things I want to be able to do: know what food they're
offering, get the address/location and make a reservation.
Similarly, on business websites I see a similar trend. There are many and long
pages about the founders, about their vision, their history, their teams – and
of course blog posts from 2 years ago and articles nobody reads.
Regardless of the size of those websites, they're often presenting an
information underload, not an overload. The actual valuable content is hidden
somewhere in this big haystack. And of course you can guess how much effort a
prospect will put into finding those pieces. In my opinion all businesses need
these three pages:
one describing the expensive problem of the client
one describing the product/services which solve the above problem
one to get in touch with the business
That's it. Adding valuable content on top of these is of course always a good
thing, but you need to get the basics right.
summing up is a recurring series on digital strategy topics & insights that compose a large
part of my thinking and work. drop your email in the box below to get it – and
much more – straight in your inbox.
If you want to get something done, the way you do it is not so much trying to
convince somebody but to create a tribe that is a conspiracy. Because
anthropologically that is what we are more than any other thing. We are tribal
beings and we tend to automatically oppose things outside of our tribe, even if
they're good ideas, because that isn't the way we think – in fact we don't
think we're not primarily thinking animals.
The theatrical part of this is lots bigger than we think, the limitations are
much smaller than we think and the relationship we have with our heritage is
that we are much more different than we think we are. I hate to see computing
and computer science watered down to some terrible kind of engineering that the
Babylonians might have failed at.
That is pathetic! And I'm saying it in this strong way because you need to
realize that we're in the middle of a complete form of bullshit that has grown
up out of the pop culture.
We're stuck in conversations around hypes and trending technological topics. At
the same time our world gets ever more complex and throws ever more complex
problems at us. I really hope that we can grow up soon and use the power the
computer grants us to actually augment ourselves.
A recurring theme in software development is the more you dig into the research
the greater the distance is between what actual research seems to say versus
what the industry practices.
Develop a familiarity with, for example, Alan Kay’s or Douglas Engelbart’s
visions for the future of computing and you are guaranteed to become thoroughly
dissatisfied with the limitations of every modern OS. Reading up hypertext
theory and research, especially on hypertext as a medium, is a recipe for
becoming annoyed at The Web. Catching up on usability research throughout the
years makes you want to smash your laptop agains the wall in anger. And trying
to fill out forms online makes you scream ‘it doesn’t have to be this way!’ at
the top of your lungs.
That software development doesn’t deal with research or attempts to get at hard
facts is endemic to the industry.
It seems crazy to me that most other subjects look at their history, while
computing mostly ignores the past, thinking that new is always better. The
problem these days isn't how to innovate, but how to get society to adopt the
good ideas that already exist.
So far, it’s mostly shit. Most of our society simply isn’t benefiting from this
trend of software eating the world. In fact, most of them live in the very
world that software ate.
The world is not just software. The world is physics, it’s crying babies and
shit on the sidewalk, it’s opioids and ecstasy, it’s car crashes and Senate
hearings, lovers and philosophers, lost opportunities and spinning planets
around untold stars. The world is still real.
Software – data, code, algorithms, processing – software has dressed the world
in new infrastructure. But this is a conversation, not a process of digestion.
It is a conversation between the physical and the digital, a synthesis we must
master if we are to avoid terrible fates, and continue to embrace fantastic
Only those who know nothing of the history of technology believe that a
technology is entirely neutral. It always has implications, positive and
negative. And all too often we seem to ignore the downsides of this in our
physical world. The world we live in, and the technology as well.
When was the last time you visited a business website without having a clue
what they were offering? Unfortunately these beasts are far too common – and
hurting business as well.
To lay it on the line: most businesses' positioning is terrible.
Websites are unforgiving in that regard. You can't control who's visiting your
website. You can't control what they read. You can't control how much they
read. You can't control how long they stay. And so on.
While you can get around this at an event or conference by talking a few
minutes more about what you and your business actually do, you don't have that
chance on the web.
The solution: A compelling positioning that helps your ideal clients understand
how you can help them and what expensive problem you solve for them.
I know it's hard work and hard choices, but in the end it's worth it – if you
want to succeed in the long run.
I don't remember the last time I've been so creeped out by a technology as I
was by Google duplex, an artificial intelligence that can make phone calls on
your behalf, booking salon appointments or restaurant reservations and
pretending to be human.
One could probably criticise that it's unethical not to disclose that a machine
is on the other end of the line, that we exploit employees of small businesses
as involuntary robot nannies at no charge, that this technology will be
deployed and used regardless of any consequences, or our blind faith in
technology. But it's none of that.
The reason why I am creeped out is this:
The fathers of personal computing, most importantly Doug Engelbart, dreamed
about making humans better with the help of computers, to assist rather than
replace humans, to augment humans with computers. What we increasingly see
however is a trend of using humans to augment computers.
In an industry that extols innovation over customer satisfaction, and prefers
algorithm to human judgement (forgetting that every algorithm has human bias in
its DNA), perhaps it should not surprise us that toolchains have replaced
Likewise, in a field where young straight white dudes take an overwhelming
majority of the jobs (including most of the management jobs) it’s perhaps to be
expected that web making has lately become something of a dick measuring
It was not always this way, and it needn’t stay this way. If we wish to get
back to the business of quietly improving people’s lives, one thoughtful
interaction at a time, we must rid ourselves of the cult of the complex.
Admitting the problem is the first step in solving it.
Solutions to many problems seem most brilliant when they appear most obvious.
Simple even. But in many cases we throw everything we have against the wall
and see what sticks. It's on us to recognize when we forget that our job is to
solve business, client and most importantly human problems.
In every century people have thought they understood the universe at last, and
in every century they were proved to be wrong. It follows that the one thing we
can say about our modern "knowledge" is that it is wrong.
The basic trouble, you see, is that people think that "right" and "wrong" are
absolute; that everything that isn't perfectly and completely right is totally
and equally wrong. However, I don't think that's so. It seems to me that right
and wrong are fuzzy concepts.
A very interesting thought which reminds me very much of a short poem by Piet
Hein: The road to wisdom? — Well, it's plain and simple to express: Err and
err and err again but less and less and less.
Technology does not emerge from a vacuum; it is the reification of the beliefs
and desires of its creators. It is assembled from ideas and fantasies developed
through evolution and culture, pedagogy and debate, endlessly entangled and
enfolded. The belief in an objective schism between technology and the world is
nonsense, and one that has very real outcomes.
Cooperation between human and machine turns out to be a more potent strategy
than trusting to the computer alone.
This strategy of cooperation, drawing on the respective skills of human and
machine rather than pitting one against the other, may be our only hope for
surviving life among machines whose thought processes are unknowable to us.
Nonhuman intelligence is a reality—it is rapidly outstripping human performance
in many disciplines, and the results stand to be catastrophically destructive
to our working lives. These technologies are becoming ubiquitous in everyday
devices, and we do not have the option of retreating from or renouncing them.
We cannot opt out of contemporary technology any more than we can reject our
neighbors in society; we are all entangled.
As we envision, plan and build technology, a human bias will always be part of
it. We can't just pass our responsibility to technology and bury our head in
the sand. The question we should and have to pose ourselves is be a different
one. How can we use and leverage technology as a tool, as ways to augment
ourselves to do things that were previously impossible? I think collaboration
and cooperation might be an answer.
Today, we have been building and investing so much of our time into the digital
world and we have forgotten to take a step back and take a look at the larger
picture. Not only do we waste other people's time by making them addicted to
this device world, we have also created a lot of waste in the real world. At
the same time we're drowning in piles and piles of information because we never
took the time to architect a system that enable us in navigating through them.
We're trapped in these rectangular screens and we have often forgotten how to
interact with the real world, with real humans. We have been building and
hustling - but hey, we can also slow down and rethink how we want to dwell in
both the physical world and the digital world.
At some point in the future we will leave this world and what we'll leave
behind are spaces and lifestyles that we've shaped for our grandchildren. So I
would like to invite you to think about what do we want to leave behind, as we
continue to build both digitally and physically. Can we be more intentional so
that we shape and leave behind a more humane environment?
What we use a computer for on a daily basis, is only a small part of what a
computer could offer us. Instead, most of our conversation evolves around hypes
and trending technological topics. What we desperately need is to take a step
back, and figure out ways of thinking to tackle complex problems in a ever more
Fast learns, slow remembers. Fast proposes, slow disposes. Fast is
discontinuous, slow is continuous. Fast and small instructs slow and big by
accrued innovation and by occasional revolution. Slow and big controls small
and fast by constraint and constancy. Fast gets all our attention, slow has
all the power.
All durable dynamic systems have this sort of structure. It is what makes them
adaptable and robust.
The total effect of the pace layers is that they provide a many-leveled
corrective, stabilizing feedback throughout the system. It is precisely in the
apparent contradictions between the pace layers that civilization finds its
We're too often thinking about the superficial, the fast, the shallow. And that
is not necessarily a bad thing - but it will easily become if it's the only
thing we do. This concept is one of those that, once your brain has been
exposed, you start seeing everywhere.
I find it hard to communicate with a lot of technologists anymore. It’s like
trying to explain literature to someone who has never read a book. You’re asked
“So basically a book is just words someone said written down?” And you say no,
it’s more than that. But how is it more than that?
I am going to make the argument that the predominant form of the social web —
that amalgam of blogging, Twitter, Facebook, forums, Reddit, Instagram — is an
impoverished model for learning and research and that our survival as a species
depends on us getting past the sweet, salty fat of “the web as conversation”
and on to something more timeless, integrative, iterative, something less
personal and less self-assertive, something more solitary yet more connected. I
don’t expect to convince many of you, but I’ll take what I can get.
We can imagine a world that is so much better than this one. And more
importantly we can build it. But in order to do that we have to think bigger
than the next hype, the next buzzword and the next press release. We have to
seriously interrogate the assumptions that are hidden in plain sight.
Wow. After sharing and discussing close to a thousand (964 to be
precise) articles, talks, essays, videos and links, my summing up column
I originally started this series a little over five years ago to keep track on
what I was reading. Little did I know then how much this effort helped me build
up a large part of my expertise, methods, strategies and way of thinking. I'm
also quite relieved that in all that time, nobody asked me about the
To celebrate this somewhat special occasion, I want to deviate a bit from the
usual format and highlight some key figures and favourite articles which
impress me to this day.
Doug Engelbart, one of the fathers of personal computing, is definitely one
of my personal heroes. He dedicated his life to the pursuit of developing
technology to augment human intellect. He didn't see this as a
technological problem though, but as a human problem, with technology falling out
as part of a solution. His methods and
are brilliant and I rely heavily on them when working with clients.
When thinking about the future, you can't do it better than Alan Kay. Perhaps
he is one of the best known computing visionaries still around today and his
reasoning is spot on when it comes to invention, innovation and strategies how
to succeed in a digital world.
Neil Postman is one of my favourite media critics and funnily enough was
never categorically against technology. But he warned us vigorously to not be
suspicious of technology. His predictions, cautions and propositions on how we
become used by technology rather than make use of technology have been
spot on so far – unfortunately.
There's often a thin line between madness and genius and Ted Nelson walks
that line confidently. The original inventor of hypertext, internet pioneer and
visionary saw the need for interconnected documents decades before the World
Wide Web was born. And even now his vision is far from being complete – luckily
the size of his ambition hasn't changed.
Bret Victor is one of the thinkers I respect most in our industry. His talks
and essay have been highly influential to me. In the spirit of Doug Engelbart,
Bret thinks deeply about how to create a new dynamic medium that shapes
computing for the 21st century and allows us to see, understand and solve
It's rare that I don't fall in love with talks by Maciej Cegłowski, talking
mostly on the excesses and impacts of technology on society. His style of
storytelling along with ingenious insights is just amazing.
Audrey Watters is mostly known for her prolific work on education
technology issues and tech in general. The witty way she interrogates the
stories about technology we tell ourselves – or have been told to us – is
full of deep insight.
Thanks a lot for your continued support and feedback over the last years, it is
heavily appreciated. You're very welcome to subscribe to this series and
get it directly in your inbox along with some cool stuff that you won't find
anywhere else on the site.
Lastly, if you have any feedback, critique, tips, ideas, comments or free bags
of money, I'd be very glad to hear from you. Thank you.
We have the opportunity to change our thinking and basic assumptions about the
development of computing technologies. The emphasis on enhancing security and
protecting turf often impedes our ability to solve problems collectively. If we
can re-examine those assumptions and chart a different course, we can harness
all the wonderful capability of the systems that we have today.
People often ask me how I would improve the current systems, but my response is
that we first need to look at our underlying paradigms—because we need to
co-evolve the new systems, and that requires new ways of thinking. It’s not
just a matter of “doing things differently,” but thinking differently about how
to approach the complexity of problem-solving today.
in a world where we've grown multiple orders of magnitude in our computing
capacity, where we spend millions of dollars on newer, faster tools and
technology, we put little emphasis on how we can augment human thinking and
problem solving. and as doug says, it is not about thinking differently about
these problems, it is thinking differently about our ability to to solve these
Suppose a person tells us that a particular photo is of people playing
Frisbee in the park, then we naturally assume that they can answer questions
like “what is the shape of a Frisbee?”, “roughly how far can a person throw a
Frisbee?”, “can a person eat a Frisbee?”, “roughly how many people play
Frisbee at once?”, “can a 3 month old person play Frisbee?”, “is today’s
weather suitable for playing Frisbee?”. Today’s image labelling systems that
routinely give correct labels, like “people playing Frisbee in a park” to
online photos, have no chance of answering those questions. Besides the fact
that all they can do is label more images and can not answer questions at
all, they have no idea what a person is, that parks are usually outside, that
people have ages, that weather is anything more than how it makes a photo
look, etc., etc.
Here is what goes wrong. People hear that some robot or some AI system has
performed some task. They then take the generalization from that performance to
a general competence that a person performing that same task could be expected
to have. And they apply that generalization to the robot or AI system.
Today’s robots and AI systems are incredibly narrow in what they can do. Human
style generalizations just do not apply. People who do make these
generalizations get things very, very wrong.
we are surrounded my hysteria about artificial intelligence, mistaken
extrapolations, limited imagination any many more mistakes that distract us
from thinking productively about the future. whether or not ai succeeds in the
long term, it will nevertheless be developed and used with uncompromising
efforts – regardless of any consequences.
Unfortunately, many in the AI community greatly underestimate the depth of
interface design, often regarding it as a simple problem, mostly about making
things pretty or easy-to-use. In this view, interface design is a problem to be
handed off to others, while the hard work is to train some machine learning
This view is incorrect. At its deepest, interface design means developing the
fundamental primitives human beings think and create with. This is a problem
whose intellectual genesis goes back to the inventors of the alphabet, of
cartography, and of musical notation, as well as modern giants such as
Descartes, Playfair, Feynman, Engelbart, and Kay. It is one of the hardest,
most important and most fundamental problems humanity grapples with.
the speed, performance or productivity of computers are mostly red herrings.
the main problem is how we can leverage the computer as a tool. in different
words, how can we use the computer to augment ourselves to do things that were
There has been so much excitement and sense of discovery around the digital
revolution that we’re at a moment where we overestimate what can be done with
AI, certainly as it stands at the moment.
One of the most essential elements of human wisdom at its best is humility,
knowing that you don’t know everything. There’s a sense in which we haven’t
learned how to build humility into our interactions with our devices. The
computer doesn’t know what it doesn’t know, and it's willing to make
projections when it hasn’t been provided with everything that would be relevant
to those projections.
after all, computers are still tools we should take advantage of, to augment
ourselves to do things that were previously impossible, to help us make our
lives better. but all too often it seems to me that everyone is used by
computers, for purposes that seem to know no boundaries.
Drawing inspiration from architectural practice, its successes and failures, I
question the role of design in a world being eaten by software. When the
prevailing technocratic culture permits the creation of products that undermine
and exploit users, who will protect citizens within the digital spaces they now
We need to take it upon ourselves to be more critical and introspective. This
shouldn’t be too hard. After all, design is all about questioning what already
exists and asking how it could be improved for the better.
Perhaps we need a new set of motivational posters. Rather than move fast and
break things, perhaps slow down and ask more questions.
we need a more thoughtful, questioning approach to digital. how does a single
technology, a tool or a digital channels help us improve? the answer is out
there somewhere, but he have to stop ourselves more often to ask "why?".
The grand struggle of creativity can often be about making yourself stupid
again. It's like turning yourself into a child who views the world with
wonderment and excitement.
Creating something meaningful isn't easy, it's hard. But that's why we should
do it. If you ever find yourself being comfortable on what you're making or
creating, then you need to push yourself. Push yourself out of your comfort
zone and push yourself to the point of failure and then beyond.
When I was a kid, we would go skiing a lot. At the end of the day all the
skiers were coming to the lodge and I used to think it was the bad skiers
that were covered in snow and it was the good skiers that were all cleaned,
with no snow on them. But turns out to be the exact opposite is true: it was
the good skiers that were covered in snow from pushing themselves, pushing
themselves beyond the limits and into their breaking points, getting better
and then pushing themselves harder. Creativity is the same thing. It's like
you push hard, you push until you're scared and afraid, you push until you
break, you push until you fall and then you get up and you do it again.
Creativity is really a journey. It's a wonderful journey the you part you
start out as one person and that you end as another.
it's always a lot harder to create something meaningful than just creating
something. but that's exactly the reason why you should do it. a great talk by
one of my favourite game designers.
So much of how we build websites and software comes down to how we think. The
churn of tools, methods, and abstractions also signify the replacement of
ideology. A person must usually think in a way similar to the people who
created the tools to successfully use them. It’s not as simple as putting down
a screwdriver and picking up a wrench. A person needs to revise their whole
frame of thinking; they must change their mind.
The new methods were invented to manage a level of complexity that is
completely foreign to me and my work. It was easy to back away from most of
this new stuff when I realized I have alternate ways of managing complexity.
Instead of changing my tools or workflow, I change my design. It’s like
designing a house so it’s easy to build, instead of setting up cranes typically
used for skyscrapers. Beyond that, fancy implementation has never moved the
needle much for my clients.
So, I thought it would be useful remind everyone that the easiest and cheapest
strategy for dealing with complexity is not to invent something to manage it,
but to avoid the complexity altogether with a more clever plan.
a fancy implementation has never moved the needle much for my clients either.
what has though is to build relationships and let technology support this
process. we are an increasingly digital society, yes, but that doesn't mean we
have to let technology take over.
Human nature, for better or worse, doesn’t change much from millennia to
millennia. If you want to see the strengths that are unique and universal to
all humans, don’t look at the world-famous award-winners — look at children.
Children, even at a young age, are already proficient at: intuition, analogy,
creativity, empathy, social skills. Some may scoff at these for being “soft
skills”, but the fact that we can make an AI that plays chess but not hold a
normal five-minute conversation, is proof that these skills only seem “soft”
to us because evolution’s already put in the 3.5 billion years of hard work
So, if there’s just one idea you take away from this entire essay, let it be
Mother Nature’s most under-appreciated trick: symbiosis.
Symbiosis shows us you can have fruitful collaborations even if you have
different skills, or different goals, or are even different species.
Symbiosis shows us that the world often isn’t zero-sum — it doesn’t have to
be humans versus AI, or humans versus centaurs, or humans versus other
humans. Symbiosis is two individuals succeeding together not despite, but
because of, their differences. Symbiosis is the “+”.
zero sum games most often win our attention, but the vast majority of our
interactions are positive sum: when you share, when you buy, when you learn,
when you talk. similarly with technology and computers: we can only improve if
we use technology to augment ourselves in order to allow for new,
previously-impossible ways of thinking, of living, of being.
At the same time as we so happily create everything from artificial
intelligences to putting "smart" into absolutely bloody everything – At the
same time, there are still so many actual, real problems unsolved. I do not
need a single more problem solved, every one of my actual problems have been
solved. There is not a single thing I could even dream of wanting that hasn't
been already created. Yes, I can upgrade. I can buy a slightly cooler car. I
can buy slightly better clothes. I can buy slightly faster phones. But frankly
I am just consuming myself into the grave, because I have an empty life.
In all this innovation bullshit, what has happened, is that rather than look at
true, meaningful change, we have turned innovation into one more bullshit
phrase, into one more management buzzword.
Do we actually have discussions about whether we're doing meaningful work or
just work that happens to be paid at the moment. We need regardless of what
company we work in, we need to look at the products we create, the things we
create, and say "yes, this can matter". But it can not just matter to me, it
needs to matter to someone else as well.
We have blind spots, we all have them. We all have our biases. It is acceptable
perchance to have a bias as an individual. But when the entire community or an
entire nation has a bias, this says we have not gone far enough.
we seem to spend so much talent, research, time, energy and money to create
things that nobody needs, just because we feel we have to innovate somehow. and
the problem isn't how to innovate or the innovation per se, but how to get
society to adopt the good ideas that already exist.
Algorithms are essentially thoughtless. They model certain decision flows, but
once you run them, no more thought occurs. To call a person “thoughtless” is
usually considered a slight, or an outright insult; and yet, we unleash so many
literally thoughtless processes on our users, on our lives, on ourselves.
this hits home on so many levels. we throw technology at people, hoping
something will stick. instead, we should use the computer and algorithms to
augment ourselves to do things that were previously impossible, to help us make
our lives better. that is the sweet spot of our technology.
The time you spend is not your own. You are, as a class of human beings,
responsible for more pure raw time, broken into more units, than almost anyone
else. You are about to spend whole decades, whole centuries, of cumulative
moments, of other people’s time. People using your systems, playing with your
toys, fiddling with your abstractions. And I want you to ask yourself when you
make things, when you prototype interactions, am I thinking about my own clock,
or the user’s? Am I going to help someone make order in his or her life?
If we are going to ask people, in the form of our products, in the form of the
things we make, to spend their heartbeats—if we are going to ask them to spend
their heartbeats on us, on our ideas, how can we be sure, far more sure than we
are now, that they spend those heartbeats wisely?
our technological capability changes much faster than our culture. we first
create our technologies and then they change our society and culture. therefore
we have a huge responsibility to look at things from the other person's point
of view – and to do what's best for them. in other words, be considerate.
Lamenting about the tech industry’s ills when self-identifying as a
technologist is a precarious construction. I care so deeply about using the
personal computer for liberation & augmentation. I’m so, so burned out by 95%
of the work happening in the tech industry. Silicon Valley mythologizes
newness, without stopping to ask “why?”. I’m still in love with technology, but
increasingly with nuance into that which is for us, and that which productizes
Perhaps when Bush prophesied lightning-quick knowledge retrieval, he didn’t
intend for that knowledge to be footnoted with Outbrain adverts. Licklider’s
man-computer symbiosis would have been frustrated had it been crop-dusted
with notifications. Ted Nelson imagined many wonderfully weird futures for
the personal computer, but I don’t think gamifying meditation apps was one of
every day we make big fuss about a seemingly new hype (ai, blockchain, vr, iot,
cloud, ... what next?). as neil postman and others have cautioned us,
technologies tend to become mythic. that is, perceived as if they were god-given,
part of the natural order of things, gifts of
nature, and not as artifacts produced in a specific political and historical
context. and by that we completely fail to recognize how we can use technology
augment ourselves to do things that were previously impossible, to help us make our lives better.
Radio brought music into hospitals and nursing homes, it eased the profound
isolation of rural life, it let people hear directly from their elected
representatives. It brought laugher and entertainment into every parlor, saved
lives at sea, gave people weather forecasts for the first time.
But radio waves are just oscillating electromagnetic fields. They really don't
care how we use them. All they want is to go places at the speed of light.
It is hard to accept that good people, working on technology that benefits so
many, with nothing but good intentions, could end up building a powerful tool
for the wicked.
But we can't afford to re-learn this lesson every time.
Technology interacts with human nature in complicated ways, and part of human
nature is to seek power over others, and manipulate them. Technology
We have to assume the new technologies we invent will concentrate power, too.
There is always a gap between mass adoption and the first skillful political
use of a medium. With the Internet, we are crossing that gap right now.
only those who know nothing about technological history believe that technology
is entirely neutral. it has always a bias towards being used in certain ways
and not others. a great comparison to what we're facing now with the internet.
In psychology, the term “insight” is used to describe a recognition of one’s
own condition, such as when a person with mental illness is aware of their
illness. More broadly, it describes the ability to recognize patterns in one’s
own behavior. It’s an example of metacognition, or thinking about one’s own
thinking, and it’s something most humans are capable of but animals are not.
And I believe the best test of whether an AI is really engaging in human-level
cognition would be for it to demonstrate insight of this kind.
I used to find it odd that these hypothetical AIs were supposed to be smart
enough to solve problems that no human could, yet they were incapable of doing
something most every adult has done: taking a step back and asking whether
their current course of action is really a good idea. Then I realized that we
are already surrounded by machines that demonstrate a complete lack of insight,
we just call them corporations. Corporations don’t operate autonomously, of
course, and the humans in charge of them are presumably capable of insight, but
capitalism doesn’t reward them for using it. On the contrary, capitalism
actively erodes this capacity in people by demanding that they replace their
own judgment of what “good” means with “whatever the market decides.”
the problem is this: if you're never exposed to new ideas and contexts, if you
grow up only being shown one way of thinking about businesses & technology and
being told that there are no other ways to think about this, you grow up
thinking you know what we're doing.
When people try to explain the wastefulness of today's computing, they commonly
offer something I call "tradeoff hypothesis". According to this hypothesis, the
wastefulness of software would be compensated by flexibility, reliability,
maintability, and perhaps most importantly, cheap programming work.
I used to believe in the tradeoff hypothesis as well. However, during recent
years, I have become increasingly convinced that the portion of true tradeoff
is quite marginal. An ever-increasing portion of the waste comes from
abstraction clutter that serves no purpose in final runtime code. Most of this
clutter could be eliminated with more thoughtful tools and methods without any
we too often seem to adjust to the limitations of technology, instead of
creating solutions for a problem with the help of technology.