We have the opportunity to change our thinking and basic assumptions about the
development of computing technologies. The emphasis on enhancing security and
protecting turf often impedes our ability to solve problems collectively. If we
can re-examine those assumptions and chart a different course, we can harness
all the wonderful capability of the systems that we have today.
People often ask me how I would improve the current systems, but my response is
that we first need to look at our underlying paradigms—because we need to
co-evolve the new systems, and that requires new ways of thinking. It’s not
just a matter of “doing things differently,” but thinking differently about how
to approach the complexity of problem-solving today.
in a world where we've grown multiple orders of magnitude in our computing
capacity, where we spend millions of dollars on newer, faster tools and
technology, we put little emphasis on how we can augment human thinking and
problem solving. and as doug says, it is not about thinking differently about
these problems, it is thinking differently about our ability to to solve these
Suppose a person tells us that a particular photo is of people playing
Frisbee in the park, then we naturally assume that they can answer questions
like “what is the shape of a Frisbee?”, “roughly how far can a person throw a
Frisbee?”, “can a person eat a Frisbee?”, “roughly how many people play
Frisbee at once?”, “can a 3 month old person play Frisbee?”, “is today’s
weather suitable for playing Frisbee?”. Today’s image labelling systems that
routinely give correct labels, like “people playing Frisbee in a park” to
online photos, have no chance of answering those questions. Besides the fact
that all they can do is label more images and can not answer questions at
all, they have no idea what a person is, that parks are usually outside, that
people have ages, that weather is anything more than how it makes a photo
look, etc., etc.
Here is what goes wrong. People hear that some robot or some AI system has
performed some task. They then take the generalization from that performance to
a general competence that a person performing that same task could be expected
to have. And they apply that generalization to the robot or AI system.
Today’s robots and AI systems are incredibly narrow in what they can do. Human
style generalizations just do not apply. People who do make these
generalizations get things very, very wrong.
we are surrounded my hysteria about artificial intelligence, mistaken
extrapolations, limited imagination any many more mistakes that distract us
from thinking productively about the future. whether or not ai succeeds in the
long term, it will nevertheless be developed and used with uncompromising
efforts – regardless of any consequences.
Unfortunately, many in the AI community greatly underestimate the depth of
interface design, often regarding it as a simple problem, mostly about making
things pretty or easy-to-use. In this view, interface design is a problem to be
handed off to others, while the hard work is to train some machine learning
This view is incorrect. At its deepest, interface design means developing the
fundamental primitives human beings think and create with. This is a problem
whose intellectual genesis goes back to the inventors of the alphabet, of
cartography, and of musical notation, as well as modern giants such as
Descartes, Playfair, Feynman, Engelbart, and Kay. It is one of the hardest,
most important and most fundamental problems humanity grapples with.
the speed, performance or productivity of computers are mostly red herrings.
the main problem is how we can leverage the computer as a tool. in different
words, how can we use the computer to augment ourselves to do things that were
There has been so much excitement and sense of discovery around the digital
revolution that we’re at a moment where we overestimate what can be done with
AI, certainly as it stands at the moment.
One of the most essential elements of human wisdom at its best is humility,
knowing that you don’t know everything. There’s a sense in which we haven’t
learned how to build humility into our interactions with our devices. The
computer doesn’t know what it doesn’t know, and it's willing to make
projections when it hasn’t been provided with everything that would be relevant
to those projections.
after all, computers are still tools we should take advantage of, to augment
ourselves to do things that were previously impossible, to help us make our
lives better. but all too often it seems to me that everyone is used by
computers, for purposes that seem to know no boundaries.
Drawing inspiration from architectural practice, its successes and failures, I
question the role of design in a world being eaten by software. When the
prevailing technocratic culture permits the creation of products that undermine
and exploit users, who will protect citizens within the digital spaces they now
We need to take it upon ourselves to be more critical and introspective. This
shouldn’t be too hard. After all, design is all about questioning what already
exists and asking how it could be improved for the better.
Perhaps we need a new set of motivational posters. Rather than move fast and
break things, perhaps slow down and ask more questions.
we need a more thoughtful, questioning approach to digital. how does a single
technology, a tool or a digital channels help us improve? the answer is out
there somewhere, but he have to stop ourselves more often to ask "why?".
The grand struggle of creativity can often be about making yourself stupid
again. It's like turning yourself into a child who views the world with
wonderment and excitement.
Creating something meaningful isn't easy, it's hard. But that's why we should
do it. If you ever find yourself being comfortable on what you're making or
creating, then you need to push yourself. Push yourself out of your comfort
zone and push yourself to the point of failure and then beyond.
When I was a kid, we would go skiing a lot. At the end of the day all the
skiers were coming to the lodge and I used to think it was the bad skiers
that were covered in snow and it was the good skiers that were all cleaned,
with no snow on them. But turns out to be the exact opposite is true: it was
the good skiers that were covered in snow from pushing themselves, pushing
themselves beyond the limits and into their breaking points, getting better
and then pushing themselves harder. Creativity is the same thing. It's like
you push hard, you push until you're scared and afraid, you push until you
break, you push until you fall and then you get up and you do it again.
Creativity is really a journey. It's a wonderful journey the you part you
start out as one person and that you end as another.
it's always a lot harder to create something meaningful than just creating
something. but that's exactly the reason why you should do it. a great talk by
one of my favourite game designers.
So much of how we build websites and software comes down to how we think. The
churn of tools, methods, and abstractions also signify the replacement of
ideology. A person must usually think in a way similar to the people who
created the tools to successfully use them. It’s not as simple as putting down
a screwdriver and picking up a wrench. A person needs to revise their whole
frame of thinking; they must change their mind.
The new methods were invented to manage a level of complexity that is
completely foreign to me and my work. It was easy to back away from most of
this new stuff when I realized I have alternate ways of managing complexity.
Instead of changing my tools or workflow, I change my design. It’s like
designing a house so it’s easy to build, instead of setting up cranes typically
used for skyscrapers. Beyond that, fancy implementation has never moved the
needle much for my clients.
So, I thought it would be useful remind everyone that the easiest and cheapest
strategy for dealing with complexity is not to invent something to manage it,
but to avoid the complexity altogether with a more clever plan.
a fancy implementation has never moved the needle much for my clients either.
what has though is to build relationships and let technology support this
process. we are an increasingly digital society, yes, but that doesn't mean we
have to let technology take over.
Human nature, for better or worse, doesn’t change much from millennia to
millennia. If you want to see the strengths that are unique and universal to
all humans, don’t look at the world-famous award-winners — look at children.
Children, even at a young age, are already proficient at: intuition, analogy,
creativity, empathy, social skills. Some may scoff at these for being “soft
skills”, but the fact that we can make an AI that plays chess but not hold a
normal five-minute conversation, is proof that these skills only seem “soft”
to us because evolution’s already put in the 3.5 billion years of hard work
So, if there’s just one idea you take away from this entire essay, let it be
Mother Nature’s most under-appreciated trick: symbiosis.
Symbiosis shows us you can have fruitful collaborations even if you have
different skills, or different goals, or are even different species.
Symbiosis shows us that the world often isn’t zero-sum — it doesn’t have to
be humans versus AI, or humans versus centaurs, or humans versus other
humans. Symbiosis is two individuals succeeding together not despite, but
because of, their differences. Symbiosis is the “+”.
zero sum games most often win our attention, but the vast majority of our
interactions are positive sum: when you share, when you buy, when you learn,
when you talk. similarly with technology and computers: we can only improve if
we use technology to augment ourselves in order to allow for new,
previously-impossible ways of thinking, of living, of being.
At the same time as we so happily create everything from artificial
intelligences to putting "smart" into absolutely bloody everything – At the
same time, there are still so many actual, real problems unsolved. I do not
need a single more problem solved, every one of my actual problems have been
solved. There is not a single thing I could even dream of wanting that hasn't
been already created. Yes, I can upgrade. I can buy a slightly cooler car. I
can buy slightly better clothes. I can buy slightly faster phones. But frankly
I am just consuming myself into the grave, because I have an empty life.
In all this innovation bullshit, what has happened, is that rather than look at
true, meaningful change, we have turned innovation into one more bullshit
phrase, into one more management buzzword.
Do we actually have discussions about whether we're doing meaningful work or
just work that happens to be paid at the moment. We need regardless of what
company we work in, we need to look at the products we create, the things we
create, and say "yes, this can matter". But it can not just matter to me, it
needs to matter to someone else as well.
We have blind spots, we all have them. We all have our biases. It is acceptable
perchance to have a bias as an individual. But when the entire community or an
entire nation has a bias, this says we have not gone far enough.
we seem to spend so much talent, research, time, energy and money to create
things that nobody needs, just because we feel we have to innovate somehow. and
the problem isn't how to innovate or the innovation per se, but how to get
society to adopt the good ideas that already exist.
Algorithms are essentially thoughtless. They model certain decision flows, but
once you run them, no more thought occurs. To call a person “thoughtless” is
usually considered a slight, or an outright insult; and yet, we unleash so many
literally thoughtless processes on our users, on our lives, on ourselves.
this hits home on so many levels. we throw technology at people, hoping
something will stick. instead, we should use the computer and algorithms to
augment ourselves to do things that were previously impossible, to help us make
our lives better. that is the sweet spot of our technology.
The time you spend is not your own. You are, as a class of human beings,
responsible for more pure raw time, broken into more units, than almost anyone
else. You are about to spend whole decades, whole centuries, of cumulative
moments, of other people’s time. People using your systems, playing with your
toys, fiddling with your abstractions. And I want you to ask yourself when you
make things, when you prototype interactions, am I thinking about my own clock,
or the user’s? Am I going to help someone make order in his or her life?
If we are going to ask people, in the form of our products, in the form of the
things we make, to spend their heartbeats—if we are going to ask them to spend
their heartbeats on us, on our ideas, how can we be sure, far more sure than we
are now, that they spend those heartbeats wisely?
our technological capability changes much faster than our culture. we first
create our technologies and then they change our society and culture. therefore
we have a huge responsibility to look at things from the other person's point
of view – and to do what's best for them. in other words, be considerate.
Lamenting about the tech industry’s ills when self-identifying as a
technologist is a precarious construction. I care so deeply about using the
personal computer for liberation & augmentation. I’m so, so burned out by 95%
of the work happening in the tech industry. Silicon Valley mythologizes
newness, without stopping to ask “why?”. I’m still in love with technology, but
increasingly with nuance into that which is for us, and that which productizes
Perhaps when Bush prophesied lightning-quick knowledge retrieval, he didn’t
intend for that knowledge to be footnoted with Outbrain adverts. Licklider’s
man-computer symbiosis would have been frustrated had it been crop-dusted
with notifications. Ted Nelson imagined many wonderfully weird futures for
the personal computer, but I don’t think gamifying meditation apps was one of
every day we make big fuss about a seemingly new hype (ai, blockchain, vr, iot,
cloud, ... what next?). as neil postman and others have cautioned us,
technologies tend to become mythic. that is, perceived as if they were god-given,
part of the natural order of things, gifts of
nature, and not as artifacts produced in a specific political and historical
context. and by that we completely fail to recognize how we can use technology
augment ourselves to do things that were previously impossible, to help us make our lives better.
Radio brought music into hospitals and nursing homes, it eased the profound
isolation of rural life, it let people hear directly from their elected
representatives. It brought laugher and entertainment into every parlor, saved
lives at sea, gave people weather forecasts for the first time.
But radio waves are just oscillating electromagnetic fields. They really don't
care how we use them. All they want is to go places at the speed of light.
It is hard to accept that good people, working on technology that benefits so
many, with nothing but good intentions, could end up building a powerful tool
for the wicked.
But we can't afford to re-learn this lesson every time.
Technology interacts with human nature in complicated ways, and part of human
nature is to seek power over others, and manipulate them. Technology
We have to assume the new technologies we invent will concentrate power, too.
There is always a gap between mass adoption and the first skillful political
use of a medium. With the Internet, we are crossing that gap right now.
only those who know nothing about technological history believe that technology
is entirely neutral. it has always a bias towards being used in certain ways
and not others. a great comparison to what we're facing now with the internet.
In psychology, the term “insight” is used to describe a recognition of one’s
own condition, such as when a person with mental illness is aware of their
illness. More broadly, it describes the ability to recognize patterns in one’s
own behavior. It’s an example of metacognition, or thinking about one’s own
thinking, and it’s something most humans are capable of but animals are not.
And I believe the best test of whether an AI is really engaging in human-level
cognition would be for it to demonstrate insight of this kind.
I used to find it odd that these hypothetical AIs were supposed to be smart
enough to solve problems that no human could, yet they were incapable of doing
something most every adult has done: taking a step back and asking whether
their current course of action is really a good idea. Then I realized that we
are already surrounded by machines that demonstrate a complete lack of insight,
we just call them corporations. Corporations don’t operate autonomously, of
course, and the humans in charge of them are presumably capable of insight, but
capitalism doesn’t reward them for using it. On the contrary, capitalism
actively erodes this capacity in people by demanding that they replace their
own judgment of what “good” means with “whatever the market decides.”
the problem is this: if you're never exposed to new ideas and contexts, if you
grow up only being shown one way of thinking about businesses & technology and
being told that there are no other ways to think about this, you grow up
thinking you know what we're doing.
When people try to explain the wastefulness of today's computing, they commonly
offer something I call "tradeoff hypothesis". According to this hypothesis, the
wastefulness of software would be compensated by flexibility, reliability,
maintability, and perhaps most importantly, cheap programming work.
I used to believe in the tradeoff hypothesis as well. However, during recent
years, I have become increasingly convinced that the portion of true tradeoff
is quite marginal. An ever-increasing portion of the waste comes from
abstraction clutter that serves no purpose in final runtime code. Most of this
clutter could be eliminated with more thoughtful tools and methods without any
we too often seem to adjust to the limitations of technology, instead of
creating solutions for a problem with the help of technology.
Socrates didn't charge for "education" because when you are in business, the
"customer starts to become right". Whereas in education, the customer is
generally "not right". Marketeers are catering to what people want, educators
are trying to deal with what they think people need (and this is often not at
all what they want).
Another perspective is to note that one of the human genetic "built-ins" is
"hunting and gathering" – this requires resources to "be around", and is
essentially incremental in nature. It is not too much of an exaggeration to
point out that most businesses are very like hunting-and-gathering processes,
and think of their surrounds as resources put there by god or nature for them.
Most don't think of the resources in our centuries as actually part of a
human-made garden via inventions and cooperation, and that the garden has to be
maintained and renewed.
these thoughts are a pure gold mine. a fundamental problem for most businesses
is that one cannot innovate under business objectives and one cannot accomplish
business objectives under innovation. ideally, you need both, but not at the
When we talk about technology, we tend to talk about it as this blue sky
opportunity. It could go any direction. And I want to get serious for a moment
and tell you why it's going in a very specific direction. Because it's not
evolving randomly. There's a hidden goal driving the direction of all of the
technology we make, and that goal is the race for our attention. Because every
new site or app has to compete for one thing, which is our attention, and
there's only so much of it. And the best way to get people's attention is to
know how someone's mind works.
A simple example is YouTube. YouTube wants to maximize how much time you spend.
And so what do they do? They autoplay the next video. And let's say that works
really well. They're getting a little bit more of people's time. Well, if
you're Netflix, you look at that and say, well, that's shrinking my market
share, so I'm going to autoplay the next episode. But then if you're Facebook,
you say, that's shrinking all of my market share, so now I have to autoplay all
the videos in the newsfeed before waiting for you to click play. So the
internet is not evolving at random. The reason it feels like it's sucking us in
the way it is is because of this race for attention. We know where this is
going. Technology is not neutral, and it becomes this race to the bottom of the
brain stem of who can go lower to get it.
we always seem to have the notion that technology is always good. but that is
simply not the case. every technology is always both a burden and a blessing.
not either or, but this and that.
The idea, as near as I can tell, is that the ideal computer should be like a
human being, only more obedient. Anything so insidiously appealing should
immediately give pause. Why should a computer be anything like a human being?
Are airplanes like birds, typewriters like pens, alphabets like mouths, cars
like horses? Are human interactions so free of trouble, misunderstanding, and
ambiguity that they represent a desirable computer interface goal? Further, it
takes a lot of time and attention to build and maintain a smoothly running team
of people, even a pair of people. A computer I need to talk to, give commands
to, or have a relationship with (much less be intimate with), is a computer
that is too much the center of attention.
in a world where computers increasingly become human, they inevitably will
become the center of attention. the exact opposite of what they should be:
invisible and helping to focus our attention to ourselves and the people we live
Above all, be prepared for the bullshit, as AI is lazily and inaccurately
claimed by every advertising agency and app developer. Companies will make
nonsensical claims like "our unique and advanced proprietary AI system will
monitor and enhance your sleep" or "let our unique AI engine maximize the value
of your stock holdings". Yesterday they would have said "our unique and advanced
proprietary algorithms" and the day before that they would have said "our unique
and advanced proprietary code". But let's face it, they're almost always talking
about the most basic software routines. The letters A and I will become degraded
and devalued by overuse in every field in which humans work. Coffee machines,
light switches, christmas trees will be marketed as AI proficient, AI savvy or
AI enabled. But despite this inevitable opportunistic nonsense, reality will
If we thought the Pandora's jar that ruined the utopian dream of the internet
contained nasty creatures, just wait till AI has been overrun by the malicious,
the greedy, the stupid and the maniacal. We sleepwalked into the internet age
and we're now going to sleepwalk into the age of machine intelligence and
biological enhancement. How do we make sense of so much futurology screaming in
Perhaps the most urgent need might seem counterintuitive. While the specialist
bodies and institutions I've mentioned are necessary we need surely to redouble
our efforts to understand who we humans are before we can begin to grapple with
the nature of what machines may or may not be. So the arts and humanities
strike me as more important than ever. Because the more machines rise, the more
time we will have to be human and fulfill and develop to their uttermost, our
an outstanding lecture exploring the impact of technology on humanity by
looking back at human history in order to understand the present and the
We use digital platforms because they provide us with great value. I use
Facebook to keep in touch with friends and family around the world. I've
written about how crucial social media is for social movements. I have studied
how these technologies can be used to circumvent censorship around the world.
But it's not that the people who run Facebook or Google are maliciously and
deliberately trying to make the world more polarized and encourage extremism. I
read the many well-intentioned statements that these people put out. But it's
not the intent or the statements people in technology make that matter, it's
the structures and business models they're building. And that's the core of the
So what can we do? We need to restructure the whole way our digital technology
operates. Everything from the way technology is developed to the way the
incentives, economic and otherwise, are built into the system.
We have to mobilize our technology, our creativity and yes, our
politics so that we can build artificial intelligence that supports us in our
human goals but that is also constrained by our human values. And I understand
this won't be easy. We might not even easily agree on what those terms mean.
But if we take seriously how these systems that we depend on for so much
operate, I don't see how we can postpone this conversation anymore. We need a
digital economy where our data and our attention is not for sale to the
highest-bidding authoritarian or demagogue.
no new technology has only a one-sided effect. every technology is always both a burden
and a blessing. not either or, but this and that. what bothers me is that
we seem to ignore the negative impact of new technologies, justifying this
attitude with their positive aspects.
If you're never exposed to new ideas and contexts, if you grow up only being
shown one way of thinking about the computer and being told that there are no
other ways to think about this, you grow up thinking you know what we're doing.
We have already fleshed out all the details, improved and optimized everything
a computer has to offer. We celebrate alleged innovation and then delegate
picking up the broken pieces to society, because it's not our fault – we
figured it out already.
We have to tell ourselves that we haven't the faintest idea of what we're
doing. We, as a field, haven't the faintest idea of what we're doing. And we
have to tell ourselves that everything around us was made up by people that
were no smarter than us, so we can change, influence and build things that make
a small dent in the universe.
And once we understand that, only then might we be able to do what the early
fathers of computing dreamed about: To make humans better – with the help of
the sequel to my previous talk, the lost medium, on
bullet holes in world war 2 bombers, page numbering, rotating point of views
and how we can escape the present to invent the future.
People want stuff for free. Smart people want free stuff that will help
grow their business. Sign up for my monthly letters to learn more about digital
strategy and creating a holistic user experience. In addition I'll send you a
list of the Top 10 Mistakes in your current digital strategy. Knowing what's
wrong is the first step toward making it right.