So much of how we build websites and software comes down to how we think. The
churn of tools, methods, and abstractions also signify the replacement of
ideology. A person must usually think in a way similar to the people who
created the tools to successfully use them. It’s not as simple as putting down
a screwdriver and picking up a wrench. A person needs to revise their whole
frame of thinking; they must change their mind.
The new methods were invented to manage a level of complexity that is
completely foreign to me and my work. It was easy to back away from most of
this new stuff when I realized I have alternate ways of managing complexity.
Instead of changing my tools or workflow, I change my design. It’s like
designing a house so it’s easy to build, instead of setting up cranes typically
used for skyscrapers. Beyond that, fancy implementation has never moved the
needle much for my clients.
So, I thought it would be useful remind everyone that the easiest and cheapest
strategy for dealing with complexity is not to invent something to manage it,
but to avoid the complexity altogether with a more clever plan.
a fancy implementation has never moved the needle much for my clients either.
what has though is to build relationships and let technology support this
process. we are an increasingly digital society, yes, but that doesn't mean we
have to let technology take over.
Human nature, for better or worse, doesn’t change much from millennia to
millennia. If you want to see the strengths that are unique and universal to
all humans, don’t look at the world-famous award-winners — look at children.
Children, even at a young age, are already proficient at: intuition, analogy,
creativity, empathy, social skills. Some may scoff at these for being “soft
skills”, but the fact that we can make an AI that plays chess but not hold a
normal five-minute conversation, is proof that these skills only seem “soft”
to us because evolution’s already put in the 3.5 billion years of hard work
So, if there’s just one idea you take away from this entire essay, let it be
Mother Nature’s most under-appreciated trick: symbiosis.
Symbiosis shows us you can have fruitful collaborations even if you have
different skills, or different goals, or are even different species.
Symbiosis shows us that the world often isn’t zero-sum — it doesn’t have to
be humans versus AI, or humans versus centaurs, or humans versus other
humans. Symbiosis is two individuals succeeding together not despite, but
because of, their differences. Symbiosis is the “+”.
zero sum games most often win our attention, but the vast majority of our
interactions are positive sum: when you share, when you buy, when you learn,
when you talk. similarly with technology and computers: we can only improve if
we use technology to augment ourselves in order to allow for new,
previously-impossible ways of thinking, of living, of being.
At the same time as we so happily create everything from artificial
intelligences to putting "smart" into absolutely bloody everything – At the
same time, there are still so many actual, real problems unsolved. I do not
need a single more problem solved, every one of my actual problems have been
solved. There is not a single thing I could even dream of wanting that hasn't
been already created. Yes, I can upgrade. I can buy a slightly cooler car. I
can buy slightly better clothes. I can buy slightly faster phones. But frankly
I am just consuming myself into the grave, because I have an empty life.
In all this innovation bullshit, what has happened, is that rather than look at
true, meaningful change, we have turned innovation into one more bullshit
phrase, into one more management buzzword.
Do we actually have discussions about whether we're doing meaningful work or
just work that happens to be paid at the moment. We need regardless of what
company we work in, we need to look at the products we create, the things we
create, and say "yes, this can matter". But it can not just matter to me, it
needs to matter to someone else as well.
We have blind spots, we all have them. We all have our biases. It is acceptable
perchance to have a bias as an individual. But when the entire community or an
entire nation has a bias, this says we have not gone far enough.
we seem to spend so much talent, research, time, energy and money to create
things that nobody needs, just because we feel we have to innovate somehow. and
the problem isn't how to innovate or the innovation per se, but how to get
society to adopt the good ideas that already exist.
Algorithms are essentially thoughtless. They model certain decision flows, but
once you run them, no more thought occurs. To call a person “thoughtless” is
usually considered a slight, or an outright insult; and yet, we unleash so many
literally thoughtless processes on our users, on our lives, on ourselves.
this hits home on so many levels. we throw technology at people, hoping
something will stick. instead, we should use the computer and algorithms to
augment ourselves to do things that were previously impossible, to help us make
our lives better. that is the sweet spot of our technology.
The time you spend is not your own. You are, as a class of human beings,
responsible for more pure raw time, broken into more units, than almost anyone
else. You are about to spend whole decades, whole centuries, of cumulative
moments, of other people’s time. People using your systems, playing with your
toys, fiddling with your abstractions. And I want you to ask yourself when you
make things, when you prototype interactions, am I thinking about my own clock,
or the user’s? Am I going to help someone make order in his or her life?
If we are going to ask people, in the form of our products, in the form of the
things we make, to spend their heartbeats—if we are going to ask them to spend
their heartbeats on us, on our ideas, how can we be sure, far more sure than we
are now, that they spend those heartbeats wisely?
our technological capability changes much faster than our culture. we first
create our technologies and then they change our society and culture. therefore
we have a huge responsibility to look at things from the other person's point
of view – and to do what's best for them. in other words, be considerate.
Lamenting about the tech industry’s ills when self-identifying as a
technologist is a precarious construction. I care so deeply about using the
personal computer for liberation & augmentation. I’m so, so burned out by 95%
of the work happening in the tech industry. Silicon Valley mythologizes
newness, without stopping to ask “why?”. I’m still in love with technology, but
increasingly with nuance into that which is for us, and that which productizes
Perhaps when Bush prophesied lightning-quick knowledge retrieval, he didn’t
intend for that knowledge to be footnoted with Outbrain adverts. Licklider’s
man-computer symbiosis would have been frustrated had it been crop-dusted
with notifications. Ted Nelson imagined many wonderfully weird futures for
the personal computer, but I don’t think gamifying meditation apps was one of
every day we make big fuss about a seemingly new hype (ai, blockchain, vr, iot,
cloud, ... what next?). as neil postman and others have cautioned us,
technologies tend to become mythic. that is, perceived as if they were god-given,
part of the natural order of things, gifts of
nature, and not as artifacts produced in a specific political and historical
context. and by that we completely fail to recognize how we can use technology
augment ourselves to do things that were previously impossible, to help us make our lives better.
Radio brought music into hospitals and nursing homes, it eased the profound
isolation of rural life, it let people hear directly from their elected
representatives. It brought laugher and entertainment into every parlor, saved
lives at sea, gave people weather forecasts for the first time.
But radio waves are just oscillating electromagnetic fields. They really don't
care how we use them. All they want is to go places at the speed of light.
It is hard to accept that good people, working on technology that benefits so
many, with nothing but good intentions, could end up building a powerful tool
for the wicked.
But we can't afford to re-learn this lesson every time.
Technology interacts with human nature in complicated ways, and part of human
nature is to seek power over others, and manipulate them. Technology
We have to assume the new technologies we invent will concentrate power, too.
There is always a gap between mass adoption and the first skillful political
use of a medium. With the Internet, we are crossing that gap right now.
only those who know nothing about technological history believe that technology
is entirely neutral. it has always a bias towards being used in certain ways
and not others. a great comparison to what we're facing now with the internet.
In psychology, the term “insight” is used to describe a recognition of one’s
own condition, such as when a person with mental illness is aware of their
illness. More broadly, it describes the ability to recognize patterns in one’s
own behavior. It’s an example of metacognition, or thinking about one’s own
thinking, and it’s something most humans are capable of but animals are not.
And I believe the best test of whether an AI is really engaging in human-level
cognition would be for it to demonstrate insight of this kind.
I used to find it odd that these hypothetical AIs were supposed to be smart
enough to solve problems that no human could, yet they were incapable of doing
something most every adult has done: taking a step back and asking whether
their current course of action is really a good idea. Then I realized that we
are already surrounded by machines that demonstrate a complete lack of insight,
we just call them corporations. Corporations don’t operate autonomously, of
course, and the humans in charge of them are presumably capable of insight, but
capitalism doesn’t reward them for using it. On the contrary, capitalism
actively erodes this capacity in people by demanding that they replace their
own judgment of what “good” means with “whatever the market decides.”
the problem is this: if you're never exposed to new ideas and contexts, if you
grow up only being shown one way of thinking about businesses & technology and
being told that there are no other ways to think about this, you grow up
thinking you know what we're doing.
When people try to explain the wastefulness of today's computing, they commonly
offer something I call "tradeoff hypothesis". According to this hypothesis, the
wastefulness of software would be compensated by flexibility, reliability,
maintability, and perhaps most importantly, cheap programming work.
I used to believe in the tradeoff hypothesis as well. However, during recent
years, I have become increasingly convinced that the portion of true tradeoff
is quite marginal. An ever-increasing portion of the waste comes from
abstraction clutter that serves no purpose in final runtime code. Most of this
clutter could be eliminated with more thoughtful tools and methods without any
we too often seem to adjust to the limitations of technology, instead of
creating solutions for a problem with the help of technology.
Socrates didn't charge for "education" because when you are in business, the
"customer starts to become right". Whereas in education, the customer is
generally "not right". Marketeers are catering to what people want, educators
are trying to deal with what they think people need (and this is often not at
all what they want).
Another perspective is to note that one of the human genetic "built-ins" is
"hunting and gathering" – this requires resources to "be around", and is
essentially incremental in nature. It is not too much of an exaggeration to
point out that most businesses are very like hunting-and-gathering processes,
and think of their surrounds as resources put there by god or nature for them.
Most don't think of the resources in our centuries as actually part of a
human-made garden via inventions and cooperation, and that the garden has to be
maintained and renewed.
these thoughts are a pure gold mine. a fundamental problem for most businesses
is that one cannot innovate under business objectives and one cannot accomplish
business objectives under innovation. ideally, you need both, but not at the
When we talk about technology, we tend to talk about it as this blue sky
opportunity. It could go any direction. And I want to get serious for a moment
and tell you why it's going in a very specific direction. Because it's not
evolving randomly. There's a hidden goal driving the direction of all of the
technology we make, and that goal is the race for our attention. Because every
new site or app has to compete for one thing, which is our attention, and
there's only so much of it. And the best way to get people's attention is to
know how someone's mind works.
A simple example is YouTube. YouTube wants to maximize how much time you spend.
And so what do they do? They autoplay the next video. And let's say that works
really well. They're getting a little bit more of people's time. Well, if
you're Netflix, you look at that and say, well, that's shrinking my market
share, so I'm going to autoplay the next episode. But then if you're Facebook,
you say, that's shrinking all of my market share, so now I have to autoplay all
the videos in the newsfeed before waiting for you to click play. So the
internet is not evolving at random. The reason it feels like it's sucking us in
the way it is is because of this race for attention. We know where this is
going. Technology is not neutral, and it becomes this race to the bottom of the
brain stem of who can go lower to get it.
we always seem to have the notion that technology is always good. but that is
simply not the case. every technology is always both a burden and a blessing.
not either or, but this and that.
The idea, as near as I can tell, is that the ideal computer should be like a
human being, only more obedient. Anything so insidiously appealing should
immediately give pause. Why should a computer be anything like a human being?
Are airplanes like birds, typewriters like pens, alphabets like mouths, cars
like horses? Are human interactions so free of trouble, misunderstanding, and
ambiguity that they represent a desirable computer interface goal? Further, it
takes a lot of time and attention to build and maintain a smoothly running team
of people, even a pair of people. A computer I need to talk to, give commands
to, or have a relationship with (much less be intimate with), is a computer
that is too much the center of attention.
in a world where computers increasingly become human, they inevitably will
become the center of attention. the exact opposite of what they should be:
invisible and helping to focus our attention to ourselves and the people we live
Above all, be prepared for the bullshit, as AI is lazily and inaccurately
claimed by every advertising agency and app developer. Companies will make
nonsensical claims like "our unique and advanced proprietary AI system will
monitor and enhance your sleep" or "let our unique AI engine maximize the value
of your stock holdings". Yesterday they would have said "our unique and advanced
proprietary algorithms" and the day before that they would have said "our unique
and advanced proprietary code". But let's face it, they're almost always talking
about the most basic software routines. The letters A and I will become degraded
and devalued by overuse in every field in which humans work. Coffee machines,
light switches, christmas trees will be marketed as AI proficient, AI savvy or
AI enabled. But despite this inevitable opportunistic nonsense, reality will
If we thought the Pandora's jar that ruined the utopian dream of the internet
contained nasty creatures, just wait till AI has been overrun by the malicious,
the greedy, the stupid and the maniacal. We sleepwalked into the internet age
and we're now going to sleepwalk into the age of machine intelligence and
biological enhancement. How do we make sense of so much futurology screaming in
Perhaps the most urgent need might seem counterintuitive. While the specialist
bodies and institutions I've mentioned are necessary we need surely to redouble
our efforts to understand who we humans are before we can begin to grapple with
the nature of what machines may or may not be. So the arts and humanities
strike me as more important than ever. Because the more machines rise, the more
time we will have to be human and fulfill and develop to their uttermost, our
an outstanding lecture exploring the impact of technology on humanity by
looking back at human history in order to understand the present and the
We use digital platforms because they provide us with great value. I use
Facebook to keep in touch with friends and family around the world. I've
written about how crucial social media is for social movements. I have studied
how these technologies can be used to circumvent censorship around the world.
But it's not that the people who run Facebook or Google are maliciously and
deliberately trying to make the world more polarized and encourage extremism. I
read the many well-intentioned statements that these people put out. But it's
not the intent or the statements people in technology make that matter, it's
the structures and business models they're building. And that's the core of the
So what can we do? We need to restructure the whole way our digital technology
operates. Everything from the way technology is developed to the way the
incentives, economic and otherwise, are built into the system.
We have to mobilize our technology, our creativity and yes, our
politics so that we can build artificial intelligence that supports us in our
human goals but that is also constrained by our human values. And I understand
this won't be easy. We might not even easily agree on what those terms mean.
But if we take seriously how these systems that we depend on for so much
operate, I don't see how we can postpone this conversation anymore. We need a
digital economy where our data and our attention is not for sale to the
highest-bidding authoritarian or demagogue.
no new technology has only a one-sided effect. every technology is always both a burden
and a blessing. not either or, but this and that. what bothers me is that
we seem to ignore the negative impact of new technologies, justifying this
attitude with their positive aspects.
If you're never exposed to new ideas and contexts, if you grow up only being
shown one way of thinking about the computer and being told that there are no
other ways to think about this, you grow up thinking you know what we're doing.
We have already fleshed out all the details, improved and optimized everything
a computer has to offer. We celebrate alleged innovation and then delegate
picking up the broken pieces to society, because it's not our fault – we
figured it out already.
We have to tell ourselves that we haven't the faintest idea of what we're
doing. We, as a field, haven't the faintest idea of what we're doing. And we
have to tell ourselves that everything around us was made up by people that
were no smarter than us, so we can change, influence and build things that make
a small dent in the universe.
And once we understand that, only then might we be able to do what the early
fathers of computing dreamed about: To make humans better – with the help of
the sequel to my previous talk, the lost medium, on
bullet holes in world war 2 bombers, page numbering, rotating point of views
and how we can escape the present to invent the future.
with the modest success of my last year's talk
the lost medium i was
reinvited by the kind folks of
voxxed days belgrade to delve into this
topic a bit further. vdb17 was an amazing experience – again – being one of
the biggest and most inspiring technology conferences in eastern europe with
excellent speakers from all over the world and about 800+ attendees.
my previous talk focused a lot on the early days of personal computing, the
ingenious ideas we lost over time and the notion that we're not really thinking
about how we can use the medium computer to augment our human capabilities.
after delivering this talk however i had the feeling that i left out an
important question: what now? how can we improve?
this was the base for my new talk, the bullet hole misconception, in which i'm exploring
how we can escape the present to invent the future and what questions must we
ask if we are to amplify our human capabilities with computers.
feel free to share it and if you have questions, feedback or critique i'd love to hear from you!
The problem of computational analytics is not only in the semantic bias of the
data set, but also in the design of the algorithm that treats the data as
unbiased fact, and finally in the users of the computer program who believe in
its scientific objectivity.
From capturing to reading data, interpretation and hermeneutics thus creep into
all levels of analytics. Biases and discrimination are only the extreme cases
that make this mechanism most clearly visible. Interpretation thus becomes a
bug, a perceived system failure, rather than a feature or virtue. As such, it
exposes the fragility and vulnerabilities of data analytics.
The paradox of big data is that it both affirms and denies this “interpretative
nature of knowledge”. Just like the Oracle of Delphi, it is dependent on
interpretation. But unlike the oracle priests, its interpretative capability is
limited by algorithmics – so that the limitations of the tool (and, ultimately,
of using mathematics to process meaning) end up defining the limits of
we're talking a lot about the advancement of computational analytics and
artificial intelligence, but little about their shortcomings and effects on
society. one of those is that for our technology to work perfectly, society has
to dumb itself down in order to level the playing field between humans and
computers. a very long but definitely one of the best essays i read this year.
Machines have always done things for us, and they are increasingly doing things
for us and without us. Increasingly, the human element is displaced in favor of
faster, more efficient, more durable, cheaper technology. And, increasingly,
the displaced human element is the thinking, willing, judging mind. Of course,
the party of the concerned is most likely the minority party. Advocates and
enthusiasts rejoice at the marginalization or eradication of human labor in its
physical, mental, emotional, and moral manifestations. They believe that the
elimination of all of this labor will yield freedom, prosperity, and a golden
age of leisure. Critics meanwhile, and I count myself among them, struggle to
articulate a compelling and reasonable critique of this scramble to outsource
various dimensions of the human experience.
our reliance on machines to make decisions for us leads us to displace the most
important human elements in favor of cheaper and faster technology. doing that
however we outsource meaning-making, moral judgement and feeling – which is
what a human being is – to machines.
The tech industry is no longer the passion play of a bunch of geeks trying to
do cool shit in the world. It’s now the foundation of our democracy, economy,
and information landscape.
We no longer have the luxury of only thinking about the world we want to build.
We must also strategically think about how others want to manipulate our
systems to do harm and cause chaos.
we're past the point where developing fancy new technologies is a fun project
for college kids. our technologies have real implications on the world, on our
culture and society. nevertheless we seem to miss a kind of moral framework on
how technology is allowed to alter society.