This Is Your Brain On Technology…How You’re Getting Jacked

Technology and its relationship to ways of thinking about the world are commonplace and as societies continue to develop newer, stronger, and more effective tools for living, philosophical conversations are never far behind. With things like computers and robots, we worry about the downfall of the sentient being and the rogue A.I. that could one day revolt against us. And, while fantastical, we’re already seeing drastic changes in the way we delegate tasks to nonhuman “workers.” In the fields of science and medicine, developments like the animal-human embryonic hybrid and genetic testing elicit conversations about eugenics while inventions like virtual reality headsets are making it easier for us to experience life-like replications of things like space travel without ever leaving our homes. But the existential qualms about how technology is changing the way we live are not relegated to the black-and-white framework of good or bad. The question of ethics is inextricably linked to many of our technologically focused developments, and perhaps nowhere are the implications more dire than when discussing them in context of the human mind. One man is determined to show us just how insidious technology has become, and just how deeply burrowed into our minds it is.

Can technology actually hijack our minds? That question is at the forefront of an extensive Medium piece written by Tristan Harris, whose job title is Design Ethicist, a job in which the worlds of technical and conceptual merge all the time. “I’m an expert on how technology hijacks our psychological vulnerabilities. That’s why I spent the last three years as Google’s Design Ethicist caring about how to design things in a way that defends a billion people’s minds from getting hijacked,” Tristan writes. Early on, he implores readers to ask themselves about how and when technology can “exploit our minds’ weaknesses,” an avenue of reflection he says he first picked up as a working magician. “Magicians start by looking for blind spots, edges, vulnerabilities and limits of people’s perception, so they can influence what people do without them even realizing it,” he says, suggesting that tech works in similar ways. In fact, the men and women who design the products we use  – from mobile apps to the medicine we take – “play your psychological vulnerabilities (consciously and unconsciously) against you in the race to grab your attention.”

Harris buttresses his statements with visual aids, examples in the working worlds of things like search engines, supermarket design, Yelp, social media, and more. The first hijack he discusses is “If You Control the Menu, You Control the Choices,” an exploration of the psychology of implied freedom when the reality is not many choices exist. When using the internet to find a service, product, or even other people, we often become distracted by what’s put in front of us and get sidetracked from our initial search. For example, “[y]ou open Yelp to find nearby recommendations and see a list of bars. The group turns into a huddle of faces staring down at their phones comparing bars. They scrutinize the photos of each, comparing cocktail drinks. Is this menu still relevant to the original desire of the group? It’s not that bars aren’t a good choice, it’s that Yelp substituted the group’s original question (‘where can we go to keep talking?’) with a different question (‘what’s a bar with good photos of cocktails?’) all by shaping the menu,” he explains. The illusion of choice – to borrow a magician’s term – is presented to us in a way which makes us feel empowered and we begin to resort to our smartphones as the repository for all things we are seeking. But the reality is that what we see on our screens is a highly filtered, algorithm-based, or totally inaccurate collection of data. “‘What’s happening in the world?’ becomes a menu of news feed stories,” Harris uses as an example.

Having Artificial Intelligence at Your Fingertips Just Got a Whole Lot Closer (Video)

“Hijack #2: Put a Slot Machine In a Billion Pockets” leads with a crazy statistic. “The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices?,” he asks. Imagine seeing this each time a phone’s screen was being scanned mindlessly:

hijack 2What’s at play here is the idea of rewarding certain behavior. On a psychological level, many of us feel positive reinforcement when realizing we’ve received a text, missed a call, or been tagged in a post. It’s the same concept, in many ways, behind gambling in real money casinos that accept Missouri players, for example. As Harris writes, “If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing.” Tech designers know this is a reliable way of thinking about marketing thanks to the fact that “[s]lot machines make more money in the United States than baseball, movies, and theme parks combined.” As such, our smartphones become a slot machine of sorts, and with each scan of our notifications we are looking for the rewarding experience of feeling as if we’ve earned something. Harris says this development is something tech giants need to address, arguing “Apple and Google have a responsibility to reduce these effects by converting intermittent variable rewards into less addictive, more predictable ones with better design. For example, they could empower people to set predictable times during the day or week for when they want to check ‘slot machine’ apps, and correspondingly adjust when new messages are delivered to align with those times.” Here, he’s taking the developers to task for effectively changing human behavior on a scale so grand, we need help changing our habits.

The FBI Needs Apple’s Help Fighting Terrorism. Should the Tech Giant Give In?

Harris’ sprawling article also covers how the fear of missing out on something important dictates much of our behavior, leading us to remain friends with people on social media with whom we rarely or ever interact. Also examined closely is the need for social approval encouraged by apps like Instagram, where we often judge the success of our days by how many people interacted with our photos. In examining the concept of social reciprocity and our need to partake in a tit-for-tat to feel like we’re doing something, Harris writes about LinkedIn and its formula for growing its membership. YouTube becomes the example for the bottomless pit of substance we can consume online without even thinking about it. Facebook’s chat function is the poster child for the fact that “messages that interrupt people immediately are more persuasive at getting people to respond than messages delivered asynchronously.” The social-media platform is once again mentioned in highlighting what Harris calls “Bundling Your Reasons with Their Reasons” – “when you want to look up a Facebook event happening tonight (your reason) the Facebook app doesn’t allow you to access it without first landing on the news feed (their reasons), and that’s on purpose. Facebook wants to convert every reason you have for using Facebook, into their reason which is to maximize the time you spend consuming things.” In closing, Harris mentions the difficulty of unsubscribing from things like digital newspaper subscriptions (for example, unsubscribing from an e-mail list isn’t always as easy as clicking a button; there are many steps involved to be left alone), saying “[b]usinesses naturally want to make the choices they want you to make easier, and the choices they don’t want you to make harder.”

For those who are naturally contemplative, these hijacks are likely to sound familiar – perhaps you’ve even noticed them happening in real time. The question is what should we do with this information? It’s a tough question to answer for many reasons, particularly when given the sheer amount of information involved. “Imagine whole bookshelves, seminars, workshops and trainings that teach aspiring tech entrepreneurs techniques like this. They exist,” Harris claims. His closing thesis – that we should begin to think of our tech gadgets and programs as extensions of our literal selves – posits that our time is valuable and should be respected as much as other things, like privacy.

How many times has your mind been hijacked today?