The recent Cambridge Analytica scandal has opened up a Pandora’s box on privacy and has people engaging in passionate conversations about the worldwide impact social media platforms have on the quality of our lives, our thoughts, and ultimately our decisions. It casts a new light on tools such as Facebook and Twitter and revitalized the discussion of how addicted we are to instruments that seem to manipulate us into oversharing information in a poorly protected space.
As Facebook CEO Mark Zuckerberg faces a new day of testimony before Congress and more people are debating whether they should drop their accounts while waiting for the social media giant to fall under some form of government control, we talked to one of the founders of an initiative that proposes a set of measures to turn social media into the good guy again.
[ READ: Zuckerberg testifies his personal data was shared by Cambridge Analytica.]
Aza Raskin, 34, spent his early career at Mozilla as head of user experience and is said to have invented the infinite scroll, a format that offers content in a continuous stream instead of mulitple pages. The son of Jef Raskin, who created the Macintosh project at Apple, Aza recently co-founded the Center for Humane Technology in San Francisco, a coalition of early stage technologists at companies such as Google, Facebook and Apple, who have first-hand experience with they way addictive technology tools were designed. The center advocates for applying more ethics in building technology as well as increasing awareness among those who use it.
Q: Even your organization’s name is intriguing. Why was there a need for such center?
A: Every species, if it’s sufficiently intelligent, at some point, will reverse-engineer itself, figure out how and why it believes what it does, and our technology starts to figure that out. [As humans], we have cognitive biases, we get addicted to things, such as Facebook or our phones, and it starts to affect the way our minds work. It was important to launch the center because right now technology is sort of targeted at the human vulnerability instead of helping to support us.
We started to address the tech addiction, which is really a crisis, part of this bigger issue of technology being used to manipulate large groups of people at scale.
Q: What is a “design engineer”, and are they to blame for our addiction to social media and technology?
A: Behind every screen there are generally hundreds of engineers that have touched it trying to figure out how to make a program more sticky. And that’s because we as consumers generally don’t pay for these products, so the companies make money off of us when they harvest our data, our attention, and sell that to somebody else, say, an advertiser. That’s what the life of a [design] engineer looks like. It’s [about] how we get the most data to be able to target our program to be particular for you, to get you to stick around. There are people that sit trying to think what message they could send you that they would get you to come back, and as soon as you do, you get sucked in. And that’s by design.
Q: Are they aware of their impact?
A: I was actually the person early on on the Web that sort of invented and popularized the infinite scroll. At the time, I was thinking about this because it was just a better user experience. When you scroll, that already means you want more, so let’s give the user more. But I didn’t look forward to the fact that this would then be weaponized against us. Everyone uses infinite scroll now. Why? Because it’s a seduction technique that just doesn’t give your brain the chance to catch up to your impulse. You’re just scrolling on Instagram for that next little drop of dopamine.
Q: How do these tools fuel our addictions?
A: The refresh [button] acts exactly like a slot machine. It’s a random variable reward which behavioral psychologists know is the most addictive kind of reward. So that when you pull on Twitter to refresh, there is a random chance you’re going to get something new, something that is going to engage you. And so you just keep doing it. There’s a reason why slot machines are the most addictive form of entertainment and making more than all Hollywood and video games combined. It’s because they’re preying on a specific human vulnerability and that’s the job of the engineers — to find the thing regardless of whether it’s morally good or bad, that maximizes that one number, which is how much time people spend inside of their product.
Q: Do engineers take ethics classes in school?
A: There [are such classes] now, very late and too late. Every program should have at least one class on ethics, on how the project is going to be used. It’s sort of ridiculous because in other fields — biology , chemistry, physics — we’ve had these moments when we realized there’s a lot of good that comes out of them, but they can also be used for incredible destruction.
Q: What vulnerabilities are social media capitalizing on?
A: One of them is just our fear of missing out. Nobody wants to see all of their friends having fun without them. That’s exactly what social media shows us. This is why the rate of loneliness and the rate of depression have spiked since 2007, when the iPhone came out. Another one is that humans socially conform. When you think that the people around you, your friends, believe something different, you believe that, too.
Q: In some parts of the world, Facebook is said to even target people who might suffer from mental disorders to push specific advertisements. How does this work?
A: This was in Facebook Australia. In one of their marketing pitches that was talking about the power of their product, they [referred] to teenagers which are depressed [and] that [Facebook] can detect through signals on their platform, being more likely to buy cosmetics. We think AI (artificial intelligence) is neutral, but this just means it has no values, which is to say that it’s injecting its valuelessness into our societies. It’s going to go figure out that these kinds of people buy more cosmetics, and it doesn’t care that it’s because they’re depressed. Or it’s going to say, “These kinds of people love conspiracy theories,” and it’s not going to ask the question, “Should we be feeding the most vulnerable people in our populations conspiracy theories?”
If you start anywhere on Youtube and you just let the recommended videos play, no matter where you start, within four or five videos, you start heading into conspiracy theories. You start with Hillary, you end with Pizzagate. If you do anything in physics, very quickly you end up in flat Earth [theory]. It’s because what we’ve told the Youtube AI to do is to grab as much of our time as possible, so it starts to find everything that’s controversial.
Q: How can we protect ourselves?
A: Turn off every notification from a non-human. There’s an extension for Chrome and Firefox which lets you turn off the news feed in Facebook, turning the platform back to a utility. We [at the center] all like turning our phones into gray scale because it just removes some of the dopamine so that your brain has time to catch up with your impulses. Color is totally a factor. You like looking at things that are bright and shiny.
Q: Are there countries where people might not be as addicted to social media?
A: There are norms and social factors to how addictive social media is, in the same way that some countries believe more in the effectiveness of pharmaceuticals. But also the part of the human brain that these products are speaking to are sort of like the lowest parts, jacked at the base of our brain stem. These companies are racing towards our attention and the parts of the humans that they are plugging into are below the levels of cultural differences between countries. This is universal technology addiction, it’s true everywhere, and the important thing to know is that when you take an AI and you point it at the chessboard, the AI always wins. Right now we’ve taken AI and pointed it at the human mind.
Q: Are there countries taking more of a stance against social media addiction?
A: Europe is doing a pretty good job starting to think with privacy, through the GDPR [the EU General Data Protection Regulation]. It’s not enough, but at least it’s getting there in terms of what our data can be used for and who can use it. And it’s just one step.
Q: Should governments get involved in regulating social media?
A: In an ideal world we, as technology makers, would be doing the right thing, but I think we’ve seen that we are not effectively self regulating. We’ve seen that we need that external oversight, otherwise the future of politics is micro-targeting, personalized persuasion and big data. And it’s too much power to just leave in the hands of one person or a couple of people, generally white men that are fairly young.
Q: We tend to talk a lot about social media when it comes to addiction, yet people somehow forget that Google was the main driver of customizing everything, targeted advertisement, etc. Why is everyone so focused on social media right now?
A: This is the first time where people are acutely feeling the effects of technology on them. Before you got a sense of technology being addicting, but it was just a personal problem. I think this rock is starting to roll and it’s going to roll faster and faster. We’re going to see it include the Googles and the other major platforms at the world.
Q: Should hardware companies also be involved? Smartphone producers?
A: They’re certainly in the best place to help solve the problem. The way they set up their environment, the space that they give apps, the way that they allow apps to notify us, makes them own the zoning laws of the “mental cities” that are on our phone.
Q: Who are most at risk?
A: The most vulnerable people. So people who are young, children, whose minds have not yet fully clicked and whose brains are co-evolving with the technology that we’re creating. And we don’t know the effects. But I don’t think anyone is invulnerable.
Q: Is it too late for younger generations to live without social media, or is it just assumed to be a regular part of our evolving lifestyles?
A: I keep hearing that there are groups that start to pull away from social media, but it hasn’t happened at the large scale. Just the usefulness of our technology is such that it is connected to a billion people. I don’t think people can just step up and walk away from the social impact of our technology.
Of course, social media does some good for us, but it has all of these external factors that do great harm both individually and at a societal scale. The question is: How do we redesign it so that it doesn’t take advantage of our vulnerabilities?
More from U.S. News
What It’s Like to Live Without Social Media
Commentary: Financial Empowerment for the Emerging Market Consumer
The Future of Social Media: Redesign, Don’t Quit originally appeared on usnews.com