Is technology failing millennials?

The author of “Kids These Days: Human Capital and the Making of Millennials,” Malcolm Harris, spoke to Strelka Mag about millennials’ relationship with technology and the future of work.

The world has evolved tremendously in recent years. The application of big data, new algorithms, and cloud computing is changing the very nature of how we work, socialize, and create value. A platform economy is emerging with a handful of tech giants such as Amazon, Facebook, and Google dominating the digital realm. Artificial intelligence is lifting automation to an unprecedented level. Millennials are at the heart of this economic and technological transformation; they grew up alongside technology and created much of it. But has the generation benefited from this rapid advancement?

In his book “Kids These Days: Human Capital and the Making of Millennials,” Malcolm Harris, a writer and editor for the online magazine The New Inquiry, analyzes generational trends and labor markets in the United States. He tries to detach his own generation from negative stereotypes of being entitled, self-obsessed, and addicted to social media. Harris argues that millennials are in fact more skilled and better educated than previous generations. From the perspective of “human capital” they indeed should be very valuable. However, the job market doesn’t reward them respectively; they have more debts and fewer stable jobs than their predecessors. Millennials find themselves far from being entitled, and instead in a position of systematic economic disadvantage. Technological acceleration is partly to blame for the killing of their professional prospects.

Strelka Mag spoke to Harris about why people are doing more and getting less, and how technology is affecting them.

Malcolm Harris. Photo: Gleb Leonov / Strelka Institute

Harris believes automation has failed in its promise of improving the labor and welfare of workers. “Instead, we have a situation where technology has benefited the owners of technology exclusively, to the detriment of everyone else.” He uses the example of Amazon’s Mechanical Turk, a crowdsourcing marketplace that allows businesses and individuals to post short tasks and pay workers – in cash, or sometimes gift cards – to complete them. “These are small tasks that we think computers are doing but it’s actually very poorly paid humans behind screens. How come there aren’t photos of child porn all over Facebook? That’s because there are people at computers clicking through pictures of child porn, being like ‘thats child porn, we can’t put that on there, that’s a torture picture.’ So it’s not actually automation, it’s that labor has come to look more like automation, more like a flow of electricity,” Harris points out.

One might argue that the way millennials work is a step towards the post-labor society, where you can work remotely, be more flexible, and have a better work-life balance. All you really need to work is a laptop and a decent wifi connection. But Harris is skeptical about that. “It doesn’t feel like post-labor when you work from your bed. That’s going to affect your sleep and damage your being as a person. I don’t believe we’re supposed to work at all, but if we are, you’re not supposed to be creating value while you’re waking up or going to sleep at night,” he says. “This post-work has also meant the breakdown between work and life. Is it post-work if you’re going to a party at your office? Do you feel obligated to be there? The breakdown of division between life and work hasn’t gone in the life division way; it’s gone in the work way, on balance overall, for this generation.”

The average millennial checks his or her smartphone 43 times and spends 5.4 hours on social media each day. Apart from creating content, they supply raw material for platforms to generate profit. Companies like Facebook use every bit of information that users disclose for advertising targeting. Harris argues that platforms are compromised by their commercial nature. “Facebook drives me nuts because every time they’ve got 10 bullsh*t notifications because they know if they put that little red dot in the corner, people will interact more. They’re wasting your time on purpose because for them it’s not a waste, that’s how they make money.”

Data harvesting opens up a whole set of privacy and security issues. For Harris, concerns about data sharing started long before the Cambridge Analytica case. In 2012, Twitter released Harris’ personal data to a court after he was arrested during an Occupy Wall Street protest on the Brooklyn Bridge. “Unfortunately they will get your information and they will use it against you. If we’re going to combat that, it means not using those platforms, being very careful about how we’re using them in terms of organizing, and then perhaps coming up with our own ways to communicate.”

Malcolm Harris giving a lecture at Strelka in August. Photo: Gleb Leonov / Strelka Institute

Harris is also skeptical about artificial intelligence as a tool for directing society. He argues that “in a capitalist society it’s going to be used for the production of profit and oppression of most people.” He thinks the election of Donald Trump was an example of an AI glitch. “You had news producers and media people looking at what was doing well on Facebook, they were not deciding what content to make based on journalistic ethics. They got him elected because there’s this feedback loop that looks like an AI glitch and is like ‘do more Donald Trump stories, these are doing really well.’ They had sort of taken their human intelligence out of the equation. Instead of thinking ‘is this a good idea? Are we informing people or misleading people? Is this going to end with Donald Trump being elected president?’, they weren’t thinking, they were just like, ‘positive responses from Facebook, do it again.’”


Users, not consumers

So what will our relationship with technology become? Are we creating a future of new possibilities, or a future of undesirable consequences? Harris explains that one of his dystopian fears is the use of people as resources. “I think ‘The Matrix’ is the full end of it, where humans are batteries, you’re hooked up to a food machine, you’ve got a pod to live in, so you're in like a cocoon of full convenience and full exploitation. That is like the end of the road: the turning of people, labor, and thought into a fungible resource like electricity. Something like Uber or the Mechanical Turk or whatever that treats labor workers as a metered resource like water or electricity is the real danger. So I hope that won’t be what happens, but I worry that’s definitely the road we’re on.”

Setting up political alternatives is a top priority, he believes. “It means collaboration outside our current tech systems. I honestly think we need to go back to hardware, and if you don’t know where the pins on a PCB are being put, that’s not your device. That’s someone else’s.”

He sees a possible solution in early hackers’ ideals of a free and open internet. “My dad grew up in Silicon Valley in the 80s and he was a really early hacker. So then he would literally dive in dumpsters to try to find codes and passwords and stuff. You were really interacting with the technology, it wasn’t just that you were consuming it, it was something you were also helping to produce.”

Harris remembers his friend Aaron Schwartz, an internet visionary and political activist who tragically took his own life in 2013 in the face of abusive prosecution. An influential coder, he co-founded Reddit and helped create the Creative Commons. “He wanted to be different, to produce a different kind of technology. He really was the best of us, both as a person but also as a scholar, a thinker, and a technologist.”

Aaron Schwartz. Photo courtesy Sage Ross

Aaron Schwartz was a leading proponent of information freedom and helped to extend the legal framework behind the free software movement. Harris believes the continuation of Schwartz’s idea of a free internet is a way to reclaim the technology back and to eliminate the bias, discrimination, and privacy-infringement of individuals by data-collecting tech companies.

“He really wanted people to understand the technology they were using and be part of its production, not just its consumption. You think about people as users, not just as consumers but as people who are interacting and producing – and that’s the kind of person he was, always. If he used something, he changed it every time.”

If you noticed a typo or mistake, highlight it and send to us by pressing Ctrl+Enter.