Skip to content

Stupid Is As Stupid Does

Osita Nwanevu
11 min read
Twitter's logo on a phone.
Photo by Joshua Hoehne on Unsplash

Hey all. Let’s hop to it.

Recent Work

I have two pieces in The New Republic’s May special issue, “Democracy in Peril.” The one that went up online yesterday, “Will the Democrats Fight to Save Democracy?” is a rundown of what happened to the voting rights and democracy reform bills with some musings on where the reform push goes from here. The short-to-medium term answer is nowhere. That leaves the long-term:

The central challenge of the democratic reform effort is that the system’s anti-democratic and inegalitarian biases must be worked through—with both political persuasion and political organization in regions more white and more conservative than the places bearing the brunt of voter suppression—before they are overcome.
That task—a transformation of the American political landscape—wasn’t going to be accomplished over the course of a congressional term. As such, the drubbing Democrats are likely to see in November should be viewed as the beginning of a new stage in the reform push rather than its end. There’s much to do in the near term to shore up the democratic process, beginning with efforts to protect voters and the vote ahead of 2024. “You have to go to the states,” [Democracy 21’s Fred] Wertheimer told me, to fight the coming laws aimed at voter suppression and election sabotage. “Trump’s followers are out there attacking and harassing election officials; they’re trying to get into those positions themselves. That’s a battle that has to be fought in the states and in local communities.”
At the federal level, reform legislation is functionally dead until 2024 at a minimum and, given the biases of the Senate and the Electoral College, potentially for many, many years to come. But democratic reformers should continue building public support for the provisions of the For the People Act and more ambitious proposals anyway. If there’s a future for the project of American democracy, it’ll be built not by politicians in Washington hoping to save their own seats from election cycle to election cycle, but by activists capable of seeing that legislative battles are only fronts in a larger, winnable ideological war.

Politics/Culture

I’m going to hop back on Twitter briefly this or next week before going off again to continue working on my book. The first chapter’s been sent off and I feel like I’m making decent time. The Twitter breaks have been helpful productivity-wise, but I doubt I’m going to be online as constantly as I once was even when the book’s finished. I've been ambivalent about social media from the jump, really. I’m not into sharing the intimate details of my life with strangers; it always feels like work to put together, photograph, and post even the things I’d be interested in sharing. I’m a free rider in the system for the most part ⁠— I can read and watch the things other people put up all day ⁠— except when I’m posting about politics. I don’t know that it’s any more of an exercise in futility than it was when I started up on Twitter a decade ago, but it certainly feels like one now. Even at their substantive best, the conversations on there are predictable and repetitive; if there’s an ultimate prize for being right online that makes it all worthwhile, I don’t know anyone who’s won it. All of this fully justifies Twitter’s destruction as far as I’m concerned; I dearly hope Musk buys it and runs the place into the ground.

That still leaves me far from believing or arguing, like most pundits on the subject, that Twitter is actually destroying American life. The latest major entry in this genre comes to us from Jonathan Haidt, whose 2015 essay “The Coddling of the American Mind” kicked the panic over what’s now called “wokeism” into high gear.

This new essay’s title, “Why the Last 10 Years of American Life Have Been Uniquely Stupid,” enticed me until I found out who’d written it. For a while now, I’ve been circling the thought that things are both materially better and substantially dumber than they were 50 or 100 years ago; that feels right until you remember that the evils and injustices of the past were also, obviously, deeply stupid. We can’t laugh easily about, say, segregation. But on one level, it was, of course, laughable ⁠— the gravity of such things tends to occlude the fact that they were as much the handiwork of fools, frauds, and hustlers as any of our contemporary political inanities. This is one of the ideas I tried to get at in my review of Alan Taylor’s work on early American history.

Still, even if the last 10 years of American life haven’t really been uniquely stupid, our technological innovations have made them genuinely unique. That hardly justifies, though, Haidt centering Facebook and Twitter as the locus of all our problems:

By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous” for a few days. If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.
This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment, and their prediction of how others would react to each new action. One of the engineers at Twitter who had worked on the “Retweet” button later revealed that he regretted his contribution because it had made Twitter a nastier place. As he watched Twitter mobs forming through the use of the new tool, he thought to himself, “We might have just handed a 4-year-old a loaded weapon.”
[...] Social media has both magnified and weaponized the frivolous. Is our democracy any healthier now that we’ve had Twitter brawls over Representative Alexandria Ocasio-Cortez’s tax the rich dress at the annual Met Gala, and Melania Trump’s dress at a 9/11 memorial event, which had stitching that kind of looked like a skyscraper? How about Senator Ted Cruz’s tweet criticizing Big Bird for tweeting about getting his COVID vaccine?
It’s not just the waste of time and scarce attention that matters; it’s the continual chipping-away of trust. An autocracy can deploy propaganda or use fear to motivate the behaviors it desires, but a democracy depends on widely internalized acceptance of the legitimacy of rules, norms, and institutions. Blind and irrevocable trust in any particular individual or organization is never warranted. But when citizens lose trust in elected leaders, health authorities, the courts, the police, universities, and the integrity of elections, then every decision becomes contested; every election becomes a life-and-death struggle to save the country from the other side. The most recent Edelman Trust Barometer (an international measure of citizens’ trust in government, business, media, and nongovernmental organizations) showed stable and competent autocracies (China and the United Arab Emirates) at the top of the list, while contentious democracies such as the United States, the United Kingdom, Spain, and South Korea scored near the bottom (albeit above Russia).
Recent academic studies suggest that social media is indeed corrosive to trust in governments, news media, and people and institutions in general. A working paper that offers the most comprehensive review of the research, led by the social scientists Philipp Lorenz-Spreen and Lisa Oswald, concludes that “the large majority of reported associations between digital media use and trust appear to be detrimental for democracy.” The literature is complex—some studies show benefits, particularly in less developed democracies—but the review found that, on balance, social media amplifies political polarization; foments populism, especially right-wing populism; and is associated with the spread of misinformation.

I’ve shared figures like this before ⁠— on Twitter, ironically ⁠— but they’re worth repeating here. Every so often, the Pew Research Center publishes data on where Americans get their news. According to surveys conducted mid-last year, a 52 percent majority of Americans still say they don’t get their news from social media even “sometimes;” that number includes over a quarter of Americans under 30. (On Twitter specifically, Pew finds that 77 percent of Americans don’t use the site.) Pew’s validated voter data suggests about a 55 percent majority of the electorate was over the age of 50 in 2020; their news surveys show that only 38 percent of Americans 50-64 and 31 percent of Americans over 65 report getting any news from social media.

None of that means that Twitter and social media haven’t been influential and that those numbers aren’t steadily rising; I’m certain they’ll continue to. But the central medium of influence in American politics and cultural life is still, far and away, television, which remains a news source for 68 percent of Americans and the preferred source for the older Americans who are the likeliest to engage with politics, the likeliest to vote, and the likeliest to have their votes actually matter. I’ve written about this at more length elsewhere, so I won’t go through all the points again now. To his credit, Haidt talks a little about this in the piece, although he focuses on Fox News. He has less to say about the misdeeds of the centrist press on and offscreen; his rundown of the polarizing “misinformation” coarsening our politics doesn’t include episodes like The Atlantic misreporting the banh mi dust-up at Oberlin and other campus dramas.

In fact, the piece in some places amounts to a full-throated defense of the way The Atlantic and other outlets have covered cultural controversies over the last decade ⁠— one being offered as the right’s efforts to capitalize on their sensationalism intensify. Haidt does clarify, though, what he and his peers have found so threatening about the way young people in particular have been able to project and instantiate their cultural views on social media ⁠— we now have a less-hierarchical, more participatory discourse rather than a “‘mass audience’ all consuming the same content, as if they were all looking into the same gigantic mirror at the reflection of their own society.” “The norms, institutions, and forms of political participation that developed during the long era of mass communication are not going to work well,” he laments, “now that technology has made everything so much faster and more multidirectional, and when bypassing professional gatekeepers is so easy.”

But “the long era of mass communication” wasn’t very long at all. It was a mid-20th century anomaly  ⁠— then-new technologies of mass communication joined Americans across the country who’d previously had little to do with each other. Most of our pre-digital history was chaotic, fractious, and violent; any full account of our post-digital history and what social media’s done to us so far has to grapple with the fact that the internet, like television and radio before it, has also brought Americans together, albeit in different ways with different consequences. The hundreds or thousands of divisions that used to characterize American life sociopolitically have finally ⁠— with the aid of technologies that have bridged physical distances as fully as we might hope to short of teleportation ⁠— collapsed mostly into one single gulf between the reds and the blues.

In fact, one of the things that ought to disappoint us the most is that social media hasn’t fractured us enough. It is true, as Haidt says, that the web has been more homogenizing than many expected; I don’t think success on that front would be more room for “dissent” on progressive cultural politics than the space already taken up by anti-woke writers at The New York Times, The Atlantic, The Wall Street Journal, New York Magazine, Substack, and so on. These institutions are more responsible for the deadening sameness of our discourse than any other camp; our debates are shallow partially because they insist we have them on their terms over, and over, and over again.

There are lonely exceptions at these outlets, of course. For her first piece as an opinion columnist for the Times, sociologist Tressie McMillan Cottom took a look at our discourses around shame, which she argues ⁠—  as Julie Claire did for Gawker recently ⁠— we could use a lot more of:  

Not everyone is a bad actor when it comes to concerns about shame. There absolutely is an online outrage machine that targets people, exploits the way internet platforms work and causes psychological terror in the process. Much has been said about Justine Sacco, arguably the first person to lose her job because of the internet outrage machine, in 2013. She told a bad joke on Twitter, boarded an international flight and was fired soon after she landed. It was harsh. Sacco took a hit professionally. But today she is working in the profession for which she trained. If she was ever truly stigmatized, her very public shaming does not appear to have erased her from society.
Meanwhile, Black professionals without a hint of shame chemtrails are stigmatized for their hair, their names, their complexions and their speech patterns. Stigma — appropriately levied or not — may cause shame, but shame is not the reason Black professionals experience discrimination in the labor market. Sacco was shamed. Black workers are stigmatized. Both of those situations may feel like a public problem, but no one can credibly argue that they involve the same stakes. When we elevate shame from psychological state to social problem, we value systems of oppression that stigmatize those with the least power.
No matter what some would say about shame, public life is more plural and diverse and democratic than it has ever been. What we took for consensus in a smaller public square was really domination of those who could not afford the price of entry by those who could. The internet has lowered the cost of participation and weakened institutions’ control over what constitutes legitimate discourse. This comes with some trade-offs that are not always worth it. But shame, alone, is not evidence of a bad trade. As a secondary emotion, it matters what accompanies the shame. If a bigger public square with more equal access is the primary condition, then shame is evidence of a democratic society operating democratically.

I’m not sure about her distinction between “shame” and “stigma,” but these are broad points I’ve tried to make myself. Beyond this specific subject, I think we’re going to see more writing of this kind in the years ahead, including from me. Something, as I’ve said before, is shifting among our better commentators and critics. If I could sum it up briefly, I’d say that efforts are being made to reapproach and construct more affirmative answers to the age-old questions – What is art? What is truth? What is justice? What is character? What is the good life? – now that the gatekeepers who long managed the debates over them have been weakened or toppled. Conservatives are not wrong to perceive that the very premises underlying those questions have been philosophically challenged over the course of the last 50 to 60 years. Those challenges were to our benefit. But they didn’t make it much easier for us to understand how, exactly, we’re supposed to live. So, we want answers again ⁠— new ones, from a different set of voices.

So far, some of the new answers being offered ⁠— like the notion that shame is valuable ⁠— actually read like new takes on of some of the old ones. Here’s another that feels the same way: while it will never solve a single structural problem, it’s important for each of us to practice self-discipline, particularly with things that might easily become all-consuming. Social media is one of them. No one is actually entitled to a frictionless, pleasant time on Twitter; almost no one in America actually has to use it, professionally or otherwise. The capacity to discern what’s worth your time and what isn’t, to understand that something isn’t important or a crisis merely because it happened to pass before your glazed-over eyes ⁠— this is a good trait to develop. And things feel as stupid as they do largely because the people who need this advice the most are the least likely to take it.

Reasons to Be Cheerful

A Song

“I Against I” ⁠— Bad Brains (1986)


Don't forget to send in questions for Mail Time if you've got any: nwanevuletter@gmail.com

Bye.