How Social Media
Imperils Scholarship

It plays on our vanity and reduces research to a popularity contest

John W. Tomac for The Chronicle

John W. Tomac for The Chronicle


June 2, 2019

If you tell me you have never Googled yourself, I will presume you are lying. Some months ago, I engaged in a periodic exercise of this new sort of self-searching, and came across, two or three pages down, a cluster of profiles for various philosophers named Justin Smith — at least one, and perhaps as many as three, of whom seemed to bear some resemblance to me.

They were part of a site called, a sort of social-media platform for professional philosophers. The site featured a questionnaire that users were invited to complete, in order to match them with other philosophers according to their positions on Newcomb’s paradox and the trolley problem and so on. My profile(s) seemed to have been generated from publicly existing information: nothing illegal about that, presumably, but it was irksome to discover that I was signed up for membership in a “community” I had not consented to join.

The existence of this profile for me, on an ostensibly “professional” social-media platform, suggested a responsibility on my part to maintain it. I imagined uploading a picture of myself smiling confidently, chin resting on my hand like a happy version of the Thinker, or like some real-estate agent on his business card — an image of someone who is probably not happy but knows the rules of his trade.

Or perhaps, if I wished to comply more fully with the habitus of Anglophone academic philosophy, I could maintain my profile with a picture of myself kayaking, or on a climbing wall, or hiking the Rockies with a look on my face that says, if only for a fleeting moment, “Boy, do I know how to live.” And if I wished to deviate from those approved pastimes, I might select a photo of myself having a great time with my kids. Except that I do not have kids, and I know nothing of mountain sports.

I wrote to David Chalmers, who directs the site along with David Bourget, and in late March I heard back from him. My machine-­generated profile had been graciously removed.

I am grateful that I got tenure before social media corroded the line between what goes on in committee meetings and what transpires on the internet’s various hellsites.

The tools and rules of social media are increasingly swallowing up scholarly work, whether we join the new platforms or not. Where once discernment, hard-won expertise, and glacial, scrupulous scholarship read by only a small cabal of peers carried real weight, the research agenda is now driven by an economy of clicks, likes, follows, and retweets. There are now strong incentives to shape one’s research program in a way that promises at least a moderate sort of virality. It is not that research on Maya burial rites or Old Turkic inscriptions is disappearing entirely, but that the pressure to find catchy yet ultimately irrelevant news hooks is ever growing, along with the pressure to present one’s findings with a bold “Listen up, folks” in a crackling, clickbaity Twitter thread.

But the real thing to fear is not just a collapse of old standards of propriety. Who cares about those? The world will very likely become richer in facts about Maya burials and Old Turkic, and that is surely good, even if the old guard finds uncouth the way these things are spoken of in social media. What is to fear, rather, is that the good people who produce these facts will be increasingly exploited and exhausted, their dopamine mechanisms will be played upon by technologies that function like drugs, they will be rewarded for their unhealthy labors according to irrelevant machine-based quantitative measures that are an impediment rather than a motor of real discovery, and all of this will occur in an increasingly precarious institutional setting.

Eventually, those who do research simply out of love of knowledge might find it preferable to present their findings anonymously and gratuitously as Wikipedia editors rather than as members of the afflicted and moribund tribe of academics. Social media is hastening the arrival of that day.


Academia is not alone in dealing with the pernicious effects of this new system. With Facebook now one of the world’s largest corporations, it is not a loose analogy to say that clicks, likes, follows, page views, and so on are at the foundation of a new global economy. Clicks have radically transformed journalism, for instance, which explains in part why so many New York Times opinion pieces now have all the tone and nuance of a tweet. Increasingly, it is as tweets that they are conceived.

The same click-swipe-and-rate economy has left everyone involved in cultural production dazed and stumbling. Journalism, art, literature, and entertainment have been engulfed by a tsunami of metrics. And dare we mention love, friendship, and political community? These, too, have been absorbed by the mania of metrics coupled with so-called gamification — a treacherous imitation of play. A flood of neurochemicals saturates our dried-out brains when a heart or a thumbs-up pops up in response to a text, or when our dating profiles get a match, or when our hasty yet emphatic political opinions or our pseudo-humble tales of small daily failures are praised and echoed back to us. The more we swipe in the right direction, or achieve whatever minor virality we can get, the more we are rewarded, and the more we hone our future swipes and tweets and posts. The first flood triggers another, and we float along blissfully metricized, trading our subjectivity for an algorithm.

John W. Tomac for The Chronicle

John W. Tomac for The Chronicle

It is no surprise that higher education has, like these other noble realms, been reduced to an impatient game of ratings. In defending in the comments section of the philosophy blog Daily Nous, Chalmers explained that the social-mediazation of philosophy was complete; “that ship,” he wrote, “has long since sailed.” Before this site came along (among many others, no doubt, in other disciplines), there were already ResearchGate, Google Scholar, and (with its onslaught of nudging and deceptive junk emails to insufficiently active users). “A prominent scholar has cited you,” emails tease, as if “prominent scholars” were Instagram influencers, as if scholarly impact were a matter of fortuitous shout-outs. These platforms operate according to the same metrical principles and the same “You may also like …” algorithms with which Facebook has destroyed one institution after another.

The pressure to join in the metrics race comes from many sources. It bubbles up from below in the form of new norms of engagement among precariously employed junior scholars, who take to Twitter and Facebook out of recreation and desperation. It thunders down from above, in the form of diktats from anti-intellectual managers — or “administrators,” as we still call them — who know nothing of the thing they want to measure, but only that measurement is a convenient way of determining scholarly productivity, and who systematically mistake productivity for value.

The pressure comes from well above middle management, too, via large-scale, government-instituted ratings schemes, such as the College Scorecard in the United States and the REF system in Britain. Our British peers, in particular, seem unable to do so much as assign a text for supplementary reading without filling out a form attesting to the measurable pedagogical payoff for doing so.

Finally, the pressure comes laterally, from peers who meet you at conferences and tell you they’ll look you up on, and from colleagues at department meetings who brainstorm about ways to enhance departmental visibility.


All these levels come together to create a strange conspiracy of what superficially appear to be opposing forces: the cretinous populist politicians who would prefer to see humanities departments closed altogether; the underemployed young academics on social media who are the victims of these politicians’ sinister designs but whose only available outlets come with metrics of their own; and their clueless midcareer peers who believe, or pretend to believe, that affairs in the academy are continuing the same as ever.

In philosophy there are many who celebrate social-media platforms, particularly Facebook, as contributing to the flourishing of the discipline. Thus on the blog of the American Philosophical Association, Rebecca Kukla wrote in 2018 that “Facebook in particular has created a vast set of interlocking philosophical communities. Career considerations aside, this is a good thing.” But as to those career considerations, Kukla openly admits that “staying off of social media altogether can actively harm your career.” One gets a whiff here of the sort of compulsory glee expected of every human subject under a despotic regime: The regime is worthy of celebration; moreover, you must celebrate.

Some scholars have feebly registered their disquiet about what all this might mean for institutions of research and higher learning. Jerry Z. Muller has warned poignantly, in these pages and elsewhere, of “the tyranny of metrics” as a corrupting force in higher education. Even academics who enthusiastically defend Twitter as a useful supplement to a multifaceted career — “something to put on your résumé,” as high-school students might say of debate club or 4-H — sometimes let slip that they can see where this is all heading. Thus Patrick Iber wrote in Inside Higher Ed in 2016 of what tenure requirements might look like in 2025: “one book, good teaching, a record of service, and at least three viral articles. Of all the unreasonable metrics by which we are judged, imagining someone caring about how many Twitter followers we have conjures a special sort of dread.”

Academia was already thoroughly Tinderized by the vain incentives that online platforms create for academics to go in search of ever more likes and swipes. And so it was only natural, under the circumstances, that philosophy, too, should have its own social-media platform. But at a point such as this, it’s time for people like me, those of us fortunate to be far enough along in our careers, secure enough in the expectation of future paychecks, and close enough to death, to look askance at this new reality — to declare, like Bartleby the Scrivener (with his algorithmically intractable declaration of irrational defiance): “I would prefer not to.”

What are we missing out on when we delete our profiles or SEO-for-academic-publishing PDFs? Most likely just immediate gratification, dopamine charges that allow us, unwisely, to go on mistaking the pleasurable for the good.

Deleting these noxious profiles and refusing to play the metrics game is not self-erasure. Rather, it is a strategic bet against an unknown future. It is not that we do not wish to be read or discussed or taken seriously, but only that we hope that there may still be, even in this radically transformed world, other forms of impact than those counted in impact metrics and citation indexes. I don’t know what those other forms are, but I feel their reality, almost as one feels faith.

I am grateful that my first education was in the stacks of Butler Library, at Columbia University, at a time when computer screens still had C prompts. I am grateful that I got my first job when under-­professionalized aspiring érudits like myself could still get jobs. And I am grateful that I got tenure before social media corroded the line between what goes on in committee meetings and what transpires on the internet’s various hellsites.

Sooner or later tenure candidates in Ohio will be PayPaling a click factory in China to help them inflate their numbers.

But woe unto those who come after me. One deliriously meta scholarly article from 2010, in the Journal of Scholarly Publishing, reminiscent of some late-late-Aristotelian commentary upon a commentary upon a commentary, declares that “Researchers should have an interest in ensuring that their articles are indexed by academic search engines such as Google Scholar, IEEE Xplore, PubMed, and, which greatly improves their ability to make their articles available to the academic community.” The authors go on, in the strange broken English that has become a telltale sign of this new sort of human/machine hybrid writing: “Not only should authors take an interest in seeing that their articles are indexed, they also should be interesting [sic] in where the articles are displayed in the results list.” Some major publishing operations, such as the European giant Elsevier, seem to make no distinction at all between an author publishing with their press (as it is still quaintly called) and the same author’s vast duties in promoting that publication by ensuring its “discoverability” through managing keywords in an author’s portal.

I have not yet heard of tenure committees taking into consideration information about a candidate’s followings on or the other sites, as they attempt to take the measure of his career success. But it will happen sooner or later. And sooner or later tenure candidates in Ohio will be PayPaling click factories in China to help them inflate their numbers artificially. And after that has gone on for some time, candidates will be required to submit, along with their dossiers, proof that the information has been run through some trusted anti-­click-factory certification software, and the metrics have been shown to be authentic. And eventually a way will be discovered to game the certification process, too, and so on, and at each stage academics will be drawn even further away from their ostensible object of study, Old Turkic inscriptions or Elizabethan verse, the thing they once imagined, in graduate school, was worthy of a lifetime of loving dedication.

The Tinderization of scholarship is only the local inflection of a problem that now plagues nearly every sector of society. The restaurant industry has been Yelpified, and it is worrisome enough for restaurateurs and their employees to be subject to the petulance and the megrims of disgruntled customers. But the problem is deeper still. National parks can now be reviewed on Yelp, too, as if “God’s big show” — to cite John Muir, speaking of Yosemite — were a business, as if any mortal had the right to review it.

Scholarship is more like Yosemite than it is like the neighborhood Olive Garden. Of course it must be reviewed, but it must be reviewed by other scholars — a special class of mortals, as fallible as any other, but committed to treating work under review with the seriousness it deserves. The shift of so much of what academics now do to social media, or social-media-like platforms, threatens to circumvent the channels that have long ensured this seriousness.

There are arguments in favor of such circumvention in many domains of our shared social life. It gives the people a voice, it takes the power of taste making away from the snooty elites and delivers art, entertainment, and food that the great masses of men and women actually want. (Yet witness, a few years into this new cultural reality, the domination of inane action-hero movies at the box office, so formulaic and convention-ridden that, if not written by algorithm, they may as well be.) Still, scholarship is, or is supposed to be, a sort of expertise, and thus, by definition, something about which important decisions must not be made by people who lack that expertise. The arguments in favor of democratization of restaurant or movie reviewing simply cannot be extended to the work academics do.

Anyone who has spent time building up a profile on, or even just mentioning his or her work in the academic corners of Facebook or Twitter, will have noticed that some topics receive warmer responses than do others (which is to say that some topics give you a better dopamine rush than others do). Such dopamine hits, moreover, are fairly reliable indicators of reactions to come, with consequences for getting grants, tenure and promotion, and landing articles in highly selective journals. By now most people involved in decisions about grants, tenure, and article acceptance are at least somewhat aware of the rhythm and the pulse of social media. They are no longer making their decisions in an environment uncontaminated by these, even if it is still their professional responsibility to pretend that this is what they are doing.

I myself work on some topics that share a pulse with the social-­media-based academic zeitgeist, as well as on some topics that do not. Some historical figures and ideas that interest me seem tailored to be praised for their concern with that barely understood and seldom examined thing called “diversity,” which is so highly valued within the bureaucratic culture of woke managerialism now predominant in the mesh between American universities and big tech.


For example, I have been working for some years on Anton Wilhelm Amo, an 18th-century African philosopher active in Germany, and I wrote a book four years ago on the history of the concept of race. When I have spoken about these research interests on social media, American academics have showered me with likes and praised me for my commitment to diversity. It has not come to this yet, but someday soon we may encounter a situation in which we might show to the managers above us in the university hierarchy digital evidence for our commitment to that ideal. One feels already, when the likes start coming in for such expressions, that they are smoothing the paths of our future careers.

The flood of likes slows to a trickle when I post Amo’s tables of forms of syllogism but omit his name. This material is dry and boring if you have no interest in the syllogistic art, yet I am quite certain that to speak of him in this way, focusing on his work rather than his identity, is exactly what he would have wanted.

The inane fashions and unreflective group-think that reign in Anglophone academia are driven by what happens online. Some clueless, and usually older, academics still try to pretend this is all extramural. There were always inane fashions in academia, of course. Deconstruction and its related games were played for decades in American humanities departments as a result of little more than occasional on-campus visits from bloated Parisian mandarins. But back then the idle chatter that spawned new obligatory ways of speaking took place mostly in lecture halls and in seminars, intra muros. Today this chatter takes place everywhere: waiting in line at the grocery store with phone in hand; while making dinner, lying in bed, sitting on the toilet; and sometimes in classrooms, too, as students signal approval for agreed-upon norms in tweets and posts sent surreptitiously from under their desks.

This is not liberated chatter, as it has moved only from one institutional setting to another, though its new institution — social media and attendant corporate overlords — is much more diffuse than the university, and better at hiding itself. To the extent that academics conform themselves to the logic of social media, a conformation seen nakedly in websites such as, they are only hastening their own obsolescence.

The old world is crumbling. Pre-internet institutions are struggling to make their presence felt however they can. Even the pope has taken to tweeting, in what may be variously interpreted as a hip renovation of his dilapidated old temple, or as a desperate bid to stay relevant in a world that equates an absence of online metrics, of clicks and likes and follows, with nonexistence itself. It is no surprise that in this strange new world, academics are behaving no differently than the Pontifex, or exposure-craving politicians, or SoundCloud rappers, or aspiring team players projecting their can-do attitudes on LinkedIn.

My discipline, philosophy, can exist only where there are free human subjects saying what they actually think. It is strictly incompatible with the algorithmization of the self. A scholar’s views on the mind-body problem or the reality of the external world cannot be captured by a questionnaire and cannot be “matched” by a machine with the views of others. There can be no Tinder for philosophy. Other disciplines might mesh more easily with the new algorithmic technologies that now shape our social life, but I doubt they can do so without extinguishing the humanistic spirit that has long animated them. The more technical domains of the STEM disciplines will continue to flourish. But humanistic inquiry cannot survive within a university swallowed up and denatured by the Gargantua of social media. 

Justin E.H. Smith is a professor of history and philosophy of science at the University of Paris Diderot, and is the author of Irrationality: A History of the Dark Side of Reason (Princeton University Press, 2019).