Vibes, Vectors, and the Biopolitics of Algorithmic Legitimation

it’s the first draft of the introduction to the central chapter of my vibes book

For decades, various counter- and sub-cultures used the term “vibe(s)” to describe what made them different from the mainstream. From hippies and their “good vibrations” to DJs curating dancefloor vibes to the hip hop publication VIBE Magazine, a vibe was something that pulsed to the beat of a different drummer. As late as 2006, The New York Times uses “vibe” to describe the ineffable New-Agey feel of Sedona, Arizona. 

However, in the early 2020s, “vibes” discourse got co-opted by capitalism. As Twitter user and sound studies doctoral student @AmbreLynae put it in an August, 2021 tweet, “When Black people use the word “vibe” we usually talkin bout kickin back with our friends in a cool place. When [whites] use the word “vibe” they finna gentrify a community.” Appropriated from its subcultural roots, “vibe” has now become the language of brands and entrepreneurs. For example, beauty company CHI has a Gen-Z targeted line of haircare products called “CHI Vibes” that categorizes products not by type (such as mousse, conditioner, etc.), but by vibes such as “wake & fake” or “know-it-all.” The Twitter bio of venture capital firm Bedrock co-founder Geoff Lewis identifies him as a “vibe capitalist.”

“Vibe” is Lewis’s way of describing Bedrock’s signature investment strategy, which looks for startups who orient themselves against conventional market wisdom or “narratives.” As Bedrock’s website explains, it targets startups whose views of the market and their place in it “are either too one-of-a-kind to fit with the popular narratives of the day, or they violate what the narrative gatekeepers deem plausible or possible.” The website lists examples of such narratives, including Tinder’s rejection of the view that “dating apps are always fads,” or DoorDash’s flouting of the truism “food delivery is too capital intensive” (by stealing tips from delivery workers). The idea is that startups whose narratives are dissonant with mainstream common sense have the best potential to revolutionize and perhaps dominate their sector. Regardless of this method’s success, the point is that Bedrock measures a startup’s market potential in terms of its vibe or orientation to the world–that’s what “narrative” stands for. Though they once reflected alternative ways of thinking, today “vibes” are now so mainstream as to be basic–in October 2021 Gawker editor in chief Leah Finnegan put “vibe” on the website’s list of banned words.

At the same time that “vibe”circulated as a term venture capitalists used to appear edgy, queer institutions and influencers began to treat “vibe” as a more progressive alternative to gender, at least when it comes to identity categories. First, in late 2020 a grassrootsy use of the gender-vs-vibe trope emerged on anglophone social media. For example, the account for “text-centered social app that connects queer community” Lex tweeted in December 2020 “‘are you a boy or a girl?’ i am a vibe.”

In a June 2021 TikTok, nonbinary bimbo influencer Griffin Maxwell Brooks posted a video with the description “no gender just vibes” where they say: “I”m not a man, right. But I’m also not a woman. Forget gender: I’m a vibe.” From Gen Z influencers to the brands trying to appear cool, “no gender, just vibes” has been circulating in queer online subcultures as a way to express one’s transcendence or overcoming of cisbinary gender. As Vivian Lam wrote in a November 2021 piece in Real Life magazine, internet subcultures are developing “new ways of defining and embodying gender, returning the means of construction to its users — as something less like a binary characteristic, or even an item in a list of options, and more like a “vibe.”” As Lam’s description suggests, vibe appears to offer a more expansive and customizable range of options than both the traditional white patriarchal gender binary and a demographic survey like the list of 50+ genders Facebook debuted in 2014. In this context, vibes’ fuzzy and amorphous character seems like a more liberating option than the strict boundaries of clear demographic categories. Like venture capitalist Lewis above, people who frame gender as a vibe understand vibes to be one’s unique and definitive orientation to the world. Vibes are less pre-existing categories and more perspectives or “narratives” that emerge from the specificities of one’s situatedness. In this sense, “no gender, just vibes” appears like a positive move away from traditionally oppressive systems of identity categorization. 

Although vibe may appear to be a more progressive alternative to apparent traditional binary and/or demographic gender categories, the very characteristics that lend it such an appearance—fuzzy and amorphous boundaries—are central to the methods 21st century algorithmic systems use to define the boundaries of gender categories. As John Cheney-Lippold explains, the algorithms powering recommendation systems such as search engines and targeted advertising constantly adjust or “modulate” gender categories by feeding user data back into the calculation of category boundaries. For example:

Maleness can be constantly evaluated according to feedback data, through which the definitions of maleness can shift (to include or exclude transgendered folk, for just one example) according to the logic of the algorithm. Policies that rely on gendered categorizations then can be automatically and continuously reoriented to address new populations. These algorithmic definitions then supplement existing discursive understandings of a category like gender…with statistical conclusions of what an idea like maleness actually is (Cheney-Lippold, New Algorithmic Identities, 173).

Because recommendation algorithms constantly incorporate new user data into the definition of variables like gender, this method of mathematically modeling gender categories treats them as inherently malleable, anti-essentialist, and flexible. As Cheney-Lippold’s analysis suggests, 21st century patriarchal racial capitalist methods of constructing and governing gender are evolving into something that looks more like vibes and less like the traditional essentialist gender binary. For this reason, “no gender, just vibes” is less a progressive disruption of essentialist binary gender and more an update of what Cheney-Lippold calls the “discursive understandings of a category like gender” into terms compatible with the quantitative tools states and corporations use to surveil and govern us in the 21st century.

This evolution in the discursive and mathematical governance of gender is an example of a broader evolution in biopower from the normative and normalizing regime Foucault theorized in the 1970s into a regime where qualitative and quantitative profiles/orientations are subject to legitimation rather than disciplinary normation or statistical normalization. Cheney-Lippold does note that 21st century algorithms use a different form of biopower than the one Foucault theorized:

Surveillance practices have increasingly moved from a set of inflexible disciplinary practices that operate at the level of the individual to the statistical regulation of categorical groupings…Cybernetic categorization provides an elastic relationship to power, one that uses the capacity of suggestion to softly persuade users towards models of normalized behavior and identity through the constant redefinition of categories of identity (Cheney-Lippold 177).

According to Cheney-Lippold, the main difference between the kind of biopower exercised through the algorithms fueling 21st century tech platforms and traditional Foucaultian biopower is that instead of compelling conformity to a fixed norm, this new “soft” biopower compels perpetually-reattuned adaptation to an evolving norm. 

While Cheney-Lippold is correct in identifying the fact that recommendation algorithms use a new form of biopower, his account doesn’t fully capture the extent to which this new form departs from the traditional Foucaultian model. For example, by describing the “discursive” dimensions of gender as “models of normalized behavior,” Cheney-Lippold makes it clear that he understands biopower to operate through qualitative/discursive norms and practices of normation targeted at groups rather than individuals. In Foucault’s account, disciplinary normation is targeted at individuals rather than populations, which are governed through practices of statistical normalization. Cheney-Lippold frames his departure from the Foucaultian account as a shift in the scale at which discursive norms operate caused by the use of new practices of quantification. 

While he is correct to highlight the way biopower weaves together qualitative and quantitative modes of governance, Cheney-Lippold doesn’t fully think through the extent to which changes on the quantitative side translate into changes in the qualitative side. In Foucault’s account, discursive norms work in concert with statistical norms (i.e., normal curves a.k.a. “bell curves”). However, as Cheney-Lippold correctly notes, algorithms like the recommendation systems he studies in this article use different statistical models–vectors rather than normal curves. As he explains, “the move toward a vector-based category marks a separation of the category from its normative domain and a reassociation of that category according to statistical correlation” (Cheney-Lippold 170). 21st century algorithms model individuals and groups as vectors–lines, often bending at a number of points, that point in a specific direction. As I argue in this chapter, vectors are different kinds of mathematical models than normal curves: whereas normal curves measure the relative frequency of variables in time, vectors measure the orientation of variables in space. Cheney-Lippold correctly intuits that this form of mathematical modeling manifests discursively/qualitatively as something other than a traditional disciplinary norm targeted at individual bodies. However, he frames this new discursive manifestation as a malleable norm policed at the group level:

the normalizing force of what it means to be categorized as male becomes a biopolitical activity. The category of gender in our example is a result of statistical analyses that regulate what Foucault calls the ‘conduct of conduct’, or more appropriately by discursively positioning maleness to indirectly, and largely unintentionally, control the conditions of possibilities afforded to users who are targeted as male (Foucault, 1982).” (Cheney-Lippold 177).

In Cheney-Lippold’s account, vectorial categories translate into different kinds of discursive norms, ones that are malleable rather than fixed and whose “normalizing force” regulates the “possibilities” afforded users rather than their strict conformity to a fixed standard. Whereas Cheney-Lippold thinks the discursive/qualitative dimension of the form of biopower exercised by vectorial algorithms is still a kind of norm or normativity, I argue that because vectors are mathematically different than normal curves, their discursive manifestations are not norms but orientations or, you guessed it, vibes. The kind of governance algorithms perform is not a ‘softening’ of traditional normative biopower, but an entirely new regime at both the quantitative and qualitative/discursive levels.

Just as traditional Foucaltian biopower links quantitative statistical normalization with qualitative norms and normation, this new form of biopower weaves together the quantitative practice of modeling vectors and measuring correlations among their orientations with the qualitative practice of perceiving vibes and assessing their degree of dis/alignment. For example, tech users targeted as male are not governed through their position on a spectrum of ab/normality, but through what Cheney-Lippold calls the “control [of] the conditions of possibilities afforded” to them–i.e., through control of their orientation or vibe. The degree to which a particular user’s gender profile aligns or disaligns with the vectoral direction of then-current gender categories determines what is more readily and likely to happen through and to them–i.e., their orientation.

Studying both the mathematical models and processes used by recommendation algorithms, AI and ML models, and other forms of contemporary algorithmic culture and vernacular “vibes” discourses, this chapter argues that a new form of biopower has emerged in the 21st century. This new form of biopower models the world as vectors and vibes and governs those objects by measuring the degree of their legitimacy, i.e., their capacity to contribute to the patriarchal racial capitalist distribution of property and personhood. Sexuality is still central to this form of biopower, but not at the level of individual norms and normalized populations; rather, as Melinda Cooper has argued, 21st century biopolitics polices sexual legitimacy, or the capacity of an individual or group to privately assume the costs of sexual behavior.

Because vectors and vibes are both orientations, the chapter begins by developing a definition of “orientation.” Looking to the work of Sara Ahmed and Linda Alcoff, I use feminist of color phenomenology to define orientation as the contextual situatedness or horizon from which some possibilities are more immediately realizable than others. Then, using work by scholars such as Lisa Adkins, Justin Joque, and Nick Seaver, I show how the vectoral math behind 21st century algorithms is a quantitative practice of modeling and comparing orientations. Next, I turn to the qualitative side of this form of biopower. Analyzing the vernacular use of the term “vibe” in 21st century anglophone popular culture, I show how vibes discourse is also a method of modeling and comparing orientations. Then, studying how social media users use the “vibes” hashtag across platforms like Instagram, TikTok, and Twitter, I show that vernacular vibes discourse is practically analogous to the procedures algorithms use to construct vectorial profiles. To conclude, I study the use of vibe as a musical category on streaming platforms–”no genre, just vibes,” as it were–and how this practice of discursive categorization works in concert with the recommendation algorithms fueling those same platforms to reproduce pop music’s traditional patriarchal racial capitalist gender and sexual status relations through logics of il/legitimate orientations.