Listening & Compression As Metaphors For Algorithmically Curated News Feeds
This is a revised version of a post I originally published on 9/10/14. Thanks for all the feedback! I’m still not finished with this argument, so if you have further ideas, I’d love to talk.
In her post on Twitter’s proposed switch to a Facebook-style algorithmically curated feed, Zeynep Tufecki argues that “Twitter ha[s] a signal-to-noise problem…But remember, one person’s noise is another’s signal.” Important posts–signal–get buried under avalanches of unimportant ones–noise. Algorithms are supposed to filter out the noise and let a clear and accessible signal emerge. The problem, Tufecki argues, is that users and corporate might have very different senses of what counts as “signal.” For corporate social media, signal is whatever maximizes user interaction, such as the kinds of announcements of which etiquette demands a brief salutary response. As Tufecki explains, “Facebook’s algorithm pushes such posts with the phrase “congratulations” in the comments to top of your feed.” Signal is whatever maximizes user interaction, that is, whatever gets us to do our “free labor” more efficiently.
The “signal-to-noise” metaphor is commonplace in statistics and data science; it’s even in the title of Nate Silver’s book on big data. But in the case of algorithmically processed social media timelines, I think the audio metaphor is deeper and more extensive than usual. Some of the audio engineering techniques developed by the telephone, radio, and recording industries to address their signal-to-noise problems are the conceptual and perhaps even technical bases for social media’s solutions to its signal-to-noise problems.
As Jonathan Sterne argues, “the institutional and technical protocols of telephony also helped frame…the basic idea of information that subtends the whole swath of ‘algorithmic culture.’” One of these protocols is the practice of “eliminating redundant data in a transmission,” which Sterne calls “compression.” As Sterne explains, compression began as an attempt to make phone signals work more efficiently and profitably. Most people can’t hear the full spectrum of signals in a phone call, so why waste energy and bandwidth transmitting redundant data? Thus, as Sterne explains,
By calculating the limits of human hearing, and of the parts of hearing most necessary for understanding speech, AT&T was able to create a system that only reproduced those parts of the signal…AT&T essentially created surplus bandwidth in its phone lines where none had existed before. It was able to monetize the gaps in human hearing and thereby use bandwidth more efficiently to generate more income.
Cutting out redundant data meant that you could cram more calls on the same phone line, thus reducing costs and optimizing profits. Instead of expanding infrastructure to accommodate more business, AT&T intensified the infrastructure it already had.
We can understand the Facebook News Feed algorithm as a similar attempt to intensify existing infrastructure by eliminating the “redundant” data that flows between users. First, Facebook has–or is at least widely perceived to have–reached market saturation in the US and the UK. Like AT&T, Facebook’s situation makes intensification more feasible than expansion. So, again like AT&T, Facebook intensifies the profitability of its existing user base by “compressing” the flow of signal among them. However, instead of selecting for communicative intelligibility, Facebook selects for interactivity. In the case of telephony, “redundant data” is what ‘normal’ listeners don’t need to hear to successfully interpret speech. In the case of BSD, ‘redundant data’ is what free laborers don’t need to see to do their job well (and the absence of which actually makes them more efficient). As Brian Boland puts it on his 5 June 2014 post on “Facebook for Business,” Facebook thinks “there is now far more content being made than there is time to absorb it.” So, the limit of ‘redundancy’ isn’t perceptual (intelligibility), but temporal (perhaps we could say the limit is that of labor power, which is a temporal phenomenon?). Whereas AT&T used compression to remedy the fact that it sent customers more signal than they could hear, Facebook uses its News Feed algorithm to remedy the fact that it sends customers more signal than they can process efficiently in a given amount of time. Also writing for Facebook for Business, Lars Backstrom explains: “in our tests, when we stop ranking and instead show posts in chronological order, the number of stories people read and the likes and comments they make decrease.” The News Feed algorithm compresses our friends’ transmissions to include only what we can process profitably. Like AT&T’s compression techniques, the News Feed algorithm masks the parts of the communicative flow that aren’t necessary for successful transmission–”success” here is not intelligibility, but “engagement.” Or rather, Facebook’s algorithm compresses News Feed signals not to make them more intelligible to us, but to make us more intelligible to them.
Facebook can only “hear” its users when they click and engage. As Backstrom explains, Facebook is constantly “listening to feedback. When a user likes something, that tells News Feed that they want to see more of it; when they hide something, that tells News Feed to display less of that content in the future.” Compression helps Facebook hear us by amplifying the signal (the data stream) we send back to them. The more data we generate, the more efficiently the algorithm can filter our feeds, and in turn we generate more data, to help the algorithm filter our feeds…and so on. It’s no coincidence that Facebook uses “listening” to describe their process of collecting the feedback necessary to determine what signal is redundant. In the same way that listening tests helped twentieth century audio researchers weed out the noise and hone in on the signal, “listening” helps Facebook’s data scientists compress our News Feeds.
AT&T made the wires more efficient; Facebook makes us more efficient. In both cases, redundant data is eliminated in order to “economiz[e] signals” (Sterne) and “calibrate [their] flow for maximum profitability” (Cote 140). Feed compression helps us perform our jobs as efficiently as possible, and thus allows Facebook to maximize its exploitation of our labor. Compression might not be the foundation of Facebook’s business model (as it was with AT&T), but it is nevertheless a necessary component of it–its business model can’t really work well (competitively, profitably, efficiently) without it.
The signal-to-noise metaphor was a really nice way to weave this piece together. I have no suggestions other than to keep writing. This stuff makes for really lovely sunny-porch-coffee-indian-summer-morning reading.
Have you seen this? https://anarchistwithoutcontent.wordpress.com/2014/05/23/wages-for-facebook/#more-2510