Steven Levy breaks down the hidden innovation behind FaceBook’s facelift: an algorithm that purports to figure you out by what you do, moment to moment, and use it to craft what it shows you for maximal engagement.
One presumes that it’s crafted to maximize ad revenue both directly (by showing you ads you will click on) and indirectly (by causing you to act in a way that gives them MORE data to go on). Just like most parasites*, Facebook manipulates its host (you) to cause you to create more favorable conditions for its own existence.
We’re seeing this all over. Facebook ads targeted at who they think you are (though this can be subverted). Google is pimping different things and showing you different results based on what you searched and clicked on yesterday. Netflix populates your suggestions list based on algorithms so complex they don’t actually understand them, and had a famous contest to improve.
Even the New York Times tries to suggest articles for you based on what articles you’ve clicked on, and which ones you’ve scrolled to the bottom of. Recently they teamed up with a company called VisualDNA to gather market segment information directly; a survey about your intimate personal preferences, masquerading as a personality quiz.
(Hilarious thought: hook the old Purity Test up to their data scrapers. I’m sure there are marketers that’d kill for the resulting data.)
The algorithms used generally fall along one of these axes of similarity, a bit like Lyndon Hardy’s principles of Thaumaturgy:
- If you’ve liked Bob’s posts in the past, you’ll like Bob’s posts in the future. (“once together, always together”)
- If you and Bob both liked X, and he likes Y, chances are you like Y. (“like begets like”)
In short: we know what you like. And to maximize our chances of a sale, we’re going to show you things just like it. I have heard this tune before. And it’s repugnant to me.
It’s true that we are, in general, predictable creatures. Every so often someone discovers that you can predict people’s future locations by their past locations, with very high accuracy. Duh. This is because most of our life is routine.
But the interesting bits are often the non-routine moments. These are the moments that make us, well, us, and not mindless zombies.
There are a number of problems with this ‘bubble’ effect, with surrounding ourselves with only that which is blandly agreeable to us. The first is the danger of the self-reinforcing echo chamber. DuckDuckGo (which I recommend as a search engine) has a nice little explanation of this. Things like VisualDNA feed into this, as do tailored Facebook ads. Even in ads you only see yourself, or rather, the lame version of yourself that was scraped and pigeonholed by an algorithm.
The second is the death of novelty. This is more concerning. The fortuitous connection—the accidental bounds-crossing that sparks a new connection. These things are important. It’s important for us to push our boundaries a little—to get different perspectives, or just to experience something that we might find less than perfectly comfortable. An algorithm that crafts our future experience based solely on the past is blind to this important aspect of existence.
You can argue that The Algorithm can be written to introduce a bit of randomness, a bit of novelty, a bit of crossing over every so often. Throw a curve ball in there. And perhaps it can. Perhaps we should relax and sink into our perfectly crafted stew of addictive, entertaining pablum. Enjoy its carefully spaced intermittent rewards. Lie back and suck down an even flow of ‘cafeteria food’ content—content of a quality that is just good enough that you don’t up and leave. Feel daring at encountering just a touch of spice, of things you don’t usually see. All served up by those who only have our best interests at heart, if by ‘best interests’ you mean ‘hard-earned money’ and by ‘at heart’ you mean ‘in their pocket’.
I think you can guess how I feel about that, though.
(* e.g., for the strong of stomach, look up the pinworm and its method of transmission)