Dogs rule, cats drool – or their owners do, anyway. At least that’s the conclusion we’re drawing from a new data set released by Facebook Research.
Full disclosure: Your author is an unabashed dog person who was bitten by the last cat she tried to pet. That said, the results are pretty unambiguous.
After analyzing the aggregate, anonymized data of about 160,000 U.S. users who’ve posted photos of dogs and/or cats, Facebook found that dog-posters tend to be more extroverted, more upbeat and luckier in love than their feline-photographing friends. Meanwhile, cat people tend to be single, to express a “wider range of emotions” (including, chiefly, exhaustion and annoyance), and to harbor an unusually strong interest in fantasy, anime and science fiction.
This is fun – and funny – in the way of all big data analyses that claim to give us some profound insight into ourselves. (Cat lovers, I wouldn’t take it too seriously: There are drawbacks to this sort of analysis.) But even more interesting, for our purposes, is how Facebook arrived at these conclusions. The social network did not hire 100 people on Mechanical Turk to comb through photos tagging dogs and cats, as similar analyses might have done in the past. Instead, the company relied on its in-house image-recognition technology – a computational neural network, trained on millions of images – to automatically pick up on photos of pets.
This sort of technology is very much in vogue lately: Both Snapchat and Pinterest have recently deployed it to power more advanced search functions. Earlier this year, Facebook also championed the technology as part of its accessibility program. As of April, the system recognized, with a very high degree of confidence, about 100 objects – including babies, eyeglasses, beards, smiles, pizzas, ice creams, buildings, trees, cars and mountains.
That is an enormous win for users with disabilities, of course, who can have photos automatically described for them – and it may prove helpful in advancing search and moderation. But image recognition and “fun” forays on the subject also serve another purpose. There is a ton of personal data locked in the 2 billion new photos that people upload to Facebook, Instagram and WhatsApp everyday, and Facebook wants to get its hands on it.
Hypothetically, Facebook could use image recognition to tell which brands of clothes you wear or what kind of food you eat, even if you’ve never posted a text update on the subject. It could also tell whether you’ve recently bought a cat or dog – the better to blitz you with ads for pet food and BarkBox subscriptions.
Speaking to Digiday in April, Luis Sanz, the chief operating officer of Olapic, an image-centric marketing firm, actually used the example of dog-ownership as a sort of best-use case for thirsty advertisers.
“If many of your pictures contain dogs,” he said, “I can probably include you in an audience group that likes dogs without you having to take more actions.”
That “audience group” will, per Facebook’s latest release, also probably be targeted with ads for cheesy rom-coms and aging reality TV series, among untold other correlated products and services that Facebook is guarding internally. Inevitable? Creepy? All of the above?
. . . All of the above, probably.