This commentary was originally published in 2017 on Forbes.com. As of 2021, Dark Posts were still available through Facebook.
Advertising is a way for consumers to get something for a subsidized cost. This has always created problematic relationships: People got glossy magazines for a low cost because they were willing to look at ads for liquor companies and overpriced financial services. People got their local newspapers for next to nothing in exchange for allowing girdle ads to infiltrate their brains.
Social media, especially Facebook, has taken down the guard rails that used to offer us at least some protection. Advertising used to be mostly transparent. Advertisers, especially those reaching for a broad audience, could face a backlash if they stepped too far over the line. Consumers knew who they were dealing with, so they could choose not to be sold by finance companies and buy index funds instead. Your grandmother figured out girdles were uncomfortable. At a certain point, you realized models were mostly made-up.
In addition to transparency, the other guard rail in the system was people. There were executives, ad sales people, designers and publishers, buyers and sellers, who could step in. They didn’t always, but sometimes they did. At one of my first jobs as an editor at The Central Penn Business Journal in Harrisburg, Pa., I remember an ad saleswoman coming into my office and saying, “I’m really not comfortable with this ad. This company is lying.”
In ways that we are only beginning to recognize, social media companies, perhaps especially Facebook, have removed the transparency and the breadth of our conversations, which lent a little accountability to the world of advertising. And more importantly, they have cut people out of the system almost entirely, and turned the monitoring over to algorithms. Maybe someday we’ll create morally capable algorithms, but that time is not yet.
Facebook, like many tech companies, has tried to sell itself as a do-good company – and it has connected billions of people across the world. But judging by the results, the need to scale turned into an unfettered desire for profits. Facebook earns more than $1 million for each of its 17,000 employees. The hyper efficient business model has made Mark Zuckerberg and many other employees incredibly wealthy. CNBC reported Friday that he will sell stock worth between $6 billion and $12.8 billion at the current share price of $170, over the next 18 months.
Dark Posts Are Not Only Used For Politics
For instance, during the election, bogus Russian accounts bought thousands of Facebook ads – so-called Dark Posts – that appeared to amplify, as Facebook said, “divisive social and political messages across the ideological spectrum—touching on topics from LGBT matters to race issues to immigration to gun rights.”
Dark Posts, which Facebook calls “unpublished posts,” are those that appear outside an account’s timeline. They’re essentially untied from a company’s main branding effort and are often used by companies to target specific groups or experiment with new messages. There are a lot of them: 18% of the posts brands use are Dark Posts, according to Socialbakers, a VC-backed social monitoring firm. Facebook defines unpublished posts as anything that originates first as an ad, as opposed to a post that would appear on an advertisers’ own page; one of the common reasons for “unpublished” posts, the company says, is so advertisers don’t clutter their own pages.
It has been easy enough for someone with bad intentions to set up a fake account, apparently, and have at it. In the case of the Russian scandal there were about 3,000 posts placed by 470 accounts and pages spending about $100,000. (In 2021, a Twitter account was used in much the same way by a Chinese government official targeting Australia. — Ed.)
In influential op-ed in the New York Times, University of Virginia professor Siva Vaidhyanathan pointed out the dangers of Dark Posts’ use by political entities, because it allows campaigns to circumvent laws that try to ensure transparency in the political system. (There’s also a loophole in campaign finance laws that means social media is exempt from some of the rules ensuring transparency.)
“There is nothing mysterious or untoward about the system itself, as long as it’s being used for commerce instead of politics,” he wrote.
That seems unlikely. If Dark Posts are being used by bad actors in the political arena, they are probably being used by bad actors in commerce, where there are laws and standards that we live by, as well. And the general lack of oversight and lack of transparency that exists on Facebook likely exists across the other platforms it owns, including Instagram and WhatsApp.
There are only about 120 officially registered political parties on Facebook’s platform worldwide, according to Socialbakers, which is a pretty good indication that most political parties using Facebook do not register as such.
Zuckerberg is envisioning Facebook – not as an advertising or media company – but as a sort of public square. It’s also setting its own limits, noted Betsy Sigman, professor at Georgetown’s McDonough School of Business. “If it’s illegal, they will step in.”
“Most ads are bought programmatically through our apps and website without the advertiser ever speaking to anyone at Facebook,” Mark Zuckerberg wrote. “That’s what happened here. But even without our employees involved in the sales, we can do better.”
But Facebook isn’t really a town square. It’s a company, and its profits come from advertising. That means Facebook’s main incentive is to help advertisers do what they’ll pay for , and to write its algorithms to help them. In an environment where technology companies operate across borders and are only lightly regulated, the only counterweight to those incentives is human conscience. Facebook has 17,000 people to serve 2 billion users. The New York Times still, despite job cuts, employs more than 3,500 people to serve its 2.3 million digital subscribers.
All it takes is a little bit of life experience to be aware of the human capacity for deception, self-deception and cruelty and to be troubled by the fact that in the ad review process and content curation, it is mostly only algorithms doing the review. Algorithms are not up to the task of countering evil.
“We are always iterating” is a convenient rationalization and cold comfort to Americans whose civil society has been damaged.
We spend a lot of time looking at the lessons of success from companies like Facebook. For business owners who are both Facebook’s consumers and its customers, I think Facebook’s current predicament suggests a handful of ideas:
- Check to see that whatever advertising, especially Dark Posts, that is being done by your company is being done to standards you agree with. We are a getting a picture of how ephemeral and targeted some posts may be.
- If you are a building a tech or a tech-enabled company, there are some lessons of failure to be drawn from this case. It’s still important to have human review of what the algorithms are doing.
- It’s a fair question to ask elected officials now, why we should allow Facebook executives to take a large role in monitoring elections or other aspects of our civil society? Anyone who used Facebook during the elections was well aware of the garbage that was being served up; it’s just that most of us had no way to track down who was spawning it. Facebook did, but executives there chose not to look.
- And it’s fair to ask Facebook, at what point will there be a real and transparent attempt to ask how widespread the vulnerabilities of the platform are? Executive Holly Lynch asked just this question a couple of months ago.
What’s A Company’s Business Anyway?
When I was in the 9th grade, and attending a small public high school in Perry, Ga., Frito-Lay opened a chip plant outside of town. It employed 300 people, which was a lot in that farming community. I played oboe in the band, which was invited to come and play at the ribbon-cutting ceremony. It was a miserably hot early fall day. Our uniforms were polyester, and the speeches by the mayor and the plant manager’s were excruciatingly boring.
What we all understood from that was that jobs were something to celebrate and sacrifice for. As I learned more as a business reporter, I knew “business” had two purposes: Profits, first, and jobs, second.
Silicon Valley shifted that compact, so that the formula became profits, first, and impact, second. Its denizens wanted to change the world. In the interests of rapid scale, efficiency became paramount. In the interests of efficiency, the people had to go — out of the all-important business model. Maybe that’s OK, or necessary, in some companies. But in Facebook’s line of business, it’s led to real disasters.
The ad business has always been about making a fraught deal with our own self-destructive demons. But at least we had some power in those complicated negotiations. We could see what was happening, and we were in what we all knew was a game, together.
“No surprise that as our attention has shifted to Internet that we have sold our souls….we think for free…to Facebook and Google ,” said Bill Meehan, a McKinsey director emeritus and lecturer at Stanford Graduate School of Business.
We turned over the negotiations to Facebook’s algorithms. On our behalf, Facebook has been perfectly willing to deal with any devil at all.
This story and others on Times of E are made possible by a sponsorship from the Ewing Marion Kauffman Foundation. The Ewing Marion Kauffman Foundation is a private, nonpartisan foundation that provides access to opportunities that help people achieve financial stability, upward mobility, and economic prosperity – regardless of race, gender, or geography. The Kansas City, Mo.-based foundation uses its grantmaking, research, programs, and initiatives to support the start and growth of new businesses, a more prepared workforce, and stronger communities. For more information, visit www.kauffman.org and connect with www.twitter.com/kauffmanfdn and www.facebook.com/kauffmanfdn.