A select few Facebook users are seeing a survey pop up immediately below news posts. It's the company's first publicly visible step toward addressing a growing - and global - fake news scandal.
Under selected posts on Facebook that link to news stories, Facebook now includes a "survey" of sorts containing just a single question:
"To what extent do you think that this link's title uses misleading language?" was the question in one example, while another read, "To what extent do you think this link's title withholds key details of the story?"
Users can then answer in one of five ways:
Not at all - slightly - somewhat - very much - completely.
Facebook has not spoken publicly about the new surveys, which came to light after Facebook users began posting screenshots of them online.
Only a handful of Facebook users have reported seeing the survey, suggesting Facebook is conducting a limited test at the moment.
It's unclear why the company chose articles from The Philadelphia Inquirer and a UK-based comedy outlet called Chortle - both of which are legitimate media outlets - in order to conduct their survey.
A complex problem
Facebook's survey is the first publicly visible step the company has taken since it came under pressure to combat fake news after the US presidential election on November 8.
In a Facebook post just over a week later, CEO Mark Zuckerberg wrote, "The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible." He has rejected assertions that fake news on Facebook had an impact on the US presidential election.
On December 4, a man motivated by false news fired two shots inside a Washington, D.C. restaurant that had been named in an article as the center of a child sex ring. The "leak" had supposedly come from an email sent by the campaign chief of Hillary Clinton.
According to an independent report by Buzzfeed news, fake news articles outperformed real news articles on Facebook in the run-up to the election vote. Today, Buzzfeed published the results of an exclusive Ipsos poll showing that 75% of American adults who see fake news believe it is true.
As seriously as Google?
Google, which has also faced criticism after the election, has since promised to restrict advertisements from their AdSense platform on sites that "misrepresent, misstate or conceal information about the publisher, the publisher's content, or [its] primary purpose." In doing so the company will cut the publisher off from a potentially critical source of in-site revenue.
Google faced online embarrassment when, after the US election, the top search result for "final election numbers" led to a fake news site boasting an incorrect answer.
University students in the US have also developed a third-party extension to Google's Chrome browser that serves as "anti-fakery" software. Called FiB, it flags news posts as either "verified" or "not verified," with the algorithm querying reputable news sites to see whether they too have published stories on a given subject.
The tool could be especially useful to US students in middle school, high school or college, who in a recent study by Stanford researchers showed a "stunning and dismaying consistency" to evaluate the credibility of information on Twitter. The researchers called the results a "threat to democracy."
An international problem
The fake news problem extends well beyond the US. In Italy, a recent study from just before the country's referendum showed fake news that contained the word "referendum" outperformed real news in the run-up to the vote on Facebook, Twitter, Google Plus and LinkedIn. The "no" vote on Sunday led to the resignation of the country's prime minister, Matteo Renzzi.
German members of parliament, meanwhile, have already been briefed on the threat posed by false news ahead of the 2017 elections. Those elections will see German Chancellor Angela Merkel run for a fourth term.
Merkel spoke of her concerns that public opinion was being "manipulated" by fake news and social bots in late November.
"In order to reach people, to inspire people, we need to deal with this phenomenon and - where necessary - regulate it," she said during a speech in the German Bundestag.
If Facebook's first step is small, the company has said it plans to strengthen detection of fake news, make it easier to report, introduce third-party verification and warnings, and to disrupt "fake news economics."
A tweet from December 3 suggests Facebook may be testing more than one survey in an attempt to assess the extent to which fake news is impacting its own reputation.