Mark Zuckerberg Has a Lot of Homework to Do

In two days on Capitol Hill, the Facebook chief promised numerous lawmakers that he would get back to them with answers to their questions. We counted up that workload.

Хостинг сайтов Joinder.Pro

Facebook to End News Feed Experiment in 6 Countries That Magnified Fake News

The social network is ending Explore, an experiment in countries like Bolivia and Cambodia where it had separated news and other publishers from its main site.

Хостинг сайтов Joinder.Pro

The Shift: On Russia, Facebook Sends a Message It Wishes It Hadn’t

Most of the coverage of Russian meddling involves their attempt to effect the outcome of the 2016 US election. I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal.

Хостинг сайтов Joinder.Pro

Fact-Checking a Facebook Executive’s Comments on Russian Interference

Advertisement

President Trump cited tweets by Rob Goldman as proof that Russia’s disinformation campaign was about something other than giving him an election victory.CreditTom Brenner/The New York Times

Rob Goldman, vice president for ads at Facebook, posted an eight-part thread on Twitter late Friday about his company’s role in Russian disinformation — and quickly caused a firestorm.

In his messages, Mr. Goldman discussed the indictment of 13 Russians and three companies accused of carrying out a scheme to subvert the 2016 election. Facebook was frequently mentioned in the indictment as the main tech tool that the Russians had used to tilt the election in favor of Donald J. Trump.

Mr. Goldman defended Facebook in his tweets, saying that the Russian-bought ads on the social network were not primarily aimed at swaying the vote result. His posts went viral on Saturday when President Trump cited them as proof that Russia’s disinformation campaign was about something other than giving him an election victory.

We fact-checked Mr. Goldman’s eight tweets. Here’s what we found.

“We shared Russian ads with Congress, Mueller and the American people to help the public understand how the Russians abused our system.” Tweet #1

Partly true.

When the Russians’ use of Facebook to influence the 2016 election became public last year, the company said it was sharing the Russian-bought ads with Congress and Robert S. Mueller III, the special counsel leading the investigation.

But Facebook did not directly share the ads with the American people. Instead, the House Intelligence Committee released examples of the ads ahead of congressional hearings last November.

“I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal.” Tweet #2

Not according to the indictment.

The grand jury indictment secured by Mr. Mueller asserts that the goal of Russian operatives was to influence the 2016 election, particularly by criticizing Hillary Clinton and supporting Mr. Trump and Bernie Sanders, Mrs. Clinton’s chief rival for the Democratic nomination.

The Russians “engaged in operations primarily intended to communicate derogatory information about Hillary Clinton, to denigrate other candidates such as Ted Cruz and Marco Rubio, and to support Bernie Sanders and then-candidate Donald Trump,” the indictment said.

Mr. Goldman later wrote in another tweet that “the Russian campaign was certainly in favor of Trump.”

“The majority of the Russian ad spend happened AFTER the election.” Tweet #3

True, but here is some context.

According to figures published by Facebook last October, 44 percent of the Russian-bought ads were displayed before the 2016 election, while 56 percent were shown afterward. Mr. Goldman asserted that those figures were not published by the “mainstream media” — however, many mainstream news outlets did print those numbers, including CNN, Reuters and The Wall Street Journal.

“The main goal of the Russian propaganda and misinformation effort is to divide America by using our institutions, like free speech and social media, against us.” Tweet #4

Not exactly.

The indictment does show that Russian operatives used social media — particularly Facebook — to try to sow division among Americans. But to reiterate, the indictment said that the Russians’ goal was to sway the 2016 election toward a particular outcome. That aim was pursued not just through ads, which Mr. Goldman focuses on, but through Facebook pages, groups and events.

“The single best demonstration of Russia’s true motives is the Houston anti-islamic protest. Americans were literally puppeted into the streets by trolls who organized both the sides of protest.” Tweet #5

This needs context.

The protests in Houston in May 2016 were among many rallies organized by Russian operatives through Facebook. While the Houston protest was anti-Islamic, as Mr. Goldman said, he failed to note that the goal in promoting the demonstration was to link Mrs. Clinton’s campaign with a pro-Islamic message.

According to the indictment secured by Mr. Mueller, there were many other examples of Russian operatives using Facebook and Instagram to organize pro-Trump rallies. At one protest, the Russian operatives paid for a cage to be built, in which an actress dressed as Mrs. Clinton posed in a prison uniform.

“The Russian campaign is ongoing. Just last week saw news that Russian spies attempted to sell a fake video of Trump with a hooker to the NSA.” Tweet #6

True.

American intelligence officials have said that Russia has continued to target the American public and that it is already meddling in the 2018 midterm elections. The New York Times also reported this month on an attempt by a shadowy Russian figure to sell stolen American cyberweapons, as well as compromising material on President Trump, to the United States.

“There are easy ways to fight this. Disinformation is ineffective against a well educated citizenry. Finland, Sweden and Holland have all taught digital literacy and critical thinking about misinformation to great effect.” Tweet #7

Not exactly.

While Finland, Sweden and the Netherlands have all made efforts to teach digital literacy, those countries are still grappling with how to handle misinformation. A recent survey in Finland found that 67 percent of respondents “think fake news affects Finns’ perceptions on issues ‘a lot’ or to an ‘extreme’ degree.” Officials in Sweden and the Netherlands have also recently warned that fake news poses a threat to their governments.

“We are also taking aggressive steps to prevent this sort of meddling in the future by requiring verification of political advertisers and by making all ads on the platform visible to anyone who cares to examine them.” Tweet #8

True.

After initially dismissing concerns that it influenced the 2016 election, Facebook has announced a series of moves to prevent its future misuse. One of those steps includes verifying political advertisers through a system that combines automated and human fact checkers. The company has also said it plans to use postcards sent by regular mail to verify the identities of American political advertisers. Whether these new measures will be effective is unclear.

Correction: 

Because of an editing error, an earlier version of this article misstated the month when protests organized by Russian operatives were held in Houston. It was March 2016, not November 2017.

Sheera Frenkel covers cybersecurity from San Francisco. Previously, she spent over a decade in the Middle East as a foreign correspondent, reporting for BuzzFeed, NPR, The Times of London and McClatchy Newspapers. @sheeraf

A version of this article appears in print on , on Page B2 of the New York edition with the headline: How Do a Facebook Executive’s Tweets on Russia Match Up With the Facts?. Order Reprints | Today’s Paper | Subscribe

Advertisement

Хостинг сайтов Joinder.Pro

In Some Countries, Facebook’s Fiddling Has Magnified Fake News

The changes are being made as the company finds itself embroiled in a larger debate over its role in spreading fake news and misinformation aimed at influencing elections in the United States and other nations.

Facebook said these News Feed modifications were not identical to those introduced last fall in six countries through its Explore program, but both alterations favor posts from friends and family over professional news sites. And what happened in those countries illustrates the unintended consequences of such a change in an online service that now has a global reach of more than two billion people every month.

In Slovakia, where right-wing nationalists took nearly 10 percent of Parliament in 2016, publishers said the changes had actually helped promote fake news. With official news organizations forced to spend money to place themselves in the News Feed, it is now up to users to share information.

“People usually don’t share boring news with boring facts,” said Filip Struharik, the social media editor of Denník N, a Slovakian subscription news site that saw a 30 percent drop in Facebook engagement after the changes. Mr. Struharik, who has been cataloging the effects of Facebook Explore through a monthly tally, has noted a steady rise in engagement on sites that publish fake or sensationalist news.

A bogus news story that spread in December illustrates the problem, Mr. Struharik said. The story claimed that a Muslim man had thanked a good Samaritan for returning his lost wallet, and had warned the Samaritan of a terrorist attack that was planned at a Christmas market.

Photo

Selling newspapers in La Paz, Bolivia’s capital. The News Feed changes that Facebook has been testing in Bolivia and other countries play down nongovernmental news sources, limiting exposure to independent news reporting. Credit Gonzalo Pardo for The New York Times

The fabricated story circulated so widely that the local police issued a statement saying it wasn’t true. But when the police went to issue the warning on Facebook, they found that the message — unlike the fake news story they meant to combat — could no longer appear on News Feed because it came from an official account.

Facebook explained its goals for the Explore program in Slovakia, Sri Lanka, Cambodia, Bolivia, Guatemala and Serbia in a blog post in October. “The goal of this test is to understand if people prefer to have separate places for personal and public content,” wrote Adam Mosseri, head of Facebook’s News Feed. “There is no current plan to roll this out beyond these test countries.”

The company did not respond to a list of questions about the Explore program, but Mr. Mosseri said in a statement on Friday that the company took its role as a “global platform for information” seriously.

“We have a responsibility to the people who read, watch and share news on Facebook, and every test is done with that responsibility in mind,” he said.

The impact of the changes to the News Feed were also felt in Cambodia. Months into the experiment (Facebook hasn’t said when it will end), Cambodians still don’t know where to find trusted, established news on Facebook, said Stuart White, editor of The Phnom Penh Post, an English-language newspaper.

Nongovernmental organizations working on issues like education and health care also complained that the changes broke down lines of communication to Cambodians in need.

Facebook has become particularly important in Cambodia. The country’s leader, Hun Sen, has cracked down on political opponents, activists and media, effectively transforming the struggling democracy into a one-party state. Journalists have been arrested, newspapers have been shut down, and Facebook has emerged as an important, more independent channel for information.

That is, if you can find that information. Mr. White recalled a conversation this month with a friend who casually observed the lack of political conversation on Facebook.

“He said he thought the government had banned politics on Facebook,” Mr. White said. “He had no idea that Facebook had created Explore or was placing news there. He’s a young, urbanite, English-speaking Cambodian. If he didn’t know about it, what do you think the effects are on other parts of the country?”

In Bolivia, the alterations to the News Feed also occurred in a country where the government and the press have found themselves at odds, with news sites like Página Siete frequently criticizing Mr. Morales, a left-wing populist who has accumulated enormous power since being elected president in 2006.

“We became the only media to take on the government,” said Rodolfo Huallpa, the web editor of Página Siete. Half of the site’s traffic came from social media, with the lion’s share of that from Facebook, he said. Since Explore was introduced, overall web traffic to the site has dropped 20 percent.

Photo

The newsroom of The Daily News in Colombo, Sri Lanka. Sri Lanka is one of the countries where Facebook is testing News Feed changes amid a larger debate over its role in spreading fake news and misinformation. Credit Kuni Takahashi for The New York Times

The loss of visitors from Facebook was readily apparent in October, and Mr. Huallpa could communicate with Facebook only through a customer service form letter. He received an automatic reply in return.

After complaints from other outlets, Facebook eventually released a statement on a blog in Spanish explaining the Explore feature and the testing being done in Bolivia and other countries. But Facebook offered no means to contact it, Mr. Huallpa said.

“We can’t talk to Zuckerberg, we can’t even talk to a customer service representative,” said Isabel Mercado, the editor of Página Siete, referring to Facebook’s chief executive, Mark Zuckerberg.

The Explore experiment has reduced traffic by 30 to 60 percent at the website of Los Tiempos, the main newspaper of Cochabamba, the country’s fourth-largest city, said Fabiola Chambi, the publication’s web editor.

Ms. Chambi, however, fears the main consequence of the Explore function will be deepening polarization in a country already divided by ideology. “It’s good to see things from your friends and your family, but there needs to be diversity of information,” she said. “The miscellany is good.”

Bolivia has also seen an increase in fake news as the established news sites are tucked behind the Explore function.

During nationwide judicial elections in December, one post widely shared on Facebook claimed to be from an election official saying votes would be valid only if an X was marked next to the candidate’s name. Another post that day said government officials had put pens with erasable ink in the voting booths.

Vladimir Tirado, a social media expert in Bolivia, said the government might simply begin paying for posts to appear on users’ News Feeds, an option that he said most newsrooms could not afford.

“Whoever has more money will appear more,” Mr. Tirado said. “In this sense, the government will certainly win.”

Ms. Chambi of Los Tiempos said her newsroom hardly had enough money to pay its journalists to report stories, let alone to distribute them as paid posts on Facebook. The situation has left her uneasy about the role that the tech giant may play in her country.

“It’s a private company — they have the right to do as they please, of course,” she said. “But the first question we asked is ‘Why Bolivia?’ And we don’t even have the possibility of asking why. Why us?”

Continue reading the main story

Хостинг сайтов Joinder.Pro

Facebook Overhauls News Feed to Focus on What Friends and Family Share

“When people are engaging with people they’re close to, it’s more meaningful, more fulfilling,” said David Ginsberg, director of research at Facebook. “It’s good for your well-being.”

Facebook has been under fire for months over what it shows people and whether its site has negatively influenced millions of its users. The company has been dogged by questions about how its algorithms may have prioritized misleading news and misinformation in News Feeds, influencing the 2016 American presidential election as well as political discourse in many countries. Last year, Facebook disclosed that Russian agents had used the social network to spread divisive and inflammatory posts and ads to polarize the American electorate.

Those issues have landed Facebook in front of lawmakers, who have grilled the company about its influence last year. Next Wednesday, Facebook is set to appear at another hearing on Capitol Hill, along with Twitter and YouTube, about the online spread of extremist propaganda.

The repercussions from Facebook’s new News Feed changes will almost certainly be far-reaching. Publishers, nonprofits, small business and many other groups rely on the social network to reach people, so de-emphasizing their posts will most likely hurt them. Adam Mosseri, vice president of product management at Facebook, who is responsible for running the News Feed, acknowledged that “there will be anxiety” from partners and publishers who often complain about the constant changes in what will be shown across the network.

Photo

Facebook said it would prioritize what users’ friends and family share and comment on in the News Feed while de-emphasizing content from publishers and brands.

The change may also work against Facebook’s immediate business interests. The company has long pushed users to spend more time on the social network. With different, less viral types of content surfacing more often, people could end up spending their time elsewhere. Mr. Zuckerberg said that was in fact Facebook’s expectation, but that if people end up feeling better about using the social network, the business will ultimately benefit.

Changes to Facebook’s News Feed are not new. The Silicon Valley company constantly experiments with what shows up in the News Feed, and in the past it has also said it would prioritize posts from users’ friends and family. But Thursday’s shift goes beyond previous changes by prioritizing posts that have generated substantive interactions. A long comment on a family member’s photo, for instance, might be highlighted in the News Feed above a video that has fewer comments or interactions between people.

Facebook has conducted research and worked with outside academics for months to examine the effects that its service has on people. The work was spurred by criticism from politicians, academics, the media and others that Facebook had not adequately considered its responsibility for what it shows its users.

After the 2016 election, for instance, Mr. Zuckerberg initially shrugged off qualms about Facebook’s effect on the outcome, even as outsiders pointed to the proliferation of fake news stories on the site that had attacked Hillary Clinton. Mr. Zuckerberg later said he had been too hasty and dismissive of the concerns. More recently, he began signaling that Facebook was rethinking what it shows people on the site.

Last week, he posted on Facebook about his goals for 2018, including “making sure that time spent on Facebook is time well spent” and adding that “this will be a serious year of self-improvement and I’m looking forward to learning from working to fix our issues together.”

On Thursday, he said many of the discussions about Facebook’s responsibilities had prompted the company “to get a better handle on some of the negative things that could happen in the system.”

“Just because a tool can be used for good and bad, that doesn’t make the tool bad — it just means you need to understand what the negative is so that you can mitigate it,” he said.

Facebook and other researchers have particularly homed in on passive content. In surveys of Facebook users, people said they felt the site had shifted too far away from friends and family-related content, especially amid a swell of outside posts from brands, publishers and media companies.

“This big wave of public content has really made us reflect: What are we really here to do?” Mr. Zuckerberg said. “If what we’re here to do is help people build relationships, then we need to adjust.”

Mr. Zuckerberg said he was now focusing his company around the new approach. Product managers are being asked to “facilitate the most meaningful interactions between people,” rather than the previous mandate of helping people find the most meaningful content, he said.

Mr. Zuckerberg added that his way of running Facebook has shifted since the birth of his two daughters, Maxima and August, in recent years. He said he had rethought the way he views his and Facebook’s legacy, even if it will cost the company in the short term.

“It’s important to me that when Max and August grow up that they feel like what their father built was good for the world,” Mr. Zuckerberg said.

Continue reading the main story

Хостинг сайтов Joinder.Pro

State of the Art: How the Internet Is Loosening Our Grip on the Truth

In a 2008 book, I argued that the internet would usher in a “post-fact” age. Eight years later, in the death throes of an election that features a candidate who once led the campaign to lie about President Obama’s birth, there is more reason to despair about truth in the online age.

Why? Because if you study the dynamics of how information moves online today, pretty much everything conspires against truth.

You’re Not Rational

The root of the problem with online news is something that initially sounds great: We have a lot more media to choose from.

In the last 20 years, the internet has overrun your morning paper and evening newscast with a smorgasbord of information sources, from well-funded online magazines to muckraking fact-checkers to the three guys in your country club whose Facebook group claims proof that Hillary Clinton and Donald J. Trump are really the same person.

A wider variety of news sources was supposed to be the bulwark of a rational age — “the marketplace of ideas,” the boosters called it.

But that’s not how any of this works. Psychologists and other social scientists have repeatedly shown that when confronted with diverse information choices, people rarely act like rational, civic-minded automatons. Instead, we are roiled by preconceptions and biases, and we usually do what feels easiest — we gorge on information that confirms our ideas, and we shun what does not.

This dynamic becomes especially problematic in a news landscape of near-infinite choice. Whether navigating Facebook, Google or The New York Times’s smartphone app, you are given ultimate control — if you see something you don’t like, you can easily tap away to something more pleasing. Then we all share what we found with our like-minded social networks, creating closed-off, shoulder-patting circles online.

That’s the theory, at least. The empirical research on so-called echo chambers is mixed. Facebook’s data scientists have run large studies on the idea and found it wanting. The social networking company says that by exposing you to more people, Facebook adds diversity to your news diet.

Others disagree. A study published last year by researchers at the IMT School for Advanced Studies Lucca, in Italy, found that homogeneous online networks help conspiracy theories persist and grow online.

“This creates an ecosystem in which the truth value of the information doesn’t matter,” said Walter Quattrociocchi, one of the study’s authors. “All that matters is whether the information fits in your narrative.”

No Power in Proof

Digital technology has blessed us with better ways to capture and disseminate news. There are cameras and audio recorders everywhere, and as soon as something happens, you can find primary proof of it online.

You would think that greater primary documentation would lead to a better cultural agreement about the “truth.” In fact, the opposite has happened.

Consider the difference in the examples of the John F. Kennedy assassination and 9/11. While you’ve probably seen only a single film clip of the scene from Dealey Plaza in 1963 when President Kennedy was shot, hundreds of television and amateur cameras were pointed at the scene on 9/11. Yet neither issue is settled for Americans; in one recent survey, about as many people said the government was concealing the truth about 9/11 as those who said the same about the Kennedy assassination.

Documentary proof seems to have lost its power. If the Kennedy conspiracies were rooted in an absence of documentary evidence, the 9/11 theories benefited from a surfeit of it. So many pictures from 9/11 flooded the internet, often without much context about what was being shown, that conspiracy theorists could pick and choose among them to show off exactly the narrative they preferred. There is also the looming specter of Photoshop: Now, because any digital image can be doctored, people can freely dismiss any bit of inconvenient documentary evidence as having been somehow altered.

This gets to the deeper problem: We all tend to filter documentary evidence through our own biases. Researchers have shown that two people with differing points of view can look at the same picture, video or document and come away with strikingly different ideas about what it shows.

That dynamic has played out repeatedly this year. Some people look at the WikiLeaks revelations about Mrs. Clinton’s campaign and see a smoking gun, while others say it’s no big deal, and that besides, it’s been doctored or stolen or taken out of context. Surveys show that people who liked Mr. Trump saw the Access Hollywood tape where he casually referenced groping women as mere “locker room talk”; those who didn’t like him considered it the worst thing in the world.

Lies as an Institution

One of the apparent advantages of online news is persistent fact-checking. Now when someone says something false, journalists can show they’re lying. And if the fact-checking sites do their jobs well, they’re likely to show up in online searches and social networks, providing a ready reference for people who want to correct the record.

But that hasn’t quite happened. Today dozens of news outlets routinely fact-check the candidates and much else online, but the endeavor has proved largely ineffective against a tide of fakery.

That’s because the lies have also become institutionalized. There are now entire sites whose only mission is to publish outrageous, completely fake news online (like real news, fake news has become a business). Partisan Facebook pages have gotten into the act; a recent BuzzFeed analysis of top political pages on Facebook showed that right-wing sites published false or misleading information 38 percent of the time, and lefty sites did so 20 percent of the time.

“Where hoaxes before were shared by your great-aunt who didn’t understand the internet, the misinformation that circulates online is now being reinforced by political campaigns, by political candidates or by amorphous groups of tweeters working around the campaigns,” said Caitlin Dewey, a reporter at The Washington Post who once wrote a column called “What Was Fake on the Internet This Week.”

Ms. Dewey’s column began in 2014, but by the end of last year, she decided to hang up her fact-checking hat because she had doubts that she was convincing anyone.

“In many ways the debunking just reinforced the sense of alienation or outrage that people feel about the topic, and ultimately you’ve done more harm than good,” she said.

Other fact-checkers are more sanguine, recognizing the limits of exposing online hoaxes, but also standing by the utility of the effort.

“There’s always more work to be done,” said Brooke Binkowski, the managing editor of Snopes.com, one of the internet’s oldest rumor-checking sites. “There’s always more. It’s Sisyphean — we’re all pushing that boulder up the hill, only to see it roll back down.”

Yeah. Though soon, I suspect, that boulder is going to squash us all.

Continue reading the main story

Хостинг сайтов Joinder.Pro