Author: Jonathan Shieber

Sheryl Sandberg says Facebook leadership should have spoken sooner, is open to regulation

The days of silence from Facebook’s top executives after the company banned the political advisory service Cambridge Analytica from its platform were a mistake, according to Sheryl Sandberg.

In a brief interview on CNBC, Sandberg said that the decision for her and company chief executive and founder Mark Zuckerberg to wait before speaking publicly about the evolving crisis was a mistake.

“Sometimes we speak too slowly,” says Sandberg. “If I look back I would have had Mark and myself speak sooner.”

It was the only significant new word from the top level of leadership at Facebook following the full-court press made by Mark Zuckerberg yesterday.

The firestorm that erupted over Facebook’s decision to ban Cambridge Analytica — and the ensuing revelations that the user data of 50 million Facebook users were accessed by the political consulting and marketing firm without those users’ permission — has slashed Facebook stock and brought calls for regulation for social media companies.

Even as $60 billion of shareholder value disappeared, Zuckerberg and Sandberg remained quiet.

The other piece of information from Sandberg’s CNBC interview was her admission that the company is “open” to government regulation. But even that formulation suggests what is a basic misunderstanding at best and cynical contempt at worst for the role of government in the process of protecting Facebook’s users.

Ultimately, it doesn’t matter whether Facebook is open to regulation or not. If the government and U.S. citizens want more controls, the regulations will come.

And it looks like Facebook’s proposed solution will end up costing the company a pretty penny as well, as it brings in forensic auditors to track who else might have abused the data harvesting permissions that the company had put in place in 2007 and only sunset in 2015. 

Before the policy change, companies that aggressively acquired data from Facebook would come in for meetings with the social media company and discuss how the data was being used. One company founder — who was a power user of Facebook data — said that the company’s representatives had told him “If you weren’t pushing the envelope, we wouldn’t respect you.”

Collecting user data before 2015 was actually something the company encouraged, under the banner of increased utility for Facebook users — so that calendars could bring in information about the birthdays of friends, for instance.

Indeed, the Obama campaign used Facebook data from friends in much the same way as Cambridge Analytica, albeit with a far greater degree of transparency.

The issue is that users don’t know where their data went in the years before Facebook shut the door on collection of data from a users’ network of friends in 2015.

That’s what Facebook — and the government is trying to find out.


After selling his company to Facebook for $19B, Brian Acton joins #deleteFacebook

Brian Acton, the co-founder of messaging service WhatsApp (which Facebook bought in 2014 for $19 billion), is now joining the chorus of the #deletefacebook movement.

A tipster alerted us to the fact that Acton made the same call… on Facebook… as well.

Since the sale of WhatsApp (which has made Acton an incredibly wealthy man), Acton has been actively financing more secure (and private) messaging platforms for users.

Acton has already used some of his WhatsApp wealth to give $50 million to the Signal Foundation.

While some may say it’s hypocritical to reap millions from Facebook and then call for users to jump ship, Acton has always had a penchant for supporting privacy. Back in its earliest days, WhatsApp’s stated goal was to never make money from ads:

Why we don’t sell ads

No one wakes up excited to see more advertising, no one goes to sleep thinking about the ads they’ll see tomorrow. We know people go to sleep excited about who they chatted with that day (and disappointed about who they didn’t). We want WhatsApp to be the product that keeps you awake… and that you reach for in the morning. No one jumps up from a nap and runs to see an advertisement.

Advertising isn’t just the disruption of aesthetics, the insults to your intelligence and the interruption of your train of thought. At every company that sells ads, a significant portion of their engineering team spends their day tuning data mining, writing better code to collect all your personal data, upgrading the servers that hold all the data and making sure it’s all being logged and collated and sliced and packaged and shipped out… And at the end of the day the result of it all is a slightly different advertising banner in your browser or on your mobile screen.

Remember, when advertising is involved you the user are the product. – June 18, 2012 — WhatsApp blog

It may be that this latest scandal was the straw that borked the camel’s back.

I’ve reached out to Acton for comment.

Facebook hired a forensics firm to investigate Cambridge Analytica as stock falls 7%

Hoping to tamp down the furor that erupted over reports that its user data was improperly acquired by Cambridge Analytica, Facebook has hired the digital forensics firm Stroz Friedberg to perform an audit on the political consulting and marketing firm.

In a statement, Facebook said that Cambridge Analytica has agreed to comply and give Stroz Friedberg access to their servers and systems.

Facebook has also reached out to the whistleblower, Christopher Wylie, and Aleksandr Kogan, the Cambridge University professor who developed an application that collected data that he then sold to Cambridge Analytica.

Kogan has consented to the audit, but Wylie, who has positioned himself as one of the architects for the data collection scheme before becoming a whistleblower, declined, according to Facebook.

The move comes after a brutal day for Facebook’s stock on the Nasdaq stock exchange. Facebook shares plummeted 7 percent, erasing roughly $40 billion in market capitalization amid fears that the growing scandal could lead to greater regulation of the social media juggernaut.

Indeed both the Dow Jones Industrial Average and the Nasdaq fell sharply as worries over increased regulations for technology companies ricocheted around trading floors, forcing a sell-off.

“This is part of a comprehensive internal and external review that we are conducting to determine the accuracy of the claims that the Facebook data in question still exists. This is data Cambridge Analytica, SCL, Mr. Wylie, and Mr. Kogan certified to Facebook had been destroyed. If this data still exists, it would be a grave violation of Facebook’s policies and an unacceptable violation of trust and the commitments these groups made,” Facebook said in a statement.

However, as more than one Twitter user noted, this is an instance where they’re trying to close Pandora’s Box but the only thing that the company has left inside is… hope.

The bigger issue is that Facebook had known about the data leak as early as two years ago, but did nothing to inform its users — because the violation was not a “breach” of Facebook’s security protocols.

Facebook’s own argument for the protections it now has in place is a sign of its too-little, too-late response to a problem it created for itself with its initial policies.

“We are moving aggressively to determine the accuracy of these claims. We remain committed to vigorously enforcing our policies to protect people’s information. We also want to be clear that today when developers create apps that ask for certain information from people, we conduct a robust review to identify potential policy violations and to assess whether the app has a legitimate use for the data,” the company said in a statement. “We actually reject a significant number of apps through this process. Kogan’s app would not be permitted access to detailed friends’ data today.”

It doesn’t take a billionaire Harvard dropout genius to know that allowing third parties to access personal data without an individual’s consent is shady. And that’s what Facebook’s policies used to allow by letting Facebook “friends” basically authorize the use of a user’s personal data for them.

As we noted when the API changes first took effect in 2015:

Apps don’t have to delete data they’ve already pulled. If someone gave your data to an app, it could go on using it. However, if you request that a developer delete your data, it has to. However, how you submit those requests could be through a form, via email, or in other ways that vary app to app. You can also always go to your App Privacy Settings and remove permissions for an app to pull more data about you in the future.

Facebook ends its experiment with the alternative ‘explore’ news feed

 Facebook is ending its short-lived (and misguided) experiment with the alternative news feed feature called “Explore.” In a blog post today, Facebook head of news feed Adam Mosseri wrote: We constantly try out new features, design changes and ranking updates to understand how we can make Facebook better for everyone. Some of these changes—like Reactions, Live Video, and… Read More

Facebook is creating a news section in Watch to feature breaking news

 Facebook is going to create a new news section in its video streaming platform Facebook Watch to feature breaking news stories. The move, which Campbell Brown, the company’s year-old head of news partnerships, announced onstage at the Code Media conference in Huntington Beach, is part of a broader evolution of Facebook’s news strategy. Read More

People with chronic illnesses and disabilities get their own media channel with The Mighty

 There’s a corkboard in the office of The Mighty, the social network for people with chronic illnesses, mental health disorders and disabilities, which has pictures and letters from many of the site’s contributors and readers who have benefited from the stories the site shares. It’s there to remind staffers of the faces behind the work they do and the impact the site has. Read More