Alongside a picture of his Facebook employee badge and a drawing of Justice Ruth Bader Ginsburg, Adin Rosenberg posted a lengthy note Monday explaining why he was leaving the company.
“These past years working on Messenger and Instagram have helped me grow personally and professionally, and I look back at them with many fond memories,” Rosenberg wrote in a Facebook post. “However, recently I’ve been feeling a growing sense of disillusionment.”
Rosenberg, who had been a software engineer for almost six years before leaving, is one of a now-steady trickle of Facebook employees who have left in recent months and made clear that they do not see the company as a force for good.
“As a result of the company’s obsession with its growth, so many things go wrong,” Rosenberg, who did not respond to a request for comment, wrote.
Other Facebook employees who have left have offered similar sentiments. Ashok Chandwaney left Facebook last month after more than five years as an engineer working in various departments.
“It’s very clear to me after everything that’s happened, that Facebook’s work has life and death consequences,” he said in an interview. “I did not believe there was a way while working there that I could help move the company to take more seriously some of these really critical issues.”
Chandwaney said he did not raise his concerns internally until he had given his two-week notice. He said he loved the work and his colleagues but explained he was forced to leave because the company “is choosing to be on the wrong side of history.”
In recent months, at least four employees have quit in protest, each posting a message to their colleagues on their way out. Others who still work at the company have spoken out anonymously for fear of retaliation.
“I’m quitting because I can no longer stomach contributing to an organization that is profiting off hate in the US and globally,” Chandwaney wrote in his three-page resignation letter posted on an internal message board on the morning of Sept. 8.
The posts come as Facebook faces one of the most challenging periods in its 16-year history. The company has made a series of changes in the past four years meant to avoid a repeat of the 2016 election, in which the social network was used by Russia to spread disinformation and stoke partisan divides. While foreign interference remains a threat, the company has also been pushed to do more about domestic issues related to extremism and racism. On Friday, Facebook announced that it would ban QAnon content from all its platforms.
Facebook has faced external criticism for years, but the internal pushback from employees is a relatively recent phenomenon following CEO Mark Zuckerberg’s decision to leave up a post from President Donald Trump that was criticized as a call for violence against protesters. Many Facebook employees went public with their disagreement over the decision.
Those disagreements have also been voiced internally, sparking some reaction from Facebook. The company recently put in place new rules about how its internal message boards — a feature that allows employees to openly discuss a wide variety of topics and dates back to the early days of the company — can be used.
Two employees, who spoke to NBC News anonymously for fear of losing their jobs, said the more restrictive rules have been met with resistance and criticism. The changes ban any text from appearing in internal profile pictures, including statements like “Black Lives Matter” or “Make America Great Again.” Political and social commentary that were once allowed anywhere in the system are now only permitted in specific, moderated groups, they said.
“The response has been pretty overwhelmingly negative,” one software engineer who has worked at Facebook for five years said, calling it a “significant change to our company culture and just limiting our expression of identity and free speech internally.”
The employees both believe the changes directly contradict Zuckerberg’s public advocacy of free expression.
“I see this as Facebook acknowledging internally that the problems that it has externally [are] not actually sustainable internally,” the software engineer said by phone.
Facebook has taken more aggressive action in recent weeks in moderating some topics that it once held at arm’s length. In addition to its QAnon ban, the platform has also given Trump little leeway, removing a post from the president’s account Tuesday that compared the coronavirus to the flu.
Facebook declined to make anyone available for an interview in response to these employee allegations. A company spokesperson, Nkechi Nneji, said in an email that it continues to work toward improving its moderation.
“We’ve invested billions of dollars to keep hate off of our platform,” the spokesperson wrote. “We’ve banned over 250 white supremacist organizations and in three months, we’ve removed over 4 million posts praising or supporting hate groups including the Proud Boys. We prohibit militarized social movements from maintaining Facebook Pages, groups and Instagram accounts and we’ve removed thousands of their Pages and groups.”
The spokesperson added, “While there’s more to do, any suggestion that we aren’t taking action against hate is disproven by the progress that we’ve made.”
But those moves don’t address what employees point to as systemic problems at Facebook.
The two Facebook employees echoed what Rosenberg and Chandwaney touched on: a lack of company motivation to make hard choices that would go against Facebook’s mission statement to “bring the world closer together.”
“Facebook’s a big company with a lot of smart people working at it. If they wanted to be more diligent about these problems, they could, but they’re not,” a Facebook designer who has worked at the company for three years said.
The situation has left some employees continuing to question whether they want to work at Facebook.
“I have a daily crisis of integrity, working at Facebook,” the software engineer said. “I’m pretty much always thinking about whether I should stay or leave.”