Before the pandemic, Andrea Norrington barely checked Facebook. A lecturer in Letchworth, England, she was concerned about how the company had let misinformation on Brexit spread unchecked, and was seriously thinking about quitting altogether.
Then, at the end of March, Norrington came down with COVID-19. When she was still ill after two weeks, she started scouring the internet for information about other people who weren't getting better. That's when she found an early Facebook group for COVID long-haulers; people who still have symptoms of the disease after a month.
"When I first started there weren't too many members, just a couple thousand, but it was really helpful to know I wasn't alone," Norrington said.
The group and others like it became an important part of her daily life and recovery. Members talked to doctors, swapped details on symptoms and tracked treatments together to find out what was making things better or worse. Like many Facebook users, Norrington realised that quitting the world's largest social network isn't as easy as hitting a delete button, especially when you're part of its online communities. It's hard to persuade people to leave, to learn a new tool, and to re-create the ease of gathering such a large variety of people.
Of the 2.74 billion users around the world who check Facebook at least once a month, two-thirds of them use the Groups feature at least once a month, for everything from health issues and hobbies to political organising.
Facebook says there has been an increase in people using the Groups feature during the pandemic. With in-person social options limited to slow the spread of the coronavirus, people have turned to virtual communities such as Groups for companionship or just to pass time.
But the past few months have also seen another wave of calls to leave Facebook. People have criticised the company for not doing enough to crack down on groups spreading the unfounded QAnon conspiracy theories, and then for saying it was banning all QAnon accounts. For not taking enough action against posts from US President Donald Trump that contained misinformation and lies, and for labelling his posts that falsely declared victory after the election.
The pattern of calls to leave Facebook is usually triggered by whatever controversy lands the company in the spotlight. And there has been no shortage of controversies in the company's 16 years, from Cambridge Analytica and data privacy issues to the ways it has handled disinformation.
Because of Facebook's massive size, the effects of any exoduses are, so far, negligible, like attacking a T-rex with spitballs. It's so big that even advertisers leaving doesn't make a dent. In April, more than 1000 companies took part in a boycott over how Facebook had handled hate groups, pulling their ad dollars from the platform. It generated a large amount of press attention but did not significantly affect the company's ad revenue.
The company has pushed Groups as the future of Facebook since 2017, when users' interest in their traditional news feeds was waning. It invested heavily in the feature, made Groups a central part of advertising campaigns, and even started hosting an annual "communities summit."
"The only reason I wouldn't leave Facebook right now is because of the groups. Everything else can be replaced," said Julia Pfeil, a 23-year-old from North Florida who is in 55 Facebook groups.
She's in neighborhood groups to get local news, groups about gaming and conspiracies, one dedicated to crafting while high, and another that is only for reacting angrily to corn (it has over 150,000 members). Her favourites are a dog-spotting group and one for people who have had a specific illness that Pfeil has experienced herself. These days she estimates she spends up to two hours a day on Facebook, even while checking Twitter more.
But over the past weeks she has become increasingly worried about how Facebook enforces its policies and believes it is censoring people, often without an explanation. Her main issue, she says, is chief executive Mark Zuckerberg himself: "I don't like his influence, I don't like his choices, I don't like what he says."
Some people do manage to tear themselves away from Facebook while continuing to use Instagram or WhatsApp; either unaware the apps are owned by the same company and share data, or unwilling to let them go. Others try more temporary measures to distance themselves from the company, such as disabling their accounts instead of deleting them, leaving the door open to come back. They might delete the app to limit how often they use it, or stop sharing personal information and passively scan their feed.
Without the ability to export a group's history or transfer its users to an outside option, there's no perfect alternative to its communities.
Facebook offers ways to export your own profile data so you have a copy, and even a tool for moving your photos to competing services. But there are no export settings for groups, no easy ways to port them over to a new site. And even if there was, it would be a near-impossible task to convince a hundred members, let alone hundreds of thousands of them, to leave and learn how to navigate an entirely new app. Without a systematic way to move groups off Facebook, groups are left to reform themselves from scratch.
If they're interested, there are options for groups ranging from private to public forums, including Slack, Reddit, Twitter and Discord, and creating private chat groups on tools such as Signal or iMessage. And Silicon Valley is creating start-ups for groups, such as Mighty Networks, as well as social apps with a specific political bent like Parler, which does not offer groups. But moving from one tech company to another might not be a panacea.
"In order to solve this problem, we need structural change, we need policy change," said Andrea Downing, a security researcher and privacy rights advocate who started a nonprofit for supporting online health communities called the Light Collective.
In 2018, Downing found a loophole on Facebook that allowed third parties to download member lists from closed groups using a Chrome extension. Facebook subsequently closed the loophole. The admin of a group for people with a BRCA gene mutation that increases the risk of developing breast and ovarian cancer, Downing wants her group and others like it to be able to back up their history, move it off the platform and delete it from Facebook. She is still on the site, even while advocating that the company be broken up and regulated.
Facebook declined to comment on its Groups settings or any plans for them in the future.
"I would love to say people should be avoiding them, but I know for that long-hauler COVID group or for someone who just got diagnosed with cancer, they're going to do what they can to find other people to direct them on that path," Downing said. "I'm not leaving until we can all leave together."
The people who stay on Facebook and work, for free, running and managing groups, don't do it for the company. They're committed to the communities they've made and joined; connections that can outweigh concerns about the company.
"I don't feel some special allegiance to them, because I feel as if I've done so much work for free," said Sandy B, an attorney in Los Angeles, who spoke on the condition that only her last initial be used.
Sandy runs a handful of groups and does anti-racist education on Facebook. She puts in hours of her time trying to combat racism in public and private Facebook spaces, from flagging groups with the n-word in their name to dealing with her own Messenger inbox, where she receives threatening and racist messages. She prizes the connections she has made on the site and says it has led her to communities of people she wouldn't have met otherwise. But the work she puts in to stay on Facebook is tiring.
"It feels like being in an abusive relationship," Sandy said. "How many times do you complain about the same things, and how many articles can be written, before they make any changes?"
The Washington Post