“How to delete Facebook” has been trending, and it’s not hard to see why. Facebook has dealt with years of whistleblowers and senate hearings, but the latest internal document findings revealed a torrent of issues at the social network company.

Including criminal activity such as human trafficking, the promotion of violence, and the spread of misinformation, “The Facebook Papers,” as the media dubbed it, has opened up the floodgates.

#DeleteFacebook trended on Twitter and other social media platforms – a move some critics have been harping on about for years. Bur recent news has opened the debate yet again.

Here are four reasons you may want to get rid of your Facebook profile and data.

Facebook Tracks Your Data and Location

If the app is downloaded onto your phone, the company is “secretly” tracking a variety of sensitive data most users are completely unaware of. According to Forbes, the app captures location data and uses the accelerometer on your phone to track your movements.

It’s possible to set location services to “never,” but the app works around this by harvesting metadata and using the accelerometer to determine where and for how long you’re actively using their app.

“If you don’t allow Facebook access to your location, the app can still infer your exact location only by grouping you with users matching the same vibration pattern that your phone accelerometer records,” a researcher told Forbes. “Apps can figure out the user’s heart rate, movements, and even precise location. Worse, all iOS apps can read the measurements of this sensor without permission.”

Along with location, the app can also be given permission to see a user’s browsing history, purchase history, contact list, photos and videos. This can be limited in the app’s setting, but some permissions are needed to share photos or add suggested friends.

Facebook Has Not Fixed Its Human Trafficking Problem

In March 2018, CNN found multiple accounts in the Middle East and North Africa promoting the sale of “domestic laborers.” The social networking company said at the time that their “policies did not acknowledge the violation,” and that they were “strengthened” following the incident.

Facebook launched a “Human Exploitation Policy” in May 2019, but did little to combat over 100 fake Facebook and Instagram accounts detailing the recruitment of dozens of female victims using Messenger and WhatsApp.

“While our previous efforts are a start to addressing the off-platform harm that results from domestic servitude, opportunities remain to improve prevention, detection, and enforcement,” Facebook said in an official statement in February. “We’re continuing to refine this experience to include links to helpful resources and expert organizations.”

Involvement in the 2020 Presidential Election

QAnon and other right-wing conspiracy groups flourished on Facebook leading up to the 2020 election as the platform struggled to curb the spread of false election results and “Stop the Steal” supporters.

In documents obtained by The New York Times, workers at the social media company warned higher-ups about QAnon-related content leading up the Jan. 6 insurrection, and the company failed to address their concerns.

According to the documents, Facebook alleged that it would simply “do this better next time.” Mark Zuckerburg told Congress that he believed his company “did our part to secure the integrity of our election.”

Posts promoting violence at the capitol were not removed until that same day, and many posts that “suggested the overthrow of the government” or “voiced support for the violence,” remained up at seven times the rate that they were usually posted.

A week after the election, company data workers found that 10% all political posts in the United States spread claims that the election results could not be trusted.

Inability to Stop Covid-19 Misinformation

Learning from its role in the election, Facebook became more aggressive in their fight to combat Covid-19 misinformation, but one of the largest false remedies has eluded detection–the claim that Ivermectin, a horse de-wormer, can prevent the spread of the virus.

According to The New York Times, Media Matters for America, a liberal watchdog group, found 60 groups dedicated solely to the discussion of Ivermectin: how it “works” and where to get it. Some groups, which were still active, boasted tens of thousands of members.

“Facebook removes content that attempts to buy, sell, donate or ask for Ivermectin,” a company spokesman told The Verge. “We also enforce against any account or group that violates our COVID-19 and vaccine policies, including claims that Ivermectin is a guaranteed cure or guaranteed prevention, and we don’t allow ads promoting Ivermectin as a treatment for COVID-19.”

Some anti-vax groups have been able to dodge their search however, by referring to ivermectin in the comments, which are checked less regularly than posts. Some groups have also started referring to the drug with nicknames such as “ivm.”

Despite their attempts, the spread of misinformation has been an ongoing problem for Facebook, and one that doesn’t seem to have an end in sight.