Survey finds more than half of US respondents have blocked a family member on Facebook

Are you tired of seeing your cousin’s political opinions? Or, does your uncle post hateful and toxic content? Well, you may not be alone.

In a new survey, researchers found that more than half of respondents have blocked at least one family member on Facebook, with 54.8% of survey users admitting to blocking the person on the social media platform.

The survey, conducted by Time2play earlier this month, asked over 2,000 users of Facebook and Instagram in the United States if and why they blocked a family member on social media. 

According to the results, 46.4% of respondents cited the posting of "hateful, toxic or problematic things" as the top reason for blocking the person. Meanwhile, other top reasons include sharing fake news (43.5%) and positing too much political content (41.4%).

4a322c2c-facebook

In this photo illustration the Social networking site Facebook is displayed on a laptop screen (Credit: Dan Kitwood/Getty Images)

On Instagram, the study found only 25.8% of Instagram users admit to blocking a family member on the platform. 

As for why Instagram users block their family members, we found they were motivated by concealing their own content from family members, rather than avoiding what their family members were posting.

The study also found that the average age of an Instagram user who has blocked a family member was 26.1 years, compared to 32.7 years old for those who have blocked a family member on Facebook.

Facebook currently boasts over 240 million active users in the U.S. as of 2022. The company has been criticized in the past for misinformation and hate speech on the internet.

Last year, the Biden administration slammed Facebook for not doing enough to censor posts that contain misinformation about the COVID-19 pandemic

Then-White House press secretary Jen Psaki said that the White House was working to flag "problematic posts" found on Facebook which spread disinformation. 

"There's about 12 people who are producing 65 percent of anti-vaccine misinformation on social media platforms. All of them remain active on Facebook, despite some even being banned on other platforms, including Facebook — ones that Facebook owns," Psaki said.

Psaki said President Joe Biden’s administration was working to connect with medical experts and other influential figures to share information on social media that was sourced from scientific material in order to combat false information as well as anti-vaccine posts. 

Last November, Twitter developed new warning labels on false and misleading tweets, redesigned to make them more effective and less confusing.

Twitter reported that misleading tweets that got the redesigned label — with an orange icon and the words "stay informed" were less likely to be retweeted or liked than those with the original labels.

In a poll released last year, 95% of Americans identified misinformation as a problem when they’re trying to access important information. About half put a great deal of blame on the U.S. government, and about three-quarters point to social media users and tech companies. Yet only 2 in 10 Americans say they're very concerned that they have personally spread misinformation.

RELATED: 95% of Americans agree misinformation is a problem, poll shows

In addition, about 6 in 10 users said they were at least somewhat concerned that their friends or family members had been part of the problem.

According to the poll, 79% of Republicans and 73% of Democrats also said social media companies had a great deal or quite a bit of responsibility for misinformation.