. . .
While Facebook has banned outright false and misleading statements about coronavirus vaccines since December, a huge realm of expression about vaccines sits in a gray area. One example could be comments by someone expressing concern about side effects that are more severe than expected. Those comments could be both important for fostering meaningful conversation and potentially bubbling up unknown information to health authorities — but at the same time they may contribute to vaccine hesitancy by playing upon people’s fears.
The research explores how to address that tension by studying these types of comments, which are tagged “VH” by the company’s software algorithms, as well as the nature of the communities that spread them, according to the documents. Its early findings suggest that a large amount of content that does not break the rules may be causing harm in certain communities, where it has an echo chamber effect.
The company’s data scientists divided the company’s U.S. users, groups and pages into 638 population segments to explore which types of groups hold vaccine hesitant beliefs. The document did not identify how Facebook defined a segment or grouped communities, but noted that the segments could be at least 3 million people.
Some of the early findings are notable: Just 10 out of the 638 population segments contained 50 percent of all vaccine hesitancy content on the platform. And in the population segment with the most vaccine hesitancy, just 111 users contributed half of all vaccine hesitant content.