The "trust and safety winter" for misinformation ahead of the election was caused by trolls, lawsuits, and layoffs.
- This election season, misinformation researchers are facing a more challenging environment, with online trolls and expensive lawsuits contributing to the difficulty.
- The increase in online misinformation is more widespread than ever, and with the advancement of artificial intelligence, it is becoming increasingly challenging to discern.
- Social media companies are restricting researchers' access to tools they used for their studies.
Nina Jankowicz's dream job has turned into a nightmare.
Over the past decade, she has focused on researching and analyzing the dissemination of Russian propaganda and internet conspiracy theories. In 2022, she was appointed to the White House's Disinformation Governance Board, which was established to help the Department of Homeland Security combat online threats.
Jankowicz faces an onslaught of government inquiries, lawsuits, and harassment due to intense hostility towards internet protectors, particularly before presidential elections.
Jankowicz, a mother of a toddler, has experienced anxiety so high due to death threats that she recently dreamt of a stranger breaking into her house with a gun. In the dream, she punched the intruder, which in reality, grazed her bedside baby monitor. Jankowicz now avoids public view and no longer publicizes her events.
"Jankowicz stated, "I do not desire someone who intends harm to appear." He has had to adapt his behavior due to this change in his surroundings."
In previous election cycles, researchers such as Jankowicz were celebrated by lawmakers and company executives for their work exposing Russian propaganda campaigns, Covid conspiracies, and false voter fraud accusations. However, the 2024 election cycle has been marked by the potential threat of litigation from powerful individuals like Elon Musk, congressional investigations led by far-right politicians, and an increasing number of online trolls.
Unfortunately, the constant attacks and legal expenses have become an occupational hazard for researchers, as Alex Abdo, litigation director of the Knight First Amendment Institute at Columbia University, stated. Abdo, whose institute has filed amicus briefs in several lawsuits targeting researchers, added that the "chill in the community is palpable."
Jankowicz is among over 25 researchers who discussed with CNBC the evolving environment and the safety concerns they now face for themselves and their loved ones. Several opted to remain anonymous to safeguard their privacy and avoid additional media attention.
Since Trump's first campaign for president nearly a decade ago, conspiracy theories claiming that internet platforms try to silence conservative voices have steadily increased.
'Those attacks take their toll'
The chilling effect is a major concern because online misinformation is increasingly common and harder to detect, especially with the rise of AI, according to some researchers. This is similar to removing police from the streets during a surge in crime.
Stanford Internet Observatory president Jeff Hancock stated that we are currently in a "trust and safety winter." He has personally experienced it.
In 2023, conservative groups sued Stanford's SIO three times, accusing the institute's researchers of colluding with the federal government to suppress speech. Stanford spent millions defending its staff and students in court.
During that time, SIO downsized significantly.
"Hancock stated during his keynote address at the third annual Trust and Safety Research Conference in September that many people, including his organization's staff and researchers, have lost their jobs or experienced worse consequences. He emphasized the negative impact of these attacks."
CNBC inquired about the reason for the job cuts at SIO but did not receive a response.
In March, Google laid off several employees in its trust and safety research unit, including a director, just days before some of them were scheduled to speak at or attend the Stanford event, according to sources close to the layoffs who asked not to be named.
Google did not provide a reason for the cuts, stating in a statement to CNBC that as the company takes on more responsibilities, particularly with new products, it makes changes to teams and roles based on business needs. Despite this, Google is still expanding its trust and safety team.
After being appointed to the Disinformation Governance Board in the Biden administration, Jankowicz felt hostility two years ago.
The group faced repeated attacks from conservative media and Republican lawmakers who accused them of limiting free speech, leading to their shutdown after just four months in operation.
The Homeland Security Advisory Council recommended the termination of the DHS board, and the agency announced the move in August 2022 without providing a specific reason.
The House Judiciary Committee's subcommittee subpoenaed Jankowicz as part of an investigation into whether the federal government was colluding with researchers to censor Americans and conservative viewpoints on social media.
"That's hard to deal with," Jankowicz said, "I'm the face of it."
Jankowicz has been dealing with a "cyberstalker" who has repeatedly posted about her and her child on social media site X, resulting in the need for a protective order. This has added to the more than $80,000 in legal bills she has already spent, and she fears that online harassment will lead to real-world dangers.
Jankowicz's face was featured on the cover of a munitions handbook on 4chan, while another person used AI software and a photo of her face to create deep-fake pornography, placing her likeness onto explicit videos.
"Jankowicz, who wrote about her experience in a 2023 story in The Atlantic with the headline, "I Shouldn't Have to Accept Being in Deepfake Porn," stated that she has been recognized on the street before."
An anonymous researcher stated that she has encountered more online harassment since Elon Musk's acquisition of Twitter in late 2022, which is now referred to as X.
A CNBC direct message user threatened a researcher, stating they had her home address and suggested the researcher plan where she, her partner, and their "little one" would live.
The researcher and her family moved within a week of receiving the message.
Musk's company has sued researchers and organizations for accusing X of not addressing hate speech and false information.
In November, a suit was filed by X against Media Matters after a report was published showing that hateful content on the platform was displayed alongside ads from companies including , and . Following the report, these companies halted their ad campaigns, which X's lawyers characterized as "deliberately misleading."
The World Federation of Advertisers suspended the operations of the Global Alliance for Responsible Media (GARM) in August, following a lawsuit by X that accused the group of organizing an illegal ad boycott.
At the time, GARM stated that the allegations had a negative impact on its resources and finances, causing a distraction.
According to the Knight First Amendment Institute, billionaires like Musk can employ lawsuits to delay and financially drain researchers and nonprofits.
No representatives from X or the House Judiciary Committee responded to requests for comment.
Less access to tech platforms
X's actions aren't limited to litigation.
The company changed its data library usage policy last year, switching from offering it for free to charging researchers $42,000 a month for the lowest tier, which grants access to 50 million tweets.
At the time, Musk stated that the change was necessary because the "free API was being misused by bot scammers and opinion manipulators."
Kate Starbird, a professor at the University of Washington who studies misinformation on social media, stated that researchers utilized Twitter because it was accessible, free, and served as a substitute for other platforms.
Starbird, who was subpoenaed for a House Judiciary congressional hearing in 2023 related to her disinformation studies, stated that "perhaps 90% of our efforts were directed towards Twitter data due to the abundance of it."
On Nov. 15, following the election, a stricter policy will be implemented, with X stating that users could face a $15,000 penalty for accessing over 1 million posts in a day under the new terms of service.
Abdo stated that the new terms of service from X Corp. will hinder the research we require at the most crucial times.
It's not just X.
In August, Facebook shut down CrowdTangle, a tool used to monitor misinformation and popular topics on its social networks. It was replaced with the Meta Content Library, which provides comprehensive access to the full public content archive from Facebook and Instagram.
CrowdTangle is being replaced by a new research-focused tool from Meta, which is more comprehensive and better suited for election monitoring, according to a Meta spokesperson.
Researchers have noted that Meta, TikTok, and Google-owned YouTube provide limited data access, making it difficult for them to analyze content. As a result, they often have to manually track videos, comments, and hashtags.
"Our knowledge is limited to what our classifiers can discover and what is accessible to us, as stated by Rachele Gilman, director of intelligence for The Global Disinformation Index," said Rachele Gilman.
Some companies are unintentionally facilitating the spread of falsehoods.
In June of last year, YouTube announced that it would stop removing false claims about the 2020 election fraud. In preparation for the 2022 U.S. midterm elections, Meta introduced a new policy that allows political ads to challenge the legitimacy of past elections.
The YouTube Researcher Program enables hundreds of academic researchers worldwide to access the company's global data API with unlimited quota per project. However, increasing access to new areas of data for researchers can be challenging due to privacy concerns.
TikTok provides free access to updated tools for researchers in the U.S. and EU, and actively seeks feedback from them, according to a TikTok spokesperson.
Not giving up
Researchers are concerned about the time between Election Day and Inauguration Day, according to Katie Harbath, CEO of tech consulting firm Anchor Change.
The storming of the U.S. Capitol on January 6, 2021, when Congress was certifying election results, is still fresh in everyone's mind. Harbath, who was previously a public policy director at Facebook, stated that the certification process could again be messy.
"During this uncertain period, companies are contemplating how to handle content, considering options such as labeling, removal, or reducing reach, as Harbath stated."
Although they faced numerous obstacles, researchers have achieved some legal triumphs in their quest to preserve their research.
In March, a California federal judge ruled that a lawsuit by X against the nonprofit Center for Countering Digital Hate was an attempt to silence X's critics and dismissed the lawsuit.
In three months, the Supreme Court ruled in favor of the White House, allowing them to request that social media companies take down false information from their platforms.
Jankowicz, for her part, has refused to give up.
She established the American Sunlight Project this year with the goal of providing citizens with reliable sources to inform their daily decisions. Jankowicz shared with CNBC that she aims to provide assistance to individuals in the field who have encountered threats and difficulties.
"Jankowicz stated that the common factor among people is their fear of publishing research that was actively published around 2020. They are concerned about facing threats, legal issues, and the potential impact on their positions."
Watch: OpenAI warns of AI misinformation ahead of election
Technology
You might also like
- Tech bros funded the election of the most pro-crypto Congress in America.
- Microsoft is now testing its Recall photographic memory search feature, but it's not yet flawless.
- Could Elon Musk's plan to reduce government agencies and regulations positively impact his business?
- Some users are leaving Elon Musk's platform due to X's new terms of service.
- The U.S. Cyber Force is the subject of a power struggle within the Pentagon.