Digital media sources play an increasingly significant role in the news consumption of younger adults, with 53% obtaining at least some of their information from social media, according to the Pew Research Center. With fake news being able to gain alarmingly fast traction through posts and shares, misinformation and the importance of media literacy are becoming increasingly prevalent.
A Washington Post article cited that NewsGuard, a misinformation tracking service, reported that in May of last year, the number of AI-generated websites increased from 49 to 600 sites, reflecting a 1000% increase. These AI-generated websites produce entirely fake content.
In a world where it is becoming increasingly difficult to verify whether something is AI-generated or not, it is vital to understand how misinformation spreads and how to increase media literacy.
Tyrah Majors, a broadcast anchor for KOMO and adjunct professor in the Communications and Media department, worries about misinformation, specifically on social media. Majors emphasized how important it is to look at the source of posts and make sure to verify whether it’s a reliable and trustworthy news source.
“I’ve seen so many fake posts, and it’s like, why is everyone posting this? And then you can verify, oh, actually that wasn’t real, but it’s too late. Everyone has already seen the fake post. You’ve got to verify things in a thorough way because mistakes spread faster than corrections,” Majors said.
Majors explained that algorithms don’t care about fact-checking and will spread whichever post receives the most engagement. Just because a post has a lot of likes doesn’t make it accurate.
Earlier this year, Meta announced the complete removal of fact-checkers on Instagram and Facebook, and instead implemented a community notes option for users. This raises the question of whether all information on Meta platforms accurately reflects actual facts and whether the option for community notes opens the door to biased censorship.
Yana Chakalo, president of Seattle U’s Technology, Media and Telecommunication Law Association, strives to teach other students how to be media literate by hosting panels and workshops with professionals. Chakalo shared that one of the main workshop topics lately has been telecommunications and media-focused.
“We have focused on the FCC and the FTC and how they regulate media and telecommunications in the United States, which is very interesting. It’s law, but it’s also kind of political because presidents appoint people to the FTC, the FCC. So it can get kind of dicey there,” Chakalo said.
While the importance of media literacy and recognizing misinformation is widely acknowledged in the media, it can also be connected to educational spaces.
Rio Slevin, a second-year cellular and molecular biology major, said that they have learned how to enhance media literacy and combat misinformation in multiple classes. As a STEM student, Slevin has certain tools they use which are specific to those types of classes.
“[We have] a whole lab dedicated just to learning how to identify things that might be misinformation, as well as how to do proper research to make sure you’re not having biased information,” Slevin shared.
Despite having the tools to identify misinformation, Slevin still worries about the influence that AI has in the realm of research, saying that they attempt to avoid it whenever possible.
“Sometimes I use AI kind of unknowingly because of the forced integration of AI we have with things like Google AI,” Slevin said.
Similarly, Fourth-Year Accounting and Finance Double Major, Dragon Truong-Le, is also wary when it comes to using artificial intelligence. Although he uses AI platforms such as ChatGPT as tools for brainstorming, he cautioned that everyone should verify the sources that they provide, as they are often wrong.
“They might give the incorrect name of an article or mix and match articles together, and even give articles that don’t exist,” Truong-Le said.
To counteract this, Truong-Le and Slevin employ similar strategies when searching for sources: using research papers and articles that have been referenced multiple times by trustworthy people, such as professors and scientists.
Slevin and Truong-Le learned this method by participating in media literacy courses offered through Seattle University’s Lemieux Library, where librarians go into various classes to teach students how to practice media literacy and recognize misinformation.
“[I think that] learning media literacy should be mandatory and added into course material for the general entry-level courses you take in your first two years,” Truong-Le shared.
The Lemieux Library also offers research assistance, where librarians help students find reliable sources.
Anniyah Fitzhugh, a second-year strategic communications and theatre double major, testified that she hasn’t had any formal media literacy training, but that she has had some projects where her professor encouraged her to look at discourse and opposing opinions of specific issues.
Fitzhugh said that in one of her classes, she studied how different social media sites promote specific agendas based on their target audiences. Fitzhugh also learned from classmates in her communications courses.
“My classmates would do presentations on ways you can improve your media literacy, like fact-checking, understanding the context of what’s being said, and recognizing the difference between actual news and someone’s opinion,” Fitzhugh said.
She also worries about misinformation in her fields of study, noting that when it comes to strategic communications, building a relationship with an audience can be challenging due to the prevalence of false information. Some consumers may lack the media literacy skills necessary to fact-check or understand the context of certain information.
“It is vital for communications experts and PR teams to be intentional about verifying facts and creating campaigns that can withstand scrutiny,” Fitzhugh shared.
As companies like Meta remove fact-checking tools and AI-generated platforms publish false articles faster than credentialed newsrooms, it’s becoming increasingly important to have the tools to sift through deceptive and unreliable information.
“At the end of the day, AI is getting information from somewhere else. You didn’t get that information,” Majors said, and went on to emphasize that, “AI should be a tool, not a crutch.”
