As Artificial Intelligence Spreads on Social Media, Users Struggle to Know What to Trust

By: Zoe Santos, arts, culture, and sports reporter

BLACKSBURG, Va. (Dec.11, 2025)- Virginia Tech sophomore Cooper Teich is looking at an AI-generated image of an AI influencer posing with Elon Musk.

Artificial intelligence has become increasingly visible on social media, shaping what users see, share, and believe online. Once limited to photo filters and automated captions, AI now generates realistic videos, images, and digital personas that blend seamlessly into everyday feeds. As the technology spreads across platforms such as Instagram and TikTok, users are left to question what is real and how that uncertainty is reshaping online culture.

Virginia Tech sophomore Cooper Teich said artificial intelligence appears in her social media feeds multiple times a day, often without a clear indication that the content is not real. While some posts are clearly labeled or exaggerated, others resemble authentic news footage or personal content shared by real users.

​​BLACKSBURG, Va. (Dec.11, 2025)- Cooper Teich, a Virginia Tech sophomore, poses for a photo.

“There are videos where something bad happens, and you don’t know if it actually happened,” Teich said. “I don’t know what to believe.”

Teich said the growing presence of AI-generated content has changed how she engages with social media. She now scrolls more cautiously, pauses more frequently, and checks comment sections for context before accepting videos at face value. What once felt like passive consumption has become an active process of verification.

She said the emotional impact of AI-generated content is often immediate, particularly when videos depict emergencies, violence, or distressing situations. Even when the content is later identified as artificial, the initial reaction remains.

“You still feel something when you see it,” Teich said. “Even if you find out it’s fake, the reaction already happened.” 

The uncertainty surrounding AI-generated content became more apparent during Thanksgiving break, when a family member showed Teich a video he believed depicted a serious car crash near his home.

“He thought it happened on his street,” Teich said. “He was really concerned and went outside to check.”

The video was entirely generated by artificial intelligence. 

Teich said moments like that illustrate how AI-generated content affects more than just younger users who are accustomed to questioning what they see online. Older adults, she said, are often more likely to accept realistic videos at face value, especially when they resemble local news footage or familiar environments.

“Imagine how often that happens when no one’s there to explain it,” she said. 

“I don’t know what to believe.”
– Cooper Teich, Virginia Tech sophomore

While younger users may be quicker to suspect a video is AI, Teich said the responsibility to interpret and verify content increasingly falls on individuals, regardless of age. That responsibility creates a culture in which uncertainty is normalized, and skepticism becomes necessary for everyday media consumption.

A screenshot from the Pew Research Center website shows differences between U.S. adults and AI experts in how they view artificial intelligence’s future impact.

Concerns about artificial intelligence extend beyond individual experiences. A 2025 Pew Research Center report found a large divide between how U.S. adults and AI experts view the technology’s future. About half of AI experts surveyed said artificial intelligence will have a positive effect on society, while only a small share of U.S. adults expressed the same optimism.

The gap suggests that while those working most closely with AI tend to see its potential benefits, the broader public remains far more cautious, reflecting a cultural disconnect between technological development and public trust as artificial intelligence becomes more visible in daily life. 

​​Carolyn Kogan, Virginia Tech Adjunct Instructor, poses for a photo. (Image courtesy of Carolyn Kogan) 


Carolyn Kogan, an adjunct professor at Virginia Tech who studies online behavior and digital culture, said artificial intelligence intensifies long-standing issues on social media by increasing realism while reducing accountability.

“When accountability is lowered, people react more emotionally and question less,” Kogan said. “AI makes that problem worse because it looks real.”

Kogan said misinformation is not new to social media, but artificial intelligence allows false or misleading content to spread faster and appear more convincing than before. Visual realism, she said, increases the likelihood that users will engage with content emotionally before evaluating its accuracy.

“Images and videos carry a different kind of authority,” Kogan said. “People trust what they can see.”

She explained that social media has long encouraged users to present idealized, public-facing versions of themselves, what sociologists refer to as “front stage” behavior. Artificial intelligence, she said, accelerates that process by removing the human element entirely.

“People already curate a front-facing version of their lives online,” Kogan said. “AI takes that one step further by removing the human altogether.”

Without a real person behind the content, accountability becomes increasingly abstract. AI-generated images and videos can circulate widely without a clear creator, making it difficult to determine who is responsible when the content is misleading or harmful. 

“When accountability is lowered, people react more emotionally and question less.”
– Carolyn Kogan, adjunct professor at Virginia Tech

That lack of accountability, Kogan said, contributes to a culture where skepticism is necessary but not always practiced.

“Not everyone has the same ability or awareness to question what they’re seeing,” she said.

On platforms such as Instagram, artificial intelligence appears in both obvious and subtle ways. In addition to AI-generated videos and images, some accounts feature AI influencers: digital personas designed to look and behave like real content creators. These accounts often post lifestyle content, promote products, and interact with followers, sometimes without clear disclosure that they are not human.

While AI influencers represent only one segment of AI-driven content online, Kogan said their presence reflects a broader cultural shift in how social media operates.

“These platforms reward engagement, not authenticity,” Kogan said. “If something performs well, it gets amplified, whether it’s real or not.”

Teich said encountering AI influencers has made her more skeptical of what appears in her feed.

“You’ll see someone who looks completely real, and then you find out they don’t even exist,” she said. “It makes you stop and question everything else you’re seeing.”

She said that realization has changed how she interacts with influencers more broadly, including human creators who use heavy editing or undisclosed AI tools.

Social media companies have begun responding to growing concerns about artificial intelligence. Meta, which owns Instagram and Facebook, now requires creators to label content that has been significantly altered or generated by AI. TikTok and YouTube have introduced similar disclosure policies for realistic AI content.
The policies are intended to help users better understand what they are seeing and reduce the spread of misleading content. However, enforcement varies across platforms, and labels are not always immediately visible to viewers.

AI-generated videos and images can still circulate widely before users realize the content is artificial, particularly when posts are reposted, edited, or shared without context.

Teich said labels can be helpful, but do not fully address the problem.

“If it looks real, people are going to believe it at first,” she said.

She also questioned the ethics of allowing highly realistic AI content to circulate freely in the same spaces as authentic photos and videos.

“I don’t think it’s ethical,” Teich said. “It makes you question what social media is even supposed to be.”

Despite growing skepticism, Teich said avoiding social media altogether feels unrealistic. Like many college students, she relies on platforms such as Instagram and TikTok for communication, entertainment, and information, even as trust in what appears online continues to erode.

That reliance on social media, Kogan said, reflects a broader cultural reality. Social media platforms are deeply embedded in daily life, making disengagement difficult even for users who are aware of the risks.

“When people can’t trust what they’re seeing, it affects how they interact with content and with each other,” Kogan said. “It changes how relationships, information, and identity function online.”

Kogan said artificial intelligence forces users to confront those issues more directly, pushing questions of trust and authenticity to the forefront of digital culture.

For Teich, navigating social media now requires her to be skeptical of anything she sees. She scrolls more carefully, questions videos that provoke strong emotional reactions, and relies on external context to determine whether content is credible.

“It just makes everything feel less certain,” she said.

As artificial intelligence becomes harder to separate from reality, users are left to adapt in real time. In a digital environment where fabricated and authentic content coexist, the ability to question what appears on a screen has become an essential part of social media use and a defining feature of online culture. 

International Art Exchange Gets It’s Start at Virginia Tech.

By Zain Omar

The Art, Research, and Technology Exchange (ARTx) was founded in collaboration with Virginia Tech’s Institute for Creativity, Arts and Technology (ICAT) and hosts conferences in specialized performance venues with universities on the advancement of technology’s role in art.

The idea of ARTx came to fruition when Kyle Hutchins, assistant professor of practice at Virginia Tech, played a piece that was composed specifically to be played at the Cube in Moss Arts Center. He realized that certain works could not be duplicated or transferred to digital media because the experience of the performance is affected by the environment.

ARTx allows for professionals and students in the art space to research the impact that advancements in technology affect the way art is shared, learned and taught. Universities and organizations have the opportunity to be awarded grants for their research to keep advancing multimedia performance spaces on their campuses. Virginia Tech was awarded the SEAD grant to fund future projects at the Cube during the spring 2024 ARTx conferences.

ARTx features guest lectures at their event where researchers and art faculty from around the world can share how they found that technology has impacted the learning and performance landscape. Music therapist, Grace Carr, has experienced first-hand how technology has affected the way we learn art. “As a music therapist, I have seen first-hand how teaching music and understanding it have changed because of technology. It is my opinion that technology has allowed for teaching to become much more accessible and readily available to people,” said Carr.

ARTx research focuses on advancements in technology and how they impact the way art is shared, whether in educational environments or in specialized performance venues, such as the Cube. According to Virginia Tech’s Institute for Creativity, Arts and Technology, “The initiative emphasizes collaborations with peer institutions that feature spatial audio and multimedia performance spaces and festivals.”

Advancements in technology also play a role in art classrooms. As new technology is being introduced each year, students and teachers must adapt to new ways to create.

Advancements in technology change the learning landscape for students who are pursuing careers in artistic fields. Former music education student and Music Therapist Grace Carr found that technology has enhanced the learning environment when it comes to artistic fields. “I would say that technology has changed the way we learn art, in that it had simply added on to what we learn. That is not to say that I didn’t learn anything the “old fashioned” way. When I was learning to transcribe music, I would first learn on paper, then on a computer software. So, I would say that in leaning art, technology can help us enhance what we already know,” said Carr.

With the rise of art created for specialized research environments, institutions have found that sharing their work in other settings can be difficult. “When institutions have highly specialized research spaces for art-making, sharing work with other institutions can become challenging, if not impossible,” according to ICAT. ARTx is a way for these institutions to team up to find ways to develop these works to be shared in other space.

Through ARTx, researchers are finding ways to better be able to share these works that have been composed to fit only certain environments. This research will open doors or new ideas and innovative ways that art can be created.

ICAT hosts collaborations with festivals, universities, and organizations throughout the year. They have currently teamed up with 11 partners with locations all around the world. Currently, ARTx has partners in Canada, California, Ireland, New York, and Washington, D.C.

Virginia Tech hosts the New Music and Technology Festival every two years, where faculty and researchers apart from ARTx convene to share their research through lectures, performances and installations. The festival also features student works and is an environment for art and technology disciplines to learn and collaborate with each other. According to ICAT, “the festival highlights diverse disciplines, including music, theatre, cinema, dance, visual art, creative coding, computer science, neuroscience, molecular biology, robotics, and cybersecurity.”

Events for this festival are held in specialized research and performance spaces around Virginia Tech’s campus, such as the Cube, the Sandbox, and Perform Studio.

When Virginia Tech is not hosting the organizations that are part of ARTx, faculty is sent to attend conferences all over the world hosted by other institutions in the art exchange. The most recent conference that Virginia Tech attended was a five-day conference at the Center for Interdisciplinary Research in Music Media and Technology in Montreal, Canada.

Through the collaboration with ARTx and ICAT, along with the other universities and organizations that have joined the art exchange, new understandings of the way technology and art intertwine will start to change the way we share and view art.