Courtney Balcombe

Staff Writer

clb6264@psu.edu

On social media, many users notice that they see products that they are interested in. Whether it is related to shopping, games, or even something new that they look into, users will then see related product ads repeatedly. This repeated product placement is caused by an algorithm that each social media platform uses to tailor advertisements to the user’s interests.  

 

According to The Washington Post, the phrase “the algorithm” has taken on sinister and even mythical overtones. It is, at its most basic level, a system that decides a post’s position on the news feed, based on predictions about each user’s preferences and tendencies. The details of its design determine what sort of content thrives on the world’s largest social networks, and what types languish. This in turn shapes the posts we all see and the ways we interact on the platform.

 

When Facebook first launched in 2006, much of the feed was just updates from friends and family that were users on the app as well. Such as, “Marina updated her profile picture,” or “Lucus started a job at Walmart.”

 

Later on in 2009, the news feed algorithm determined the order of stories for each user was to make all the juicy stuff, like the news that a friend was “no longer in a relationship”, appear near the top of the users feed.

 

In June 2021, Instagram shared some insight on how their algorithm works. According to Instagram’s about page, “It’s hard to trust what you don’t understand. We want to do a better job of explaining how Instagram works. There are a lot of misconceptions out there, and we recognize that we can do more to help people understand what we do.”

 

Instagram first explained that it doesn’t have just one algorithm to determine what their users do and don’t see, but several algorithms. When they first launched in 2010, “Instagram was a single stream of photos in chronological order. But as more people joined and more was shared, it became impossible for most people to see everything, let alone all the posts they cared about.”

 

Later in 2016, they began to notice that 70% of a user’s feed was missing. This included half of the posts made by close friends.

 

“Each part of the app, Feed, Explore and Reels, uses its own algorithm tailored to how people use it,” Instagram shared. “People tend to look for their closest friends in Stories, but they want to discover something entirely new in Explore. We rank things differently in different parts of the app, based on how people use them.”

 

According to The Economist, from 2006 until 2016, Twitter users only saw the tweets made by people they followed. After launching and using an algorithm, only 1% of users remained on the old system. By April 2020, they used the algorithm on over 3,000 accounts that belonged to 32 political parties.

 

According to The Washington Post, Twitter researchers announced in October 2021, after analyzing tweets from 2020’s elected officials in Canada, France, Germany, Japan, Spain, Britain and the United States, that accounts belonging to political figures see algorithmic amplifications. 

 

“Tweets about political content from elected officials, regardless of party or whether the party is in power, do see algorithmic amplification when compared to political content on the reverse-chronological timeline.” Dr. Rumman Chowdhury summarized the results in her Twitter thread. “Group effects did not translate to individual effects. Since party affiliation or ideology is not a factor our systems consider when recommending content, two ppl in the same political party would not necessarily see the same amplification. In 6 out of 7 countries, tweets posted by political right elected officials are algorithmically amplified more than the political left. Right-leaning news outlets (defined by 3rd parties), see greater amplification compared to left-leaning.”

 

Chowdhury concluded that there is no “master algorithm” for a Twitter user, and that what they see is based on the algorithmic system.

 

Even newer apps like TikTok, which launched in 2016, use an algorithm to continue showing a user something they seem to interact with more. According to CBS News, Michael Beckerman, TikTok’s head of public policy for North America, said that they have “tools in place” to moderate how long users are on the app and what they see.

 

“TikTok is about entertainment and bringing joy,” Beckerman said. “You put a premium on authentic content, uplifting content. But like all entertainment, you want to watch with moderation, and we put tools in place, take-a-break video, screen time management, and tools for parents like family pairing to make sure that they can have conversations and do what’s right for their family and their teenagers.”

 

While social media is very popular for teens and adults alike, the algorithms are not always set up for younger teens. Beckerman shared that TikTok focuses on age-appropriate experiences. However, The Wall Street Journal invested in TikTok and found that bots registered as minors saw hundreds of videos about drug use, eating disorders and porn site recommendations.

 

As social media continues to advance, their algorithms may need to change toward one that focuses more on different age groups rather than the specific content users see.

Leave a comment

Welcome to the Behrend Beacon

We are the newspaper for the Penn State Behrend campus, serving the students, administration, faculty, staff, and visitors of our university.
Our goal is to shed light on important issues, share the accomplishments of Behrend and Penn State as a whole, and to build connections between writers, editors, and readers.

Let’s connect