Source: United States Senator for South Dakota John Thune
U.S. Sen. John Thune (R-S.D.), ranking member of the Subcommittee on Communications, Media, and Broadband, today urged Congress to enact meaningful, bipartisan legislation to hold big tech companies accountable. Thune discussed the secret and potentially harmful algorithms big tech uses to manipulate a user’s experience. Thune noted that his two bipartisan bills, the Platform Accountability and Consumer Transparency (PACT) Act and the Filter Bubble Transparency Act, would help increase online transparency and accountability for consumers of all ages.
Thune’s remarks below (as prepared for delivery):
“Mr. President, social media has become a big part of a lot of Americans’ lives.
“TikTok. Twitter. Facebook. YouTube. Instagram.
“People turn to social media for connection.
“For entertainment.
“To stay on top of the news.
“For pictures of the grandkids.
“For workout routines and new recipes.
“Social media offers a lot of benefits and opportunities.
“But it’s becoming ever more clear that social media has a darker side as well.
“Social media use can have a negative effect on mental health.
“It can foster negative and divisive engagement and serve as an outlet for illegal activity, from child pornography to human trafficking.
“And it can have a particularly detrimental effect on the still-developing psyches of teenagers.
“The Wall Street Journal recently published a series of disturbing reports based on the information a Facebook whistleblower provided that highlighted everything from the use of Facebook for criminal activity in developing countries, to the company’s own research showing the negative impact Instagram can have on teenage girls.
“And two weeks ago, the Senate Commerce Committee held a hearing where we heard firsthand from the Facebook whistleblower about the concerns that led her to come forward.
“Next week, the Commerce Committee will be holding a hearing with witnesses from Snapchat, TikTok, and YouTube examining how these companies treat younger users.
“A recent Wall Street Journal investigation into TikTok revealed how easy it is for younger users to be bombarded with wildly inappropriate content, from videos promoting drug use to disturbing sexual content.
“Mr. President, one major problem with social media that came through once again in the recent Commerce hearing and in the Wall Street Journal’s recent revelations is social media platforms’ use of algorithms to shape users’ experience.
“Gone are the days when you logged into Facebook and just consumed content that had been posted chronologically since your previous log-in.
“Now Facebook – and other social media platforms – use algorithms to shape your newsfeed and suggestions for additional content, emphasizing posts the platforms think you’ll be interested in and deemphasizing other posts.
“Now, algorithms can be useful, of course.
“If you’re looking for YouTube videos on how to build a bookshelf, you’ll probably appreciate it if YouTube suggests additional videos on how to build a bookshelf … rather than videos on how to roast a turkey or sink the perfect jump shot.
“But algorithms have a problematic aspect as well.
“For starters, many people aren’t aware of just how much their experience on these platforms is being manipulated, and the negative emotional effects that that manipulation can have.
“Disclosure on these platforms can be confusing or nonexistent, so individuals can be largely unaware that the immense amount of personal data social media platforms collect is being used to decide what posts they’re being shown, what ads they’re being offered, and more.
“Individuals end up trapped in what has been termed the ‘filter bubble’ – their own world of filtered search results and tailored content.
“This can lead to everything from political polarization – as users are presented with a narrow, one-sided view of current affairs – to addictive behavior, as the platform doubles down on troubling content users have shown an interest in.
“As the Wall Street Journal’s recent articles on Facebook and TikTok demonstrate, the filter bubble can be particularly troubling in the case of younger social media users, who may watch an inappropriate video and soon find that their feed is filled with similar material.
“In many ways, the filter bubble can end up shaping users’ choices and behavior.
“Mr. President, as a former Commerce Committee chairman and current ranking member of the Commerce Subcommittee on Communications, Media, and Broadband, I’ve been following these issues for a while and have developed two bipartisan bills – the Filter Bubble Transparency Act and the PACT Act – that I think would go a long way toward addressing the problems posed by social media platforms.
“My Filter Bubble Transparency Act, which is cosponsored by Senators Blumenthal and Blackburn, among others, would allow social media users to opt out of the ‘filter bubble.’
“In other words, it would allow them to opt out of the filtered experience tailored for them by opaque algorithms and instead see an unfiltered social media feed or search results that aren’t based on the vast amount of information the platform has on them.
“Facebook, for example, would be required to provide a clear notification to users that their content is being shaped by algorithms.
“And then Facebook would be required to provide users with an easily accessible option to see a chronological newsfeed, instead of a newsfeed powered by opaque algorithms emphasizing the posts Facebook wants you to see.
“My Platform Accountability and Consumer Transparency Act – the PACT Act – which I introduced with Senator Schatz, would also increase social media transparency.
“It would require sites to provide an easily digestible disclosure of their content moderation practices for users.
“And it would address censorship concerns by requiring sites to explain their decision to remove material to consumers.
“Until relatively recently, sites like Facebook and Twitter would remove a user’s post without explanation and without an appeals process.
“And even as platforms start to shape up their act with regard to transparency and due process, it’s still hard for users to get good information about how content is moderated.
“Under the PACT Act, if a site chooses to remove your post, it has to tell you why it decided to remove your post and explain how your post violated the site’s terms of use.
“And then it has to provide a way for you to appeal that decision.
“The PACT Act would also explore the viability of a federal program for Big Tech employees to blow the whistle on wrongdoing inside the companies where they work.
“We’ve learned a lot from Frances Haugen, the Facebook whistleblower who spoke to the Commerce Committee two weeks ago.
“And I believe that we should encourage employees in the tech sector to speak up about questionable practices of Big Tech companies so that we can, among other things, ensure Americans are fully aware of how social media platforms are making use of artificial intelligence and individuals’ personal data to keep them hooked on their platforms.
“Mr. President, as I said earlier, social media offers a lot of benefits.
“But with the ever-increasing role it plays in Americans’ lives, it is essential that consumers understand exactly how social media platforms are using their information and shaping the news they see and the content they interact with.
“And I am hopeful that the recent troubling revelations about Facebook and TikTok published by the Wall Street Journal will create an impetus for bipartisan action on social media transparency.
“I’m grateful to have bipartisan cosponsors for the Filter Bubble Transparency Act and the PACT Act.
“And I look forward to working with my cosponsors to get these bills passed in the near future.
“Big Tech has operated in the dark for too long.
“It’s time to shed some light on content moderation.”