Congresswoman Kathy Castor | Kathy Castor Official Photo
Congresswoman Kathy Castor | Kathy Castor Official Photo
WASHINGTON, D.C. – Representatives Kathy Castor (FL-14) and Debbie Dingell (MI-06) led 18 other members of the House in sending letters to social media companies requesting information regarding their platforms, algorithms, and steps they take to mitigate harms to children online. The letters to Meta, TikTok, Snapchat, Youtube, Twitter, and Twitch, come in the wake of an advisory released recently by U.S. Surgeon General Vivek Murthy underscoring concerns that companies are not responsibly mitigating the risks and subsequent harms to kids on their platforms.
“The proliferation of computers, smart phones, and other connected devices has dramatically increased underaged consumption of online content, with nearly two-thirds of teenagers using social media platforms daily,” the lawmakers write. “A growing body of research has simultaneously shown evidence of negative health impacts of excessive social media use on underage users. We know that social media is contributing significantly to the ongoing youth mental health crisis, exacerbated by increased engagement on social media platforms during and following the COVID-19 pandemic, and it cannot go ignored and unaddressed.”
“Parents are rightfully worried about their children’s privacy. Social media companies have taken advantage of, and profited off, underaged users through manipulative design and recommendation tools, encouragement of in-app purchases, targeted advertising, and expose them to harmful content and unhealthy behaviors,” the lawmakers continue. “At this critical point in their brain development, many kids do not fully understand how companies are collecting, processing, and using their data – nor grasp the potential consequences of how this data is used.”
“Companies have a responsibility to mitigate the fundamental risks of their platforms on underage users. These protections should be inherent to the design of platforms and must prioritize the safety, health, and privacy of children and teens,” the lawmakers conclude. “We all share a commitment to protecting kids online and the need for greater transparency from social media companies on the steps they are currently taking to achieve these goals. While Congress must pass a bipartisan, comprehensive data privacy law to protect all Americans online, we need to ensure that companies are adequately and responsibly protecting children and teens in the interim.”
Specifically, the lawmakers requested responses from each of the companies to the following questions:
1. What steps does your company take to mitigate children and teens’ exposure to harmful content, online abuse, and other threats and to prioritize their online health and safety?
2. Does your company conduct transparent assessments of your platform, algorithms, products, and services to determine the potential health and behavioral impacts on children and teens?
a. How often does your company conduct such assessments?
b. When was the last such assessment conducted?
c. If you have conducted such assessments, what have those assessments found?
d. What recommendations from such assessments has your company implemented and when?
e. What recommendations from such assessments has your company failed to implement and why have you failed to implement them?
f. Does your company share data and findings from such assessments with independent researchers, health experts, or auditors?
g. If you do share such data and findings, have you received recommendations from independent researchers, health experts or auditors that have reviewed that information?
h. Have their concerns or recommendations been used to inform design choices that prioritize the health of children and teens?
i. If you are not already doing so, do you commit to conduct transparent assessments of your platform, algorithms, products and services to determine the potential health and behavioral impacts on children and teens, to implement recommendations that emerge from those assessments and to share data and findings from these assessments with independent researchers, health experts, or auditors to assess the potential health impacts of your platform on underage users?
3. Does your company engage or allow independent third parties to conduct transparent assessments of your platform, algorithms, products, and services to determine the potential health and behavioral impacts on children and teens?
a. When was the last such assessment conducted?
b. If you have engaged or allowed such assessments, what have those assessments found?
c. What recommendations from such assessments has your company implemented and when?
d. What recommendations from such assessments has your company failed to implement and why have you failed to implement them?
e. Does your company share data and findings from such assessments with other independent researchers, health experts, or auditors?
f. If you do share such data and findings, have you received recommendations from independent researchers, health experts or auditors?
g. Have their concerns or recommendations been used to inform design choices that prioritize the health of children and teens?
h. If you are not already doing so, do you commit to engage and allow transparent third party assessments of your platform, algorithms, products and services to determine the potential health and behavioral impacts on children and teens, to implement recommendations that emerge from those assessments and to share data and findings from these assessments with independent researchers, health experts, or auditors to assess the potential health impacts of your platform on underage users?
4. Does your company share algorithmic data with independent researchers, health experts, or auditors?
a. If so, have their concerns or recommendations been used to inform design choices that prioritize the health of children and teens?
b. If not, do you commit to share algorithmic data with independent researchers, health experts, or auditors to assess the potential health impacts of your platform on underage users?
c. Do you commit to publicly share data and findings from these assessments to offer parents and others transparency on the potential health impacts of your platform on underage users?
5. Does your company currently have an advisory committee or office dedicated to informing the safe design of your platform for underaged users?
a. If not, does your company intend to create an advisory committee on this subject, and when does your company expect to establish this committee?
6. What steps are your company taking to ensure the design and default settings of your platform prioritize the highest standards of health and safety for underaged users?
7. Does your company consult with mental health and youth development experts to inform the design of your platform and potential impacts on underage users?
8. Please describe any minimum age requirements you have adopted for the use of your platform
9. How does your company enforce any such policy?
a. Does your platform use age verification technology?
b. How does that technology work?
c. Can your platform’s algorithm determine the approximate age of a user, and enforce increased protections for users identified as underage
10. What tools and processes does your company make available for parents, children, educators, researchers and others to raise concerns or complaints regarding harmful content, online abuse, and other threats to underage users’ health and safety on your platform?
a. What is your platform’s average time in evaluating and adjudicating requests or complaints from users of online harassment, harmful content, and other threats to underage users?
b. When evaluating these requests does your platform provide transparent means for these individuals to understand outcomes of adjudicated requests?
The letters are also signed by Representatives Grijalva, Clarke, Trahan, Schiff, Caraveo, Stevens, DeSaulnier, Norton, Lee, Barragan, Raskin, Balint, Schakowsky, Veasey, Auchincloss, Sherrill, Trone, and Moulton.
Original source can be found here.