Copy
View this email in your browser
FOR IMMEDIATE RELEASE
September 27, 2021
CONTACT

New Future of Privacy Forum Report Delves Into the Risks and Concerns of Self-Harm Monitoring Technology in Schools, 
Offers Resources and Best Practices

New report covers the increased reliance on self-harm monitoring programs in schools meant to address student mental health and its potential harm to students. 

WASHINGTON, D.C. The Future of Privacy Forum (FPF) released a new report entitled, “The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies: Recommendations for Schools.” The report reviews the increased use of self-harm monitoring technologies in school districts, the privacy and legal concerns of such technologies, the harms to students that can arise from such technologies, and the best practices and policies for those school districts that choose to use such technologies. View the report here.

Following two years of uncertainty, loss, and severe disruption to the education landscape, many educators and administrators are rightly focused on supporting and providing resources for students’ mental health. “School districts should understand the risks self-harm monitoring technology can pose to students’ privacy and safety and carefully weigh those risks against any benefits,” said Amelia Vance, FPF’s Director of Youth & Education Privacy. “There is an assumption that this is a solution without drawbacks, but some students may be disincentivized from seeking help if they are being monitored.”

A few of the key concerns regarding the use of self-harm monitoring technologies highlighted in the report include:

  • Infringement of Student Privacy. While some schools deploy technology that simply emails an administrator when a student accesses inappropriate content, other schools use more intensive monitoring that creates a profile of each student’s search and web browsing activity, including any flags related to mental health keywords. This monitoring may make students feel over-surveilled and wary of using school technology to research needed resources. Privacy concerns are often exacerbated because the majority of content flagged is being scanned when students are at home outside of normal school hours. 

  • Unintended Harm to Students. Self-harm monitoring technologies that intend to flag keywords or phrases that may indicate self-harm cannot fully grasp the context, slang, colloquial use, and joking nature inherent in student conversation. Students who are mistakenly flagged by the system may be subject to unnecessary stigma, biases, and differential treatment. 

  • Increased Inequity for Vulnerable Student Groups. Using self-harm monitoring systems without strong guardrails and privacy-protective policies is likely to disproportionately harm already vulnerable student groups.

    • LGBTQ+ students may find themselves inadvertently outed by such programs, possibly leading to an increase of bullying or an unsafe home environment.

    • Students of color and English Language Learners may be subject to unnecessary scrutiny due to mistranslation or a lack of colloquial and cultural understanding.

    • As mental health issues are often incorrectly conflated with violence, schools may more readily turn over information regarding flagged students to law enforcement. This can perpetuate the school-to-prison pipeline, which already disproportionately affects students of vulnerable populations.

  • Legal Implications. Many schools that implement monitoring technologies do so in an attempt to comply with the Children’s Internet Protection Act (CIPA). However, the Federal Communications Commission (FCC) has yet to publish guidance for schools on monitoring requirements. Without clarity on CIPA’s requirements, schools may unintentionally over-surveil and over-collect sensitive, personal information about students or their families in an attempt to comply with the law. Monitoring technologies can also trigger implications of the Americans with Disabilities Act (ADA) and Section 504 Disability Discrimination regarding privacy and discrimination of mental health disabilities, as well as Fourth Amendment concerns of unlawful searches and seizures.

  • Lack of Evidence. Despite their increasing adoption by schools, self-harm monitoring systems are an unproven technique for effectively identifying and assisting students who may be considering self-harm.

Despite the concerns inherent in self-harm monitoring technologies, FPF recognizes that some districts will find them to be an asset in aiding student mental health. For those districts that choose to implement such programs, FPF has compiled the following resources and best practices to address these concerns when using self-harm monitoring technologies safely and responsibly:

You can view this FPF report here. To access all the Future of Privacy Forum’s student privacy resources, visit www.studentprivacycompass.org. To learn more about the Future of Privacy Forum, visit www.fpf.org

 
# # #
 
About FPF
Future of Privacy Forum (FPF) is a nonprofit organization focused on how emerging technologies affect consumer privacy. FPF is based in Washington, DC, and includes an advisory board comprising leading figures from industry, academia, law, and advocacy groups. FPF’s Youth & Education Privacy program works to protect child and student privacy while allowing for data and technology use that can help young people learn, grow, develop, and succeed. FPF works with stakeholders from practitioners to policymakers, providing technical assistance, resources, trend analysis, and training. FPF's Youth and Education Privacy team runs Student Privacy Compass, the one-stop-shop resource site on all things related to student privacy.
Twitter
Facebook
Website






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
Future of Privacy Forum · 1350 I St NW Ste 350 · Washington, DC 20005-6503 · USA