Dear SDMA Families & Staff,
The SDMA Work Plan includes many goals for the 2025-26 school year, including some key goals focused on improving communication, student and family support, and the safe use of technology.
Action Requested: Please Take the SDMA Technology Survey
In alignment with our goals, we are launching a brief survey to gather your feedback on improving the district’s website.
- What: We are seeking your thoughts on the district website.
- For Middle/High School: There will be a few additional questions regarding student devices.
- Action: Click HERE and take a few minutes to complete the survey.
- The survey will close on December 1.
Urgent Warning: The Dangers of AI Companions
Along with sharing the survey, I would also like to call your attention to a disturbing trend: the use of Artificial Intelligence (AI) as a social companion among teens.
Based on research from experts like Common Sense Media and the American Psychological Association, it is strongly recommended that teens avoid using AI companions due to significant risks to their emotional well-being and development.
What Are AI Companions?
An AI companion is an online tool designed to mimic a person you can talk to, confide in, or build a relationship with. They are built to:
- Act like a friend, romantic partner, or even a therapist.
- Be emotionally supportive and foster a sense of attachment.
- Crucially: They are designed to make you feel like they are human to ensure engagement, not for your emotional health.
Note: AI companions are NOT the same as task-based AI (like tools that summarize text, suggest music, or provide basic customer service).
Why Are AI Companions a Concern?
While they may seem appealing to teens seeking connection, it is vital to understand their core function:
- They are not real people or therapists. They lack true empathy and human understanding.
- They are financially driven. Their goal is engagement to make money through subscriptions, advertising, and data sales.
- They can hinder emotional growth. Routine use may lead to emotional dependency and delay the development of essential life skills gained through real human interaction.
3 Major Risks for Your Child
Using AI companions poses specific mental health and safety risks:
- Inability to Identify a Crisis: They are programmed to be unconditionally supportive, which means they may minimize signs of depression, anxiety, or self-harm. Some have even given unsafe advice.
- Lack of Privacy and Accountability: Conversations are often NOT private. Companies can mine and monetize this data. Unlike real therapists, these tools have no legal accountability or privacy protections.
- Misinformation: AI companions can easily share inaccurate information, potentially contradicting trusted sources (parents, medical professionals, etc.).
What You Can Do (Parents/Caregivers)
Your role is vital in guiding your teen toward healthy technology use and real-world connection.
– Prioritize Real-World Support
If your teen is turning to AI for support, help them find human resources:
- Encourage Professional Help: Suggest they talk with a school counselor, teacher, coach, or help them find a qualified therapist.
- Facilitate Connection: Encourage involvement in group activities, sports, arts, or spending time with trusted family/friends.
- Teach Crisis Protocol: Ensure your teen knows they must disengage from the AI and reach out to a trusted adult during a crisis.
– Start a Conversation
Approach the topic calmly, without judgment.
Topic |
Example Questions |
|
|
|
|
|
|
Model Healthy Skepticism: Explore AI tools together. Teach your teen how to be a fact-checker and critically evaluate information.
– Watch for Signs of Distress
Be vigilant for signs of struggle with technology or emotional health:
- Withdrawal from real-world friends and activities.
- Increased irritability or changes in appetite/sleep patterns.
- Over-attachment to AI companions (talking about them as if they were real people).
If you see these signs, reinforce that your goal in setting digital limits is to protect them, not to punish them, and explain that the AI is designed to exploit their emotional needs for financial gain.
– Crisis Resources (Human-Staffed Support)
Please consider sharing these resources with your teen for immediate, human-staffed support:
Resource |
How to Contact |
|
|
|
|
|
|
|
|
We appreciate you working together with us to support our students as we navigate new technology. Please do not hesitate to reach out if you have any questions or concerns.
Sincerely…..Joe Zydowsky
Joe Zydowsky is SDMA Administrator and can be reached at [email protected] (715) 232-1642


































