Register for an Open House to Meet Our Staff!


This lesson builds critical thinking and ethical awareness in students as they navigate the complex issue of child monitoring apps and AI technology. By exploring real-world scenarios and ethical dilemmas, students gain a deeper understanding of the delicate balance between privacy and safety, consent and autonomy, and the impact on family relationships. This lesson equips students with the skills to make informed decisions about technology use, promoting responsible digital citizenship and open discussions within families and society at large.

Materials Needed

Materials Needed

Printed Simulation handouts

Time needed

Time needed

45 - 60 Mins


  • Students will be able to analyze the ethical dilemmas arising from the use of child monitoring apps, distinguishing between privacy concerns and safety considerations.
  • Students will be able to identify and describe the potential impact of AI-powered child monitoring apps on trust and communication within family relationships.
  • Students will be able to evaluate the role of consent and autonomy in the use of child monitoring apps
  • Students will be able to propose ethical guidelines and solutions to balance the benefits and challenges of AI in child monitoring apps

Key Concepts & Vocabulary

  • Geofencing: A feature in child monitoring apps that allows parents to set virtual boundaries on a map, triggering alerts when a child enters or leaves specified areas, such as home or school.
  • Data Encryption: A security measure to protect sensitive information, ensuring that data is converted into code to prevent unauthorized access.

Lesson Components

  1. Before You Watch: Connect lesson to background knowledge of AI child monitoring apps and get students’ attention 
  2. Video: Show the video explaining the ethical considerations in the topic of child monitoring
  3. Case Study: Detail a real-world scenario that relates to the issue of privacy and security in monitoring apps
  4. Simulation: Lead students through an interactive activity exploring the possible ethical considerations
  5. Discussion: Ask whole-class questions to reflect on experience and consider perspectives.
  6. Assessment: Verify student understanding with an exit ticket

Warm Up

  • Begin the lesson by asking the students if any of them have family tracking apps installed on their phones, such as “Family360” or similar apps. Encourage students to raise their hands or simply share if they are comfortable.
  • For those who indicate they have experience with such apps, ask them to briefly share their experiences. They can mention whether they find the app helpful, any concerns they or their parents have, and whether it has affected their family dynamics. Are there any positive stories where the app helped them? (Try to avoid a complaint session.)
  • For students who haven’t used such apps, ask them if they are aware of anyone in their family or friend circle who uses them, and whether they have any thoughts or questions about these apps.



Video Script for Narrations

Hello Young Innovators! Today we’re discussing the ethics of gendered voices of AI assistants.
Artificial Intelligence is becoming a bigger part of our lives every day. From smartphones to smart homes, AI voice assistants are everywhere, helping us with tasks, answering our questions, and even keeping us company. But have you ever wondered why most of these voice assistants sound female?
AI voice assistants haven't always been around. In the early days of technology, computers were large, clunky machines that certainly didn’t talk. As technology evolved, so did the ability for machines to interact with us using voice – a feature that is becoming increasingly common.
Imagine asking your AI for the weather, and a deep, authoritative voice responds. Or, picture a soft, gentle voice helping you with homework. Why do these differences matter? Well, they bring us to our main topic: the ethics of gender representation in AI voice assistants. For a long time, most AI assistants like Siri or Alexa had female-sounding voices. This wasn’t just a random choice.
Research showed that people generally found female voices to be warmer and more welcoming. And people were used to hearing women’s voices from back when operators connected phone calls.
On the flip side, some people prefer to hear male voices for authoritative roles, like GPS navigation or voiceovers in documentaries. But this leads to ethical concerns. Are we reinforcing traditional stereotypes about gender roles, stereotyping men in roles of power and women in roles of service?
One method of dealing with this issue is to use gender-neutral voices. These are designed to not clearly sound male or female, aiming to represent a wider range of human experiences and identities. It's a step towards inclusivity, and an attempt to avoid the stereotypes of gender from previous generations.
When AI voice assistants reinforce gender stereotypes, they might also impact how we view gender roles in real life. But when we make these voices gender-neutral, are we erasing gender differences that are a real part of many people's identities?
Some people argue that having a range of gendered voices in AI can reflect the diversity of human experiences. Others believe that breaking away from gendered voices entirely is the key to challenging stereotypes and promoting equality. There’s no easy answer, and technology is constantly evolving to reflect our changing society.
So, what do you think? Should AI voice assistants have a gender? Or should they be gender-neutral to avoid reinforcing stereotypes? As we continue to integrate AI into our daily lives, it's important to think about how the choices we make about technology today shape our future.
Let’s discuss: How do AI assistants impact our attitudes toward gender in the real world?

Case Study

Distribute or read Case Study handout.

Summary: The fictitious “EyesOut” app is designed to allow parents to monitor their children’s real-time whereabouts, raising questions about the balance between privacy and safety, consent and autonomy, and the impact on child-parent relationships. The app’s data security measures and the risk of hacking are also key concerns. The company responds by considering customizable permissions and enhanced data security, but whether these changes fully address the challenges of creating an app that ensures safety while preserving relationships remains a question.

Student Handout

Case Study: Family Tracking

Description: A fictitious company has come out with the new “EyesOut” app, which  is designed to allow parents to monitor their children’s whereabouts in real-time – to “keep an eye out” for them. It provides a convenient way for parents to ensure their children’s safety and know their location at all times. Families start using this app, and it isn’t long before the company starts getting complaints for a number of reasons.

  • Privacy vs. Safety: The “EyesOut” app raises questions about the balance between a child’s right to privacy and the parent’s desire for safety. Should parents have the ability to track their child’s every move?
  • Consent and Autonomy: Do children have a say in whether they want to be tracked, or is this decision solely up to the parents? What age is considered appropriate for a child to have a say in such matters?
  • Impact on Child-Parent Relationships: How does constant tracking through “EyesOut” affect the trust and relationship between parents and their children? Does it foster open communication or lead to a lack of trust?
  • Data Security and Hacking Risk: Data collected by the “EyesOut” app is sensitive and could be valuable to criminals. How important are data security measures, such as secure storage and authentication? What are the consequences of a data breach and the impact on children’s safety?

The company takes information from the complaints and makes the following decisions as a company:

  • Customizable Permissions: “EyesOut” could offer customizable permissions, allowing children to have some control over when and how they are tracked. This way, parents can balance safety with respecting their child’s autonomy.
  • Enhanced Data Security: The developers of “EyesOut” can invest in robust data security measures, regular security audits, and encryption to protect the location data from potential breaches.


But are these two changes enough? Do they address all of the challenges in creating an app that provides safety but maintains relationships?



  • How can we strike a balance between ensuring child safety and respecting their privacy? 
  • What role should technology play in parenting?


  1. Begin by introducing the case study scenario involving the “EyesOut” app and its ethical considerations.
  2. Explain the roles students will assume: Tech Representative, Parent, Child, and Cybersecurity Expert.
  3. Mention the objective of evaluating app features from different perspectives.
  4. Break up the class into groups of about four. Assign each student a specific role (Tech Rep, Parent, Child, or Cybersecurity Expert). If groups have more than four, include additional Parent and Child roles. If a group only has three, eliminate the Cybersecurity Expert role.
  5. Provide Simulation handout with feature descriptions to each group for the Tech Rep to share.
  6. For each app feature, have the Tech Rep in the group explain the feature. Then the tech rep asks for comments from the students representing the Parent, Child, and Cybersecurity Expert roles.
  7. After conversation on each feature, the Tech Rep should decide whether the feature should be added.
  8. After going through all of the features (or as many as time allows), bring the whole class back together to discuss the simulation.
  9. Go through the list of features one at a time and ask the different groups’ Tech Reps whether they would add the feature or not, and why.
  10. Finish off with whole-class discussion questions.

Student handout

Simulation Activity

Each group’s Tech Rep should read the possible app features to the group. For each feature idea, the other team members discuss whether they are in favor of that feature, or opposed. The Tech Rep then decides whether to include that feature in the app. 

Mark on your paper whether you are going to include the feature or not.


Role descriptions

Parent – Answer as if you were a parent wanting to use this app to keep your child safe

Child – Answer as if your parents required you to use this app in exchange for getting a cell phone

Cybersecurity Expert – Answer with the concern that any data could be stolen


App Feature Include?
Real-Time Location Tracking: “EyesOut” allows parents to track their child’s location in real-time, helping ensure their safety and providing peace of mind.
Geofencing: With geofencing, parents can set virtual boundaries on a map, receiving alerts when their child enters or leaves predefined areas, such as school or home.
Social Media Monitoring: The app monitors a child’s social media activity, alerting parents to potentially harmful or inappropriate content and troubling interactions.
Web Browsing History Tracking: “EyesOut” keeps a record of a child’s web browsing history, enabling parents to review their online activities and ensure safe internet usage.
Emergency Alerts: In case of emergencies, the app can send instant alerts to parents, providing location information and ensuring a rapid response.
Daily Activity Reports: Parents receive daily reports summarizing their child’s online and offline activities, helping them stay informed about their child’s well-being.
Private Chat Monitoring: The app can monitor private chats and conversations on various messaging platforms, aiming to detect signs of cyberbullying or harmful content.
Data Encryption: “EyesOut” prioritizes data security by employing robust encryption methods to protect user information from unauthorized access.
Child Consent Settings: Children can adjust privacy settings within the app, allowing them some control over what information is shared and monitored.
Automatic App Blocking: To maintain a safe online environment, the app can automatically block certain apps or websites deemed unsafe, preventing access by the child.


These questions are designed to be used in whole-class discussion. Ask questions that relate most effectively to the lesson.

  1. If your parents made you install an app like this as a requirement for having a cell phone, how would you respond?
  2. How do you balance the safety of a child with their right to privacy in the context of child monitoring apps?
  3. Should parents have the authority to monitor their child’s social media activity and online interactions through an app? What benefits would this feature have for parents?
  4. Can daily activity reports strike a balance between keeping parents informed and respecting a child’s personal space and privacy?
  5. Is there a specific age or level of maturity at which a child should have decision-making authority over their privacy?
  6. What are the potential consequences of monitoring private chats, and how might this impact trust between parents and children?


Exit Ticket: Provide a prompt for students to reflect on their learning, such as: 

  • What is one key lesson or insight you gained from today’s discussion on child monitoring apps, and how has it influenced your perspective on the topic?
  • If you were tasked with creating an ethical guideline for the development and use of child monitoring apps, what would be your top priority or principle, and why?