Child Safety Policy
Last Updated: April 25, 2025
1. Introduction and Commitment
Laila places the highest priority on protecting children and young people from harm, exploitation, and abuse. We enforce a strict zero-tolerance policy for any content, behavior, or activity that puts minors at risk. Our commitment includes proactive monitoring, strict enforcement actions, and collaboration with child safety organizations and law enforcement. Every member of the Laila community shares the responsibility to create a safe environment where all users can engage without fear of exploitation.
This policy outlines Laila's comprehensive child protection measures. We maintain full compliance with laws regarding the prevention and reporting of child sexual abuse and exploitation (CSAE). Any violations will result in immediate and permanent account termination and reporting to relevant authorities.
2. Age Requirements and Verification
Users must be at least 18 years old to use Laila. We use advanced detection methods to analyze user content and block underage users. Age verification during registration is mandatory, and misrepresentation leads to immediate account termination. Laila may request additional verification at any time to maintain safety.
3. Prohibited Content
3.1 Child Sexual Abuse and Exploitation (CSAE)
The following activities are strictly prohibited:
- Sharing, creating, or distributing child sexual abuse or exploitation content.
- Engaging in any sexualized discussions, roleplay, or depictions involving minors.
- Displaying child nudity or sexualization through any form, including artwork or animations.
3.2 Other Prohibited Content Related to Minors
- Threatening, encouraging, or engaging in harm toward minors.
- Psychological manipulation, coercion, or abuse of minors.
- Depicting dangerous behavior involving minors.
- Promoting child neglect, trafficking, or exploitation.
- Minors appearing in live streams, videos, or any form of content is prohibited.
- Creating or managing accounts on behalf of minors is not allowed.
4. Reporting and Enforcement
4.1 In-App Reporting
- In-app flagging system for quick reporting of concerning content.
- Report button available on user profiles and content.
- Email reports can be sent to: lailaappstudio@gmail.com.
4.2 External Reporting Resources
Users are encouraged to report CSAE through international organizations:
- INHOPE Association (Global)
- National Center for Missing & Exploited Children (NCMEC) – U.S., Canada
- International Centre for Missing & Exploited Children – Global
- Childline 1098 India (India)
- Internet Hotline Center Japan (Japan)
- Korea Communication Standards Commission (South Korea)
5. Content Moderation
5.1 Review Process
Laila uses a combination of automated detection tools and human moderation to enforce child safety:
- Automated scanning of all uploaded content.
- Real-time AI-based detection of prohibited content.
- Human moderators with specialized training review flagged content.
5.2 Response Times
Reports involving minors are treated as the highest priority. Our team works 24/7, with most child safety reports reviewed within 24 hours or sooner.
6. Prevention and Education
6.1 Policy Enforcement
- Immediate account termination for any violations.
- Evidence is preserved and provided to law enforcement when necessary.
- Records of enforcement actions are maintained for accountability.
6.2 Community Education
- Clear Child Safety Policy accessible across the app.
- Content guidelines during registration and content sharing processes.
- In-app reminders about responsible usage.
6.3 Staff Training
All Laila staff undergo comprehensive child safety training during onboarding and regular updates thereafter. Specialized training is provided to moderators to detect and act against child exploitation and abuse.
Together, we can maintain a safe, respectful, and secure community for all.