Child Safety Standards
MatchMakers LLC · Effective: April 19, 2026 — Version 1.0
Pending final legal review · Last updated April 19, 2026
1. Our Commitment
MatchMakers is an adults-only dating and matchmaking service. We do not permit minors to use our Services, and we do not permit content that sexually exploits or endangers minors in any way. This Child Safety Standards Policy describes the standards we maintain and the mechanisms we use to prevent, detect, and respond to child sexual abuse and exploitation ("CSAE") on our platform.
This policy applies globally and is published to satisfy:
- Google Play's Child Safety Standards policy for Social and Dating apps
- Apple App Store Review Guideline 1.2 (Safety — User Generated Content)
- United States federal law at 18 U.S.C. § 2258A (reporting requirements for electronic service providers)
- Our own safety mission
2. Eligibility Is Adults Only
MatchMakers is available only to individuals who are eighteen (18) years of age or older. This is enforced through:
- Registration gate. During registration, every user must submit a date of birth and attest that they are at least eighteen (18). Registrations that report a date of birth indicating the user is under eighteen (18) are blocked.
- Age-appropriate content rating. The app is rated 17+ on the Apple App Store and Mature 17+ on Google Play.
- Terms of Service. Our Terms of Service prohibit account creation by minors.
- Ongoing review. Accounts we reasonably believe to belong to minors are suspended pending age verification and, if confirmed, deleted, with CSAE content preserved and reported as described below.
3. Zero Tolerance for CSAE
MatchMakers prohibits, without exception:
- Child sexual abuse material ("CSAM"), including any visual depiction of sexually explicit conduct involving a minor, whether produced, received, possessed, solicited, or transmitted.
- Grooming, enticement, or any communication aimed at engaging a minor in sexual activity or in production of sexual material.
- Trafficking, advertising, or facilitating the commercial sexual exploitation of a minor.
- Sexualized content involving minors (including "coded" or obfuscated content that depicts or references minors sexually).
- Content that promotes, normalizes, or glorifies the sexual exploitation of minors.
- Any account operated by or on behalf of a minor.
Violation of any of the above results in immediate account termination, preservation of evidence, and a report to the National Center for Missing & Exploited Children ("NCMEC") CyberTipline, or the equivalent authority in the user's jurisdiction.
4. How We Prevent CSAE
4.1 Age Gate. See Section 2.
4.2 Profile Photo Review. New user profile photos are subject to automated screening at upload time for apparent minors, nudity, and other prohibited content. Flagged uploads are queued for human review by our Trust & Safety team before the profile is allowed to appear in matching. Profile photos that depict clothed minors (family photos, children pictured in group photos) are not permitted; users are asked to submit solo photos depicting only themselves.
4.3 Content Filters in Messaging. In-app messaging is scanned for terms, patterns, and images associated with CSAE. Accounts sending such content are immediately suspended pending review.
4.4 Behavioral Signals. Our Trust & Safety team monitors behavioral signals (rapid account creation, use of temporary emails, VPN patterns associated with prior violators, devices previously banned) to identify accounts attempting to circumvent our age gate.
4.5 User Education. New-user onboarding includes safety guidance and a link to this policy. Our Community Guidelines reinforce that the Services are for adults only.
5. How We Detect CSAE
- Automated scanning. We use industry-standard content-hashing techniques to detect known CSAM on upload. Our systems align with best practices including those referenced by NCMEC CyberTipline reporting protocols. Messages are scanned for prohibited patterns.
- User reporting. Every profile and every conversation includes an in-app report button. Reports of child safety concerns are prioritized and addressed by our operations team promptly, typically within 24–48 hours. Critical concerns (imminent harm) are escalated immediately to the NCMEC CyberTipline.
- Human review. Our operations team reviews flagged content, investigates reports, and takes action.
6. How to Report a Child Safety Concern
6.1 In the App. Open any profile or message, tap the flag icon, and select "Child safety concern" or "Minor on platform." This submits the report to our priority-one queue.
6.2 By Email. Send a report to child-safety@matchmakersusa.com. Include as much detail as you can — the offending user's profile name, screenshots (if you have them), approximate date and time, and any context. We treat all such reports as urgent.
6.3 To Authorities.
- United States: report directly to the NCMEC CyberTipline at report.cybertip.org or 1-800-843-5678.
- Other jurisdictions: contact your local law enforcement or your country's equivalent of NCMEC (for example, the Internet Watch Foundation in the United Kingdom at iwf.org.uk).
Reporting to authorities does not remove content from our Services. Please also report to us so we can remove the content and take action against the account.
7. How We Respond
When we identify CSAE content or an account belonging to a minor:
- Preserve. We preserve the offending content and account data in accordance with 18 U.S.C. § 2258A and applicable evidence-preservation practices.
- Report. We report to the NCMEC CyberTipline within the time required by law (for U.S. content). For content identified outside the United States, we report to applicable local authorities.
- Remove. We remove the content and suspend or terminate the account.
- Ban. We take reasonable steps to prevent the user from re-registering, including device fingerprinting.
- Investigate. Where appropriate, we cooperate with law enforcement investigations through formal legal process.
8. Compliance with Applicable Laws
MatchMakers complies with:
- United States federal law — 18 U.S.C. § 2258A (CyberTipline reporting), 18 U.S.C. § 2258B (protection from civil liability for good-faith reporting), 18 U.S.C. § 2256 (definitions), and related provisions.
- State law — applicable state statutes governing online child safety, including where applicable mandatory-reporting laws.
- Google Play Child Safety Standards policy.
- Apple App Store Review Guideline 1.2.
- European Union — Regulation (EU) 2021/1232 ("CSAM Interim Regulation") and Directive 2011/93/EU, to the extent applicable.
- United Kingdom — the Online Safety Act 2023 to the extent applicable.
9. Our Team
Trust & Safety at MatchMakers is the responsibility of trained personnel operating under the direction of the Chief Executive Officer. All personnel with access to reports of CSAE sign confidentiality undertakings, receive training on applicable law and trauma-informed review practices, and have access to mental-health support resources.
10. Point of Contact
For questions about this policy, to raise a child-safety concern that cannot be submitted through the in-app report function, or for press and legal inquiries:
11. Annual Review
We review this policy at least annually and after any material change in our practices, our obligations, or applicable law.
12. Acknowledgement
If you operate an account on the Services, you acknowledge that you have read this policy and you agree to comply with it. Violations result in termination and may result in referral to law enforcement.