Child Safety Standards
SpatzChat has a zero-tolerance policy toward child sexual abuse and exploitation, known as CSAE, and child sexual abuse material, known as CSAM.
SpatzChat prohibits any user from creating, uploading, sharing, requesting, promoting, facilitating, or distributing content or behaviour that sexually exploits, abuses, endangers, or targets children.
This includes, but is not limited to:
- Child sexual abuse material
- Grooming or attempted grooming of a child
- Sexual solicitation of a child
- Sextortion, sexual blackmail, or coercion involving a child
- Sharing, requesting, or encouraging sexualised content involving a child
- Any communication or behaviour that endangers children or violates child protection laws
SpatzChat is designed for workplace team communication and conflict-resolution use. It is not intended for children. However, SpatzChat still maintains clear standards against CSAE and CSAM.
If SpatzChat becomes aware of CSAE, CSAM, or any child safety concern, we may take appropriate action, including:
- Removing the content
- Suspending or disabling the relevant account
- Preserving relevant information where legally appropriate
- Reporting confirmed CSAM or credible child safety concerns to relevant authorities, including the National Center for Missing & Exploited Children, where required or appropriate
Users can report child safety concerns through the app’s reporting or support function.
Reports can also be sent by email to:
Child Safety Point of Contact:
Spatz Child Safety Team
login@spatz.ai
These standards apply to SpatzChat and are maintained by Pablow API Pty Ltd trading as SpatzAI
Last updated: 28 April 2026