Zero Tolerance Policy
Concord maintains a strict zero-tolerance policy regarding child sexual abuse and exploitation (CSAE). Any content, conduct, or activity that exploits or endangers minors is strictly prohibited.
Prohibited material and behaviour include, but are not limited to:
Child sexual abuse or exploitation in any form, including imagery, grooming, coercion, or sexually suggestive content involving minors.
The sexualisation of minors, whether real, fictional, or digitally generated.
Attempts to contact minors for sexual purposes or engage in inappropriate communication.
The use of Concord to create, promote, distribute, or facilitate CSAE-related content.
Impersonation of minors for grooming, deception, or exploitation.
Violations of this policy will result in immediate and permanent account suspension. Concord may report confirmed violations to relevant local and international law enforcement authorities, including the National Center for Missing & Exploited Children (NCMEC), and will cooperate fully with lawful investigations.
Moderation and Safety Controls
Concord employs a combination of automated systems and human oversight to prevent, detect, and address CSAE-related activity.
Our safety measures include:
Proactive detection and removal of prohibited content prior to publication where possible.
Continuous monitoring of user reports and behavioural signals to identify suspicious activity.
Moderation of user-generated content, including chats, posts, profiles, avatars, and media.
Temporary restriction or suspension of accounts pending investigation where safety thresholds are exceeded.
Age Restrictions and Identity Safeguards
Concord is intended for individuals aged 16 and above. Certain features are restricted to users aged 18 and above.
Users must confirm their age during registration and may be required to complete age-verification procedures. Accounts, usernames, avatars, or representations designed to depict or mimic minors in a misleading or exploitative manner are prohibited and will be removed.
Reporting and Community Safeguards
Concord is committed to empowering users to help maintain a safe environment.
Users may report suspected CSAE content or behaviour through in-app reporting tools accessible across profiles, conversations, posts, and media.
Reports may be submitted anonymously and are prioritised for review, typically within 24 hours.
Users may block other accounts at any time and manage privacy settings to control contact and visibility.
Where appropriate, confirmed CSAE content or accounts will be escalated to relevant law enforcement authorities.
Contact and Escalation
For urgent safety concerns or legal inquiries relating to child protection or exploitation, please contact safety@concord.digital. If you or someone you know may be in immediate danger, contact your local authorities.
Concord complies with lawful data disclosure requests and provides appropriate reporting channels for law enforcement agencies and recognised non-governmental organisations.
Comments
0 comments
Article is closed for comments.