CONTENT MODERATION AND SAFETY POLICY

Effective Date: 11 February 2024

This Policy explains how Flipdare maintains a safe environment for all users.

Prohibited Content

The following content is strictly prohibited:

  1. Child sexual abuse material (CSAM)
  2. Violence, threats, or terrorism
  3. Sexual assault or exploitation
  4. Self‑harm or suicide promotion
  5. Hate speech
  6. Harassment or bullying
  7. Criminal activity
  8. Spam or malicious content
  9. Any content illegal in the user’s jurisdiction

Reporting Mechanism

Users may report any content directly within the Application.
Reports may include:

  1. Death or self‑harm
  2. Child abuse
  3. Violence
  4. Sexual crimes
  5. Bullying
  6. Spam
  7. Criminal activity
  8. Legal issues
  9. Other serious issues

Automated and Manual Review

Flipdare may use:

Severe reported content is reviewed by a human moderator within 5 days.
All other reported content is reviewed by a human moderation within 31 days.

Law‑Enforcement Escalation

Severe content may be forwarded to:

Flipdare complies with all lawful requests for information.

Reputation System

Each user has a Reputation score (0–100).

  1. Low Reputation triggers:
  2. increased content analysis
  3. automated restrictions
  4. potential suspension

Automated Restrictions

Depending on severity, users may lose access to:

  1. profile editing
  2. chat
  3. Task creation
  4. Promise creation
  5. other features

Restrictions last from 1 day to permanent, based on severity and Reputation.

Dispute Resolution

Restricted users may dispute via:

A mediator may be requested.
The mediator’s identity is hidden.
Final decisions are issued within 7–8 days.