• 0 Posts
  • 4 Comments
Joined 11 months ago
cake
Cake day: August 16th, 2023

help-circle


  • To expand the conversation; NOTE: I am NOT a Lawyer
    People hosting a federated instance in Australia would likely be classed as a Social Media service and be bound by the relevant safety code on the eSafety commissioners site here: https://www.esafety.gov.au/industry/codes/register-online-industry-codes-standards. This is planned to take effect in December 2023 but serves as a guide.

    First perform an assessment on your risk factor to determine a Tier (1,2,3) which dictates your required actions. Services that assess between tiers should assume higher risk, which means, potentially, you may be classed higher risk due to the general nature of the content (its not a club so conversation is around a specific topic).

    Minimum compliance (assuming you are classed as a Tier 3 Social Media Service)
    Section 7, Objective 1, Outcome 1.1 and Outcome 1.5:

    Should you be determined to be Tier 2 or 1, there are a whole raft of additional actions including ensuring you are staffed to oversee the safety (1.4), and child account protections (1.7) (preventing unwanted contact), and active detection of CSAM material (1.8)


    1.1
    Notifying appropriate entities about class 1A material on their services
    If a provider of a social media service:
    a) identifies CSEM and/or pro-terror materials on its service; and
    b) forms a good faith belief that the CSEM or pro-terror material is evidence of serious
    and immediate threat to the life or physical health or safety of an adult or child in
    Australia,
    it must report such material to an appropriate entity within 24 hours or as soon as
    reasonably practicable.
    An appropriate entity means foreign or local law enforcement (including, Australian
    federal or state police) or organisations acting in the public interest against child sexual
    abuse, such as the National Centre for Missing and Exploited Children (who may then
    facilitate reporting to law enforcement).
    Note: Measure 1 is intended to supplement any existing laws requiring social media service providers
    to report CSEM and pro-terror materials under foreign laws, e.g., to report materials to the National
    Centre for Missing and Exploited Children and/or under State and Territory laws that require reporting
    of child sexual abuse to law enforcement.
    Guidance:
    A provider should seek to make a report to an appropriate entity as soon as reasonably
    practicable in light of the circumstances surrounding that report, noting that the referral of
    materials under this measure to appropriate authorities is time critical. For example, in
    some circumstances, a provider acting in good faith, may need time to investigate the
    authenticity of a report, but when a report has been authenticated, an appropriate authority
    should be informed without delay. A provider should ensure that such report is compliant
    with other applicable laws such as Privacy Law.

    1.5
    Safety by design assessments
    If a provider of a social media service:
    a) has previously done a risk assessment under this Code and implements a significant
    new feature that may result in the service falling within a higher risk Tier; or
    b) has not previously done a risk assessment under this Code (due to falling into a
    category of service that does not require a risk assessment) and subsequently
    implements a significant new feature that would take it outside that category and
    require the provider to undertake a risk assessment under this Code,
    then that provider must (re)assess its risk profile in accordance with clause 4.4 of this Code
    and take reasonable steps to mitigate any additional risks to Australian end-users
    concerning material covered by this Code that result from the new feature, subject to the
    limitations in section 6.1 of the Head Terms.