In the middle of an ordinary looking office lies a large room with frosted glass. It houses the eSafety Commission's investigators who are tasked with viewing the worst aspects of humanity. Their work can involve anything from child sex abuse, to bestiality and sadism.
Sometimes, investigators will watch children grow up through abusive material.
Since the head of investigations Toby, who along with his team cannot give his last name for privacy reasons, joined the commission, he has seen child sexual abuse material evolve into a "global catastrophe".
"The number of images that are produced and shared every year is on the rise," he said. "But we still have just under half the world's population to come online. And we are not prepared for that eventuality.
"On the other side of the ledger, we've got some smart tools, smart people working for us and we are making an impact."
'There's so much out there to do'
The commission is the first of its type in the world. Since it was established in 2015, it has tripled in size and its remit has shifted from young Australians to ensuring the safety of all, investigating thousands of complaints and providing educational resources.
While the commission doesn’t prosecute - it leaves that to law enforcement - it does work with police and online platforms to ensure content is removed.
"The more we instil in the [technology] industry a sense that they need to harden their systems against misuse and abuse, the closer we'll get to something that looks like victory," Toby said.
The investigation unit is made up of three teams: the Cyber Report team deals with illegal and harmful content including child sex abuse material, the Cyberbullying and Cyber Abuse unit investigates serious cyberbullying targeting children and the Image-Based Abuse team covers the non-consensual sharing of intimate images or videos, or threats to do so.
The investigators come from a range of backgrounds including police, military or intelligence.
When a complaint is lodged, it is sent to the relevant unit and reviewed by an investigator who will seek to remove the material, often working with the country that hosts the website or the online provider.
"There's so much out there to do and we can always be doing so much more but it is seeing the results and seeing that this site's been taken down [that makes it worthwhile]," cyber report senior investigator Kate said. "Sure another will pop-up, but they will find that and we'll get onto that."
Helping to make the online world a better place
In the last financial year, investigators handled about 14,500 reports of child sexual abuse material. This year, the numbers are expected to be worse. The pandemic has led to a steep rise in complaints, with a 123 per cent increase year-on-year in the average number of monthly reports across all areas from March to September.
The work is not without its challenges.
"We've all encountered content that has just torn the guts out of us and might have left me crying at my machine," Toby said. "Those are the moments when you have to pause and take stock of your self-care process and talk to your colleagues, which is something we absolutely encourage - it's a fundamental part of our response.
"It's important for us all to remind ourselves, and our peers as well, that what we see is a very small slice of the things that happened to children. Outside of this small slice are all the wonderful things that happen to children - the growth, the joy, the exploration, the play, the love, the nurturing and sense of security that comes from being within a loving family and community environment. That's the experience of just about all kids, what we're seeing here is a very small slice of malice."
The investigators are encouraged to take regular breaks, keep an eye on peers and speak to a psychologist every three months. For eSafety Commissioner Julie Inman Grant, ensuring the psychological wellbeing of her investigators is fundamental.
"[They] literally wake up every day and subject themselves to the absolute worst aspects of humanity," she said.
"Day in and day out, for no overt recognition but only in the knowledge that they may be able to help these victims suffer less over the long term by removing imagery of their devastating abuse.
"They do take some satisfaction out of disrupting these paedophile networks and know they are helping to make the online world a better place but few people in the world are willing to subject themselves to this kind of darkness."
Despite the thousands of videos and images investigators remove, some of it will inevitably emerge. But knowing that a child is being abused and the investigators are able to minimise the number of people who view it, keeps the team going.
"It's really impactful and distressing," manager of the cyber report team Dave said of the job.
"The volume of content that we see is extreme and may appear insurmountable to the wider community, but it's a fight that you can't give up on, and you're fighting for survivors of this abuse and looking to identify these children from these abusive situations and rescue them."
It's been a huge five years since Ms Inman Grant first stepped into the role. She's overseen sweeping changes in image-based abuse, new educational programs, new responsibilities following the Christchurch massacre and draft legislation which would see individuals and companies fined thousands of dollars if they fail to take down abusive content from the internet within 24 hours.
"It's been challenging ... there is no precedent, we're kind of creating the precedent as we go," she said.
Don't forget to hit the Child Protection Party icon below and then click Subscribe and ring the bell on the YouTube page. You will then be notified when new videos are posted.