What she has seen so far has been disturbing, she says, but she adds that “every day is disturbing”.
At the desk behind her, a colleague prepares to spend the afternoon trawling through websites known to attract paedophiles, searching for and removing links to illegal content. As they police the internet, analysts listen to BBC Radio 6 Music.
When they need a break from the relentless flow of images they are required to view, they pause and play Super Mario or ten-pin bowling on the Nintendo Wii.
Staff at the IWF have a uniquely difficult job, charged with watching, classifying and removing images of child sexual abuse from the web, working from a secure office just outside Cambridge, where police have given special dispensation for employees to search for this material.
In the past 12 months there has been a 40% rise in reports made to the IWF of potentially illegal content. Staff link the increase to a surge of public unease about the presence of this material online.
Detectives investigating the sexually motivated murders of two young girls, Tia Sharp and April Jones, revealed that both murderers had accessed online images of child sex abuse before the killings.
The police department responsible for internet safety, the Child Sexual Exploitation and Online Protection Centre (Ceop), estimates that there are now millions of sexual abuse images in circulation on the internet.
Within the UK, responsibility for removing them has been outsourced by the big internet providers to the tiny, little-known IWF charity with a staff of fewer than 20, based in a nondescript business park.
Employees are divided into two groups, those with administrative roles, and a team of eight people who have received specialist training to equip them to watch the illegal content. The two groups sit in separate rooms, and administrative staff are not allowed to enter the hotline room, where online images are assessed, until analysts have removed illegal content from their screens.
Analysts work their way through between 100 and 200 new notifications of online child sexual abuse imagery every day, sent in by the police, or the public, who can make anonymised reports about illegal content via the charity’s website.
Employees will not use the word pornography in this context, because they believe it does not “accurately reflect the gravity of images” they deal with.
Web analysts have to enter the website address and study the content to assess whether the images or film contain inappropriate shots of children; then they have to grade the severity of the material, according to UK sentencing guidelines, on a scale of one to five (from one, erotic posing with no sexual activity, to five, sadism).
If the image is classified as illegal, and is hosted in the UK, the analyst will alert the police at Ceop and contact the server to request that it is removed. Usually this can be done within an hour.
In 1996, the UK hosted about 18% of child sexual abuse images globally; since 2003, that figure has dropped below 1%, largely thanks to the work of the IWF. If the image is hosted on a server in another country, staff contact partner helplines in that country, and pass on the request for the page to be removed.
Twice a day, the charity sends companies such as Google and BT updated lists of sites which show illegal content, and they block them.
Working for the IWF is stressful, and staff are given mandatory monthly counselling sessions and quarterly group counselling to help minimise the scarring effects of spending all day looking at images of children being abused.
Before they are employed, they go through a rigorous selection process, which uses specialist interviewing techniques (Warner interviews) to ensure that unsuitable applicants are not employed.
Peter Burness spent four years as a computer games tester (“It got very monotonous,” he says) before joining the charity this year.
“They tell you what it will entail. I am quite a calm person,” he says, but he was nevertheless shocked by the images he saw during his interview process. “One of the main reasons was how happy the children looked. It was very strange.”
The charity’s work is mainly funded by internet service providers, and it acts as a self-regulatory body. Although IT expertise helps, no particular qualifications are needed. Currently the charity employs an ex-chef, a former IT trainer, someone who was a complementary therapist and an ex-mobile phone salesperson.
“It’s about personal qualities,” explains Chris Barker, the hotline manager, in charge of eight members of staff who come in for two seven-and-a-half-hour shifts, and responsible for making sure they are managing to handle the strain.
“There are different techniques for coping. For myself, I concentrate on assessing the images from a legal perspective, and on the fact that I am working on getting it removed – I view that as the motivation for what I’m doing,” he says.
Most of his team take a daily 20-minute walk by the lake just beyond the business park at lunchtime to switch off from their work.
They try to limit their own exposure to the material to the bare minimum.
“We try not to watch the video in its entirety; where possible, we just scroll through frame by frame. Unless there’s a particular reason to listen, we don’t put the headphones on. You would only listen if you were searching for clues – maybe a regional accent that could help Ceop locate the abuser.”
Staff also do victim identification work.
Many of the images that are reported are historical, and are pictures that they have seen repeatedly, rehosted on different sites, but when they come across previously unseen images, they search for clues about where they may have been filmed – looking at the language of the books on the shelves, the style of the plug socket, the shop fronts that may be visible through the window, the logos on the school uniform.
Barker has been working here for more than a year, but still occasionally finds the material he encounters profoundly shocking.
“It might be that you see something that is particularly violent or where there is a particularly young victim – a baby or a newborn.
The next time you see an image like that you are better able to assimilate it. The first time you see an image you are thinking about the victim, what’s going on in their lives, in their families. It is true to say that you build up resilience – if you were shocked every time you see an image that would be hard.
But when you become an analyst, you are making an assessment – is this a level one image? How many victims are there? What gender and race are they?” he says.
He compares himself to a member of a fire or ambulance crew. “You show up and you see something quite horrific, but you are there to do a job, you are not thinking ‘that poor person, lying on the road’, you are thinking, ‘I need a tourniquet, blood.’”
Guardian News Service