Student Surveillance Under the Spotlight: Part 2
30 Oct 2018
In the quest to keep students safe, some Australian schools are turning to sophisticated surveillance technologies to monitor student behaviour on digital devices. Companies with reassuring names like eSafe Global and Family Zone are providing the technologies, but how much do we really know about the effectiveness of their services? With potential privacy issues, and concerns the technologies may create new risks, questions are being raised to see if there a better way to achieve the same outcome?
David Vaile has been considering this question over many years, and from multiple angles. He is currently Data Protection Leader at the Allens Hub for Technology, Law & Innovation at the Law Faculty of UNSW. He is Sydney Chair of the Australian Privacy Foundation. He has worked for government at the Office of the Federal Privacy Commissioner and Privacy NSW, and in a range of other fields including medical informatics, online content development, and risk management.
According to Vaile, schools who are considering student surveillance, and parents considering granting permission for their children to be monitored, should be asking tough questions of those championing the technologies.
“They should be asking, has the service been exposed to independent, expert assessment and review? If that information is not there then they should be saying sorry, we shouldn’t be going ahead. We need to know it’s going to work, to some degree,” explains Vaile.
Without that level of information, he believes there could be issues with governance.
“Most likely these services won’t have built in evaluation or risk assessment. I want testable, verifiable evidence or data. And independent or objective people doing the collection and analysis. Not the proponents, not the minister, not the head of PR.”
In this “age of surveillance capitalism,” Vaile feels the business models of some companies in this space can focus more on marketing than on product provision.
“There’s an amazing amount of spin. It means they are very often politically savvy, their messaging is very sophisticated and it’s also, potentially, very manipulative.”
“Unless they’ve got serious, external, evidence-based assessment of what the problem was in the first place, what they’re doing, and what impact they are having, it’s essentially just marketing fluff, or word salad. It’s hypnotising the chicken.”
What makes the decision-making process more difficult for school leaders and parents, according to Vaile, is the fear mongering that goes on and the reasonable concerns it exploits.
“In a sense, people’s concerns are part of the problem. In some cases those concerns are justifiable, in others not.”
says the public can be whipped up into a moral panic and led to believe
surveillance is the only solution. What ensues is some sort of security
theatre, a moral gesture saying that we care and that we’re trying to do
Vaile sees some proponents of surveillance looking for political value in creating a state of fear. When there are already genuine grounds for that fear, surveillance becomes an easier sell. The problem is when resources are being poured into surveillance and it’s not effective, it could be displacing things that might work. Or worse, it might be actively counter-productive.
Weighing up the benefits and the costs
If we are to take the dramatic step of placing students under surveillance, or any group or individuals for that matter, we need to be confident the benefits are worth the costs and the risks that having such surveillance creates. We need to ask ourselves: does the mean justify the end? That’s what proportionality analysis is all about, and as far as Vaile is concerned, not enough is being done in this country to weigh up the pros and cons.
“Proportionality is a concept that appears in Australian law and case law but is not very well developed. In Europe it’s extremely well developed. It’s essentially a quantitative exercise,” he says.
“On one side you have the benefits of a program or a technology and on the other side traditionally you have a cost/benefit analysis.”
Things get more complex when risk has to be factored into the cost side of the equation, a reason many organisations shy away from it, he explains.
“Risk is much harder to put a dollar figure on. And it’s much more open-ended as to how harm might actually manifest. It’s intrinsically much harder to quantify.
“And yet, it’s not rocket science,” he says.
“A lot of entities doing cost/benefit analysis, or cost and risk against benefit analysis, have been happy not to think too much about the risks. Equally, some are happy to think only of the costs and risks that affect them.”
He believes it’s crucial they consider the potential risks and harm surveillance can cause others. The implications for privacy, for example, require rigorous analysis.
“I feel an obligation to raise the flag for privacy as a human right,” he says.
“In Australia, privacy is very weakly protected by law, as are most other things that are considered human rights or civil liberties or individual rights in other countries. They are protected in spirit rather than by litigable legislative provisions or specific protections you can rely on in front of a court. We don’t have law, particularly in relation to data protection, that individuals can use.
The counter-example he cites is in the EU where there has been widespread objection to the practice of metadata retention.
“In Australia, they tried to slip it through with virtually no justification whatsoever. But in the European context, where they actually did the proper analysis they looked at how well it actually works. They found that the evidence of improvement in law enforcement, in criminal prosecutions, was negligible.
“And in the US, the same happened with the Presidential Civil Liberties Oversight Board. In terms of catching terrorists they found it didn’t work enough. In both cases, they said those whole programs are invalid because they’re not proportionate. The risks, the intrusions, the problems, are much greater than the actual benefit you get out of them.”
While at UNSW’s Cyber Space Law & Policy Centre, David Vaile came across another approach to keeping children safe. At the time, he was working alongside the National Children’s & Youth Law Centre, Save the Children, and the Inspire Foundation researching internet monitoring, filtering, and censorship in relation to children. What became apparent was that by focusing on the dangers children face and technological interventions, an opportunity was being missed to develop the child.
“What these organisations were saying was what really keeps children safe is building resilience. We should be focusing on their capacity, their developmental creativity, their self-respect, and their communication skills.
“When I first heard that conversation I thought, this makes a lot of sense. It builds on a clear-eyed, non-hysterical view of what growing up in a world of trouble actually means.”
Building resilience is about developing relationships and trust, and giving children the cognitive tools to assess and deal with a range of situations.
“As they get older,” he says, “they learn that they have to work it out. But also, importantly, if something goes wrong or they’re not sure of what’s going on, they know they can turn to someone, whether it's a parent or a teacher or a responsible adult. They won’t be isolated.”
Vaile acknowledges that this approach to safety has its challenges. For one, young children have a limited cognitive capacity for understanding consequences and changing their behaviour. The approach also requires parents and schools to accept that things can and will happen to children.
“Parents can’t be there all the time,” he says. “The school can’t be there all the time. The spyware can’t necessarily check everything.”
He believes that if we relieve children of the need to develop an understanding of consequences, we run the risk of turning out children who can’t think for themselves.
“One of the ways you learn is you realise that things go wrong. You learn I hated going out with those mean kids or I broke my arm or that was a really weird experience I just had online. Trying to make sense of stuff that happens and learn the right thing from it is part of life.”
Minor negative experiences are actually useful, according to Vaile.
“They are what helps them learn consequences, what helps them develop that cognitive capacity to predict. So if you’re putting in a technological aid that’s supposed to make up for any cognitive deficiency they have, I suppose it’s an issue of whether the surveillance becomes like a crutch or a prosthetic, which means they never actually develop those skills.”He is also concerned that parents and schools will develop an over-reliance on “the black box of surveillance” because “it makes them feel comfortable.”
“That will blind them to the real risk that they are in fact, worsening or weakening the very rapport with their children that is such a critical failsafe when things go wrong.”
Openness and trust
Strong relationships with children are crucial to keeping them safe, according to David Vaile. To enable those relationships, he believes parents and schools, who have been provoked into a state of panic by business and government, need to keep their heightened fears in check.
“When a child comes to you and says, I saw this or I said that or I was thinking of this, the important thing is you’re calm. Even though what they’ve just told you is pretty scary, you’re not going to give them a terrible emotional experience just for telling you about it. And you’re not going to overreact. So if something happens again in the future, something that might really be serious, they will come back again.
“When you have a technocratic, external model, it can feel like being spied on,” says Mr Vaile, “and that’s not consistent with kids trusting you.”
“The last thing you’d do would be report something to the police or the local authorities, or even to your friends. You’d be suspicious that everything was monitored and tracked.
“In one sense I’m exaggerating,” he says. “But I’m just observing that it wouldn’t be completely improbable for young people going through periods of alienation or rebellion, to feel the yoke and the oppression of being the targets of a surveillance operation.”
Those feelings can lead to surveillance resentment and inhibit the trust and openness children need to feel when approaching adults with their issues.
“Especially if the adults have hysterical overreactions to all the little things that can go on, some of which go on to be serious, most of which don’t.”
This is where the concepts of risk appetite and risk tolerance come into play, according to Vaile. He believes that in order to help kids develop resilience, “we have to have some risk appetite, we have to allow them to make mistakes.”
While acknowledging this challenging, Vaile says accepting a certain amount of risk and minor harm now is what can help keep children safe in the long term. What’s important he says, is how we deal with those risks, “rather than accepting the notion that some omniscient outside force can step in and solve everyone’s problems.“One of the reasons I’m so passionate about this perspective,” he says, “is that it has some hope of empowering parents and schools to think more soberly about things.”
About David Vaile
David Vaile is Data Protection Leader at the Allens Hub for Technology, Law & Innovation at the Law Faculty of UNSW, and Sydney Chair of the Australian Privacy Foundation. His expertise extends over education, data security and law, privacy, and organisational risk management.
About Mark Donkersley
Mark Donkersley is Managing Director of eSafe Global based in Greater Manchester, England. He also provides briefings on safety in the digital environments to HM Government Select Committees in the Commons and Lords (UK), and to the e-Safety Commissioner’s Office in Australia.