The official blog of The Social Democratic Party.

Watching Me, Watching You

Surveillance culture is far more sophisticated and far more invasive than most of us imagine. And it's only going to get worse. JOSH SIMS looks at how - and why we need to start taking back our privacy.

By: Editor

“It’s absolutely false to think that we in democratic countries have it any different to China,” insists Frederic Lemieux. “The only difference is that China is open about about what it does and we have a more layered, subtle approach. Governments say they’re not bad but the fact is that they have access to everything if they want it. Frankly it’s hard to grasp the scope of the surveillance apparatus today.”

Lemieux is a professor at Georgetown University, US, specialising in information technology, and he uses a virtual private network. He avoids Zoom and social media, has “privacy settings through the roof” and is only ‘friends’ online with people he’s met several times in person. He watches what he says in emails. He won’t wear a smartwatch. And he is not remotely paranoid.

Just look, he says, at the mobile surveillance spyware Pegasus – technically illegal in the US. And yet the FBI has just been caught out, forced to cancel its arrangement with a government contractor that used the tool on its behalf. It’s the latest instance of an abuse of power – and the data breaches that underscore it – that are, it seems, uncovered somewhere around the world every few months. Many more, one can only assume, are not. “So am I hopeful of some correction to this surveillance culture?” says Lemieux. “No.”

Perhaps this culture has been a long-time coming. After all, the idea of systematic surveillance is not new. The Panopticon was the name given to an ideal prison devised by philosopher Jeremy Bentham in 1787. In it, every prisoner would – as an encouragement to improved behaviour – be observable without ever knowing if they were being observed. It would, as Bentham put it, create a “sense of invisible omniscience”. And, he added, more darkly: “Ideal perfection would require that each person should actually be in that predicament, during every instant of time. This being impossible, the next thing to be wished for is, that, at every instant, seeing reason to believe as much, and not being able to satisfy himself to the contrary, he should conceive himself to be so.”

In Bentham’s time, this was no more than a thought experiment. Today the situation is very different. As the tech entrepreneur Maciej Ceglowski put it to a US Senate committee in 2019, “until recently, even people living in a police state could count on the fact that the authorities didn’t have enough equipment or manpower to observe everyone, everywhere”. Now, it seems, it looks as though they “enjoyed more freedom from monitoring then than we do living in a free society today.”

It’s easy to see why. The aforementioned spyware, with advanced processing power, can now collate, save and analyse truly awesome quantities of data; increasingly prevalent CCTV has morphed into often erratic facial-recognition technology and biometrics, including the unevidenced idea that people’s emotional state can be read through their physical appearance; drones have provided ‘eyes in the sky’; digital currencies – actively being promoted in many nations as a stepping stone to doing away with cash – will allow the tracking of all financial transactions; so-called ‘smart cities’ sees the mass deployment of intrusive sensors to monitor citizens supposedly with the intention of improving the urban environment; and there’s ever more wearable tech, RFID tags, GPS dots and the growing Internet of Things all coalescing to provide anyone sufficiently well-resourced with a detailed picture of what once was considered private.

“But then we have also become largely indifferent to matters of privacy,” stresses sociologist Dr. Gary Armstrong, author of ‘The Maximum Surveillance Society’. “Generation Facebook/Tik-Tok/Instagram have a different perception of privacy than my generation – over 60s – and think nothing of self-revelation and self-promotion. As it stands the state knows less about me than, say, supermarket chains do.”

How so? Invariably because the greatest tool in the snoop’s armoury is, as Lemieux puts it, “our own complicity”. We let Alexa listen and Ring Video doorbells watch. We sign up to loyalty schemes. And, given that some 86% of the world’s population, and growing, now owns a smartphone, it seems we willingly carry the means of our own monitoring with us. Indeed, David Lyon, director of the Surveillance Studies Centre and professor of sociology and law at Queen’s University, Ontario, argues that while CCTV might remain the most powerful symbol of surveillance, to still think of it as the most powerful means of surveillance is way out of date. That’s the gadget in our own pocket – our self-imposed, frantically upgraded, style-conscious ankle monitors. He calls the result ‘dataveillance’, our supervision and assessment through a melding of state and corporate interests.

“And that’s been mutating and accelerating at a rapid rate,” he says. Lyon cites a recent case in Canada in which a user of the ordering app from the coffee chain Tim Hortons put in a ‘freedom of information’ request about its function, only to discover that, even when he thought he had disabled it, the app had continued to track him wherever he went, recording when he visited one of the company’s competitors.

What he still didn’t grasp, however, was “the other uses that data was undoubtedly put to, being sold to and among other corporations and institutions in what has become a globally-significant economic system,” says Lyon. “It’s not just about being tracked but analysed, and then treated according to the profile then created and from which all kinds of judgements are made – by employers, healthcare providers, banks, insurers, law enforcement. The thing is that most people just don’t get that this is even happening.”

Small wonder then that when the public reaction to surveillance is discussed it is, at best, rather muted, not least because, as Lyon puts it, “we’ve become seduced [through our smartphones] by the idea of the world organised around our needs, living in a very consumerist society in which efficiency, convenience and comfort have been elevated into core values” – ‘luxury surveillance’ as it has been dubbed. And even if we give it some thought, our rationalisations justifying our acceptance of surveillance tend to be misguided, adds Juan Lindau, professor of political science at Colorado College, US, and author of ‘Surveillance and the Vanishing Individual’.

People dismiss the encroachment of surveillance because, they say, they have nothing to hide – “but it’s a bullshit notion that they wouldn’t mind if every detail of their life was out there for all to see,” Lindau notes. Or they say they’re too irrelevant to be of interest – “but if you ever do anything of even remote political consequence then you’re immediately not irrelevant to the state,” he adds. Or there’s the argument that any one personal revelation is now merely lost in a giant sea of revelations and so doesn’t matter.

“But its evil brilliance is that tech gives the veneer of distance and [us the sense of] anonymity that is entirely fictitious,” he says. “It is not impersonal. We spend our lives now interacting with machines that observe all, that never forget and never forgive, such that the delineation between our inner and outer selves is [breaking down] by stealth.”

It’s also because thinking seriously about the boundaries for surveillance is relatively new – before the seismic revelations of Edward Snowden, 10 years ago this year, much concern about surveillance was dismissed as so much conspiracy thinking, argues Professor Peter Fussey, an expert in criminology at the University of Essex. That, and because much of the surveillance apparatus is, governments so often argue, for our own safety – that’s the line Myanmar has taken in the junta’s crackdown in protests – or for more effective, worryingly ‘proactive’, increasingly militarised crime prevention.

That’s concerning when, as Armstrong argues, we’re well on out way to systems that look for the potentially suspicious or merely inappropriate. “Doing that requires a database of both known and potential offenders. And such schemes are always sold on the benefits of apprehending these known offenders,” he says. “But these schemes are expansionist and soon develop databases of ‘people of interest’ too”.

But it’s also concerning when national emergencies are used to bring in more surveillance and tend to see subsequent spikes in favour of its expansion. A TNS poll conducted in 2014 – thirteen years after the world-changing events of 9/11, but also not long after Snowden – found that 71% of respondents thought government should prioritise reducing the public threat “even if this erodes people’s right to privacy”.

“The idea that surveillance is for out own safety holds water, but only up to a point. Surveillance doesn’t inherently make us safer. And that’s aside from the misplaced assumption that surveillance always works, as many cases of misidentification suggest,” says Fussey, also an independent human rights observer of London’s Metropolitan Police while it trialled facial recognition technology from 2020.

“The problem with people being suddenly more accepting of surveillance after, say, a terrorist attack is that the powers then given [to the machinery of state] don’t tend to be rolled back later,” he adds. “And then there is the fact that if we keep creating these tools that can be used for surveillance – even if that’s not their intended use – they will be. There is simply just so much evidence for their mis-use.”

Furthermore, the expanding means of surveillance – from gait recognition to remote heartbeat analysis – are being developed at such a pace that campaigners and legislators can barely keep up. It says something concerning that a hugely powerful business the likes of Amazon has been entirely open in its ambition to create tech products with what it calls “ambient intelligence”, always there in the background harvesting your life.

There’s mission creep to contend with as well: if it wasn’t bad enough the state and commerce wanting to watch us, remote working has encouraged a culture of surveillance among employers too, with a boom in monitoring software being used to map the behaviour, mood, eye movement, location, online activity and productivity of often oblivious workers. The American attorney Zephyr Teachout has predicted the coming of “surveillance wages”, in which each worker’s pay is constantly changing according to that worker’s perceived alignment with their employer’s expectations. Data would be used for hiring and firing decisions.

Could a new ad-free business model be devised for the web, disincentivizing data collection? Could the European Union’s General Data Protection Regulation be adopted beyond its borders, even as Facebook obtusely moaned about how it and other regulations “may be costly to comply with and may delay or impede the development of new products, increase our costs, require significant management time and subject us to remedies that may harm our business”?

Is there scope for a rebalancing of the interests of what’s often referred to as the surveillance industrial complex – which, of course, makes billions from monetising data flows, with China and US the leading exporters of surveillance tech around the world too – and the rights of the individual? Surely the transparency and accountability necessary for the relationship between state and citizen to function requires it? And yet, right down to how certain parts of your smartphones algorithms work, all is opaque, and getting more so.

“We have to have a much clearer sense of how surveillance will be used, whether it’s legitimate and the necessary limits on its use,” implores Fussey. “We’re invited to think that the technology is just too complicated, but actually the standards we need to protect – standards in international law – are basic. The problem is who enforces those standards. We need the right policies, programmes and oversight.”

“My concern is that so much surveillance now isn’t just about watching where you go and what you do but what information you consume and what thoughts you express,” adds Lemieux. “Surveillance can now be used to gauge opinion and so influence opinion too. It’s not just about watching us through data but manipulating us through data.”

Indeed, the instruments of surveillance only look set to get more invasive, more clever, more wily and devious. The tide might be turning – Lindau argues that after a long period of being “promiscuous with sharing our information”, some of us are waking up, as low download rates for various government-driven tracking apps during Covid might suggest, for all that the pandemic opened the doors to data collection and tracking on a scale that would have been imaginable just a few years before. Some cities – Portland, Oregon, for example – have banned the use of facial recognition in its stores and restaurants. And there’s a growing academic interest in surveillance overreach too.

And yet the more a surveillance mindset is applied, the more ordinary it seems. “Citizens are allowing greater and greater intrusion, to the point where the distinction between public and private has really broken down at this juncture,” suggests Steven Feldstein, senior fellow at the Carnegie Endowment for International Peace and author of ‘The Rise of Digital Repression’. “The smartphone has normalised surveillance, but it’s a slippery slope. You continue to push at the boundaries and surveillance just becomes more and more acceptable. And there are no concerns about this because there is no political will [to make changes]. And there’s no political will because nobody seems to care about it. We’re seeing a greater level of omni-surveillance made possible and that needs more push-back.”

In fact, we’re moving towards TIA, or Total Information Awareness, “the goal to know everything about everyone in real time,” as Lindau explains. “And so far all that has limited that most totalitarian of ambitions has been the tools.” Small wonder that the resistance to London’s ULEZ camera network is not simply a reflection of disgruntlement at yet more monetising of motoring but an unease at being watched for yet another reason.

The really bad news? The tools are coming: the AI Global Surveillance Index suggests that at least 75 out of 176 countries globally, many of them liberal democracies, are already using artificial intelligence for automated surveillance purposes. “All considerations we have about surveillance get put on steroids with AI,” Lindau says – the French government, for example, has just passed a law allowing the use of AI in mass video surveillance at next year’s summer Olympics in Paris; and for AI to work, the data must flow. Your data. Everybody’s data. “The ease with which AI will be able to amass and process information, combined with facial recognition, well, that’s ominous,” he says.

He cites by way of example his recent experience of returning home from holiday in Norway and passing through the notoriously aggressive and prying US Immigration. He was expecting the typical barrage of questions. Instead he was just asked to look into a small camera. That was it. Lindau asked whether they wanted to gather the usual details about where he had been and for how long and why. No, they said casually, we already know that.

Have your say...

Your email is never published nor shared. Required fields are marked *

*

All Comments (1)

  • Really interesting – I’ve just been looking at your policies and was concerned about the proposal for national ID cards. It’s probably a bit out of date to worry about the introduction of ID cards but in principle it seems to be extending the monitoring potential of the state. Isn’t this an issue?


30th October 2023

Family, Community, Nation.