CIA buys into Web 2.0 monitoring firm

Wired online has a report that the US Central Intelligence Agency has bought a significant stake in a market research firm called Visible Technologies that specializes in monitoring new social media such as blogs, mirco-blogs, forums, customer feedback sites and social networking sites (although not closed sites like Facebook – or at least that’s what they claim).  This is interesting but it isn’t surprising – most of what intelligence agencies has always been sifting through the masses of openly available information out there – what is now called open-source intelligence – but the fact is that people are putting more of themselves out their than ever before, and material that you would never have expected to be of interest to either commercial or state organisations is now there to be mined for useful data.

(thanks, once again to Aaron Martin for this).

Surveillance cameras in the favelas (2)

A couple of weeks ago, I found out that the military police had installed surveillance cameras in the favela of Santa Marta, in Rio de Janeiro, Brazil, which I visited back in April. This is the first time such police cameras have been put into such informal settlements in Rio. My friend and colleague, Paola Barreto Leblanc, sent me this link to these youtube broadcasts from a local favela TV company, in which residents discuss their (largely negative) views of the cameras.

There is also a poster that has been put up around the area produced by the Community Association and other local activist and civil society groups – see here – which reads as follows in English:

SANTA MARTA , THE MOST WATCHED PLACE IN RIO

At the end of August, the inhabitants of Santa Marta were surprised to learn from newspapers and TV that nine surveillance cameras would be installed in different areas of the favela. A fear of being misinterpreted paralysed the community.

Many of the people of the city, and some in the Moro itself support this initiative.  However, we are a pacified favela, so why do they keep treating us as dangerous?

Walls, three kinds of police, 120 soldiers, cameras – this is no exaggeration.  When will we be treated as ordinary citizens instead of being seen as suspects?

Wall: 2 million Reais, Cameras, half a million Reais. How many houses could this amount of money build? How many repairs to the water and sewage system?

The last apartments built in Santa Marta are 32 square metres. The Popular Movement for Housing [an NGO] says that the minimum size should be 42 square metres. Other initiatives have gone with 37 square metres. So why don’t we stand up and demand this minimum standard? This should be our priority!

When will the voice of the inhabitants of this community be heard?

We need collective discussion and debate.

Fear is paralysing this community and preventing criticism. But the exercise of our rights is the only guarantee of freedom.

“Peace without a voice is fear”

We want to discuss our priorities. We want to know about and be involved in the urban development project in Santa Marta.

We will only be heard and respected if we unite.

Think, talk, reflect, debate, get involved…

Surveillance image of the week 2: camera catches man stealing camera

Just how postmodern can contemporary surveillance get?

Well, after the irony of numerous recent CCTV thefts in the USA – after all, if you’re going to put lots of shiny new cameras up in public places they are bound to be a target themselves – now another layer has been added in Bakersfield, California, with a video surveillance camera thief caught on the camera system he was stealing. Of course, some thieves don’t seem to realise that the camera isn’t the place the data is stored… either that or they just aren’t put off by CCTV at all. Say it ain’t so…

A clear demonstration of the deterrent effect of video surveillance in action... not.
A clear demonstration of the deterrent effect of video surveillance in action... not.

More military robots…

A story in the Daily Mail shows two new military robot surveillance devices developed for the UK Ministry of Defence’s Defence and Equipment Support (DES) group. The first is a throwable rolling robot equipped with multiple sensors, which can be chucked like a hand-grenade and then operated by remote-control. The second is another Micro-(Unmanned) Aerial Vehicle (Micro-UAV or MAV), a tiny helicopter which carries a surveillance camera. There have been rolling surveillance robots around for a while now (like the Rotundus GroundBot from Sweden), but this toughened version seems to be unique. The helicopter MAV doesn’t seem to be particularly new, indeed it looks at least from the pictures, pretty similar to the one controversially bought by Staffordshire police in Britain – which is made by MicroDrones of Germany.

The proliferation of such devices in both military and civil use is pretty much unchecked and unnoticed by legislators at present. Media coverage seems to be limited to ‘hey, cool!’ and yes, they are pretty cool as pieces of technology, but being used in useful humanitarian contexts (for example, rolling robots getting pictures of a partially-collapsed building or MAVs flying over a disaster zone) is a whole lot different from warfare, which is a whole lot different again from civilian law enforcement, commercial espionage or simple voyeuristic purposes. As surveillance gets increasingly small, mobile and independent, we have a whole new set of problems for privacy, and despite the fact that we warned regulators about these problems back in 2006 in our Report on the Surveillance Society, little government thought seems to have been devoted to these and other new technologies of surveillance.

The use of robots in war is of course something else I have become very interested in, especially as these flying and rolling sensor-platforms are increasingly independent in their operation and, like the US Predator drones employed in Afghanistan and Pakistan or the MAARS battlefield robot made by Qinetiq / Foster-Miller, become weapons platforms too. This is an urgent but still largely unnoticed international human rights and arms control issue, and one which the new International Committee for Robotic Arms Control (in which I am now getting involved), will hopefully play a leading role in addressing.

Pigs subvert surveillance

It is not just human beings who are subjects of surveillance. Animals are increasingly under surveillance too, indeed there are techniques of surveillance and tracking used on animals that are designed to achieve levels of control that (for the most part) would not be tolerated for human beings. Animals are tagged, filmed, implanted, tracked, and even used and adapted for surveillance (see Amber Marks’s book, Headspace, for example) for all kinds of reasons from the economic to the environmental. However, this great story from a BBC kids’ news program demonstrates that some animals can ‘fight back’ in ways that are inventive and heartening.

Many farms now limit the food consumption of individual pigs through the use of electronic Radio-frequency Identification (RFID) collars and gates: once the pig has gone through the gate, the collar communicates with a computerised food distribution system which will provide the pig with what is deemed ‘enough’ for the pig. When the pig has eaten and left the feed stall, it cannot get back in for more because the system knows which collar has already been through the gate.

However, apparently pigs in several locations have independently learnt how to get round this surveillance system. Some pigs hate the collars so much that they rip them off. These pigs then don’t get to eat of course, but other pigs have learnt that if they pick up the collars they can go through the gate a second time – and they have even taught other pigs how to this…

Never mind ‘Big Brother’ and Nineteen Eighty-Four, it’s another Orwell phrase (from Animal Farm) that comes to mind here: “Four legs good, two legs bad”…!

(Thanks to Aaron Martin for this)

Big Brother Watch

I’ve just been contacted by a UK organisation calling itself ‘Big Brother Watch’, which claims to be a ‘think-tank’ asking for my help and support. Now the UK already has Liberty, Privacy International, No2ID, No-CCTV, not to mentioned the Surveillance Studies Network, and various other campaigns and organisations, so why this new one? Of course anyone and their dog can call themselves a ‘think-tank’ but Big Brother Watch is being more than a little disingenuous. It is basically a creation of the Taxpayers’ Alliance, which in turn is a fake ‘popular’ pressure group, and a front for neoliberal economic think-tanks like the Adam Smith Institute, various large industrial interests and the most free-market wing of the Conservative Party – you can find out more about what they are really about here. Now, if those are your politics, and you are happy with who backs them, then you are most welcome to support this new creation, but they aren’t my politics and I won’t be offering any support for such front organisations.

Manchester Airport trials virtual strip-search system

Rapiscan image (BBC)
Rapiscan image (BBC)

You would think after 4 years of trials at Heathrow, that British airports would now be able to work out whether or not they could and more importantly, should, use the various varieties of body scanners that are now available. However Manchester Airport is holding another trial starting from now at its Terminal 2. At least it will give a chance for the public to say what they think. The scans are remote – i.e.: the officer observing the images is not on the airport floor, which prevents the kind of scenario we mentioned in our Report on the Surveillance Society of lewd remarks directed at passengers. Personally, I am rather less concerned about this rather abstract view of my body being seen briefly as I pass through an airport than I am about my financial details and personal life being traded between private companies, or about being under constant video surveillance in ordinary public space in the city. However, the images, although ghostly, are detailed enough that genitals, deformities, medical implants and so on can be seen, and if this story is to be believed it would seem that there is no provision for women’s images to be seen by a women alone and men’s only by a man. This will make it entirely unacceptable to some people, in particular members of certain religious groups. But the scans are – at least, for now – voluntary, in that passengers can refuse and have a traditional pat-down search instead.

However, this technology won’t be staying in the airports for long. I reported back in July on stories that terahertz wave scanning could soon be made to fit into portable cameras. That raises a whole different set of social, political and ethical questions…

(Thanks to Simon Reilly for sending me the link)

Automation and Imagination

Peter Tu writes over on Collective Imagination, that greater automation might prevent racism and provide for greater privacy in visual surveillance, providing what he calls ‘race-blind glasses’. The argument is not a new one at all, indeed the argument about racism is almost the same proposition advanced by Professor Gary Marx in the mid-90s about the prospects for face-recognition. Unfortunately, Dr Tu does several of the usual things: he argues that ‘the genie is out of the bottle’ on visual surveillance, as if technological development of any kind is an unstoppable linear force that cannot be controlled by human politics; and secondly, seemingly thinking that technologies are somehow separate from the social context in which they are created and used – when of course technologies are profoundly social. Although he is more cautious than some, this still leads to the rather over optimistic conclusion, the same one that has been advanced for over a century now, that technology will solve – or at least make up for – social problems. I’d like to think so. Unfortunately, empirical evidence suggests that the reality will not be so simple. The example Dr Tu gives on the site is one of a simple binary system – a monitor shows humans as white pixels on a black background. There is a line representing the edge of a station platform. It doesn’t matter who the people are or their race or intent – if they transgress the line, the alarm sounds, the situation can be dealt with. This is what Michalis Lianos refers to as an Automated Socio-Technical Environment (ASTE). Of course these simple systems are profoundly stupid in a way that the term ‘artificial intelligence’ disguises and the binary can hinder as much as it can help in many situations. More complex recognition systems are needed if one wants to tell one person from another or identify ‘intent’, and it is here that all those human social problems return with a vengeance. Research on face-recognition systems, for example, has shown that prejudices can get embedded within programs as much as priorities, in other words the politics of identification and recognition (and all the messiness that this entails) shifts into the code, where it is almost impossible for non-programmers (and often even programmers themselves) to see. And what better justification for the expression of racism can there be that a suspect has been unarguably ‘recognised’ by a machine? ‘Nothing to do with me, son, the computer says you’re guilty…’ And the idea that ‘intent’ can be in any way determined by superficial visualisation is supported by very little evidence which is far from convincing, and yet techno-optimist (and apparently socio-pessimist) researchers push ahead with the promotion of the idea that computer-aided analysis of ‘microexpressions’ will help tell a terrorist from a tourist. And don’t get me started on MRI…

I hope our genuine human ‘collective imagination’ can do rather better than this.

Nations stop tracking H1N1 cases

The Associated Press is reporting that many nations, in particular the USA, have changed their surveillance methods for keeping track of Swine Flue (H1N1), and are no longer counting confirmed cases. The justification for this is that the confirmed cases count was already massively underestimating the numbers affected, and in any case, it is no longer useful once the disease hits a certain proportion of the population. This may be true on a whole population level, but the move away from counting cases means that changes in particular populations and areas below subnational level are less observable – and this is a problem if the disease is affecting some groups and places more than others. It might for example be crucial to deciding who and where receives vaccinations, for example. There is also the added complication of budget cuts in local government surveillance resulting from the recession. As with many kinds of caring surveillance, one key question is not whether the surveillance is perfectly accurate, but whether the surveillance is ‘good enough’ for the purpose for which it is intended, and in the case of diseases, this is sometime a tricky thing to determine.

So are there better ways of doing it? Some private companies certainly think so. As I have reported before, Google and others reckon that online disease tracking systems will be vital in the future, so much so that Google in particular has gone rather over the top in its claims about what would happen if access to the data it used for these systems were restricted…

US Congress debates online data protection

The US House of Representatives will finally get to debate whether online advertising which tracks the browsing habits of users is a violation of privacy and needs to be controlled. A bill introduced by Rep. Rick Boucher of Virginia will be propsing an opt-out regime that gives users information about the uses to which their data will be put, and allows them to refuse to be enroled. At present many such services work entirely unannounced, placing cookies on users’ hard drives and using other tracking and datamining techniques, and without any way in which a user can say ‘no’. Of course, we have yet to see the results of the inveitable industry scare-stories and hard-lobbying on the what will be proposed, let alone pased. But the proposal itself is particularly significant because so far the US has so far always bowed to business interests on online privacy and data protection, and if this bill is pased, it is a sign that what EFF-founder, Howard Rhiengold, long ago called the ‘electronic frontier’ might start to acquire a little more law and order in favour of ordinary people.