The mundane costs of independent drones

It’s been an aim of developers for quite a while to develop more independently functioning surveillance drones that can fly around and recharge themselves in some way – whether it’s solar gliders in the stratosphere or, at street level, biomimetic bird-like micro-UAVs that can ‘perch’ and draw power from electricity cables. This was one of the original aims of the DARPA call that led to the creation of that beautiful marvel of engineering / dystopian nightmare surveillance tool, the Nano Hummingbird. If you are an engineer, this is certainly convenient and probably looks a lot like a ‘free lunch’ – there is certainly no mention of any possible costs or downsides in this piece on engineering.com. But as we all should know, there is no such thing as free lunch.

Firstly and most importantly, there’s the question of whether societies want either identifiable or camouflaged surveillance devices flying around us at all times. A mobile surveillance device essentially becomes even more independent and less limited by its construction if it can ‘feed’ itself. And while the US Federal Aviation Authority in particular has just recently put a bar on commercial drone delivery services (PDF), it certainly hasn’t prohibited other kinds of drone use, and many other national regulatory bodies are yet to decide on what to do, while drone manufacturers are pushing hard for less ‘bureaucratic’ licensing and fewer controls.

The second objection is less fundamental but perhaps more effective at igniting opposition to such devices. It might be that any single device would draw minute amounts of power from cables, but what happens if (or when) there are thousands, even millions, of these devices – flying, crawling, creeping, rolling, slithering – and all hungry for electricity? I would suggest that, just like the cumulative effect of millions of computers and mobile phones, this would be substantial and unlike the claims made for smartphones, this would be additional rather than replacing less efficient devices. And this is not including the energy use of the huge server farms that provide the big data infrastructure for all of these things. So, who pays for this? Essentially we do: increased energy demand means higher bills and especially when the power is being drawn in an unaccountable way as with a biomimetic bird on a wire. And unlike the more voluntary decision to use a phone because of its benefits to us, paying for our own surveillance in this way would seem to be less obviously ‘for our own good’ and certainly has the potential to incite the ire of ‘ordinary middle-class homeowners’ (that holy grail of political marketing) and not just the usual small-government libertarian right or pro-privacy and anti-surveillance left.

 

The Data Revolution

10309475_556761331111976_7897627149443974030_nI’m far from the only academic studying smart cities and big data-driven urbanism. One of the people who’s most inspired my work (in many ways) over the years, is Rob Kitchin – sometimes I even spell his name right! Rob has this fantastic new book, The Data Revolution, coming out in September from Sage, and very helpfully he has put the bibliography, and a lot of other stuff, online. This is the way scholarship should be. Too many of us still guard our ‘secret’ sources and keep our work-in-progress close to our chests. But if we want people to read what we do, think and take action, then more open scholarship is the way to go.

The Right to Watch?

I’ve always defended the right to photograph in public places. However, a number of cases in the last few weeks are highlighting an important new development in this area, a new front in the increasingly confusing information wars. Gary Marx always like to say that surveillance is neither good nor bad but that intent, circumstances, and effects make it so, but a growing number of people and organizations seem to be treating surveillance – or at least watching, and certainly not all watching is surveillance – as a right which supersedes rights to privacy. We’ve seen this in the case of Google Glass – even before it was launched commercially – and more recently with the arguments over the ‘right to be forgotten’ in Europe, with personal privacy being counterposed to freedom of information and actions to protect privacy being compared to censorship. It’s all somewhat reminiscent of Dave Eggers’ novel, The Circle, in which a Facebook-Google-Apple-a-like company completely turns around social values until, as one of the corporate slogans has it, “PRIVACY IS THEFT!”

The latest case is that of the use of drones / micro-UAVs / MAVs in the USA. The Federal Aviation Authority (FAA), the government body that controls US airspace, is trying to regulate the use of drones and has attempted to fine commercial drone operators who fly surveillance drones without their permission. The case revolves around one Robert Pirker, who used an unlicensed drone to film a promotional video back in 2011. At the moment the FAA is appealing against the National Transportation Safety Board (NTSB), who rule that it could not fine Pirker as it did have jurisdiction over small drones. Now the media has weighted in on Pirker’s side, arguing that the FAA’s stance infringes the first amendment and creates a ‘chilling effect’ on journalism.

I’m really not sure about either argument. On the FAA side, this is partly about a bureaucracy trying to keep control of its regulatory territory as much it is about the object of the regulation – the FAA does not want to be seen to be losing control just as the number of small drones is increasing massively.

On the other side, is this really about the rights of journalists? Pirker was making a commercial film not covering a story, and the effect of the FAA’s ruling being overturned is more likely to open the door to a corporate free-for-all, an absurd PKDickian world of drones as far as the eye can see, with all the attendant crashes and legal battles, could result. Think not? Well, back in the 1900s, people thought there would never be that many cars on the roads either… so it is certainly it is partly about their mandate, i.e. air safety.

The big question here, as with Google Glass and with Search, is whether technological change makes a difference. Is a flying camera just the same as a hand-held camera? Does the greater potential for intrusion, or on the other hand the inability to know that one is being filmed, matter? Does that possibility that ‘the truth’ will be revealed justify any technological method used to obtain it? If not, which ones are acceptable, whereis the line drawn, and who decides and how? In the UK, the ‘public interest’ would be a good basis for deciding, as has been frequently alluded to in the Leveson Inquiry into telephone tapping conducted by Murdoch-owned newspapers, however ‘public interest’ is a much vaguer term in the USA… what is certain is that conflicts around the ‘right to watch’ versus the ‘right to privacy’ and other human rights and social priorities are only going to intensify.

The computer did it…

It seems that ‘the computer did it’ is becoming as much a cliché in the early twenty-first century as ‘the butler did it’ was 100 years ago. There’s an interesting link by Cory Doctorow on bOING bOING to a blog post by one Pete Ashton about the already infamous ‘Keep Calm and Rape A Lot’ T-Shirts being sold through Amazon’s marketplace.

Computer-generated 'Rape' T-shirts sold via Amazon
Computer-generated ‘Rape’ T-shirts sold via Amazon

Only the explanation given is incomplete in important ways. This is not to encourage people to attack Pete who, as his post explains, is not in any way connected to or responsible for the T-shirts or the company that produces them. However the explanation that ‘it was an algorithm that did this and the company didn’t know what was being produced until it was ordered’ is inadequate as an explanation. Here’s why.

1. This was not simply a product of computer generation nor do algorithms just spring fully formed from nature. All algorithms are written by humans (or by other programs, which are in turn produced by humans) and the use of an algorithm does not remove the need to check what the algorithm is (capable of) generating.

2. There was a specific number of verbs included in the algorithm for generating these T-shirt slogans (621 verbs in fact).  Even if they were generated by selecting all the 4 or 5 letter words in a dictionary of some sort, it’s not that hard to check a list of 621 verbs for words that will be offensive.

3. There words following the verb were not even as random as this. In fact, they are specifically A LOT, HER, IN, IT, ME, NOT, OFF, ON, OUT and US. Several people have checked this. There are some very interesting words missing, notably HIM. This list is clearly a human selection and its choices reflect, if not deliberately misogynistic choices, at the very least a patriarchal culture.

Algorithms, as cultural products, are always political. They are never neutral even when they appear to be doing entirely unremarkable things. The politics of algorithms may be entirely banal in these cases, but in some, as in this case, the politics of algorithms is accidentally visible. T-shirts may be a minor issue, but what’s much more important is not just to accept the idea that ‘the computer did it’ as an infallible explanation when it comes to rather more consequential things: all the way from insurance and credit rating through police stop-and-search and no-fly lists to assassination by drone. Otherwise, before we know it, the opportunity to question the politics is buried in code and cabling.

Drones Over America

EPIC has obtained evidence under the Freedom of Information Act from the US Department of Homeland Security that is has fitted Predator drones with domestic espionage capabilities. The document, Performance Specification for the US Customs and Border Protection Unmanned Aerial System (UAS) Version 2.4, dated March 10 2010,  includes the following technical requirements: infra-red sensors and communications, plus either synthetic aperture radar (SAR), Ground Moving Target Indicator mode (GMTI – tracking) or signals interception receivers (page 7). The UAV should:

be “capable of tracking an adult human-sized, single moving object” with sufficient accuracy “to allow target designation at the specific ranges.”(page 28)

“be able to maintain constant surveillance and track on a designation geographic point.” (page 28)

The section ‘target marking’ is redacted in EPIC’s version however the CNET website managed to get hold of a non-redacted version, which say that the system “shall be capable of identifying a standing human being at night as likely armed or not,”  and specify “signals interception” technology for mobile phone frequencies as well as “direction finding” which will enable the UAS locate them.

And in case people are wondering whether this is just for border patrol, the documents specifically states that it is for collection of ‘Intelligence Surveillance and Reconnaissance (ISR) data in support of Department of Homeland Security (DHS) and CBP missions” (page 1). I hope all you US people know exactly how you can challenge drones flying at 20,000 feet up that might be breaching your 4th Amendment Rights…