The US Federal Aviation Authority released a roadmap for the introduction of drones (UAVs) into US airspace (PDF) last week. Whether it happens this way or not is a moot point as previous deadlines have been consistently missed. However, they also released a list of requirements for meeting privacy standards (PDF).
Mark Calo in Forbes says that the FAA’s plan for privacy from drone surveillance is ‘pretty sensible’, which it is if you consider that the FAA’s primary job is, as it repeats in response to comments demanding great privacy protections throughout the document, “to provide the safest, most efficient aerospace system in the world.” Marc Rotenberg has pointed out, the problem is that it wasn’t their plan but the result of external pressure from EPIC and ACLU and others.
The right time to intervene to strengthen human rights in relation to threats to those rights posed by new technologies is always before they are introduced. If this is not done then the widespread use of those technologies can shift what people understand as ‘normal’ and reduce expectations of privacy (and other rights). And it is much more difficult to legislate in retrospect. We’ve seen this with public space CCTV. In other words, while technologies do not determine social relations, they will interact with people, individually and in groups, in both positive and negative ways, and the job of politics and of policy is to ensure that the positive effects are maximized and the negative ones, minimized*. This means federal regulation. However, given the way in which the US favours private over state intervention, it’s not surprising that this is not a popular way of dealing with things there.
What will result from the FAA’s weak set of privacy requirements in drone operations without real oversight is privacy protection as a bureaucratic box-ticking exercise.
*It should also be noted that this includes the possibility of not allowing any particular technology to be used in public space if the latter cannot be minimized to a level that not be harmful to socially desirable goals. Not enough attention is paid to the possibility of just saying ‘no’ to the public use of any particular technology.