January 2010


SAMURAI is a “next generation” CCTV system capable of identifying and tracking individuals “acting suspiciously” in crowded public spaces. The project has received €2.5 million in EU funding under the Fp7 security research programme.

Unlike its ninja namesake, SAMURAI uses computer algorithms to profile people’s behaviour. The system also claims to learn about how people “usually behave” in the environments where “smart CCTV” is deployed. As SAMURAI researchers explained to New Scientist magazine, the system “is designed to issue alerts when it detects behaviour that differs from the norm, and adjusts its reasoning based on feedback. So an operator might reassure the system that the person with a mop appearing to loiter in a busy thoroughfare is no threat. When another person with a mop exhibits similar behaviour, it will remember that this is not a situation that needs flagging up”.

Here’s the demonstration video:

The project is led by Queen Mary’s University in London. Partners in the EU-funded SAMURAI consortium include BAA, the Spanish-owned British airports group, UK Defence contractor Waterfall Solutions Ltd., and Elsag Datamat, the surveillance-tech subsidiary of Italian arms giant Finmeccanica.

For more information see: SAMURAI project website and “Smart CCTV learns to spot suspicious types” (New Scientist, 15.12.2009).

You can't appeal to robots for mercy or empathy - or punish them afterwards

Two interesting articles examining the development and implementation of combat robots of various sorts were published this month. In “The age of the killer robot is no longer a sci-fi fantasy” (Independent), Johann Hari considers the growing army of 12,000 robots used by the USA in some 33,000 military operations per year and offers the following conclusion:

Imagine if the beaches at Dover and the skies over Westminster were filled with robots controlled from Torah Borah, or Beijing, and could shoot us at any time. Some would scuttle away – and many would be determined to kill “their” people in revenge. The Lebanese editor Rami Khouri says that when Lebanon was bombarded by largely unmanned Israeli drones in 2006, it only “enhanced the spirit of defiance” and made more people back Hezbollah.

Is this a rational way to harness our genius for science and spend tens of billions of pounds? The scientists who were essential to developing the nuclear bomb – including Albert Einstein, Robert Oppenheimer, and Andrei Sakharov – turned on their own creations in horror and begged for them to be outlawed. Some distinguished robotics scientists, like Illah Nourbakhsh, are getting in early, and saying the development of autonomous military robots should be outlawed now.

There are some technologies that are so abhorrent to human beings that we forbid them outright. We have banned war-lasers that permanently blind people along with poison gas. The conveyor belt dragging us ever closer to a world of robot wars can be stopped – if we choose to.

The second article, “Israeli Robots Remake Battlefield” by Charles Levinson (Wall Street Journal), can only dampen Hari’s optimism. Levinson argues that the growing Israeli army of “robotic fighting machines” offers a “window onto the potential future of warfare”, with “over 40 countries now said to have military-robotics programs”.

‘Highlights’ from the article, for want of a better term, include:

  • Among the recently deployed technologies that set Israel ahead of the curve is the Guardium unmanned ground vehicle, which now drives itself along the Gaza and Lebanese borders. The Guardium was deployed to patrol for infiltrators in the wake of the abduction of soldiers doing the same job in 2006. The Guardium, developed by G-nius Ltd., is essentially an armored off-road golf cart with a suite of optical sensors and surveillance gear. It was put into the field for the first time 10 months ago.
  • In the Gaza conflict in January 2009, Israel unveiled remote-controlled bulldozers.
  • Within the next year, Israeli engineers expect to deploy the voice-commanded, six-wheeled Rex robot, capable of carrying 550 pounds of gear alongside advancing infantry.
  • After bomb-laden fishing boats tried to take out an Israeli Navy frigate off the coast off Gaza in 2002, Rafael designed the Protector SV, an unmanned, heavily armed speedboat that today makes up a growing part of the Israeli naval fleet. The Singapore Navy has also purchased the boat and is using it in patrols in the Persian Gulf.
  • Unlike the U.S. and other militaries, where UAVs are flown by certified, costly-to-train fighter pilots, Israeli defense companies have recently built their UAVs to allow an average 18-year-old recruit with just a few months’ training to pilot them.
  • “The Israelis do it differently, not because they’re more clever than we are, but because they live in a tough neighborhood and need to respond fast to operational issues,” says Thomas Tate, a former U.S. Army lieutenant colonel who now oversees defence cooperation between the U.S. and Israel.
  • in 2009 the U.S. Air Force trained more “pilots” for unmanned aircraft than for manned fighters and bombers for the first time.

The Civil Contingencies 2010 conference was held in London earlier this week. It featured “20 expert speakers” and “numerous carefully selected suppliers” in the business of preparing for “major disruptions”. The discussion ranged from “the current flu pandemic to severe weather, widespread flooding, the risks posed by a changing climate and malicious threats”.

The event promised “a crucial opportunity for delegates to connect with speakers, policy setters and key drivers of the government agenda”, with the exhibition area offering “unrivalled opportunities to network with over 25 suppliers, service providers and stakeholders”.

Civil Contingencies 2010 - Tackling tomorrow's threats

Article from David Cronin for Inter Press Service published 13.11.2010, reproduced in full here:

BRUSSELS: Aid traditionally reserved for keeping victims of war and disasters alive may now be used for security-related projects such as the fingerprinting of refugees, European Union officials have decided.

Although the European Commission’s humanitarian office (ECHO) regularly publishes statements detailing how much food, medicines or blankets it gives to people in distress, it has drawn no attention to a widening in the scope of its activities in recent years. Through a partnership with the United Nations’ Refugee Agency (UNHCR), the office has been financing the development of a computer system designed to store the fingerprints and other biometric data of refugees.

An internal ECHO paper from September 2009 suggests that support for such activities is necessary as part of an “innovative” approach towards improving the response of international agencies to crises.

But civil liberties activists are perturbed that humanitarian aid is being used to extend fingerprinting, a technique universally associated with criminal investigations, to refugee management projects. “If the EU wants to finance security projects, it should be doing so from money earmarked for security projects (rather than from humanitarian aid),” Ben Hayes from the organisation Statewatch told IPS.

Through a project known as Profile, the UNHCR has registered the fingerprints of more than 2.5 million refugees in some 20 countries since 2004. This project has received some four million euros (six million dollars) from the humanitarian aid section of the EU’s budget. As well as taking fingerprints, the UNHCR has stored images of the eyes of Afghan refugees who were returning to their home country after fleeing to neighbouring Pakistan. Identity documents are issued to refugees as part of the project, in cooperation with the governments in the countries where the refugees are located.

The UNHCR is also implementing a related project known as ProGres with the software giant Microsoft. While this relies mainly on basic data such as the names and birth dates of refugees, UNHCR sources say that biometric indicators are being stored in it on a trial basis in several countries. “There is considerable thought on expanding its use,” said one source, speaking on condition of anonymity.

The UNHCR’s decision to resort to fingerprinting has been made despite previous concerns expressed by the organisation that refugees could be unfairly stigmatised if techniques associated with criminal investigations are widened to asylum and migration policies. The agency has, for example, been critical of the way the EU’s own system for fingerprinting asylum-seekers has evolved. Known as Eurodac, this system was originally confined to preventing asylum claims from being lodged in more than one EU member state, but the European Commission formally recommended last year that law enforcement agencies should have access to this database.

Gilles Van Moortel, the UNHCR’s Brussels spokesman, said that the agency has drawn up guidelines stating that police will not be able to scrutinise its fingerprinting files. “Sharing this kind of information for law enforcement purposes would not be in the keeping with the spirit of our work,” he added. “Our registration of asylum-seekers and refugees is purely being done for the purpose of international protection. While we fully understand the need for security, we are against the sharing of such data with law enforcement authorities.”

Hayes from Statewatch, however, described the UNHCR’s assurance as “meaningless”, given the history of the Eurodac system. “Once these things get big, their appeal for law enforcement agencies can become huge,” he added. “It becomes very difficult to resist calls that law enforcement agencies should have access to them.”

Ross Anderson, a specialist in computer technology with Cambridge University in Britain, said that while international aid organisations have long been involved in handing out ID cards, “poorly designed systems can do great harm.” He cited the situation in Rwanda during the 1990s, where people designated as Tutsis on official documents become victims of genocide, as an example of why great care is needed when ID systems are being set up.

John Clancy, the European Commission’s spokesman on humanitarian affairs, said that supporting the fingerprinting system is “not in any way a departure from ECHO’s traditional role” of providing emergency relief. “An effective registration system is crucial for refugees because it allows them to have their status clearly established and their rights respected,” he said. “They gain access to humanitarian assistance, social services in the host country and sometimes even local employment.”

Kathrin Schick from Voluntary Organisations in Cooperation in Emergencies (VOICE), a grouping of relief agencies, said that she had no difficulty with the principle of refugee registration. “It is very often forgotten by the person on the street that humanitarian aid is not just about food and milk,” she said. “It is also about ensuring that people are protected. It is very important to stress that humanitarian aid involves both protection and assistance.”

But Simon Stocker from the anti-poverty campaign group Eurostep said that the use of humanitarian aid for security projects “could be seen as compromising.” ECHO, he noted, is officially committed to ensuring that its activities are focused purely on relieving the distress of vulnerable people and that they are independent of more strategic political considerations.

From the BBC Press Office: A new two-part series for BBC Radio 4, presented by Stephen Sackur, considers a crucial, but often hidden, revolution in the way in which wars are fought. Ever-more autonomous robotic machines are becoming steadily more popular with the military and other agencies, often controlled far away from the battlefield. The pilotless drone aircraft, for example, has become key to current conflicts such as Afghanistan. Whether in the air or on the ground, such machines are seen as offering huge military advantages – with less exposure of soldiers to danger as well as being quicker, cheaper and more effective forms of defence and attack.

But the implications of this revolution are controversial. Are countries more likely to fight wars if their personnel are not put in danger? What happens if machines malfunction? How can autonomous machines be held accountable for their actions according to the laws of war?

Stephen questions military figures, including the most senior RAF figure responsible for strategy on pilotless aircraft, those who operate drones for the RAF, manufacturers of military robots and experts in the field. He explores what is already happening as military robotics expand – and what might happen in the future.

The first programme focuses on the huge impact of drones, used not only by the British and US in Afghanistan but also – highly controversially – by the CIA in attacks on targets in Pakistan. One former CIA employee reveals how the use of drones in so-called targeted assassinations has divided the organisation.

Episode 1 is on Monday 1 February 2010 at 20:00 on BBC Radio 4.

Presenter/Stephen Sackur, Producer/Chris Bowlby

Drone

The Guardian reports today that “Police in the UK are planning to use unmanned spy drones, controversially deployed in Afghanistan” as part of a “national strategy developed by arms manufacturer BAE Systems” and “a consortium of government agencies”. You can read the full article here.

The “national strategy” to which the Guardian refers is actually the ASTRAEA project, a £32 million ‘public-private’ partnership that has been funded as part of the UK’s National Aerospace Technology Strategy. ‘NATS’, as the strategy is known, is an industry-led government initiative adopted in 2004. By the end of 2008, the initiative had attracted some £464 million in collaborative R&D funding for 70 individual programmes.

So while none of this exactly ‘news’, credit to the Guardian for its freedom of information request and provocative reporting. The comments on its article certainly show the strength of feeling against the use of drones/UAVs in the UK.

By way of clarification, there are actually two types of unmanned aerial vehicle (UAVs): (i) the armed and unarmed ‘drone’ planes’ to which the Guardian report refers, and (ii) much smaller miniature spy planes. The latter are basically remote-controlled aircraft fitted with cameras and are already in use in the UK and other countries.

Annotated image labeling components of police drone

The military drones the Guardian is reporting on are currently prohibited from flying in European airspace because of well-founded concerns about potential collisions with traditional aircraft. The air traffic control community is particularly suspicious, and demands that UAVs adhere to the same safety standards as their manned counterparts, which some argue render UAV systems too expensive to implement. I’d be very surprised if the “sense and avoid” systems for these kind of drones will be licensed in time for the 2012 Olympics, but governments and the aerospace industry are certainly throwing money at the problem and can be relied upon to lobby hard when the technology is in place.

Dutch counter-terrorism game

On 22 December 2009 Stichting DubbelX-Alternative View in Amsterdam launched a new internet game about the EU’s Security and Defence Policy. It was developed with the support of the ‘Europe Fund’ of the Dutch Ministry of Foreign Affairs. It’s objective is “to reach people who don’t read books or articles about the subject but who may be tempted to learn something about the subject through playing a game”.

The game is available on www.benjij007.nl (are you 007?). You need to understand Dutch to play it (an English translation is apparently on its way). “In the game the EU Anti-Terrorism Coordinator sends you on a mission to prevent a terrorist attack with a dirty nuclear bomb on one of Europe’s cities. While chasing the terrorists around the world you are fed with information about the EU’s security and defence policy. The game ends with a report about your qualities as an agent, invites you to deepen your knowledge and gives some suggestions to do it”.

Useful educational and counter-terrorism tool or shameless piece of state-sponsored, fear-mongering propaganda?

If only I could speak Dutch…

One of the reasons for setting-up this blog is to show how the arms industry is trying to cash-in on all things “security”. This includes everything from pandemics to paramedics. DITSEF is a new three year, €2.8 million EU funded project on “Digital and innovative technologies for security and efficiency of first responders operation” (Project Reference: 225404).

The DITSEF consortium claims that “The main problem of First Responders (FR) (fire fighters, police, etc…) in case of crisis at critical infrastructures are the loss of communication and location and the lack of information about the environment (temperature, hazardous gases, etc.)”.

The DITSEF project is led by Sagem Défense Sécurité and includes European arms giants Finmeccanica and EADS, as well as TNO, the Dutch defence research institute. The consortium does not include any “first responders”.

The Council of Europe (not to be confused with the European Union) is to discuss “Faked pandemics: a threat to health” at the next plenary session of the Parliamentary Assembly of the Council of Europe (PACE), to be held in Strasbourg from 25 to 29 January.

The PACE Social Affairs Committee has proposed the holding of an urgent debate on this subject. If the Assembly agrees when it adopts its agenda on the opening day, the debate is likely to be held on the morning of Thursday 28 January. The committee will be holding a closed hearing on the same subject on Tuesday 26 January at 8.30 am, attended by representatives of the World Health Organisation (WHO), the European pharmaceutical industry and experts on the subject.

See also: EU to probe pharma over “false pandemic” (PharmaTimes).

The recently released synopsis for the EU contract for the 3.3 million Euro ARGUS 3D project, led by SELEX Sistemi Integrati (a Finmeccanica company), is so badly written that it is difficult to ascertain exactly what the project is about:

Objective: The project aims to improve the detection of manned and unmanned platforms by exploiting the treatment of more accurate information of cooperative as well as non-cooperative flying objects, in order to identify potentially threats. The scope will be reached by managing the 3D position data in region including extended border lines and large areas, 24 hours a day and in all weather conditions, derived from enhanced existing Primary Surveillance Radar (PSR), together whit conventional data and information coming via various passive radar technique in order to extend the airspace coverage and to enhance the target recognition capability of the surveillance systems. Thus, the security could be enhanced in large areas, at sustainable costs, by improving the recognition of non-cooperative target through more accurate information on it s characteristics and/or more accurate positioning.

The project clearly concerns the development of some kind of radar system. Perhaps they mean the protection of manned and unmanned platforms, in which case it has something to do with oil rigs and such like? Or maybe it’s something to do with the detection of unmanned aerial vehicles (drones) that pose some kind of threat? Whatever this project is about, it smells a whole lot like military research, in which case it clearly should not have been funded under the ESRP. Here’s a Northrop Grumman Aerospace Systems press release that may or may not be about the same kind of technology? Any information as to what this is actually all about gratefully received…

Next Page »