Tag Archives: CIA drone war

Spreading the War Bug

Foreign Policy reported recently that key officials within the Trump administration are “pushing to broaden the war in Syria, viewing it as an opportunity to confront Iran and its proxy forces on the ground there”. The strategy was being advocated over objections from the Pentagon, but it doesn’t seem to be deterring the White House.  As the Washington Post made clear just a few days ago, Iranian and US forces have already been directly clashing in the region, and officials are busy planning the “next stage” of the Syria war once Isis is defeated – a plan that centers around directly attacking the Iranians….

Just this weekend, Politico quoted key Republican senator Tom Cotton saying: “The policy of the United States should be regime change in Iran.” The CIA has already expanded its Iranian covert operations, while the main White House liaison to intelligence agencies, Ezra Cohen-Watnick, has reportedly“told other administration officials that he wants to use American spies to help oust the Iranian government”. And US secretary of state Rex Tillerson, in little noticed comments to Congress last week, called for “regime change” in Iran as well (albeit a “peaceful” one – whatever that means)…

The Trump administration’s plans may not stop in Syria either. Some officials have allegedly also been pushing for the Pentagon to step up its support of Saudi Arabia’s appalling war in Yemen, which has left 20 million people on the verge of starvation – all to go after Iranian-backed forces in the region as well.

All this comes as the Trump administration ramps up war across the Middle East. They are conducting drone strikes at a rate almost four times that of the Obama administration; civilian deaths from US forces in Syria have skyrocketed; special operations in Somalia have been ramping up; and the Pentagon is sending thousands of more troops to Afghanistan.

Excerpt from: Trevor Timm, Trump administration Donald Trump’s bloodlust for war in the Middle East risks chaos, Guardian, June 27, 2017

Advertisements

Drone War 2014: transparency for covert lethal action

training in Djibouti image from wikipedia

A UN counter-terrorism expert has published the second report of his year-long investigation into drone strikes, highlighting 30 strikes where civilians are reported to have been killed.  The report, by British lawyer Ben Emmerson QC, identifies 30 attacks between 2006 and 2013 that show sufficient indications of civilian deaths to demand a ‘public explanation of the circumstances and the justification for the use of deadly force’ under international law.

Emmerson analysed 37 strikes carried out by the US, UK and Israel in Afghanistan, Pakistan, Yemen, Somalia and Gaza, to arrive at a ‘sample’ of strikes that he believes those nations have a legal duty to explain.

Britain and the US conduct strikes as part of the armed conflict in Afghanistan, and the US also conducts covert strikes in Pakistan, Yemen and Somalia.  Although Israel has never officially acknowledged using armed drones, Emmerson met with Israeli officials in the course of preparing his report and lists seven attacks in Gaza among those requiring investigation.

This report expands on an argument for the legal obligation for states to investigate and account for credible claims of civilian casualties, which Emmerson first laid out in his previous report, presented to the UN General Assembly in October (2013).

He writes: ‘in any case in which there have been, or appear to have been, civilian casualties that were not anticipated when the attack was planned, the State responsible is under an obligation to conduct a prompt, independent and impartial fact-finding inquiry and to provide a detailed public explanation of the results.

A February 2010 attack in Afghanistan serves as a ‘benchmark’ of the kind of disclosure that should follow claims of civilian casualties. After a US drone attack on a convoy of trucks reportedly killed up to 23 civilians, the International Security Assistance Force (Isaf), which runs international operations in Afghanistan, partially declassified the findings of its internal investigation. Emmerson writes that this report strongly criticised the crew’s actions and revealed ‘a propensity to “kinetic activity” [lethal action]‘.  This level of transparency is rare.

The most recent incident featured in Emmerson’s report is a December 2013 attack that hit a wedding procession near Rada’a in Yemen, killing at least 12. Multiple sources have identified numerous civilian casualties among the dead, including a Human Rights Watch investigation published last week.   Three unnamed US officials told Associated Press after the publication of Human Rights Watch’s report that an internal investigation had found only alleged militants were killed – but no results of this investigation have yet been officially released.

Information is particularly scarce for activity in Somalia, Emmerson notes. The only strike from the country in the report is the February 2012 strike that killed former British citizen Mohamed Sakr, whose case the Bureau has reported on as part of its investigation into the British government’s deprivation of citizenship.

Neither the US nor the UK routinely publish details of their drone operations. The UK states that it has killed civilians in only one incident in Afghanistan, a March 2011 strike that killed four civilians.  The US has repeatedly dismissed the Bureau’s estimate that at least 400 civilians have died in Pakistan drone strikes as ‘ludicrous’; the CIA director John Brennan has said that claims of high civilian casualties amount to ‘disinformation’.

Emmerson notes that operations that kill civilians are not necessarily illegal under international law, but states have a duty of transparency where there are credible allegations of non-combatants being harmed.  The report does not take a position on the legality of drone strikes away from the battlefield, but says there is an ‘urgent and imperative need’ for international agreement on the legal arguments advanced in favour of covert lethal action.

The US has argued that its strikes are legal on two grounds: they are legitimate acts of self-defence against an imminent threat, and they are part of an armed conflict against an enemy, al Qaeda, and its ‘associated forces’. Emmerson asks a series of questions – about the thresholds for action in self-defence, the definition of ‘imminent’ threat, al Qaeda’s current state, and more – on which he says the international community must reach consensus.  Last week the European Parliament voted 534 to 49 in favour of a motion calling on the EU to develop a ‘common position’ on drone strikes and other targeted killings.  To date, Europe has remained largely silent on the issue, but the motion expressed ’grave concern’ over drone strikes ‘outside the international legal framework’ and called on member states not to ‘facilitate such killings by other states’.

The UK has refused to clarify whether it shares intelligence with the US that could lead to drone strikes in Pakistan; in January the Court of Appeal ruled that any attempt to force the government to disclose such information could endanger international relations. In December, Emmerson told a meeting in parliament that such intelligence-sharing is ‘inevitable’ owing to the closeness of the relationship between the US and UK. ‘It would be absurd if it were not the case,’ he added.

Alice K. Ross, UN report identifies 30 drone strikes that require ‘public explanation, Bureau of Investigative Journalism, Mar. 1, 2014’

The Hunter and Killer Algorithmic Drones

Drone Aladin of the German army 2008. Image from wikipedia

The Pentagon is discussing the possibility of replacing human drone operators with computer algorithms, especially for ‘signature strikes’ where unknown targets are killed simply because they meet certain criteria. So what characteristics define an ‘enemy combatant’ and where are they outlined in law?

Drone strikes and targeted killings have become the weapon of choice for the Obama administration in their ongoing war against terrorists. But what impact is this technology having, not only on those who are the targets (both intended and unintended), but on the way we are likely to wage war in the future?

John Sifton is the advocacy director for Asia at Human Rights Watch, and says that while drones are currently controlled remotely by trained military personnel, there are already fears that the roving killing machines could be automated in the future.  ‘One of the biggest concerns human rights groups have right now is the notion of a signature strike,’ he says. ‘[This is] the notion that you could make a decision about a target based on its appearance. Say—this man has a Kalashnikov, he’s walking on the side of the road, he is near a military base. He’s a combatant, let’s kill him. That decision is made by a human right now, but the notion that you could write an algorithm for that and then program it into a drone… sounds science fiction but is in fact what the Pentagon is already thinking about. There are already discussions about this, autonomous weapons systems.’‘That is to human rights groups the most terrifying spectre that is currently presented by the drones.’

Sarah Knuckey is the director of the Project on Extrajudicial Executions at New York University Law School and an advisor to the UN. She says the way that drones are used to conduct warfare is stretching the limits of previous international conventions and is likely to require new rules of engagement to be drawn up…The rules of warfare built up after World War II to protect civilians are already hopelessly outdated, she says. The notion of border sovereignty has already been trashed by years of drone strikes, which she estimates have targeted upwards of 3,000 individuals, with reports of between 400 and 800 civilian casualties.

Excerpt from Annabelle Quince, Future of drone strikes could see execution by algorithm, May 21, 2013

Military Robots and Automated Killing

Military robots come in an astonishing range of shapes and sizes. DelFly, a dragonfly-shaped surveillance drone built at the Delft University of Technology in the Netherlands, weighs less than a gold wedding ring, camera included. At the other end of the scale is America’s biggest and fastest drone, the $15m Avenger, the first of which recently began testing in Afghanistan. It uses a jet engine to carry up to 2.7 tonnes of bombs, sensors and other types of payload at more than 740kph (460mph).

On the ground, robots range from truck-sized to tiny. TerraMax, a robotics kit made by Oshkosh Defense, based in Wisconsin, turns military lorries or armoured vehicles into remotely controlled or autonomous machines. And smaller robotic beasties are hopping, crawling and running into action, as three models built by Boston Dynamics, a spin-out from the Massachusetts Institute of Technology (MIT), illustrate.  By jabbing the ground with a gas-powered piston, the Sand Flea can leap through a window, or onto a roof nine metres up. Gyro-stabilisers provide smooth in-air filming and landings. The 5kg robot then rolls along on wheels until another hop is needed—to jump up some stairs, perhaps, or to a rooftop across the street. Another robot, RiSE, resembles a giant cockroach and uses six legs, tipped with short, Velcro-like spikes, to climb coarse walls. Biggest of all is the LS3, a four-legged dog-like robot that uses computer vision to trot behind a human over rough terrain carrying more than 180kg of supplies. The firm says it could be deployed within three years.

Demand for land robots, also known as unmanned ground vehicles (UGVs), began to pick up a decade ago after American-led forces knocked the Taliban from power in Afghanistan. Soldiers hunting Osama bin Laden and his al-Qaeda fighters in the Hindu Kush were keen to send robot scouts into caves first. Remote-controlled ground robots then proved enormously helpful in the discovery and removal of makeshift roadside bombs in Afghanistan, Iraq, and elsewhere. Visiongain, a research firm, reckons a total of $689m will be spent on ground robots this year. The ten biggest buyers in descending order are America, followed by Israel, a distant second, and Britain, Germany, China, South Korea, Singapore, Australia, France and Canada.

Robots’ capabilities have steadily improved. Upload a mugshot into an SUGV, a briefcase-sized robot than runs on caterpillar tracks, and it can identify a man walking in a crowd and follow him. Its maker, iRobot, another MIT spin-out, is best known for its robot vacuum cleaners. Its latest military robot, FirstLook, is a smaller device that also runs on tracks. Equipped with four cameras, it is designed to be thrown through windows or over walls.

Another throwable reconnaissance robot, the Scout XT Throwbot made by Recon Robotics, based in Edina, Minnesota, was one of the stars of the Ground Robotics Capabilities conference held in San Diego in March. Shaped like a two-headed hammer with wheels on each head, the Scout XT has the heft of a grenade and can be thrown through glass windows. Wheel spikes provide traction on steep or rocky surfaces. In February the US Army ordered 1,100 Scout XTs for $13.9m. Another version, being developed with the US Navy, can be taken to a ship inside a small aquatic robot, and will use magnetic wheels to climb up the hull and onto the deck, says Alan Bignall, Recon’s boss.

Even more exotic designs are in development. DARPA, the research arm of America’s Department of Defence, is funding the development of small, soft robots that move like jerky slithering blobs. EATR, another DARPA project, is a foraging robot that gathers leaves and wood for fuel and then burns it to generate electricity. Researchers at Italy’s Sant’Anna School of Advanced Studies, in Pisa, have designed a snakelike aquatic robot. And a small helicopter drone called the Pelican, designed by German and American companies, could remain aloft for weeks, powered by energy from a ground-based laser….

A larger worry is that countries with high-performance military robots may be more inclined to launch attacks. Robots protect soldiers and improve their odds of success. Using drones sidesteps the tricky politics of putting boots on foreign soil. In the past eight years drone strikes by America’s Central Intelligence Agency (CIA) have killed more than 2,400 people in Pakistan, including 479 civilians, according to the Bureau for Investigative Journalism in London. Technological progress appears to have contributed to an increase in the frequency of strikes. In 2005 CIA drones struck targets in Pakistan three times; last year there were 76 strikes there. Do armed robots make killing too easy?

Not necessarily….. Today’s drones, blimps, unmanned boats and reconnaissance robots collect and transmit so much data, she says, that Western countries now practise “warfare by committee”. Government lawyers and others in operation rooms monitor video feeds from robots to call off strikes that are illegal or would “look bad on CNN”, says Ms Cummings, who is now a robotics researcher at MIT. And unlike pilots at the scene, these remote observers are unaffected by the physical toil of flying a jet or the adrenalin rush of combat.

In March Britain’s Royal Artillery began buying robotic missiles designed by MBDA, a French company. The Fire Shadow is a “loitering munition” capable of travelling 100km, more than twice the maximum range of a traditional artillery shell. It can circle in the sky for hours, using sensors to track even a moving target. A human operator, viewing a video feed, then issues an instruction to attack, fly elsewhere to find a better target, or abort the mission by destroying itself. But bypassing the human operator to automate attacks would be, technologically, in the “realm of feasibility”, an MBDA spokesman says……

Traditional rules of engagement stipulate that a human must decide if a weapon is to be fired. But this restriction is starting to come under pressure. Already, defence planners are considering whether a drone aircraft should be able to fire a weapon based on its own analysis. In 2009 the authors of a US Air Force report suggested that humans will increasingly operate not “in the loop” but “on the loop”, monitoring armed robots rather than fully controlling them. Better artificial intelligence will eventually allow robots to “make lethal combat decisions”, they wrote, provided legal and ethical issues can be resolved…..

Pressure will grow for armies to automate their robots if only so machines can shoot before being shot, says Jürgen Altmann of the Technical University of Dortmund, in Germany, and a founder of the International Committee for Robot Arms Control, an advocacy group. Some robot weapons already operate without human operators to save precious seconds. An incoming anti-ship missile detected even a dozen miles away can be safely shot down only by a robot, says Frank Biemans, head of sensing technologies for the Goalkeeper automatic ship-defence cannons made by Thales Nederland.  Admittedly, that involves a machine destroying another machine. But as human operators struggle to assimilate the information collected by robotic sensors, decision-making by robots seems likely to increase. This might be a good thing, says Ronald Arkin, a roboticist at the Georgia Institute of Technology, who is developing “ethics software” for armed robots. By crunching data from drone sensors and military databases, it might be possible to predict, for example, that a strike from a missile could damage a nearby religious building. Clever software might be used to call off attacks as well as initiate them.

In the air, on land and at sea, military robots are proliferating. But the revolution in military robotics does have an Achilles heel, notes Emmanuel Goffi of the French air-force academy in Salon-de-Provence. As robots become more autonomous, identifying a human to hold accountable for a bloody blunder will become very difficult, he says. Should it be the robot’s programmer, designer, manufacturer, human overseer or his superiors? It is hard to say. The backlash from a deadly and well-publicised mistake may be the only thing that can halt the rapid march of the robots.

Robots go to war: March of the robots, Economist Technology Quarterly, June 2, 2012, at 13

See also Boston Dynamics

UK to Blame for the CIA Drone War?

A human rights group and a law firm took legal action Monday (March 12, 2012) against the British government, accusing it of passing on intelligence to assist U.S. covert drone attacks in Pakistan.  The London-based charity Reprieve and the law firm Leigh Day & Co. are filing papers to the High Court claiming that civilian staff at Britain’s electronic listening agency, GCHQ, could be liable as “secondary parties to murder” for providing “locational intelligence” to the CIA in directing its drone attack program.

The two are acting on behalf of Noor Khan, 27, a Pakistani whose father was killed by a drone strike in northwest Pakistan in March 2011 while attending a gathering of elders. More than 40 other people were killed in that attack, they said.

Reprieve, which helps death row prisoners and Guantanamo Bay inmates, urged the British government to be more transparent about its role — if any — in the drone program.  “What has the government got to hide? If they’re not supplying information as part of the CIA’s illegal drone war, why not tell us?” Reprieve director Clive Stafford Smith said.

British officials have never commented publicly on the drones. The Foreign Office and GCHQ declined comment on the legal action Monday, saying they could not speak about ongoing legal proceedings or and intelligence matters.

Since 2004, CIA drones have targeted suspected militants with missile strikes in the Pakistani tribal regions, killing hundreds of people. The program is controversial because of questions about its legality, the number of civilians it has killed, and its impact on Pakistan’s sovereignty.  U.S. officials do not publicly acknowledge the covert drone program but they have said privately that the strikes harm very few innocents and are key to weakening Al Qaeda and other militant groups.

Leigh Day & Co. did not detail what evidence the firm has regarding Britain’s alleged role in the drone program, but it cited media reports that quoted an anonymous GCHQ source as saying that the assistance it gave to the U.S. authorities was in ‘strict accordance’ with the law.  The law firm disputed that, saying GCHQ staff may be guilty of war crimes by passing along detailed intelligence to a drone program that violates international humanitarian law.

UK government sued for helping US drone strikes, Associated Press, March 12, 2012