Warfare Technology News Stories
The DoD has ambitious plans for full spectrum dominance, seeking control over all potential battlespaces: land, ocean, air, outerspace, and cyberspace. Artificial intelligence and other emerging technologies are being used to further these agendas, reshaping the military and geopolitical landscape in unprecedented ways.
In our news archive below, we examine how emerging warfare technology undermines national security, fuels terrorism, and causes devastating civilian casualties.
Related: Weapons of Mass Destruction, Biotech Dangers, Non-Lethal Weapons
Future wars just might revolve around insect-size spy robots. A recent digest of present-day microbots by US national security magazine The National Interest breaks down the many machines currently in development by the US military and its associates. They include sea-based microdrones, cockroach-style surveillance bots, and even cyborg insects. Arguably the most refined program to date is the RoboBee, currently being shopped by Harvard’s Wyss Institute. Originally funded by a $9.3 million grant from the National Science Foundation in 2009, the RoboBee is a bug-sized autonomous flying vehicle capable of transitioning from water to air, perching on surfaces, and autonomous collision avoidance in swarms. The RoboBee features two “wafer-thin” wings that flap some 120 times a second to achieve vertical takeoff and mid-air hovering. The US Defense Advanced Research Projects Agency (DARPA) has reportedly taken a keen interest in RoboBee prototypes, sponsoring research into microfabrication technology, presumably for quick field deployments. Other developments, like the aforementioned cyborg insect, remain in early stages. Researchers have successfully demonstrated the capabilities of these remote-control systems using of a range of insect hosts, from the unicorn beetle to the humble cockroach. Underwater microrobotics are another area of interest for DARPA.
Note: Explore all news article summaries on emerging warfare technology in our comprehensive news database.
AI could mean fewer body bags on the battlefield — but that's exactly what terrifies the godfather of AI. Geoffrey Hinton, the computer scientist known as the "godfather of AI," said the rise of killer robots won't make wars safer. It will make conflicts easier to start by lowering the human and political cost of fighting. Hinton said ... that "lethal autonomous weapons, that is weapons that decide by themselves who to kill or maim, are a big advantage if a rich country wants to invade a poor country." "The thing that stops rich countries invading poor countries is their citizens coming back in body bags," he said. "If you have lethal autonomous weapons, instead of dead people coming back, you'll get dead robots coming back." That shift could embolden governments to start wars — and enrich defense contractors in the process, he said. Hinton also said AI is already reshaping the battlefield. "It's fairly clear it's already transformed warfare," he said, pointing to Ukraine as an example. "A $500 drone can now destroy a multimillion-dollar tank." Traditional hardware is beginning to look outdated, he added. "Fighter jets with people in them are a silly idea now," Hinton said. "If you can have AI in them, AIs can withstand much bigger accelerations — and you don't have to worry so much about loss of life." One Ukrainian soldier who works with drones and uncrewed systems [said] in a February report that "what we're doing in Ukraine will define warfare for the next decade."
Note: As law expert Dr. Salah Sharief put it, "The detached nature of drone warfare has anonymized and dehumanized the enemy, greatly diminishing the necessary psychological barriers of killing." For more, read our concise summaries of news articles on AI and warfare technology.
“Ice is just around the corner,” my friend said, looking up from his phone. A day earlier, I had met with foreign correspondents at the United Nations to explain the AI surveillance architecture that Immigration and Customs Enforcement (Ice) is using across the United States. The law enforcement agency uses targeting technologies which one of my past employers, Palantir Technologies, has both pioneered and proliferated. Technology like Palantir’s plays a major role in world events, from wars in Iran, Gaza and Ukraine to the detainment of immigrants and dissident students in the United States. Known as intelligence, surveillance, target acquisition and reconnaissance (Istar) systems, these tools, built by several companies, allow users to track, detain and, in the context of war, kill people at scale with the help of AI. They deliver targets to operators by combining immense amounts of publicly and privately sourced data to detect patterns, and are particularly helpful in projects of mass surveillance, forced migration and urban warfare. Also known as “AI kill chains”, they pull us all into a web of invisible tracking mechanisms that we are just beginning to comprehend, yet are starting to experience viscerally in the US as Ice wields these systems near our homes, churches, parks and schools. The dragnets powered by Istar technology trap more than migrants and combatants ... in their wake. They appear to violate first and fourth amendment rights.
Note: Read how Palantir helped the NSA and its allies spy on the entire planet. Learn more about emerging warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more, read our concise summaries of news articles on AI and Big Tech.
Local cops have gotten tens of millions of dollars’ worth of discounted military gear under a secretive federal program that is poised to grow under recent executive action. The 1122 program ... presents a danger to people facing off against militarized cops, according to Women for Weapons Trade Transparency. “All of these things combined serve as a threat to free speech, an intimidation tactic to protest,” said Lillian Mauldin, the co-founder of the nonprofit group, which produced the report released this week. The federal government’s 1033 program ... has long sent surplus gear like mine-resistant vehicles and bayonets to local police. Since 1994, however, the even more obscure 1122 program has allowed local cops to purchase everything from uniforms to riot shields at federal government rates. The program turns the feds into purchasing agents for local police. Local cops have used the program to pick up 16 Lenco BearCats, fearsome-looking armored police vehicles. Those vehicles represented 4.8 percent of the total spending identified in the ... report. Surveillance gear and software represented another 6.4 percent, and weapons or riot gear represented 5 percent. One agency bought a $428,000 Star Safire thermal imaging system, the kind used in military helicopters. The Texas Department of Public Safety’s intelligence and counterterrorism unit purchased a $1.5 million surveillance software license. Another agency bought an $89,000 covert camera system.
Note: Read more about the Pentagon's 1033 program. For more along these lines, read our concise summaries of news articles on police corruption and the erosion of civil liberties.
Department of Defense spending is increasingly going to large tech companies including Microsoft, Google parent company Alphabet, Oracle, and IBM. Open AI recently brought on former U.S. Army general and National Security Agency Director Paul M. Nakasone to its Board of Directors. The U.S. military discreetly, yet frequently, collaborated with prominent tech companies through thousands of subcontractors through much of the 2010s, obfuscating the extent of the two sectors’ partnership from tech employees and the public alike. The long-term, deep-rooted relationship between the institutions, spurred by massive Cold War defense and research spending and bound ever tighter by the sectors’ revolving door, ensures that advances in the commercial tech sector benefit the defense industry’s bottom line. Military, tech spending has manifested myriad landmark inventions. The internet, for example, began as an Advanced Research Projects Agency (ARPA, now known as Defense Advanced Research Projects Agency, or DARPA) research project called ARPANET, the first network of computers. Decades later, graduate students Sergey Brin and Larry Page received funding from DARPA, the National Science Foundation, and U.S. intelligence community-launched development program Massive Digital Data Systems to create what would become Google. Other prominent DARPA-funded inventions include transit satellites, a precursor to GPS, and the iPhone Siri app, which, instead of being picked up by the military, was ultimately adapted to consumer ends by Apple.
Note: Watch our latest video on the militarization of Big Tech. For more, read our concise summaries of news articles on AI, warfare technology, and Big Tech.
The US military may soon have an army of faceless suicide bombers at their disposal, as an American defense contractor has revealed their newest war-fighting drone. AeroVironment unveiled the Red Dragon in a video on their YouTube page, the first in a new line of 'one-way attack drones.' This new suicide drone can reach speeds up to 100 mph and can travel nearly 250 miles. The new drone takes just 10 minutes to set up and launch and weighs just 45 pounds. Once the small tripod the Red Dragon takes off from is set up, AeroVironment said soldiers would be able to launch up to five per minute. Since the suicide robot can choose its own target in the air, the US military may soon be taking life-and-death decisions out of the hands of humans. Once airborne, its AVACORE software architecture functions as the drone's brain, managing all its systems and enabling quick customization. Red Dragon's SPOTR-Edge perception system acts like smart eyes, using AI to find and identify targets independently. Simply put, the US military will soon have swarms of bombs with brains that don't land until they've chosen a target and crash into it. Despite Red Dragon's ability to choose a target with 'limited operator involvement,' the Department of Defense (DoD) has said it's against the military's policy to allow such a thing to happen. The DoD updated its own directives to mandate that 'autonomous and semi-autonomous weapon systems' always have the built-in ability to allow humans to control the device.
Note: Drones create more terrorists than they kill. For more, read our concise summaries of news articles on warfare technology and Big Tech.
In 2003 [Alexander Karp] – together with Peter Thiel and three others – founded a secretive tech company called Palantir. And some of the initial funding came from the investment arm of – wait for it – the CIA! The lesson that Karp and his co-author draw [in their book The Technological Republic: Hard Power, Soft Belief and the Future of the West] is that “a more intimate collaboration between the state and the technology sector, and a closer alignment of vision between the two, will be required if the United States and its allies are to maintain an advantage that will constrain our adversaries over the longer term. The preconditions for a durable peace often come only from a credible threat of war.” Or, to put it more dramatically, maybe the arrival of AI makes this our “Oppenheimer moment”. For those of us who have for decades been critical of tech companies, and who thought that the future for liberal democracy required that they be brought under democratic control, it’s an unsettling moment. If the AI technology that giant corporations largely own and control becomes an essential part of the national security apparatus, what happens to our concerns about fairness, diversity, equity and justice as these technologies are also deployed in “civilian” life? For some campaigners and critics, the reconceptualisation of AI as essential technology for national security will seem like an unmitigated disaster – Big Brother on steroids, with resistance being futile, if not criminal.
Note: Learn more about emerging warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more, read our concise summaries of news articles on AI and intelligence agency corruption.
Before signing its lucrative and controversial Project Nimbus deal with Israel, Google knew it couldn’t control what the nation and its military would do with the powerful cloud-computing technology, a confidential internal report obtained by The Intercept reveals. The report makes explicit the extent to which the tech giant understood the risk of providing state-of-the-art cloud and machine learning tools to a nation long accused of systemic human rights violations. Not only would Google be unable to fully monitor or prevent Israel from using its software to harm Palestinians, but the report also notes that the contract could obligate Google to stonewall criminal investigations by other nations into Israel’s use of its technology. And it would require close collaboration with the Israeli security establishment — including joint drills and intelligence sharing — that was unprecedented in Google’s deals with other nations. The rarely discussed question of legal culpability has grown in significance as Israel enters the third year of what has widely been acknowledged as a genocide in Gaza — with shareholders pressing the company to conduct due diligence on whether its technology contributes to human rights abuses. Google doesn’t furnish weapons to the military, but it provides computing services that allow the military to function — its ultimate function being, of course, the lethal use of those weapons. Under international law, only countries, not corporations, have binding human rights obligations.
Note: For more along these lines, read our concise summaries of news articles on AI and government corruption.
2,500 US service members from the 15th Marine Expeditionary Unit [tested] a leading AI tool the Pentagon has been funding. The generative AI tools they used were built by the defense-tech company Vannevar Labs, which in November was granted a production contract worth up to $99 million by the Pentagon’s startup-oriented Defense Innovation Unit. The company, founded in 2019 by veterans of the CIA and US intelligence community, joins the likes of Palantir, Anduril, and Scale AI as a major beneficiary of the US military’s embrace of artificial intelligence. In December, the Pentagon said it will spend $100 million in the next two years on pilots specifically for generative AI applications. In addition to Vannevar, it’s also turning to Microsoft and Palantir, which are working together on AI models that would make use of classified data. People outside the Pentagon are warning about the potential risks of this plan, including Heidy Khlaaf ... at the AI Now Institute. She says this rush to incorporate generative AI into military decision-making ignores more foundational flaws of the technology: “We’re already aware of how LLMs are highly inaccurate, especially in the context of safety-critical applications that require precision.” Khlaaf adds that even if humans are “double-checking” the work of AI, there's little reason to think they're capable of catching every mistake. “‘Human-in-the-loop’ is not always a meaningful mitigation,” she says.
Note: For more, read our concise summaries of news articles on warfare technology and Big Tech.
Alexander Balan was on a California beach when the idea for a new kind of drone came to him. This eureka moment led Balan to found Xdown, the company that’s building the P.S. Killer (PSK)—an autonomous kamikaze drone that works like a hand grenade and can be thrown like a football. The PSK is a “throw-and-forget” drone, Balan says, referencing the “fire-and-forget” missile that, once locked on to a target, can seek it on its own. Instead of depending on remote controls, the PSK will be operated by AI. Soldiers should be able to grab it, switch it on, and throw it—just like a football. The PSK can carry one or two 40 mm grenades commonly used in grenade launchers today. The grenades could be high-explosive dual purpose, designed to penetrate armor while also creating an explosive fragmentation effect against personnel. These grenades can also “airburst”—programmed to explode in the air above a target for maximum effect. Infantry, special operations, and counterterrorism units can easily store PSK drones in a field backpack and tote them around, taking one out to throw at any given time. They can also be packed by the dozen in cargo airplanes, which can fly over an area and drop swarms of them. Balan says that one Defense Department official told him “This is the most American munition I have ever seen.” The nonlethal version of the PSK [replaces] its warhead with a supply container so that it’s able to “deliver food, medical kits, or ammunition to frontline troops” (though given the 1.7-pound payload capacity, such packages would obviously be small).
Note: The US military is using Xbox controllers to operate weapons systems. The latest US Air Force recruitment tool is a video game that allows players to receive in-game medals and achievements for drone bombing Iraqis and Afghans. For more, read our concise summaries of news articles on warfare technologies and watch our latest video on the militarization of Big Tech.
Last April, in a move generating scant media attention, the Air Force announced that it had chosen two little-known drone manufacturers—Anduril Industries of Costa Mesa, California, and General Atomics of San Diego—to build prototype versions of its proposed Collaborative Combat Aircraft (CCA), a future unmanned plane intended to accompany piloted aircraft on high-risk combat missions. The Air Force expects to acquire at least 1,000 CCAs over the coming decade at around $30 million each, making this one of the Pentagon’s costliest new projects. In winning the CCA contract, Anduril and General Atomics beat out three of the country’s largest and most powerful defense contractors ... posing a severe threat to the continued dominance of the existing military-industrial complex, or MIC. The very notion of a “military-industrial complex” linking giant defense contractors to powerful figures in Congress and the military was introduced on January 17, 1961, by President Dwight D. Eisenhower in his farewell address. In 2024, just five companies—Lockheed Martin (with $64.7 billion in defense revenues), RTX (formerly Raytheon, with $40.6 billion), Northrop Grumman ($35.2 billion), General Dynamics ($33.7 billion), and Boeing ($32.7 billion)—claimed the vast bulk of Pentagon contracts. Now ... a new force—Silicon Valley startup culture—has entered the fray, and the military-industrial complex equation is suddenly changing dramatically.
Note: For more, read our concise summaries of news articles on warfare technologies and watch our latest video on the militarization of Big Tech.
In the Air Force, drone pilots did not pick the targets. That was the job of someone pilots called “the customer.” The customer might be a conventional ground force commander, the C.I.A. or a classified Special Operations strike cell. [Drone operator] Captain Larson described a mission in which the customer told him to track and kill a suspected Al Qaeda member. Then, the customer told him to use the Reaper’s high-definition camera to follow the man’s body to the cemetery and kill everyone who attended the funeral. In December 2016, the Obama administration loosened the rules. Strikes once carried out only after rigorous intelligence-gathering and approval processes were often ordered up on the fly, hitting schools, markets and large groups of women and children. Before the rules changed, [former Air Force captain James] Klein said, his squadron launched about 16 airstrikes in two years. Afterward, it conducted them almost daily. Once, Mr. Klein said, the customer pressed him to fire on two men walking by a river in Syria, saying they were carrying weapons over their shoulders. The weapons turned out to be fishing poles. Over time, Mr. Klein grew angry and depressed. Eventually, he refused to fire any more missiles. In 2020, he retired, one of many disillusioned drone operators who quietly dropped out. “We were so isolated," he said. “The biggest tell is that very few people stayed in the field. They just couldn’t take it.” Bennett Miller was an intelligence analyst, trained to study the Reaper’s video feed. In late 2019 ... his team tracked a man in Afghanistan who the customer said was a high-level Taliban financier. For a week, the crew watched the man feed his animals, eat with family in his courtyard. Then the customer ordered the crew to kill him. A week later, the Taliban financier’s name appeared again on the target list. “We got the wrong guy. I had just killed someone’s dad,” Mr. Miller said. “I had watched his kids pick up the body parts.” In February 2020, he ... was hospitalized, diagnosed with PTSD and medically retired. Veterans with combat-related injuries, even injuries suffered in training, get special compensation worth about $1,000 per month. Mr. Miller does not qualify, because the Department of Veterans Affairs does not consider drone missions combat. “It’s like they are saying all the people we killed somehow don’t really count,” he said. “And neither do we.”
Note: Captain Larson took his own life in 2020. Furthermore, drones create more terrorists than they kill. Read about former drone operator Brandon Bryant's emotional experience of killing a child in Afghanistan that his superiors told him was a dog. For more along these lines, explore concise summaries of revealing news articles on war.
The Defense Advanced Research Project Agency, the Pentagon's top research arm, wants to find out if red blood cells could be modified in novel ways to protect troops. The DARPA program, called the Red Blood Cell Factory, is looking for researchers to study the insertion of "biologically active components" or "cargoes" in red blood cells. The hope is that modified cells would enhance certain biological systems, "thus allowing recipients, such as warfighters, to operate more effectively in dangerous or extreme environments." Red blood cells could act like a truck, carrying "cargo" or special protections, to all parts of the body, since they already circulate oxygen everywhere, [said] Christopher Bettinger, a professor of biomedical engineering overseeing the program. "What if we could add in additional cargo ... inside of that disc," Bettinger said, referring to the shape of red blood cells, "that could then confer these interesting benefits?" The research could impact the way troops battle diseases that reproduce in red blood cells, such as malaria, Bettinger hypothesized. "Imagine an alternative world where we have a warfighter that has a red blood cell that's accessorized with a compound that can sort of defeat malaria," Bettinger said. In 2019, the Army released a report called "Cyborg Soldier 2050," which laid out a vision of the future where troops would benefit from neural and optical enhancements, though the report acknowledged ethical and legal concerns.
Note: Read about the Pentagon's plans to use our brains as warfare, describing how the human body is war's next domain. Learn more about biotech dangers.
Militaries, law enforcement, and more around the world are increasingly turning to robot dogs — which, if we're being honest, look like something straight out of a science-fiction nightmare — for a variety of missions ranging from security patrol to combat. Robot dogs first really came on the scene in the early 2000s with Boston Dynamics' "BigDog" design. They have been used in both military and security activities. In November, for instance, it was reported that robot dogs had been added to President-elect Donald Trump's security detail and were on patrol at his home in Mar-a-Lago. Some of the remote-controlled canines are equipped with sensor systems, while others have been equipped with rifles and other weapons. One Ohio company made one with a flamethrower. Some of these designs not only look eerily similar to real dogs but also act like them, which can be unsettling. In the Ukraine war, robot dogs have seen use on the battlefield, the first known combat deployment of these machines. Built by British company Robot Alliance, the systems aren't autonomous, instead being operated by remote control. They are capable of doing many of the things other drones in Ukraine have done, including reconnaissance and attacking unsuspecting troops. The dogs have also been useful for scouting out the insides of buildings and trenches, particularly smaller areas where operators have trouble flying an aerial drone.
Note: Learn more about the troubling partnership between Big Tech and the military. For more, read our concise summaries of news articles on military corruption.
It is often said that autonomous weapons could help minimize the needless horrors of war. Their vision algorithms could be better than humans at distinguishing a schoolhouse from a weapons depot. Some ethicists have long argued that robots could even be hardwired to follow the laws of war with mathematical consistency. And yet for machines to translate these virtues into the effective protection of civilians in war zones, they must also possess a key ability: They need to be able to say no. Human control sits at the heart of governments’ pitch for responsible military AI. Giving machines the power to refuse orders would cut against that principle. Meanwhile, the same shortcomings that hinder AI’s capacity to faithfully execute a human’s orders could cause them to err when rejecting an order. Militaries will therefore need to either demonstrate that it’s possible to build ethical, responsible autonomous weapons that don’t say no, or show that they can engineer a safe and reliable right-to-refuse that’s compatible with the principle of always keeping a human “in the loop.” If they can’t do one or the other ... their promises of ethical and yet controllable killer robots should be treated with caution. The killer robots that countries are likely to use will only ever be as ethical as their imperfect human commanders. They would only promise a cleaner mode of warfare if those using them seek to hold themselves to a higher standard.
Note: Learn more about emerging warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more, read our concise summaries of news articles on AI and military corruption.
Mitigating the risk of extinction from AI should be a global priority. However, as many AI ethicists warn, this blinkered focus on the existential future threat to humanity posed by a malevolent AI ... has often served to obfuscate the myriad more immediate dangers posed by emerging AI technologies. These “lesser-order” AI risks ... include pervasive regimes of omnipresent AI surveillance and panopticon-like biometric disciplinary control; the algorithmic replication of existing racial, gender, and other systemic biases at scale ... and mass deskilling waves that upend job markets, ushering in an age monopolized by a handful of techno-oligarchs. Killer robots have become a twenty-first-century reality, from gun-toting robotic dogs to swarms of autonomous unmanned drones, changing the face of warfare from Ukraine to Gaza. Palestinian civilians have frequently spoken about the paralyzing psychological trauma of hearing the “zanzana” — the ominous, incessant, unsettling, high-pitched buzzing of drones loitering above. Over a decade ago, children in Waziristan, a region of Pakistan’s tribal belt bordering Afghanistan, experienced a similar debilitating dread of US Predator drones that manifested as a fear of blue skies. “I no longer love blue skies. In fact, I now prefer gray skies. The drones do not fly when the skies are gray,” stated thirteen-year-old Zubair in his testimony before Congress in 2013.
Note: For more along these lines, read our concise summaries of news articles on AI and military corruption.
The Pentagon is turning to a new class of weapons to fight the numerically superior [China's] People’s Liberation Army: drones, lots and lots of drones. In August 2023, the Defense Department unveiled Replicator, its initiative to field thousands of “all-domain, attritable autonomous (ADA2) systems”: Pentagon-speak for low-cost (and potentially AI-driven) machines — in the form of self-piloting ships, large robot aircraft, and swarms of smaller kamikaze drones — that they can use and lose en masse to overwhelm Chinese forces. For the last 25 years, uncrewed Predators and Reapers, piloted by military personnel on the ground, have been killing civilians across the planet. Experts worry that mass production of new low-cost, deadly drones will lead to even more civilian casualties. Advances in AI have increasingly raised the possibility of robot planes, in various nations’ arsenals, selecting their own targets. During the first 20 years of the war on terror, the U.S. conducted more than 91,000 airstrikes ... and killed up to 48,308 civilians, according to a 2021 analysis. “The Pentagon has yet to come up with a reliable way to account for past civilian harm caused by U.S. military operations,” [Columbia Law’s Priyanka Motaparthy] said. “So the question becomes, ‘With the potential rapid increase in the use of drones, what safeguards potentially fall by the wayside? How can they possibly hope to reckon with future civilian harm when the scale becomes so much larger?’”
Note: Learn more about emerging warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more, read our concise summaries of news articles on military corruption.
At the Technology Readiness Experimentation (T-REX) event in August, the US Defense Department tested an artificial intelligence-enabled autonomous robotic gun system developed by fledgling defense contractor Allen Control Systems dubbed the “Bullfrog.” Consisting of a 7.62-mm M240 machine gun mounted on a specially designed rotating turret outfitted with an electro-optical sensor, proprietary AI, and computer vision software, the Bullfrog was designed to deliver small arms fire on drone targets with far more precision than the average US service member can achieve with a standard-issue weapon. Footage of the Bullfrog in action published by ACS shows the truck-mounted system locking onto small drones and knocking them out of the sky with just a few shots. Should the Pentagon adopt the system, it would represent the first publicly known lethal autonomous weapon in the US military’s arsenal. In accordance with the Pentagon’s current policy governing lethal autonomous weapons, the Bullfrog is designed to keep a human “in the loop” in order to avoid a potential “unauthorized engagement." In other words, the gun points at and follows targets, but does not fire until commanded to by a human operator. However, ACS officials claim that the system can operate totally autonomously should the US military require it to in the future, with sentry guns taking the entire kill chain out of the hands of service members.
Note: Learn more about emerging warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more, see concise summaries of deeply revealing news articles on AI from reliable major media sources.
On the sidelines of the International Institute for Strategic Studies’ annual Shangri-La Dialogue in June, US Indo-Pacific Command chief Navy Admiral Samuel Paparo colorfully described the US military’s contingency plan for a Chinese invasion of Taiwan as flooding the narrow Taiwan Strait between the two countries with swarms of thousands upon thousands of drones, by land, sea, and air, to delay a Chinese attack enough for the US and its allies to muster additional military assets. “I want to turn the Taiwan Strait into an unmanned hellscape using a number of classified capabilities,” Paparo said, “so that I can make their lives utterly miserable for a month, which buys me the time for the rest of everything.” China has a lot of drones and can make a lot more drones quickly, creating a likely advantage during a protracted conflict. This stands in contrast to American and Taiwanese forces, who do not have large inventories of drones. The Pentagon’s “hellscape” plan proposes that the US military make up for this growing gap by producing and deploying what amounts to a massive screen of autonomous drone swarms designed to confound enemy aircraft, provide guidance and targeting to allied missiles, knock out surface warships and landing craft, and generally create enough chaos to blunt (if not fully halt) a Chinese push across the Taiwan Strait. Planning a “hellscape" of hundreds of thousands of drones is one thing, but actually making it a reality is another.
Note: Learn more about warfare technology in our comprehensive Military-Intelligence Corruption Information Center. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Razish [is] a fake village built by the US army to train its soldiers for urban warfare. It is one of a dozen pretend settlements scattered across “the Box” (as in sandbox) – a vast landscape of unforgiving desert at the Fort Irwin National Training Center (NTC), the largest such training facility in the world. Covering more than 1,200 square miles, it is a place where soldiers come to practise liberating the citizens of the imaginary oil-rich nation Atropia from occupation by the evil authoritarian state of Donovia. Fake landmines dot the valleys, fake police stations are staffed by fake police, and fake villages populated by citizens of fake nation states are invaded daily by the US military – wielding very real artillery. It operates a fake cable news channel, on which officers are subjected to aggressive TV interviews, trained to win the media war as well as the physical one. Recently, it even introduced internal social media networks, called Tweeter and Fakebook, where mock civilians spread fake news about the battles – social media being the latest weapon in the arsenal of modern war. Razish may still have a Middle Eastern look, but the actors hawking chunks of plastic meat and veg in the street market speak not English or Arabic, but Russian. This military role-playing industry has ballooned since the early 2000s, now comprising a network of 256 companies across the US, receiving more than $250m a year in government contracts. The actors are often recent refugees, having fled one real-world conflict only to enter another, simulated one.
Note: For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Billionaire Elon Musk’s brain-computer interface (BCI) company Neuralink made headlines earlier this year for inserting its first brain implant into a human being. Such implants ... are described as “fully implantable, cosmetically invisible, and designed to let you control a computer or mobile device anywhere you go." They can help people regain abilities lost due to aging, ailments, accidents or injuries, thus improving quality of life. Yet, great ethical concerns arise with such advancements, and the tech is already being used for questionable purposes. Some Chinese employers have started using “emotional surveillance technology” to monitor workers’ brainwaves. Governments and militaries are already ... describing the human body and brain as war’s next domain. On this new “battlefield,” an era of neuroweapons ... has begun. The Pentagon’s research arm DARPA directly or indirectly funds about half of invasive neural interface technology companies in the US. DARPA has initiated at least 40 neurotechnology-related programs over the past 24 years. As a 2024 RAND report speculates, if BCI technologies are hacked or compromised, “a malicious adversary could potentially inject fear, confusion, or anger into [a BCI] commander’s brain and cause them to make decisions that result in serious harm.” Academic Nicholas Evans speculates, further, that neuroimplants could “control an individual’s mental functions,” perhaps to manipulate memories, emotions, or even to torture the wearer. In a [military research paper] on neurowarfare: "Microbiologists have recently discovered mind-controlling parasites that can manipulate the behavior of their hosts according to their needs by switching genes on or off. Since human behavior is at least partially influenced by their genetics, nonlethal behavior modifying genetic bioweapons that spread through a highly contagious virus could thus be, in principle, possible.
Note: The CIA once used brain surgery to make six remote controlled dogs. For more, see important information on microchip implants and CIA mind control programs from reliable major media sources.
The Palestinian population is intimately familiar with how new technological innovations are first weaponized against them–ranging from electric fences and unmanned drones to trap people in Gaza—to the facial recognition software monitoring Palestinians in the West Bank. Groups like Amnesty International have called Israel an Automated Apartheid and repeatedly highlight stories, testimonies, and reports about cyber-intelligence firms, including the infamous NSO Group (the Israeli surveillance company behind the Pegasus software) conducting field tests and experiments on Palestinians. Reports have highlighted: “Testing and deployment of AI surveillance and predictive policing systems in Palestinian territories. In the occupied West Bank, Israel increasingly utilizes facial recognition technology to monitor and regulate the movement of Palestinians. Israeli military leaders described AI as a significant force multiplier, allowing the IDF to use autonomous robotic drone swarms to gather surveillance data, identify targets, and streamline wartime logistics.” The Palestinian towns and villages near Israeli settlements have been described as laboratories for security solutions companies to experiment their technologies on Palestinians before marketing them to places like Colombia. The Israeli government hopes to crystalize its “automated apartheid” through the tokenization and privatization of various industries and establishing a technocratic government in Gaza.
Note: For more along these lines, see concise summaries of deeply revealing news articles on government corruption and the disappearance of privacy from reliable major media sources.
In 2023, this country’s drone warfare program has entered its third decade with no end in sight. Despite the fact that the 22nd anniversary of 9/11 is approaching, policymakers have demonstrated no evidence of reflecting on the failures of drone warfare and how to stop it. Instead, the focus continues to be on simply shifting drone policy in minor ways within an ongoing violent system. Washington’s war on terror has inflicted disproportionate violence on communities across the globe, while using this form of asymmetrical warfare to further expand the space between the value placed on American lives and those of Muslims. Since the war on terror was launched, the London-based watchdog group Airwars has estimated that American air strikes have killed at least 22,679 civilians and possibly up to 48,308 of them. Such killings have been carried out for the most part by desensitized killers, who have been primed towards the dehumanization of the targets of those murderous machines. In the words of critic Saleh Sharief, “The detached nature of drone warfare has anonymized and dehumanized the enemy, greatly diminishing the necessary psychological barriers of killing.” While the use of drones in the war on terror began under President George W. Bush, it escalated dramatically under Obama. Then, in the Trump years, it rose yet again. Though the use of drones in Joe Biden’s first year in office was lower than Trump’s, what has remained consistent is the lack of ... accountability for the slaughter of civilians.
Note: A 2014 analysis found that attempts to kill 41 people with drones resulted in 1,147 deaths. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Though once confined to the realm of science fiction, the concept of supercomputers killing humans has now become a distinct possibility. In addition to developing a wide variety of "autonomous," or robotic combat devices, the major military powers are also rushing to create automated battlefield decision-making systems, or what might be called "robot generals." In wars in the not-too-distant future, such AI-powered systems could be deployed to deliver combat orders to American soldiers, dictating where, when, and how they kill enemy troops or take fire from their opponents. In its budget submission for 2023, for example, the Air Force requested $231 million to develop the Advanced Battlefield Management System (ABMS), a complex network of sensors and AI-enabled computers designed to ... provide pilots and ground forces with a menu of optimal attack options. As the technology advances, the system will be capable of sending "fire" instructions directly to "shooters," largely bypassing human control. The Air Force's ABMS is intended to ... connect all US combat forces, the Joint All-Domain Command-and-Control System (JADC2, pronounced "Jad-C-two"). "JADC2 intends to enable commanders to make better decisions by collecting data from numerous sensors, processing the data using artificial intelligence algorithms to identify targets, then recommending the optimal weapon ... to engage the target," the Congressional Research Service reported in 2022.
Note: Read about the emerging threat of killer robots on the battlefield. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Weapons-grade robots and drones being utilized in combat isn't new. But AI software is, and it's enhancing – in some cases, to the extreme – the existing hardware, which has been modernizing warfare for the better part of a decade. Now, experts say, developments in AI have pushed us to a point where global forces now have no choice but to rethink military strategy – from the ground up. "It's realistic to expect that AI will be piloting an F-16 and will not be that far out," Nathan Michael, Chief Technology Officer of Shield AI, a company whose mission is "building the world's best AI pilot," says. We don't truly comprehend what we're creating. There are also fears that a comfortable reliance in the technology's precision and accuracy – referred to as automation bias – may come back to haunt, should the tech fail in a life or death situation. One major worry revolves around AI facial recognition software being used to enhance an autonomous robot or drone during a firefight. Right now, a human being behind the controls has to pull the proverbial trigger. Should that be taken away, militants could be misconstrued for civilians or allies at the hands of a machine. And remember when the fear of our most powerful weapons being turned against us was just something you saw in futuristic action movies? With AI, that's very possible. "There is a concern over cybersecurity in AI and the ability of either foreign governments or an independent actors to take over crucial elements of the military," [filmmaker Jesse Sweet] said.
Note: For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Within ten days [of its release], the first-person military shooter video game [Call of Duty: Modern Warfare II] earned more than $1 billion in revenue. The Call of Duty franchise is an entertainment juggernaut, having sold close to half a billion games since it was launched in 2003. Its publisher, Activision Blizzard, is a giant in the industry. Details gleaned from documents obtained under the Freedom of Information Act reveal that Call of Duty is not a neutral first-person shooter, but a carefully constructed piece of military propaganda, designed to advance the interests of the U.S. national security state. Not only does Activision Blizzard work with the U.S. military to shape its products, but its leadership board is also full of former high state officials. Chief amongst these is Frances Townsend, Activision Blizzard's senior counsel. As the White House's most senior advisor on terrorism and homeland security, Townsend ... became one of the faces of the administration's War on Terror. Activision Blizzard's chief administration officer, Brian Bulatao ... was chief operating officer for the CIA, placing him third in command of the agency. Bulatao went straight from the State Department into the highest echelons of Activision Blizzard, despite no experience in the entertainment industry. [This] raises serious questions around privacy and state control over media. "Call of Duty ... has been flagged up for recreating real events as game missions and manipulating them for geopolitical purposes," [journalist Tom] Secker told MintPress.
Note: The latest US Air Force recruitment tool is a video game that allows players to receive in-game medals and achievements for drone bombing Iraqis and Afghans. For more on this disturbing "military-entertainment complex" trend, explore the work of investigative journalist Tom Secker, who recently produced a documentary, Theaters of War: How the Pentagon and CIA Took Hollywood, and published a new book, Superheroes, Movies and the State: How the U.S. Government Shapes Cinematic Universes.
Last week, an Israeli defense company painted a frightening picture. In a roughly two-minute video on YouTube that resembles an action movie, soldiers out on a mission are suddenly pinned down by enemy gunfire and calling for help. In response, a tiny drone zips off its mother ship to the rescue, zooming behind the enemy soldiers and killing them with ease. While the situation is fake, the drone — unveiled last week by Israel-based Elbit Systems — is not. The Lanius, which in Latin can refer to butcherbirds, represents a new generation of drone: nimble, wired with artificial intelligence, and able to scout and kill. The machine is based on racing drone design, allowing it to maneuver into tight spaces, such as alleyways and small buildings. After being sent into battle, Lanius’s algorithm can make a map of the scene and scan people, differentiating enemies from allies — feeding all that data back to soldiers who can then simply push a button to attack or kill whom they want. For weapons critics, that represents a nightmare scenario, which could alter the dynamics of war. “It’s extremely concerning,” said Catherine Connolly, an arms expert at Stop Killer Robots, an anti-weapons advocacy group. “It’s basically just allowing the machine to decide if you live or die if we remove the human control element for that.” According to the drone’s data sheet, the drone is palm-size, roughly 11 inches by 6 inches. It has a top speed of 45 miles per hour. It can fly for about seven minutes, and has the ability to carry lethal and nonlethal materials.
Note: US General Paul Selva has warned against employing killer robots in warfare for ethical reasons. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Looking Glass Factory, a company based in the Greenpoint neighborhood of Brooklyn, New York, revealed its latest consumer device: a slim, holographic picture frame that turns photos taken on iPhones into 3D displays. Looking Glass received $2.54 million of “technology development” funding from In-Q-Tel, the venture capital arm of the CIA, from April 2020 to March 2021 and a $50,000 Small Business Innovation Research award from the U.S. Air Force in November 2021 to “revolutionize 3D/virtual reality visualization.” Across the various branches of the military and intelligence community, contract records show a rush to jump on holographic display technology, augmented reality, and virtual reality display systems as the latest trend. Critics argue that the technology isn’t quite ready for prime time, and that the urgency to adopt it reflects the Pentagon’s penchant for high-priced, high-tech contracts based on the latest fad in warfighting. Military interest in holographic imaging, in particular, has grown rapidly in recent years. Military planners in China and the U.S. have touted holographic technology to project images “to incite fear in soldiers on a battlefield.” Other uses involve the creation of three-dimensional maps of villages of specific buildings and to analyze blast forensics. Palmer Luckey, who founded the technology startup Anduril Industries ... has received secretive Air Force contracts to develop next-generation artificial intelligence capabilities under the so-called Project Maven initiative.
Note: For more along these lines, see concise summaries of deeply revealing news articles on intelligence agency corruption from reliable major media sources.
According to the nonprofit organization Airwars, the U.S. has conducted more than 91,000 airstrikes in seven major conflict zones since 2001, with at least 22,000 civilians killed and potentially as many as 48,000. How does America react when it kills civilians? Just last week, we learned that the U.S. military decided that nobody will be held responsible for the August 29 drone attack in Kabul, Afghanistan, that killed 10 members of an Afghan family, including seven children. After an internal review, Secretary of Defense Lloyd Austin chose to take no action, not even a wrist slap for a single intelligence analyst, drone operator, mission commander, or general. U.S. bombings since 2014 have consistently killed civilians but ... the Pentagon has done almost nothing to discern how many were harmed or what went wrong and might be corrected. Savagery consists of more than the act of killing. It also involves a system of impunity that makes clear to the perpetrators that what they are doing is acceptable, necessary — maybe even heroic — and must not cease. To this end, the United States has developed a machinery of impunity that is arguably the most advanced in the world, implicating not only a broad swathe of military personnel but also the entirety of American society. The machinery of impunity actually has two missions: The most obvious is to excuse people who should not be excused. The other is to punish those who try to expose the machine, because it does not function well in daylight.
Note: For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
The terrorist attack on the airport in Kabul, Afghanistan’s capital ... killed more than 170 Afghan civilians and 13 U.S. soldiers. Three days later, Biden authorized a drone strike that the U.S. claimed took out a dangerous cell of ISIS fighters. Biden held up this strike, and another one a day earlier, as evidence of his commitment to take the fight to the terrorists in Afghanistan. But the Kabul strike, which targeted a white Toyota Corolla, did not kill any members of ISIS. The victims were 10 civilians, seven of them children. The driver of the car, Zemari Ahmadi, was a respected employee of a U.S. aid organization. Following a New York Times investigation that fully exposed the lie of the U.S. version of events, the Pentagon and the White House admitted that they had killed innocent civilians, calling it “a horrible tragedy of war.” This week, the Pentagon released a summary of its classified review into the attack, which it originally hailed as a “righteous strike” that had thwarted an imminent terror plot. The results were predictable. The report recommended that no personnel be held responsible for the murder of 10 civilians; there was no “criminal negligence,” as the report put it. Daniel Hale, a military veteran who pleaded guilty to disclosing classified documents that exposed lethal weaknesses in the drone program, is serving four years in prison. Hale’s documents exposed how as many as nine out of 10 victims of U.S. drone strikes in Afghanistan were not the intended targets. In Biden’s recent drone strike, 10 of 10 were innocent civilians.
Note: For more along these lines, see concise summaries of deeply revealing news articles on government corruption and war from reliable major media sources.
A bomb hit the house. [Rua Moataz] Khadr and her two daughters were able to free themselves from the rubble that had fallen on them, but her 4-year-old son, Ibrahim Ahmed Yahya, was crushed to death. He was among the 9,000 to 11,000 civilians killed during the yearlong battle for Mosul. Khadr, like most bombing victims in Iraq, has no idea which nation was responsible for the airstrike that killed her son. Was it an American aircraft, British, Dutch? “Even if I found out, what would I do?” she told The Intercept. In its final days in Afghanistan, the U.S. conducted a drone strike that killed 10 civilians in Kabul — seven of them children. Their deaths bring up a thorny question surrounding the frequent U.S. killing of civilians in the 9/11 wars: What would justice look like for the families of civilians who have been wrongfully killed? The media attention generated by the Kabul strike has prompted a rare admission of guilt from the Pentagon and may ultimately lead to monetary compensation for the survivors. But byzantine laws in the U.S. make it all but impossible for foreigners to file for compensation if a relative was killed in combat. The only hope for most survivors is a “sympathy” payment from the U.S. military that does not acknowledge responsibility for causing the deaths. But unsurprisingly, those payments are rare: None were issued in 2020. Meanwhile, U.S. allies involved in bombing campaigns usually hide behind the shield of joint operations to avoid taking responsibility for civilian deaths.
Note: For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Rep. Ilhan Omar (D-Minn.) is asking President Biden to pardon a former Air Force intelligence analyst who exposed secrets about drone warfare in Afghanistan. In July, Daniel Hale pleaded guilty in federal court in Alexandria to violating the Espionage Act and was sentenced in July to 45 months in prison for leaking classified documents to the Intercept. In court, Hale said he felt compelled to speak out about the immorality of the drone program after realizing he had helped kill Afghan civilians, including a small child. "Not a day goes by that I don't question the justification for my actions," he wrote to the judge. "I am grief-stricken and ashamed of myself." One document he leaked showed that during a five-month operation in Afghanistan, nearly 90 percent of the people killed were not the intended targets. "I take extremely seriously the prohibition on leaking classified information, but I believe there are several aspects of Mr. Hale's case that merit a full pardon," Omar wrote in the letter sent to Biden. "The information, while politically embarrassing to some, has shone a vital light on the legal and moral problems of the drone program and informed the public debate on an issue that has for too many years remained in the shadows." This week, Hale was awarded the Sam Adams Award for Integrity in Intelligence, given by a group of whistleblowers from the national security community. Edward Snowden received the same award in 2013.
Note: Hale's leak was the basis for an article series called The Drone Papers. A 2014 analysis found that attempts to kill 41 people with drones resulted in 1,147 deaths. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
A soldier wears a skullcap that stimulates his brain to make him learn skills faster, or reads his thoughts as a way to control a drone. Another is plugged into a Tron-like active cyber defense system, in which she mentally teams up with computer systems to successfully multitask during complex military missions. The Pentagon is already researching these seemingly sci-fi concepts. The basics of brain-machine interfaces are being developedjust watch the videos of patients moving prosthetic limbs with their minds. The Defense Department is examining newly scientific tools, like genetic engineering, brain chemistry, and shrinking robotics, for even more dramatic enhancements. But the real trick may not be granting superpowers, but rather making sure those effects are temporary. Last year, three Canadian defense researchers published a paper that explored the intersection of human enhancement and ethics. They found that the permanence of the enhancement could have impacts on troops in the field ... as well as a return to civilian life. They also note that many soldier resilience human enhancement technologies raised health and safety questions. The Canadian researchers wrote: Are there unknown side effects or long term effects that could lead to unanticipated health problems during deployment or after discharge? Moreover, is it ethical to force a soldier to use the technology in question, or should he/she be allowed to consent to its use? Can consent be fully free from coercion in the military?
Note: Read an excellent article detailing the risks of biosensors implanted under the skin which have already been developed. Some smaller than a grain of rice can be injected with a needle. Watch a slick video promoting this brave new world. Learn how this is already planned for use on soldiers. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
Ties between Silicon Valley and the Pentagon are deeper than previously known, according to thousands of previously unreported subcontracts published Wednesday. The subcontracts were obtained through open records requests by accountability nonprofit Tech Inquiry. They show that tech giants including Google, Amazon, and Microsoft have secured more than 5,000 agreements with agencies including the Department of Defense, Immigrations and Customs Enforcement, the Drug Enforcement Agency, and the FBI. Tech workers in recent years have pressured their employers to drop contracts with law enforcement and the military. Google workers revolted in 2018 after Gizmodo revealed that Google was building artificial intelligence for drone targeting through a subcontract with the Pentagon after some employees quit in protest, Google agreed not to renew the contract. Employees at Amazon and Microsoft have petitioned both companies to drop their contracts with ICE and the military. Neither company has. The newly-surfaced subcontracts ... show that the companies' connections to the Pentagon run deeper than many employees were previously aware. Tech Inquiry's research was led by Jack Poulson, a former Google researcher. "Often the high-level contract description between tech companies and the military looks very vanilla," Poulson [said]. "But only when you look at the details ... do you see the workings of how the customization from a tech company would actually be involved."
Note: For more along these lines, see concise summaries of deeply revealing news articles on corruption in government and in the corporate world from reliable major media sources.
Brandon Bryant was enlisted in the US Air Force for six years. During his time with the military, he operated Predator drones, remotely firing missiles at targets more than 7,000 miles away from the small room containing his workspace near Las Vegas, Nevada. Mr Bryant says he reached his breaking point with the US military after killing a child in Afghanistan that his superiors told him was a dog. Following that incident, Mr Bryant quit the military and began speaking out against the drone program. During his time in the Air Force, Mr Bryant estimates he contributed directly to killing 13 people himself and says his squadron fired on 1,626 targets including women and children. He says he has been left suffering from post-traumatic stress disorder. Mr Bryant said that despite his misgivings about the program, his superiors used punitive measures and mockery to keep him in line. He has said the US military is worse than the Nazis because we should know better. Mr Bryant said he and his family have been threatened for speaking out against the drone program and that he has lost friends and been estranged from other members of his family over his whistle-blowing. Ultimately Mr Bryant wants the public to understand the dehumanizing effect of the drone program on the operators and the individuals targeted. I would want people to know, beyond its existence, the consequences it has on us as a species to delineate our power into something so easily destructive, Mr Bryant said.
Note: Drones almost always miss their intended targets and create more terrorists than they kill. For more along these lines, see concise summaries of deeply revealing news articles on military corruption from reliable major media sources.
In September 2017, Aileen Black wrote an email to her colleagues at Google. Black, who led sales to the U.S. government, worried that details of the companys work to help the military guide lethal drones would become public through the Freedom of Information Act. We will call tomorrow to reinforce the need to keep Google under the radar," Black wrote. According to a Pentagon memo signed last year, however, no one at Google needed worry: All 5,000 pages of documents about Googles work on the drone effort, known as Project Maven, are barred from public disclosure, because they constitute critical infrastructure security information." The memo is part of a recent wave of federal decisions that keep sensitive documents secret on that same basis - thus allowing agencies to quickly deny document requests. In response to a Freedom of Information Act request I filed more than a year ago, seeking documents related to Project Mavens use of Google technology, the Defense Department said that it had discovered 5,000 pages of relevant material - and that every single page was exempt from disclosure. Some of the pages included trade secrets, sensitive internal deliberations, and private personal information about some individuals, the department said. Such information can be withheld under the act. But it said all of the material could be kept private under Exemption 3" of the act, which allows the government to withhold records under a grab bag of other federal statutes.
Note: Read more about Project Maven. Google employees strongly opposed working on war technology, and circulated a petition to stop the project. For more along these lines, see concise summaries of deeply revealing news articles on corruption in government and in the corporate world.
The US Navy has announced that a new laser weapon it tested earlier this year was a success. A video of the laser weapon system (Laws), released by the Office of Naval Research, shows the laser being deployed aboard USS Ponce in September in the Persian Gulf. It shows the weapon being used against two test targets, including a speedboat which bursts into flames. Other targets were located at sea and in the air, including unmanned aerial vehicles (UAVs), or drones. Rear Adm. Matthew L. Klunder, chief of naval research, said in a statement on Wednesday that the powerful Laws system will play a vital role in the future of naval combat operations. The prototype weapon in the video cost $40 million to produce, dealt with a tough pace, adverse weather conditions including a sandstorm, and destroyed targets with near-instantaneous lethality. Officials claim the weapon is capable of destroying its targets with pin-point accuracy. The captain of the USS Ponce could use it against a real threat if required. Operated using a video game controller, the system hit targets mounted aboard small boats speeding towards the ship. In a separate test, the laser targeted and shot a drone out of the sky.
Note: For more along these lines, see concise summaries of deeply revealing war news articles from reliable major media sources. Then explore the excellent, reliable resources provided in our War Information Center.
Chinese researchers are working on a new handheld laser weapon capable of burning skin and clothing from up to half a mile away. Boasts about the ... laser rifle's capabilities come amid a diplomatic row between the US and China over the use of blinding lasers on military aircraft. The new laser rifle will be used by anti-terrorism squads in the Chinese Armed Police, the South China Morning Post reports, with prototypes of the device already being tested ... at the Chinese Academy of Sciences. The ZKZM assault rifle causes "instant carbonisation" of human tissue, according to the researchers behind it, and will "burn through clothes in a split second. " The unnamed researcher from the Chinese Academy of Sciences added: "If the fabric is flammable, the whole person will be set on fire." While the gun is not powerful enough to kill someone, the researcher said: "The pain will be beyond endurance." Aerospace and defence giant Lockheed Martin has previously developed laser beam systems that can be attached to planes or ground-based vehicles. In May, the Pentagon made a formal complaint to the Chinese government over Chinese nationals allegedly pointing lasers at US planes in east Africa. While there has been no word on when China's new laser gun might see service, the US Navy recently announced its own $300 million research fund aimed at creating a family of laser weapons for its fleet. In 2016, the US military said it would be deploying laser weapons as early as 2023.
Note: China was reported in 2006 to be using lasers to blind American spy satellites. For more along these lines, see concise summaries of deeply revealing non-lethal weapons news articles from reliable major media sources.
Google will not seek to extend its contract next year with the Defense Department for artificial intelligence used to analyze drone video, squashing a controversial alliance that had raised alarms over the technological buildup between Silicon Valley and the military. Google ... has faced widespread public backlash and employee resignations for helping develop technological tools that could aid in warfighting. Google will soon release new company principles related to the ethical uses of AI. Thousands of Google employees wrote chief executive Sundar Pichai an open letter urging the company to cancel the contract, and many others signed a petition saying the companys assistance in developing combat-zone technology directly countered the companys famous Dont be evil motto. Several Google AI employees had told The Post they believed they wielded a powerful influence over the companys decision-making. The advanced technologys top researchers and developers are in heavy demand, and many had organized resistance campaigns or threatened to leave. The sudden announcement Friday was welcomed by several high-profile employees. Meredith Whittaker, an AI researcher and the founder of Googles Open Research group, tweeted Friday: I am incredibly happy about this decision, and have a deep respect for the many people who worked and risked to make it happen. Google should not be in the business of war.
Note: Explore a treasure trove of concise summaries of incredibly inspiring news articles which will inspire you to make a difference.
Hundreds of academics have urged Google to abandon its work on a U.S. Department of Defense-led drone program codenamed Project Maven. An open letter calling for change was published Monday by the International Committee for Robot Arms Control (ICRAC). The project is formally known as the Algorithmic Warfare Cross-Functional Team. Its objective is to turn the enormous volume of data available to DoD into actionable intelligence. More than 3,000 Google staffers signed a petition in April in protest at the company's focus on warfare. We believe that Google should not be in the business of war, it read. Therefore we ask that Project Maven be cancelled. The ICRAC warned this week the project could potentially be mixed with general user data and exploited to aid targeted killing. Currently, its letter has nearly 500 signatures. It stated: We are ... deeply concerned about the possible integration of Googles data on peoples everyday lives with military surveillance data, and its combined application to targeted killing ... Google has moved into military work without subjecting itself to public debate or deliberation. While Google regularly decides the future of technology without democratic public engagement, its entry into military technologies casts the problems of private control of information infrastructure into high relief. Lieutenant Colonel Garry Floyd, deputy chief of the Algorithmic Warfare Cross Functional Team, said ... earlier this month that Maven was already active in five or six combat locations.
Note: You can read the full employee petition on this webpage. The New York Times also published a good article on this. For more along these lines, see concise summaries of deeply revealing news articles on corporate corruption and war.
Theres something eating at Google employees. Roughly one dozen employees of the search giant have resigned in the wake of reports that the ... company is providing artificial intelligence to the Pentagon. The employees resigned because of ethical concerns over the companys work with the Defense Department that includes helping the military speed up analysis of drone footage by automatically classifying images of objects and people, Gizmodo reported. Many of the employees who quit have written accounts of their decisions to leave the company. Their stories have been gathered and shared in an internal document. Google is helping the DoDs Project Maven implement machine learning to classify images gathered by drones, according to the report. Some employees believe humans, not algorithms, should be responsible for this sensitive and potentially lethal work - and that Google shouldnt be involved in military work at all. The 12 resignations are the first known mass resignations at Google in protest against one of the companys business decisions - and they speak to the strongly felt ethical concerns of the employees who are departing. In addition to the resignations, nearly 4,000 Google employees have voiced their opposition to Project Maven in an internal petition that asks Google to immediately cancel the contract and institute a policy against taking on future military work.
Note: You can read the full employee petition on this webpage. An open letter in support of google employees and tech workers was signed by more than 90 academics in artificial intelligence, ethics, and computer science. The New York Times also published a good article on this. For more along these lines, see concise summaries of deeply revealing news articles on corporate corruption and war
Thousands of Google employees, including dozens of senior engineers, have signed a letter protesting the companys involvement in a Pentagon program that uses artificial intelligence to interpret video imagery and could be used to improve the targeting of drone strikes. The letter, which is circulating inside Google and has garnered more than 3,100 signatures, reflects a culture clash ... that is likely to intensify as cutting-edge artificial intelligence is increasingly employed for military purposes. We believe that Google should not be in the business of war, says the letter, addressed to Sundar Pichai, the companys chief executive. It asks that Google pull out of Project Maven, a Pentagon pilot program, and announce a policy that it will not ever build warfare technology. That kind of idealistic stance ... is distinctly foreign to Washingtons massive defense industry and certainly to the Pentagon, where the defense secretary, Jim Mattis, has often said a central goal is to increase the lethality of the United States military. Some of Googles top executives have significant Pentagon connections. Eric Schmidt, former executive chairman of Google and still a member of the executive board of Alphabet, Googles parent company, serves on a Pentagon advisory body, the Defense Innovation Board, as does a Google vice president, Milo Medin. Project Maven ... began last year as a pilot program to find ways to speed up the military application of the latest A.I. technology.
Note: The use of artificial intelligence technology for drone strike targeting is one of many ways warfare is being automated. Strong warnings against combining artificial intelligence with war have recently been issued by America's second-highest ranking military officer, tech mogul Elon Musk, and many of the world's most recognizable scientists. For more along these lines, see concise summaries of deeply revealing war news articles from reliable major media sources.
Drone pilots have been quitting the U.S. Air Force in record numbers. They cite a combination of low-class status in the military, overwork and psychological trauma. But a widely publicized new memoir about Americas covert drone war fails to mention the outflow increases, as one internal Air Force memo calls it. Drone Warrior: An Elite Soldiers Inside Account of the Hunt for Americas Most Dangerous Enemies chronicles the nearly 10 years that Brett Velicovich, a former special operations member, spent using drones to help special forces find and track terrorists. Conveniently, it also puts a hard sell on a program whose ranks the military is struggling to keep full. The book is, at best, a tale of hyper-masculine bravado and, at worst, a piece of military propaganda designed to ease doubts about the drone program and increase recruitment. Velicovich exaggerates the accuracy of the technology, neglecting to mention how often it fails or that such failures have killed an untold number of civilians. For instance, the CIA killed 76 children and 29 adults in its attempts to take out Ayman al Zawahiri, the leader of Al Qaeda, who reportedly is still alive. The film rights to Drone Warrior were bought over a year ago, with much fanfare, by Paramount Pictures. This development is predictable. The U.S. military and Hollywood have long enjoyed a symbiotic relationship. But there is something particularly unseemly about Hollywoods enthusiasm for bringing Velicovichs version of drone warfare to the big screen.
Note: Documents obtained by a crowdfunded investigative journalism project show that US military and intelligence agencies have influenced over 1,800 movies and television shows. For more along these lines, see concise summaries of deeply revealing news articles on military corruption and the manipulation of mass media.
The world's first race for drones controlled by people's thoughts involved 16 pilots flying drones over a course of just 10 yards in an indoor basketball court. "With events like this, we're popularizing the use of BCI (brain-computer interface) instead of it being stuck in the research lab," said [PhD student] Chris Crawford. Mind-controlled technology already is helping paralyzed people move limbs or robotic prosthetics. The technology ... works by using an EEG headset that has been calibrated to identify the electrical activity associated with particular thoughts in each wearer's brain. Programmers write code to translate these ... signals into commands that computers send to the drones. Professor Juan Gilbert, whose computer science students organized the race, [wants] to know how mind-controlled devices could expand and change the way we play, work and live. But as the technology moves toward wider adoption, ethical, legal and privacy questions remain unresolved. The US Defence Department - which uses drones to kill suspected terrorists ... from vast distances - is looking for military brain-control applications. A 2014 Defence Department grant supports the Unmanned Systems Laboratory ... where researchers have developed a system enabling a single person with no prior training to fly multiple drones simultaneously through mind control.
Note: Read an article which dives deep into the use of neuroscience as a weapon. And since drone strikes almost always miss their intended targets and reportedly create more terrorists than they kill, is it really a good idea to make drone fleets easier for the military to deploy?
As the United States and its allies continue their bombing campaign against the Islamic State in Iraq and Syria, many more noncombatants are perishing than they seem prepared to admit. Airwars, the organization I lead, at present estimates that at least 1,500 civilians have been killed by the United States-led coalition. Similar or higher tallies are reported by other monitoring groups, like Iraq Body Count and the Syrian Observatory for Human Rights. But coalition officials have publicly admitted just 55 deaths. It may just be a matter of looking. Our policy is not to go out and seek allegations of civilian casualties, a senior official from United States Central Command, or Centcom, which oversees the bombing campaign, told me recently when I asked about the discrepancy between reports of noncombatant deaths and official investigations. It took about 15 months into the war for any admission of civilian deaths in Iraq - despite thousands of airstrikes and more than 130 reported incidents. An average of 173 days still passes between a civilian casualty in Iraq or Syria and any public admission of responsibility. The Pentagon is not alone in its accounting failures. Russia still denies the more than 2,000 deaths it has most likely caused in Syria, while all 12 of the United States coalition partners insist they have killed only bad guys. This then is a systemic problem, one that suggests militaries are at present unfit - or unwilling - to count the dead accurately from above.
Note: The above was written by Chris Woods, author of Sudden Justice: Americas Secret Drone Wars. For more along these lines, see concise summaries of deeply revealing war news articles from reliable major media sources.
As a witness to the removal of fallen U.S. troops from Afghanistan, Army Chaplain Christopher John Antal cant recall a time when that solemn ceremony wasnt conducted without the presence of drones passing along the horizon. On April 12, Antal resigned his commission as an officer in the Army because of his conscientious objection to the United States drone policy. In a letter addressed to ... Barack Obama, Antal wrote, The executive branch continues to claim the right to kill anyone, anywhere on Earth, at any time, for secret reasons, based on secret evidence, in a secret process, undertaken by unidentified officials. I refuse to support this policy of unaccountable killing. In doing so, he joined other previous members of the armed forces who have addressed Obama to criticize his drone strike policy, including four former members of the Air Force who penned a letter in November of 2015 warning the president that the strikes served as a recruitment tool similar to Guantanamo Bay. Antals resignation concluded nearly eight years of service as an Army chaplain. He publicly voiced [his concerns about the targeted killings] in a Veterans Day sermon Nov. 11, 2012, when he gave a lyrical sermon criticizing drones on his base in Afghanistan and posted it online. Antal ... was called into the office of a general who told him to take down the sermon. He told me that my message did not support the mission, Antal said. He worried that his views about drones could land him in a military prison if did not leave his post.
Note: Drone strikes almost always miss their intended targets and reportedly create more terrorists than they kill. Casualties of war whose identities are unknown are frequently mis-reported to be "militants". For more along these lines, see concise summaries of deeply revealing war news articles from reliable major media sources.
Toward the end of a May 27 article in The Times about President Obamas speech in which, among other things, he mentioned setting new standards for ordering drone strikes against non-Americans, there was this rather disturbing paragraph: Even as he set new standards, a debate broke out about what they actually meant and what would actually change. For now, officials said, signature strikes targeting groups of unidentified armed men presumed to be extremists will continue in the Pakistani tribal areas. As Glenn Greenwald has pointed out, those two sentences seem to contradict the entire tenor of Mr. Obamas speech, and of a letter to Congress from Attorney General Eric Holder. Both men seemed to be saying that the administration would stop using unmanned drones to kill targets merely suspected, due to their location or their actions, of a link to Al Qaeda or another terrorist organization. Those strikes have resulted in untold civilian casualties that have poisoned Americas relationship with Yemen and Pakistan. Mr. Obama talked at some length about civilian casualties, and also said that the need to use drone strikes against forces that are massing to support attacks on coalition forces will disappear once American forces withdraw from Afghanistan at the end of 2014. But so what to make of that paragraph in the May 27 article? I asked the White House. What I got in response was part of a background briefing given after the presidents speech that repeated the language about how the need for signature strikes will fade.
Note: Drone strikes often miss their intended targets and reportedly create more terrorists than they kill. Casualties of war whose identities are unknown are frequently mis-reported to be "militants". For more along these lines, see concise summaries of deeply revealing government corruption news articles from reliable major media sources.
The US government has long maintained, reasonably enough, that a defining tactic of terrorism is to launch a follow-up attack aimed at those who go to the scene of the original attack to rescue the wounded and remove the dead. Yet ... this has become one of the favorite tactics of the very same US government. Attacking rescuers (and arguably worse, bombing funerals of America's drone victims) is now a tactic routinely used by the US in Pakistan. In February, the Bureau of Investigative Journalism documented that "the CIA's drone campaign in Pakistan has killed dozens of civilians who had gone to help rescue victims or were attending funerals." Specifically: "at least 50 civilians were killed in follow-up strikes when they had gone to help victims." The UN special rapporteur on extrajudicial killings ... Christof Heyns, said that if "there have been secondary drone strikes on rescuers who are helping (the injured) after an initial drone attack, those further attacks are a war crime." There is no doubt that there have been. The frequency with which the US uses this tactic is reflected by this December 2011 report ... on the drone killing of 16-year-old Tariq Khan and his 12-year-old cousin Waheed, just days after the older boy attended a meeting to protest US drones: "[Witnesses] did not provide pictures of the missile strike scene. Virtually none exist, since drones often target people who show up at the scene."
Note: Drone strikes almost always miss their intended targets and reportedly create more terrorists than they kill. Casualties of war whose identities are unknown are frequently mis-reported to be "militants". For more along these lines, see concise summaries of deeply revealing government corruption news articles from reliable major media sources.
In 2009, not long after his historic election and seven years after the first U.S. drone strike, President Barack Obama accepted the Nobel Peace Prize. Since then, however, deadly U.S. drone strikes have increased sharply, as have doubts about the programs reliability and effectiveness. The latest criticism comes from Drone, a new documentary about the CIAs covert drone war. To help promote the film and inveigh against the agencys drone program ... four former operators - Stephen Lewis, Michael Haas, Cian Westmoreland and Brandon Bryant - appeared at a press conference. Speaking out can lead to veiled threats and prosecution. Which is why for years Bryant was the only drone veteran who openly rebuked the drone war. But his persistence and his appearance in the film, the other three say, inspired them to come forward. On multiple occasions, the men say they complained to their superiors about their concerns to no avail. Drone strikes kill far more civilians than the government admits. These deaths, they argue, wind up helping militant groups recruit new members and hurt the U.S.s long-term security. By distancing soldiers from the battlefield, the operators suggest the people carrying out strikes may become even more desensitized to killing than their counterparts on the front lines. On some occasions, Haas says operators referred to children as fun-sized terrorists or TITS, terrorists in training.
Note: A human rights attorney has stated the four former Air Force drone operators-turned-whistleblowers mentioned above have had their credit cards and bank accounts frozen. How many more have not spoken out against these abuses for fear of retaliation like this? Read more about the major failings of US drone attacks. For more along these lines, see concise summaries of deeply revealing war news articles from reliable major media sources.
Retired Army Gen. Mike Flynn, a top intelligence official in the post-9/11 wars in Iraq and Afghanistan, says in a forthcoming interview ... that the drone war is creating more terrorists than it is killing. He also asserts that the U.S. invasion of Iraq helped create the Islamic State. Flynn, who in 2014 was forced out as head of the Defense Intelligence Agency, has in recent months become an outspoken critic of the Obama administrations Middle East strategy. The former three star general ... describes the present approach of drone warfare as a failed strategy. What we have is this continued investment in conflict, the retired general says. The more weapons we give, the more bombs we drop, that just fuels the conflict. In 2010, [Flynn] published a controversial report on intelligence operations in Afghanistan, stating in part that the military could not answer fundamental questions about the country and its people despite nearly a decade of engagement there. Earlier this year, Flynn commended the Senate Intelligence Committee report on CIA torture saying that torture had eroded American values and that in time, the U.S. will look back on it, and it wont be a pretty picture.
Note: Drone strikes almost always miss their intended targets. Casualties of war whose identities are unknown are frequently mis-reported to be "militants". For more along these lines, see concise summaries of deeply revealing news articles about military corruption.
The Obama administration is again allowing the CIA to use drone strikes to secretly kill people that the spy agency does not know the identities of in multiple countries - despite repeated statements to the contrary. Apparently the drone operators didn't even know at the time who they were aiming at - only that they thought the target was possibly a terrorist hideout. It's what's known as a "signature" strike. Signature strikes has led to scores of civilians being killed over the past decade, including two completely innocent hostages ... one of whom was a US citizen ... less than two months ago. It's a way of killing that's been roundly condemned by human rights organizations and that some members of Congress have tried to outlaw. Here's how the New York Times described it: "The joke was that when the CIA sees "three guys doing jumping jacks," the agency thinks it is a terrorist training camp, said one senior official. Men loading a truck with fertilizer could be bombmakers but they might also be farmers." It has become increasingly clear that the "rules" are virtually meaningless. As is typical with the US government's extrajudicial killing policy, there was no public debate about any of the changes to the supposed rules, or even announcement that they ever changed - only an unofficial leak to a journalist after the latest killing. Beyond the enormous human rights consequences related to such a dangerous policy, these types of strikes backfire on the United States, sowing hatred in the populations of bombed countries and breeding sympathy for al-Qaida where there was none before.
Note: For more along these lines, see concise summaries of deeply revealing news articles on terrorism from reliable major media sources. Then explore the excellent, reliable resources provided in our War Information Center.
The New York Times reported on Sunday that many of those in charge of the CIAs torture program the same people whose names were explicitly redacted from the Senates torture report in order to avert accountability have ascended to the agencys powerful senior ranks and now run the CIA drone program. Rather than being fired and prosecuted, they have been rewarded with promotions. The longtime Counterterrorism Center chief who just stepped down, Michael DAndrea, was previously in charge of the notorious CIA prison known as the Salt Pit, where prisoners were regularly tortured and some died. His replacement, Chris Wood, was also central to the interrogation program, according to the Times. The only reason we know DAndrea and Woods names is because the New York Times executive editor Dean Baquet commendably decided to publish them. The CIA asked them not to. Adding to the disturbing nature of the CIAs ability to kill people in complete secrecy, the agency apparently now has a carte blanche to conduct drone strikes on its own. President Obama doesnt individually approve them anymore he lets the CIA unilaterally decide to kill people. The Obama administration has promised more transparency around drone strikes, yet at the same time, wont even acknowledge that the controversial drone strike its apologizing for even happened - just because such admission might force courts to hold the government accountable for its actions.
Note: For more along these lines, see concise summaries of deeply revealing news articles about corruption in government and in the intelligence community.
About once a month, staff members of the congressional intelligence committees drive across the Potomac River to C.I.A. headquarters in Langley, Va., and watch ... footage of drone strikes. The screenings have provided a veneer of congressional oversight. The C.I.A.s killing missions are ... unlikely to change significantly despite President Obamas announcement on Thursday that a drone strike accidentally killed two innocent hostages, an American and an Italian. Michael DAndrea ... was chief of operations during the birth of the agencys detention and interrogation program and then, as head of the C.I.A. Counterterrorism Center, became an architect of the targeted killing program. He presided over the growth of C.I.A. drone operations and hundreds of strikes. Mr. DAndrea was a forceful advocate for the drone program. He was particularly effective in winning the support of Senator Dianne Feinstein, the California Democrat who was chairwoman of the Senate Intelligence Committee until January. The confidence Ms. Feinstein and other Democrats express about the drone program ... stands in sharp contrast to the criticism among lawmakers of the now defunct C.I.A. program to capture and interrogate Qaeda suspects in secret prisons. When Ms. Feinstein was asked in a meeting with reporters in 2013 why she was so sure she was getting the truth about the drone program while she accused the C.I.A. of lying to her about torture, she seemed surprised. Thats a good question, actually.
Note: The CIA has been aware that drone strikes are ineffective since at least 2009. If drones help terrorists, almost always miss their intended targets, and may be used to target people in the US in the future, what are the real reasons for the US government's drone program?
Theyre called lethal autonomous weapons, or LAWs, and their military mission would be to seek out, identify and kill a human target independent of human control. Representatives of 60 nations ... met in Geneva during the third week of April in an attempt to define the level of artificial intelligence needed for an international definition of robotic autonomy. The Panel of Experts, under the Convention on Conventional Weapons (CCW), will meet again next year to continue the discussion. None of the industrial nations admits having a LAW, but theres really no way to confirm the nonexistence of a weapon that would be classified as secret. The U.S. Department of Defense has had a directive in place for three years that outlines the chain of command that would approve their deployment on a case-by-case basis. Its called Directive 3000.09. On April 15, the third day of the panel meeting, Secretary of the Navy Ray Mabus ... announced the creation of a new office for unmanned warfare systems. According to Stuart Russell, who addressed the panel, Devices in the 1-gram range might be able to selectively kill a chosen human target on contact using a shaped explosive charge. Im not sure what countermeasures one might try against a swarm of 5-gram robots. There will be ... a LAWs arms race."
Note: Current surveillance drones can be hacked, hijacked, and redirected while flying in some cases. Under human guidance, current killer drones almost always miss their intended targets. What will happen when tiny lethal flying robots begin operating without being controlled by human decision makers?
Its been over two years since President Obama promised new transparency and accountability rules when it comes to drone strikes. Virtually no progress has been made. The criteria for who gets added to the unaccountable kill list is still shrouded in secrecy even when the US government is targeting its own citizens. We know because a Texas-born man named Mohanad Mahmoud Al Farekh recently captured overseas was arraigned in federal court this week. It turns out, as the Times reported, that in 2013 his government debated whether he should be killed by a drone strike in Pakistan. The CIA and military were reportedly pushing hard to send drones to kill Al Farekh, but the Justice Department didnt think there was enough evidence. An important new report released by the Open Society Justice Initiative this week also shows that - despite the Obama administrations internal requirements for drone strikes that supposedly require a near certainty that civilians wont get killed - the government quite often just disregards its own rules, which has led to the death of dozens of civilians in Yemen in the past two years. Though without Open Societys study, the public would have no clue, since the Obama administration still steadfastly refuses to officially release any information on drone strikes in Yemen. The administration has said for years it prefers capturing to killing but the data indicates that they practice the opposite.
Note: The CIA has been aware that drone strikes are ineffective since at least 2009. If drones help terrorists, almost always miss their intended targets, and may be used to target people in the US in the future, what are the real reasons for the US government's drone program?
Drone strikes and "targeted killings" of terror targets by the United States can be counterproductive and bolster the support of extremist groups, the CIA has admitted in a secret report released by WikiLeaks. The document, by the intelligence agency's Directorate of Intelligence, said that despite the effectiveness of "high value targeting" (HVT), air strikes and special forces operations had a negative impact by boosting the popular support of terror organisations. The CIA report is dated 2009 and talks of operations conducted in countries such as Iraq, Pakistan, Somalia, Afghanistan and Yemen. Operations against terror targets "may increase support for the insurgents, particularly if these strikes enhance insurgent leaders' lore, if non-combatants are killed in the attacks, if legitimate or semi-legitimate politicians aligned with the insurgents are targeted, or if the government is already seen as overly repressive or violent," the report said. "Senior Taliban leaders' use of sanctuary in Pakistan has also complicated the HVT effort," it reveals. "Moreover, the Taliban has a high overall ability to replace lost leaders ... especially at the middle levels." It speaks of drone strikes also having limited effect in Iraq. According to the Bureau of Investigative Journalism, US drone strikes have killed between 2,400 and 3,888 people in Pakistan in the years 2004 to 2014 and between 371 and 541 people in Yemen in the years 2002 to 2014.
Note: This report proves that the CIA has been aware that drone strikes are ineffective since at least 2009. If drones help terrorists, almost always miss their intended targets, and may be used to target people in the US in the future, what are the real reasons for the US government's drone program?
A new analysis of the data available to the public about drone strikes, conducted by the human-rights group Reprieve, indicates that even when operators target specific individuals the most focused effort of what Barack Obama calls targeted killing they kill vastly more people than their targets, often needing to strike multiple times. Attempts to kill 41 men resulted in the deaths of an estimated 1,147 people, as of 24 November. Reprieve [focused on] cases in which specific people were targeted by drones multiple times. Their data, shared with the Guardian, raises questions about the accuracy of US intelligence. The analysis is a partial estimate. Drone strikes ... are only as precise as the intelligence that feeds them. There is nothing precise about intelligence that results in the deaths of 28 unknown people, including women and children, for every bad guy the US goes after, said Reprieves Jennifer Gibson. The data cohort is only a fraction of those killed by US drones. Neither Reprieve nor the Guardian examined ... the so-called signature strikes that attack people based on a pattern of behavior considered suspicious, rather than intelligence tying their targets to terrorist activity. An analytically conservative Council on Foreign Relations tally assesses that 500 drone strikes outside of Iraq and Afghanistan have killed 3,674 people. Like all weapons, drones will inevitably miss their targets. But the secrecy surrounding them obscures how often misses occur and the reasons for them.
Note: For more along these lines, see concise summaries of deeply revealing military corruption news articles from reliable major media sources, including this NPR article that reports on the possibility of future drone strikes taking place within the US.
It has been more than two years since The New York Times revealed that Mr. Obama embraced a disputed method for counting civilian casualties of his drone strikes which in effect counts all military-age males in a strike zone as combatants ... unless there is explicit intelligence posthumously proving them innocent. The paper noted that this counting method may partly explain the official claims of extraordinarily low collateral deaths, and even quoted CIA officials as deeply troubled by this decision. After the Times article, most large western media outlets continued to describe completely unknown victims of U.S. drone attacks as militants even though they (a) had no idea who those victims were or what they had done and (b) were well-aware by that point that the term had been re-defined by the Obama administration. Like the U.S. drone program itself, this deceitful media practice continues unabated. The U.S. government itself let alone the media outlets calling them militants often has no idea who has been killed by drone strikes in Pakistan. The Intercept previously reported that targeting decisions can even be made on the basis of nothing more than metadata analysis and tracking of SIM cards in mobile phones. Just last month, the Bureau of Investigative Journalism documented that fewer than 4% of the people killed have been identified by available records as named members of al Qaeda.
Note: For more along these lines, see concise summaries of deeply revealing news articles about military corruption and high level manipulation of mass media from reliable sources.
The Government has secretly ramped up a controversial programme that strips people of their British citizenship on national security grounds with two of the men subsequently killed by American drone attacks. Since 2010, the Home Secretary, Theresa May, has revoked the passports of 16 individuals, many of whom are alleged to have had links to militant or terrorist groups. Critics of the programme warn that it allows ministers to wash their hands of British nationals suspected of terrorism who could be subject to torture and illegal detention abroad. They add that it also allows those stripped of their citizenship to be killed or rendered without any onus on the British Government to intervene. At least five of those deprived of their UK nationality ... were born in Britain, and one man had lived in the country for almost 50 years. Those affected have their passports cancelled, and lose their right to enter the UK making it very difficult to appeal. The leading human rights lawyer Gareth Peirce said the present situation smacked of mediaeval exile, just as cruel and just as arbitrary. Ian Macdonald QC, the president of the Immigration Law Practitioners Association, described the citizenship orders as sinister. Its not open government; its closed, and it needs to be exposed. Government officials act when people are out of the country on two occasions while on holiday before cancelling passports and revoking citizenships.
Note: For deeply revealing reports from reliable major media sources on crimes committed in wars of aggression, click here.
U.S. drone strikes in Pakistan have killed far more people than the United States has acknowledged, have traumatized innocent residents and largely been ineffective, according to a new study released [on September 25]. The study by Stanford Law School and New York University's School of Law calls for a re-evaluation of the practice, saying the number of "high-level" targets killed as a percentage of total casualties is extremely low -- about 2%. In contrast to more conservative U.S. statements, the Stanford/NYU report -- titled "Living Under Drones" -- offers starker figures published by The Bureau of Investigative Journalism, an independent organization based at City University in London. Based on interviews with witnesses, victims and experts, the report accuses the CIA of "double-striking" a target, moments after the initial hit, thereby killing first responders. It also highlights harm "beyond death and physical injury," publishing accounts of psychological trauma experienced by people living in Pakistan's tribal northwest region, who it says hear drones hover 24 hours a day. "Before this we were all very happy," the report quotes an anonymous resident as saying. "But after these drones attacks a lot of people are victims and have lost members of their family. A lot of them, they have mental illnesses." People have to live with the fear that a strike could come down on them at any moment of the day or night, leaving behind dead whose "bodies are shattered to pieces," and survivors who must be desperately sped to a hospital.
Note: Visit the Living Under Drones website here. For a Democracy Now! report on the results of this study click here. For more analysis click here and here.
The Sept. 11 attacks triggered a revolution in U.S. spycraft as the intelligence services shattered a longstanding taboo by launching an expansive program of targeted killings by remote control. The greatest shift both in tactics and mindset has been the embrace of the pilotless, hunter-killer aircraft known as drones. The CIA, which doesn't formally acknowledge the covert program, has killed about 2,000 militants with drones, U.S. officials say, most in the past two years as President Barack Obama's national security team aggressively expanded the program. In 2010, the number of drone strikes more than doubled, to 114, and this year, drone campaigns are expanding. The CIA now plans flights in Yemen, and the military is using drones to kill militants in Somalia. Legal challenges to the drone program have secured little traction. The main debate inside the government has been over how to execute the campaign without irreversibly damaging Pakistani cooperation. American citizens can be targets, too. Under the legal authority for the drone program, the CIA must consult the National Security Council before capturing an American posing an imminent threat, but no additional consultation is required to kill an American, a former senior intelligence official said. "The reason there hasn't been more of an outcry about it is, it's the Obama administration defending this authority," said the American Civil Liberties Union's Jameel Jaffer.
Note: For lots more on the illegal methods employed by the CIA and Pentagon in its "endless war", click here.
"Gentlemen! We have called you together to inform you that we are going to overthrow the United States government." So begins a statement being delivered by Gen. Carl W. Steiner. At least the voice sounds amazingly like him. But it is not Steiner. It is the result of voice "morphing" technology developed at the Los Alamos National Laboratory in New Mexico. Psychological operations ... PSYOPS, as the military calls it, seek to exploit human vulnerabilities in enemy governments, militaries and populations to pursue national and battlefield objectives. Covert operators kicked around the idea of creating a computer-faked videotape of Saddam Hussein crying or showing other such manly weaknesses, or in some sexually compromising situation. The nascent plan was for the tapes to be flooded into Iraq and the Arab world. The tape war never proceeded ... but the "strategic" PSYOPS scheming didn't die. What if the U.S. projected a holographic image of Allah floating over Baghdad urging the Iraqi people and Army to rise up against Saddam, a senior Air Force officer asked in 1990? According to a military physicist given the task of looking into the hologram idea, the feasibility had been established of projecting large, three-dimensional objects that appeared to float in the air. A super secret program was established in 1994 to pursue the very technology for PSYOPS application. The "Holographic Projector" is described in a classified Air Force document as a system to "project information power from space ... for special operations deception missions."
Note: Read declassified Air Force documents about holographic projector technology and Air Force research papers about its psychological operations implications. If the above link fails, click here. Watch this video to see how easily anyone's face can be manipulated on video to say anything in a way that appears startlingly real. Watch videos of highly sophisticated holographic projector technologies in Dubai and South Korea. For other revealing news articles on the use of these "nonlethal" weapons on this webpage.
Important Note: Explore our full index to revealing excerpts of key major media news stories on several dozen engaging topics. And don't miss amazing excerpts from 20 of the most revealing news articles ever published.



















































































