Family members of three U.S. citizens killed last year in drone strikes in Yemen recently filed a lawsuit here accusing U.S. intelligence and military officials of violating the victims’ rights under the U.S. constitution and international law.

Prepared by the American Civil Liberties Union and the Center for Constitutional Rights, the lawsuit marks a major legal challenge to the U.S. policy of extrajudicial “targeted killings” of suspected terrorists far from traditional battlefields, such as Afghanistan.

“These killings rely on vague legal standards, a closed executive process, and evidence never presented to the courts,” according to the 17-page complaint, which noted that the practice has “resulted in the deaths of thousands of people, including many hundreds of civilian bystanders,” in Yemen, Somalia, Pakistan, Sudan, and the Philippines since 2001...
The tragedy that played out in an Aurora movie theater Friday was ironically paralleled as a classroom learning experience in a medical school in Parker the same day.

Rocky Vista University College of Osteopathic Medicine is in the middle of holding specialized classes in disaster life support for 150 second-year medical students. Along with response to natural disasters like hurricanes and floods and terrorist attacks, one of the scenarios being used to train the students is how to respond if a shooter fires at people in a movie theater and also uses a bomb in the attack.

"The irony is amazing, just amazing," said Rocky Vista Dean Dr. Bruce Dubin.

He said emergency specialist physicians from Parkland Hospital in Dallas as well as from several other emergency programs around the country are teaching the Advanced Disaster Life Support Training. Rocky Vista is the only medical school in the nation to make that training a part of the curriculum.

"They are trained to respond in every type of disaster," Dubin said.

The shootings in Aurora were incorporated into the teaching Friday, Dubin said.

"It made these medical students very aware that these kinds of things can happen anywhere," he said. "The events of this tragedy have helped to drive that home."
In Israel, a heated debate is underway about whether Israel’s Interior Ministry will move ahead with the creation of a governmental biometric database containing digital fingerprints and facial photographs, which would be linked to “smart” national ID cards containing microchips. At the heart of the issue is a major concern about privacy: Aggregated personal information invites security breaches, and large databases of biometric information can be honeypots of sensitive data vulnerable to exploitation.

On July 23, Israel’s High Court of Justice held a hearing on a petition filed by civil rights advocates who sought to strike down a law establishing a governmental biometric database and an associated two-year pilot program.

The law approving the database, enacted in 2009, met with public resistance until the government backed down and agreed to begin with only the pilot program. The pilot was supposed to be a test for determining whether it was actually necessary to move forward with building the biometric database, but an Interior Ministry decree that sanctioned the program did not actually contain any criteria to measure whether the program succeeded or failed.

While three justices voiced harsh criticism of the database, they didn’t move to cancel the project altogether. Instead, they determined that the pilot program description has to present clear criteria for success and failure, so that it would be conducted as a true test. The ruling requires the Interior Ministry to examine the very necessity of a central database, and to seriously weigh possible alternatives. The court also called for an independent review of the program, and preserved petitioners’ right to return and present their claims against the database and pilot program.

In the course of the hearing, several justices characterized the proposed database as a “harmful” and “extreme” measure. They have good reason to be skittish: Last fall, officials discovered that information in Israel’s primary population database had been hacked in 2006, and the personal records of some 9 million Israelis—both living and dead—were uploaded to the Internet and made freely available. The database contained substantial information including full names, identity numbers, addresses, dates of birth and death, immigration dates and familial relationships. Given this blemished track record, there is naturally a concern that a database that also contained biometric information would meet the same fate...
Finally, Osaka University’s infant robot Affetto is able to hug you like a real baby, with the added benefit of paralyzing you with terror. A year and a half ago, Affetto wasn’t much more than a face, but a video from the university’s Asada Laboratory discovered by Plastic Pals shows a newly-limbed and skinless Affetto using his newfound freedom to swing his arms around. Things get even more unsettling around the 40 second mark, when Affetto gets draped in a grey hoodie, before finally having his face returned to him. The improvements to the robot include 20 pneumatic actuators, which allow him to freely move his arms, neck, and spine.

The project falls under the umbrella of "cognitive developmental robotics," which uses robotics to study human development — in particular, the interplay between caregivers and developing babies. The laboratory’s site explains, "interacting with the environment and people nearby is an important factor in development. In order to create the same conditions as with a real child, we’re developing a child robot that’s the same size, with a soft body, rich facial expressions, and small hands."
Researchers reporting online on July 26 in Current Biology have for the first time shown that they can control the behavior of monkeys by using pulses of blue light to very specifically activate particular brain cells. The findings represent a key advance for optogenetics, a state-of-the-art method for making causal connections between brain activity and behavior. Based on the discovery, the researchers say that similar light-based mind control could likely also be made to work in humans for therapeutic ends.

"We are the first to show that optogenetics can alter the behavior of monkeys," says Wim Vanduffel of Massachusetts General Hospital and KU Leuven Medical School. "This opens the door to use of optogenetics at a large scale in primate research and to start developing optogenetic-based therapies for humans."

In optogenetics, neurons are made to respond to light through the insertion of light-sensitive genes derived from particular microbial organisms. Earlier studies had primarily validated this method for use in invertebrates and rodents, with only a few studies showing that optogenetics can alter activity in monkey brains on a fine scale.

In the new study, the researchers focused on neurons that control particular eye movements. Using optogenetics together with functional magnetic resonance imaging (fMRI), they showed that they could use light to activate these neurons, generating brain activity and subtle changes in eye-movement behavior.

The researchers also found that optogenetic stimulation of their focal brain region produced changes in the activity of specific neural networks located at some distance from the primary site of light activation.

The findings not only pave the way for a much more detailed understanding of how different parts of the brain control behavior, but they may also have important clinical applications in treating Parkinson's disease, addiction, depression, obsessive-compulsive disorder, and other neurological conditions.

"Several neurological disorders can be attributed to the malfunctioning of specific cell types in very specific brain regions," Vanduffel says. "As already suggested by one of the leading researchers in optogenetics, Karl Deisseroth from Stanford University, it is important to identify the underlying neuronal circuits and the precise nature of the aberrations that lead to the neurological disorders and potentially to manipulate those malfunctioning circuits with high precision to restore them. The beauty of optogenetics is that, unlike any other method, one can affect the activity of very specific cell types, leaving others untouched."
As you scan the face on that giant billboard, it may just be scanning your face right back.

Increasingly sophisticated digital facial-recognition technology is opening new possibilities in business, marketing, advertising and law enforcement while exacerbating fears about the loss of privacy and the violation of civil liberties.

Businesses foresee a day when signs and billboards with face-recognition technology can instantly scan your face and track what other ads you’ve seen recently, adjust their message to your tastes and buying history and even track your birthday or recent home purchase. The FBI and other U.S. law enforcement agencies already are exploring facial-recognition tools to track suspects, quickly single out dangerous people in a crowd or match a grainy security-camera image against a vast database to look for matches.

Many fear that future is coming too quickly, with facial-recognition technology becoming increasingly advanced, available and affordable before restrictions on its use can be put into place. Concerns have been raised on Capitol Hill in recent weeks that FBI searches using the technology could trample Fourth Amendment protections against unreasonable search and seizure, while some in the industry say excessive regulations could cripple cutting-edge technology.

“In our country, government shouldn’t be looking over your shoulder unless it has a reason,” said Jay Stanley, senior policy analyst with the American Civil Liberties Union’s speech, privacy and technology project. “They should not be collecting data on innocent subjects...”
Small surveillance drones are starting to be part of police departments across America, and the FAA will soon open up the airspace for more to come. This drone invasion has already raised all kinds of privacy concerns. And if you think that's bad, across the ocean, Russia seems hell-bent on outdoing its former Cold War enemy.

Russia's leading manufacturer of unmanned aerial vehicles, Zala Aero, has provided the Russian government with more than 70 unmanned systems, each containing several aircraft. According to an article published Monday on Open Democracy Russia, the Kremlin's romance with drones started in 2006, when the Interior Ministry deployed a Zala 421-04M to monitor street protests at a G8 summit in St. Petersburg. The Russian government has also bought drones from Israel.

Vladimir Putin himself appears ready to jump on the drone bandwagon. "We need a program for unmanned aircraft. Experts say this is the most important area of development in aviation," he said in early June. "We need a range of all types, including automated strike aircraft, reconnaissance and other types." Indeed, Russia is going to allegedly spend around $13 billion on unmanned aerial vehicles through 2020.

According to its Zala executive Maksim Shinkevich, almost every Interior Ministry air group has a drone these days. Their favorite one? The Zala 421-08M, a 5.5-pound, 31-inch wingspan unmanned vehicle equipped with a camera that can fly for 90 minutes at almost 12,000 feet. At the right angle, a drone like this can take a quality snapshot of a car's license plate. What about, say, a protester's face? "Capturing faces in any detail would however require a very heavy drone with a good camera; more precisely, with a heavy, specialized platform," Shinkevich told Open Democracy Russia.

No matter, these small drones, like the Zala 421-06, are perfect to monitor dissatisfied Russians marching down the streets. "They will be used mainly to maintain public order during local demonstrations and marches, when we shall be keeping watch from the air to avoid any incidents," said Sergei Kanunnikov, the head of the air operation center in the Department of the Interior of the eastern state of Amur.

Drones will also be deployed at the 2014 Winter Olympics in Sochi, a Russian city on the Black Sea. And Sochi won't be the first Olympic city to secure its skies with robots; London will do the same starting this weekend...
The day is likely coming when you could see someone on the street, aim your smartphone at her, and quickly retrieve a list of possible matches for who she is.

That’s because facial-recognition software is getting smarter the more people use it.

Consider Facebook, for example. Every time a user confirms or rejects photo-tagging suggestions the social network gets better at knowing who’s actually in photos people put up on the site...
The Santa Fe City Council is looking at plans that could install anywhere from a few dozen to nearly 200 surveillance cameras in public places around the city.

Santa Fe has had a continuing battle with burglars breaking into cars and homes. So now the city wants to bring in extra eyes, a lot of extra eyes, to help stop thieves.

Citywide surveillance cameras are nothing new. London has thousands of them.

Now it looks like Santa Feans might need to start prepping for their close-ups.

Santa Fe's Dale Ball Trails don't look anything like an urban center, but that doesn't mean it's a crime-free spot.

Plenty of parked cars with few people around is a tempting sight for crooks and explains the "High Prowl Area" sign warning people to secure their valuables.

"Shortly after they opened the trails I had my car broken into," one visitor told KRQE News 13.

"I've seen so many broken windows over here," added another.

It's not just a problem at the trailhead.

"We have trails, we have business districts, we have parks, and the police can't be everywhere all the time," Santa Fe Mayor David Coss said...
Tuesday, July 24, 2012
"If you're concerned about it, maybe there's a reason we should be flying over you, right?" That’s the callous response of one drone trade group representative when asked his opinion of those who worry about the increasing use of the unmanned aerial vehicles and the corresponding decrease in privacy and civil liberties. The man who spoke those words is Douglas McDonald, the director of special operations for Unmanned Applications Institute International and president of a North Dakota chapter of an unmanned vehicle trade group. Another North Dakotan has a different take on the use of drones in the Flickertail State.

It’s been about a year since a North Dakota man was arrested after a local SWAT team tracked him down using a Predator drone it borrowed from the Department of Homeland Security. Although the story has not been widely reported, Rodney Brossart became one of the first (if not the first) American citizens arrested by local law enforcement with the use of a federally-owned drone surveillance vehicle after holding the police at bay for over 16 hours.

Brossart’s run-in with law enforcement began after six cows found their way onto his property (about 3,000 acres near Lakota, North Dakota) and he refused to turn them over to officers. In fact, according to several sources, Brossart and a few family members ran police off his farm at the point of a gun. Naturally, police weren’t pleased with Brossart’s brand of hospitality, so they returned with a warrant, a SWAT team, and a determination to apprehend Brossart and the cows.

A standoff ensued and the Grand Forks police SWAT team made a call to a local Air Force base where they knew a Predator drone was deployed by the DHS. About three years before the Brossart incident, the police department had signed an agreement with DHS for the use of the drone. No sooner did the call come in than the drone was airborne and Brossart’s precise location was pinpointed with laser-guided accuracy. The machine-gun toting SWAT officer rushed in, tased then arrested Brossart on various charges including terrorizing a sheriff, and the rest is history. Literally.

As the matter proceeds through the legal system, Bruce Quick, the lawyer representing Brossart, is decrying the “guerilla-like police tactics” used to track and capture his client, as well as the alleged violation of the Fourth Amendment’s protection against unwarranted searches and seizures. While the police admittedly possessed an apparently valid search warrant, Quick asserts that no such judicial go-ahead was sought or obtained for the use of the Predator to track the suspect. Therein lies the constitutional rub.

In an interview with the press, Quick claims that the police exceeded their authority in several instances, especially when they decided to go around the Fourth Amendment and illegally search Brossart’s farm. "The whole thing is full of constitutional violations," he said. Quick goes so far as to call the police’s use of the taser "tortuous" and something only slightly below "water-boarding..."
High-tech security? Forget those irksome digital eye scans. Meet the biometric shoe.

A new lab is working to perfect special shoe insoles that can help monitor access to high-security areas, like nuclear power plants or special military bases.

The concept is based on research that shows each person has unique feet, and ways of walking. Sensors in the bio-soles check the pressure of feet, monitor gait, and use a microcomputer to compare the patterns to a master file for that person. If the patterns match the bio-soles go to sleep. If they don't, a wireless alarm message can go out.

"It's part of a shoe that you don't have to think about," said Marios Savvides, head of Carnegie Mellon University's new Pedo-Biometrics Lab, in Pittsburgh.

The lab, which has $1.5 million in startup funding, is a partnership with Autonomous ID, a Canadian company that is relocating to several U.S. cities. Todd Gray, the company president, said he saw the potential when his daughter was in a maternity ward decorated with representations of different baby feet all along a wall.

Autonomous ID has been working on prototypes since 2009, with the goal of making a relatively low cost ID system. Gray said they've already run tests on sample bio-soles, which are no thicker than a common foot pad sold in pharmacies, and achieved an accuracy rate of more than 99 percent. He said Carnegie Mellon will broaden the tests to include "a full spectrum of society: big, tall, thin, heavy, athletic, multicultural, on a diet, twins and so on."

Gray wouldn't speculate on what the system will cost or when it might reach the marketplace, but each worker at a site would have his or her own pair of bio-soles.

"Within the third step, it knows it's you, and it goes back to sleep," he said. "If I put on yours, it would know almost instantly that I'm not you..."
Drones are not simply a moral issue. Like debating the legitimacy of air strikes, ground invasion or cruise missile strikes, deliberating on the use of drones is a use of force question. To the extent that their use is supposed to follow the moral standards of war, the first question we have to ask ourselves is: under what conditions is their lethal use legitimate? To gain purchase on the ethical dilemmas posed by drones, one needs, first, to know the moral and historical context during which the use the drones emerged as the weapon of choice of President Obama. This is linked to a partial transition away from the Bush Doctrine.

Early in his presidential campaign in 2008, Obama stated he wanted to repudiate the "mind-set that got us into [the Iraq] war in the first place." That mind-set included Bush's willingness to snub allies, such as France and Germany, and undertake a pre-emptive war, or what is sometimes now distinguished as preventive war, against Iraq in the name of self-defence. Obama's rhetoric thus sounded a more cautious tone that emphasized the importance of last resort and multilateralism.

In his Nobel Peace Prize acceptance speech in 2009, Obama referenced the importance of the just war tradition in guiding the use of force: "And over time, as codes of law sought to control violence within groups, so did philosophers and clerics and statesmen seek to regulate the destructive power of war. The concept of a 'just war' emerged, suggesting that war is justified only when certain conditions were met: if it is waged as a last resort and in self-defense; if the force used is proportional; and if, whenever possible, civilians are spared from violence." The 2010 National Security Strategy - the document that outlines the foreign policy threats facing the U.S. and how the administration plans to deal with them - echoes this cautious war philosophy. The language of pre-emptive war that predominated Bush's national Security Strategy of 2002 and 2006 was removed, and a more cautious language that echoed the notion of last resort was employed: "While the use of force is sometimes necessary, we will exhaust other options before war whenever we can, and carefully weigh the costs and risks of inaction." The document goes on to emphasize the importance of using force in ways that "reflects our values and strengthens our legitimacy" and stresses the need for "broad international support."

The notion of last resort is important here because it suggests that Obama sees the use of force as something that ought to be avoided, if possible. This means that force should not be used unless a threat is imminent, and even in cases in which it is, all reasonable means of forestalling the threat should be tried first.

In some respects, Obama has towed the line. The Libya campaign was a multilateral effort aimed, at least initially, at protecting civilians from imminent threat of slaughter. When dealing with the looming threat of Iran, Obama has emphasized diplomatic measures designed to isolate the regime. Finally, he has shown restraint by not rushing to war to stop the bloodshed in Syria because of a lack of international support. When it comes to large-scale force, Obama has, it seems, turned the page from the Bush Doctrine.

But this does not mean he has completely rejected the idea that the U.S. could 'go it alone' and act pre-emptively. As the National Security Strategy unequivocally exclaims: "The United States must reserve the right to act unilaterally if necessary to defend our nation and our interests, yet we will also seek to adhere to the standards that govern the use of force." This leads us to the dilemmas posed by drones.

In using drones, Obama continues to act on the Memorandum of Notification, signed in the weeks following 9/11 by President Bush that gave the CIA the right to kill members of Al-Qaeda in anticipatory self-defense virtually anywhere in the world. Obama has continued the war on Al-Qaeda, using drones to relentlessly pursue its members by denying them safe haven and killing them with targeted strikes, some of which have killed civilians. While the administration claims important successes in decimating Al-Qaeda, skeptics point to the link between purported civilian casualties and terrorist recruitment, as well as the growing presence of potentially affiliated branches in Africa and Yemen, to suggest that the war against extremism is far from being won by drones. The cause of this criticism is the impression that the Obama administration is not living up to its own values...
Wednesday, July 18, 2012
It's time to think of Google as much more than just a search engine, and that should both excite and spook you.

Search remains critical to the company's financial and technological future, but Google also is using the search business' cash to transform itself into something much broader than just a place to point your browser when asking for directions on the Internet.

What it's now becoming is an extension of your mind, an omnipresent digital assistant that figures out what you need and supplies it before you even realize you need it.

Think of Google diagnosing your daughter's illness early based on where she's been, how alert she is, and her skin's temperature, then driving your car to school to bring her home while you're at work. Or Google translating an incomprehensible emergency announcement while you're riding a train in foreign country. Or Google steering your investment portfolio away from a Ponzi scheme.

Google, in essence, becomes a part of you. Imagine Google playing a customized audio commentary based on what you look at while on a tourist trip and then sharing photo highlights with your friends as you go. Or Google taking over your car when it concludes based on your steering response time and blink rate that you're no longer fit to drive. Or your Google glasses automatically beaming audio and video to the police when you say a phrase that indicates you're being mugged.

Exciting? I think so. But it's also, potentially, a profoundly creepy change. For a Google-augmented life, you must grant the Googlebot unprecedented privileges to monitor your personal information and behavior. What medicine do you take? What ads did you just glance at while walking by the bus stop? What's your credit card number? And as Google works to integrate social data into its services, you'll have to decide how much you'll share with your contacts' Google accounts -- and the best way to ask them to share their data with your Google account.

Where your Google comfort zone ends It'll be foolhardy to be as cavalier with tomorrow's Google as you might be with it today. I think some of those sci-fi possibilities I just described could be real within three to five years, so now is a good time to start thinking about where your Google comfort zone ends.

Me? I'm immersed in Google services, but I worry that handy new features will arrive in a steady stream of minor changes that are all but imperceptible until one day I wake up and realize that Google has access to everything that makes me who I am.

Google Now says it needs access to my calendar? Sounds useful. My Android phone needs to turn on my phone's microphone so the Google Maps app can judge by ambient noise whether I'm indoors or outdoors? Well, that'll help me get through the airport faster. My glasses need to identify the faces of people in my company so Google can deduce who gets consigned to the Google Voice answering machine and who gets through to my phone even at 3 a.m.? Well, I sure don't want to have to set all that up manually...
The explosion of unmanned aerial surveillance drones by the government is paying dividends for law enforcement authorities and yet exacerbating angst over terrorist attacks from within, air traffic safety and the risk of "eyes in the sky" over law-abiding citizens.

At least 106 agencies across the country have permission from the Federal Aviation Administration to operate 207 drones - numbers that are expected to increase as the FAA speeds approval of low-flying drones into the U.S. aviation system by 2015, as required by Congress.

Drones have been a game changer in combat in Iraq and Afghanistan as well as providing real-time surveillance of the U.S.-Mexico border, but their proliferation has prompted concerns mainly over national security.

Last week, a suspect in Boston agreed to plead guilty to federal charges involving a plot to fly remote controlled aircraft loaded with explosives to collapse the dome of the U.S. Capitol and attack the Pentagon.

Texas Republican U.S. Rep. Michael McCaul has summoned experts including, Texas' Montgomery County Sheriff Randy McDaniel and University of Texas-Austin engineering professor Todd Humphreys, to testify Thursday before his oversight and investigations panel within the House Committee on Homeland Security.
Americans have been protesting and getting arrested at U.S. drone bases and research institutions for years, and some members of Congress are starting to respond to the pressure.

But it's not that drones are being used to extra-judicially execute people, including Americans, in Afghanistan, Pakistan, Yemen and Somalia that has U.S. lawmakers concerned. Rather it's the possible and probable violation of Americans' privacy in the United States by unlawful drone surveillance that has caught the attention of legislators.

Rep. Jeff Landry, R-La., says "there is distrust amongst the people who have come and discussed this issue with me about our government. It's raising alarm with the American public." Based on those discussions, Landry has placed a provision in a defense spending bill that would prohibit information gathered by drones without a warrant from being used as evidence in court.

Two other legislators, Rep. Austin Scott, R-Ga., and Sen. Rand Paul, R-Ky., introduced identical bills to bar any government agency from using a drone without a warrant to "gather evidence or other information pertaining to criminal conduct or conduct in violation of a regulation."

No one in Congress, however, has introduced legislation requiring the government to provide to a neutral judge evidence of a criminal act committed by a person to be targeted for assassination by a drone, or allowing such a person the right to defend himself against the U.S. government's allegations.

Under President Obama's signature national security policy, being a young male in the tribal region of Pakistan is often sufficient evidence to warrant execution. The kill committee members from the National Security Agency, the Central Intelligence Agency and the Department of Defense act as the prosecution, judge and jury for "low-level" targets. The president, consulting his "kill list," makes the decision on "high-value" targets, including American citizens...
A silent drone flown by U.S. Special Forces could stay in the air forever, in theory, if power were being beamed to it from a laser on the ground. That exciting possibility came up during an indoor flight test showing how a laser could power a Stalker drone for 48 hours.

The electric version of Lockheed Martin's Stalker has a battery that usually lasts just two hours, but in the test, a laser power system wirelessly recharged a drone battery in midair for 24 times as long. Such a system, if proven in actual outdoor flights, could give U.S. Special Forces a steady robot friend in the sky to watch for targets or approaching enemies.

"This test is one of the final steps in bringing laser-powered flight to the field," said Tom Nugent, president of LaserMotive, which made the laser power system used in the test. "By enabling in-flight recharging, this system will ultimately extend capabilities, improve endurance and enable new missions for electric aircraft."

LaserMotive previously made its mark by winning NASA's space elevator contest — a challenge to build machines powered by laser beams that can climb a cable toward the sky.

The drone flight test took place in a wind tunnel to simulate flying conditions. By the end of the test, the Stalker's battery actually held more stored energy than it did at the beginning.

"We're pleased with the results of this test," said Tom Koonce, the Stalker program manager of Lockheed Martin Skunk Works. "Laser power holds real promise in extending the capabilities of Stalker..."
Sunday, July 15, 2012
Cellphones, e-mail, and online social networking have come to rule daily life, but Congress has done nothing to update federal privacy laws to better protect digital communication. That inattention carries a heavy price.

Striking new data from wireless carriers collected by Representative Edward Markey, a Massachusetts Democrat, and first reported last week by Eric Lichtblau of The Times, showed surging use of cellphone surveillance over the past five years by law enforcement agencies at every level and for crimes both mundane and serious.

Wireless carriers reported responding to a whopping 1.3 million demands from law enforcement agencies for subscriber information, including location data, calling records and text messages. The number of people whose information was turned over is almost certainly much higher because a single request for a cell tower “dump” could sweep in the names of thousands of people connected to a given tower at a certain time.

As cell surveillance has ballooned, federal and local officials have come to rely less on wiretapping to eavesdrop on conversations, probably because cell tracking is less time consuming and less legally difficult to manage. In most cases, law enforcement officers do not need to hear the actual conversation; what they want to know can be discerned from a suspect’s location or travel patterns. And location data can be as revealing of a cellphone owner’s associations, activities and personal tastes as listening in on a conversation, for which a warrant is mandatory.

As a result, warrants for wiretaps, which are subject to stringent legal standards used for decades, declined by 14 percent last year, to just 2,732 nationwide. The legal standards applied to cell tracking and other forms of digital monitoring are more lax and inconsistently applied, with many law enforcement agencies claiming a right to such data without having to show a compelling need or getting detailed vetting by a court.

Clearly, federal laws need to be revamped and brought into line with newer forms of surveillance. A good place to begin is the Electronic Communications Privacy Act, the main federal statute governing access to electronic information. The act has not had a significant overhaul since its passage in 1986...
We need a privacy revolution. The movement must be built as an alliance uniting people concerned about liberty from the right, the left and the center. In an era of economic austerity and depression, the only thing that will force these vital issues to the forefront is a whole lot of (organized) noise.

The government isn't waiting for our approval, either tacit or overt. The ACLU has maybe hundreds of lawyers on deck to challenge these abuses. The FBI, just to take one example from a sprawling surveillance state matrix, has 40,000 agents. The cards aren't stacked evenly. In order to make a real impact, people need to raise their voices. Civil liberties organizations cannot do it alone.

Michelle Alexander is often asked how we can push back against the drug war and its mass incarceration disaster. She says nothing less than a mass movement will get us what we need. Her lesson applies equally to the growing surveillance state. Yes, we need to pass the GPS Act. Yes, we need to update electronic communications privacy law.

But to get there and to change the fundamental relations of power between the government and the governed we need a mass social movement for privacy.
How many eyes in the sky are there over these United States?

At least 18 police departments, universities and other government agencies have received clearance from the federal government to send up a range of unmanned aerial vehicles, or drones, according to documents unearthed by a Freedom of Information Act request filed by the Electronic Frontier Foundation.

Among them:

The Mississippi Department of Marine Safety has a 35-ounce unmanned helicopter made of carbon fiber, hooked up with a still and a video camera.

The Texas Department of Public Safety, based in Austin, has its own, called the WASP, to “support critical law enforcement operations in South Texas.”

And the United States Department of Agriculture deploys a drone – named the Bat — to pick up “thermal infrared data” on experimental field sites in Georgia and Alabama.

The new documents came as part of a drone census conducted by EFF and MuckRock, a Web site that helps file and collate public information requests. The documents are important because, so far, little is known about how prevalent drones have become in domestic airspace.

The Federal Aviation Administration had issued licenses for the civilian use of drones, mainly to research and law enforcement agencies. Their use is expected to grow in the coming months.

The Obama administration earlier this year approved commercial use of drones, opening up airspace to businesses of all kinds, from those that seek aerial photographs to sell real estate to those that are keen to monitor oil spills. The new drones law will also make it easier for law enforcement to obtain licenses to deploy drones of their own – and inevitably raise issues for civil liberties and the limits of surveillance.
A debate between privacy and protection is heating up again, and Facebook is front and center.

It's no secret the stuff you post publically online can be monitored, but your private chats, too? According to Reuters, the answer is yes.

Facebook's chief security officer admits Facebook users are being monitored for any suspected criminal activity, and it's not just the stuff you post on timelines.

Using software, the company says it's monitoring personal chats as well. Using smart software, Facebook scans those chats for certain phrases, exchanges of personal information and vulgar language.

If it sees something suspicious, it flags it, and only then would an actual person read it. At that point, a security team takes over, reads the chat and then contacts police if needed.

Facebook says the technology has a very low false-positive rate to protect its users' privacy, but as expected there has been backlash from users. Some feel their private conversations are being violated.

But the company points to one instance where the technology helped net an alleged sexual predator: The software red-flagged a man's chat with a 13-year-old girl in South Florida.

In the conversation, authorities said he was making plans to meet with her after school. It was tagged by Facebook and shipped to police, who arrested the man.

The FBI says it's on board with this technology and hopes more online sites use it.
Memories and thoughts are private—or at least they used to be. A new company, Veritas Scientific, is developing a technology that promises to peek into a person’s brain to reveal some of their secrets. “The last realm of privacy is your mind,” says Veritas CEO Eric Elbot. “This will invade that.”

Elbot’s device belongs in a Philip K. Dick novel: It’s a futuristic motorcycle-type helmet containing metal brush sensors that will read brain activity as images of, say, bomb specs or Osama bin Laden’s face flash quickly across the inside of the visor. Scientists have shown that familiar images prompt spikes of electrical brain activity that indicate recognition. Recognition indicates memory, and memory implies knowledge. Veritas’s goal is to create an electroencephalogram (EEG) helmet with a slideshow of images that could reliably help to identify an enemy.

But whose enemy? Veritas would provide the U.S. military with the device first, as a way to help them pick friend from foe among captured people. But Elbot imagines that the brain-spying, truth-telling technology will also be useful for law enforcement, criminal trials, and corporate takeovers. Eventually, it will even make its way into cellphone apps for civilians, he says.

“Certainly it’s a potential tool for evil,” says Elbot. “If only the government has this device, it would be extremely dangerous.”

EEG experiments on mock terrorism plots have been conducted in laboratories, identifying participants and detecting criminal details. Veritas wants to put its helmets on real suspected terrorists. According to Elbot, the U.S. military used an earlier Veritas device called BrainTruth to test the thoughts of suspected Iranian agents crossing the Mexican border into the United States.

Elbot envisions a scenario in which troops in a village in Afghanistan round up all the men and put helmets on them, and then the soldiers will able to classify them as friend or foe almost instantly. Elbot hopes to have a prototype ready for the U.S. military’s war games this fall and is pursuing a military contract...
Internet users in Australia may be forced to share every aspect of their online lives with the government. If passed into law, a new security measure would require service providers to retain customers’ phone and internet data online for two years.

­A paper released by the Attorney-General’s Department shows that if passed, the law would require Australians to hand over their computer passwords to authorities.

Everything from networking sites to emails would be stored, and intelligence agencies would be given increased access to social media sites like Facebook and Twitter.

The paper was written for a parliamentary joint committee which is considering ways to reform the country’s national security legislation.

Another proposal under consideration is whether to allow the country’s foreign intelligence services to monitor citizens overseas, if an officer from the Australian Security Intelligence Organization (ASIO) is not available.

Until now, ASIO has been the only agency allowed to collect data on Australian citizens.

The Federal government has defended the need for intelligence agencies to have access to internet and phone records. However, not everyone agrees with the plans laid out in the document.

“I think it's unjustified. Australians should have a right to privacy online,” Greens party senator Scott Ludlam told ABC.

If the measures pass, it will be the greatest expansion of Australia’s security laws since 2001, when the country implemented strict security measures following the 9/11 attacks...
Smoking is already banned at beaches, parks, restaurants and near buildings in Santa Monica, but Tuesday night the city council sought to expand that prohibition and voted 4-2 to ban smoking for all new tenants of apartments and condos inside their residences – with one exception.

“It also requires existing residents to designate their units as smoking or non smoking and from then on it will be prohibited to smoke in a non smoking unit,” said Adam Radinksy, head of the Consumer Protection Unit in Santa Monica.

The coastal city’s smoking bans date back almost two decades and Radinsky – along with other supporters of the ban – say the measures are in the interest of public health.

“People have come to testify to city council about asthma problems, people who had no prior health problems who developed health problems because their neighbors smoke,” he said.

Wayne Jackson, a smoker, says Southland residents breathe in pollution all day – pointing to a bus passing by as a “health issue.”

“I think it's a little too much control,” he said.

Jennifer Jones – who exercises in her apartment – describes smelling her neighbor's cigarette through the walls and vents.

“It’s not like a light smoke; it's really heavy,” Jones said.

Still, she has concerns with the new ordinance.

“I really don't think they should be telling people what to do if they are paying for the place they're living in,” Jones said.

Other cities have taken similar measures. Pasadena will make all apartments, condos and townhouses smoke-free by 2013 – a move Santa Monica is also looking into...
Pakistani ambassador to the U.S. Sherry Rehman recently made an unsurprising statement. She said that her government has not approved any drones strikes. "It hasn't okayed any American drone strikes on its territory in exchange for Washington's apology over the Salala attacks," she said in an interview with CNN. Rehman argued that there are more effective ways to go after terrorists inside Pakistan, and that the Pakistani government officially condemns "unilateral" drone strikes on its territory.

The word "unilateral" here is important, because the Pakistani government collaborates with the U.S. on at least some drone strikes. It varies by target, but the Pakistani government is seeking greater control over target selection and intelligence gathering -- and not necessarily an end to the drone strikes. After all, the Pakistani government is fighting terrorists as well.

There is a surprisingly simple explanation for this seeming contrast between public statements by officials and what happens behind the scenes. Pakistani authorities don't mind it when U.S. drones kill off people like the TTP (Pakistani Taliban) leader Baitullah Mehsud. They do, however, mind when U.S. drone strikes happen without their consent or involvement, such as one in North Waziristan in May of this year. (There is a chance, too, that the Pakistani officials protested the North Waziristan strike because that is where Jalaluddin Haqqani, an Taliban-linked insurgent commander widely believed to be supported by Pakistani intelligence, lives)

Speaking with CNN, Rehman emphasized the problem of so-called "signature strikes," in which a drone is used to attack a group of unidentified people judged as behaving suspiciously. Like many people, she's uncomfortable with a foreign power killing her country's citizens without knowing who they are or what they're doing.

The issue of drones in Pakistan is terribly complex. Pakistanis seem, simultaneously, to love and hate them: love, because drones are responsible for killing many of the terrorists who have brutalized communities across the northwest; hate, because they kill innocent people and because it's humiliating to grant America the right to bomb your country...
A hacker group has posted online the details of 450,000 user accounts and passwords it claims to have stolen from a Yahoo server.

The passwords could pose a severe security risk to Yahoo users if they use the same password and email combination across other sites.

The information was posted by a previously unknown hacker group.

The Ars Technica technology news website reported that the group, which calls itself D33DS Company, hacked into an unidentified subdomain of Yahoo's website where they retrieved unencrypted account details.

A Yahoo spokesperson declined to comment.

The affected accounts appeared to belong to a voice-over-Internet-protocol, or VOIP, service called Yahoo Voices, which runs on Yahoo's instant messenger...
They are now a familiar presence in war zones, but if manufacturers have their way, skies over civilians heads will soon be busy with unmanned vehicles.

Drones are currently a growth industry in the aviation sector, with scores of new companies competing for a slice of the market.

And if they can clear hurdles that currently limit their deployment in friendly air space, pilotless planes of all shapes will be taking to the air on missions to watch over us.

Some of the aircraft -- from devices barely bigger than a paper plane to formidable missile-sized systems operated by five-man ground crews -- were on display this week at the UK's Farnborough Airshow.

Although the event, held on alternate years, is one of Europe's biggest market places for traditional aircraft, a "drone zone" occupies a substantial slice of the exhibition space.

"There now are hundreds of companies competing for the market," said Konstantins Popkis, chief technology officer for the UAV Factory, which produces a 3.3-meter wingspan drone known as Penguin B.

"But not all of them are producing reliable systems," he added...
The Department of Homeland Security will soon be using a laser at airports that can detect everything about you from over 160-feet away.

Gizmodo reports a scanner that could read people at the molecular level has been invented. This laser-based scanner – which can be used 164-feet away — could read everything from a person’s adrenaline levels, to traces of gun powder on a person’s clothes, to illegal substances — and it can all be done without a physical search. It also could be used on multiple people at a time, eliminating random searches at airports.

The laser-based scanner is expected to be used in airports as soon as 2013, Gizmodo reports.

The scanner is called the Picosecond Programmable Laser. The device works by blasting its target with lasers which vibrate molecules that are then read by the machine that determine what substances a person has been exposed to. This could be Semtex explosives to the bacon and egg sandwich they had for breakfast that morning.

The inventor of this invasive technology is Genia Photonics. Active since 2009, they hold 30 patents on laser technology designed for scanning. In 2011, they formed a partnership with In-Q-Tel, a company chartered by the CIA and Congress to build “a bridge between the Agency and a new set of technology innovators.”

Genia Photonics wouldn’t be the only ones with similar technology as George Washington University developed something similar in 2008, according to Gizmodo. The Russians also developed something akin to the Picosecond Programmable laser. The creators of that scanner claim that “it is even able to detect traces of explosives left by fingerprints...”
(google play)
Citizens can hold police accountable in the palms of their hands with "Police Tape," a smartphone application from the ACLU of New Jersey that allows people to securely and discreetly record and store interactions with police, as well as provide legal information about citizens' rights when interacting with the police...

(remember this?)
In their unending battle to deter illegal immigration, drug trafficking and terrorism, U.S. authorities already have beefed up border security with drug-sniffing dogs, aircraft and thousands more agents manning interior checkpoints.

Now, the U.S. Drug Enforcement Administration has decided it wants more, and the Justice Department agency doesn’t care whether someone has even set foot in Mexico.

Clusters of what at first appear to be surveillance cameras have begun turning up in recent months on the Southwest border, and while some of the machines are merely surveillance cameras, others are specialized recognition devices that automatically capture license-plate numbers and the geographic location of everyone who passes by, plus the date and time.

The DEA confirms that the devices have been deployed in Arizona, California, Texas and New Mexico. It has plans to introduce them farther inside the United States...
Tuesday, July 10, 2012
A baby's first step is often considered the hardest and the most significant. Human babies, which are among those that are altricial at birth, are unable to walk and must "learn" to do so, often by mimicking the movements of other people.

Now engineers with the University of Arizona have developed a set of robotic legs that essentially also work by mimicking the movements of humans. Researchers Theresa Klein and M. Anthony Lewis, both of the university's Department of Electrical and Computer Engineering, describe their project in a paper published in the Journal of Neural Engineering.

The robot is able to walk -- much like a human -- by placing one foot in front of the other. While this is a step forward for the robot, is it a step forward for robotic development?

The Pentagon is considering awarding a Distinguished Warfare Medal to drone pilots who work on military bases often far removed from the battlefield.

Pentagon officials have been briefed on the medal’s “unique concept,” Charles V. Mugno, head of the Army Institute of Heraldry, told a recent meeting of the Commission of Fine Arts, according to a report in Coin World by our former colleague Bill McAllister.

Mugno said most combat decorations require “boots on the ground” in a combat zone, but he noted that “emerging technologies” such as drones and cyber combat missions are now handled by troops far removed from combat.

The Pentagon has not formally endorsed the medal, but Mugno’s institute has completed six alternate designs for commission approval.

The notion of greater recognition for drone pilots has been percolating for some time. Air Force Maj. Dave Blair, writing in the May-June issue of the Air & Space Power Journal, asked how much difference there is in terms of risk “between 10,000 feet and 10,000 miles.”

A “manned aircraft . . . that scrapes the top of a combat zone, well outside the range of any realistic threat” is deemed in “combat,” Blair writes, but a Predator firing a missile is considered “combat support.”

The proposed medal would rank between the Distinguished Flying Cross and the Soldier’s Medal for exceptional conduct outside a combat zone.
The graphic above appears in a July 4 CNN column titled "Drones decimating Taliban in Pakistan." It indicates that the Pakistan drone program overseen by Nobel Peace Prize winner Barack Obama killed 163 innocent people in 2009, 40 innocent people in 2010, 26 innocents in 2011, and zero innocent people in 2012. Is our drone-strike program really only killing bad guys now?

The casual CNN reader can be forgiven for drawing that conclusion. Why worry about drones if everyone dying from them is now a militant? she might conclude. What the authors neglect to mention is this bit from the May 29, New York Times story that explains how the United States government -- and perhaps our allies of convenience inside Pakistan? -- define "militant." Per the newspaper (emphasis added), "Mr. Obama embraced a disputed method for counting civilian casualties that did little to box him in. It in effect counts all military-age males in a strike zone as combatants, according to several administration officials, unless there is explicit intelligence posthumously proving them innocent..."
Tagged photos are Facebook’s lifeblood, and it would be happy to suck them out of other apps. That’s why I suspect Facebook will resurrect Face.com’s facial recognition API, even though it just shut it down less than a month after acquiring the Israeli company.

Reopening the API will let other apps’ users easily tag their Facebook friends in photos…which can then be shared back to Facebook where they generate notifications, return visits, and engagement the social network can monetize with ads.

It’s all part of Facebook’s on-going quest to become the Omni-news feed, collecting content from everywhere for data-mining and display.

Too Many Faceless Photos

Tagged photos are extremely valuable to Facebook. They’re a strong signal for who you’re currently closest to, so they teach its EdgeRank algorithm who to show in your news feed. Tagged photos can appear on multiple people’s timelines so they generate more time-on-site from browsing. Plus, most people will return to Facebook immediately to check out a photo they’ve been tagged in. That’s why Facebook wants every face-filled photo tagged...

One last time: Please stop tagging me!!!!
Less than a month after Face.com was acquired by Facebook, the social network is shutting down the facial-recognition software company's APIs.

The software company made a splash in 2009 when it released Photo Tagger, a free third-party application for Facebook that uses facial recognition technology to automatically tag photos of people, as well as a recognition-based alert service for Facebook. In 2010, Face.com released an open API to the public that allowed third-party developers to incorporate the technology in their apps.

However, according to an e-mail reprinted by The Next Web, Face.com representatives have begun notifying developers that the APIs would be closed down within a month.

"We're excited to move forward to work with all our friends at Facebook. Part of this process includes closing down other products and services that we are no longer able to support, and this includes the Face.com developers API," reads the message.

Also, the facial recognition iPhone app Klik has been removed from the App Store. Users of the app, which allowed people to tag faces in photos using Facebook, have until July 20 to retrieve their photos.

"After this date Face.com will dispose of the data we collect in connection with your use of the KLIK app and will not be migrating data to Facebook. All your data will be deleted - no exceptions," reads an announcement on Klik's site...
In the first public accounting of its kind, cellphone carriers reported that they responded to a startling 1.3 million demands for subscriber information last year from law enforcement agencies seeking text messages, caller locations and other information in the course of investigations.

The cellphone carriers’ reports, which come in response to a Congressional inquiry, document an explosion in cellphone surveillance in the last five years, with the companies turning over records thousands of times a day in response to police emergencies, court orders, law enforcement subpoenas and other requests.

The reports also reveal a sometimes uneasy partnership with law enforcement agencies, with the carriers frequently rejecting demands that they considered legally questionable or unjustified. At least one carrier even referred some inappropriate requests to the F.B.I.

The information represents the first time data have been collected nationally on the frequency of cell surveillance by law enforcement. The volume of the requests reported by the carriers — which most likely involve several million subscribers — surprised even some officials who have closely followed the growth of cell surveillance.

“I never expected it to be this massive,” said Representative Edward J. Markey, a Massachusetts Democrat who requested the reports from nine carriers, including AT&T, Sprint, T-Mobile and Verizon, in response to an article in April in The New York Times on law enforcement’s expanded use of cell tracking. Mr. Markey, who is the co-chairman of the Bipartisan Congressional Privacy Caucus, made the carriers’ responses available to The Times.

While the cell companies did not break down the types of law enforcement agencies collecting the data, they made clear that the widened cell surveillance cut across all levels of government — from run-of-the-mill street crimes handled by local police departments to financial crimes and intelligence investigations at the state and federal levels.

AT&T alone now responds to an average of more than 700 requests a day, with about 230 of them regarded as emergencies that do not require the normal court orders and subpoena. That is roughly triple the number it fielded in 2007, the company said. Law enforcement requests of all kinds have been rising among the other carriers as well, with annual increases of between 12 percent and 16 percent in the last five years. Sprint, which did not break down its figures in as much detail as other carriers, led all companies last year in reporting what amounted to at least 1,500 data requests on average a day...
Saturday, July 07, 2012
Today many of the pilots at Holloman never get off the ground. The base has been converted into the U.S. Air Force’s primary training center for drone operators, where pilots spend their days in sand-colored trailers near a runway from which their planes take off without them. Inside each trailer, a pilot flies his plane from a padded chair, using a joystick and throttle, as his partner, the “sensor operator,” focuses on the grainy images moving across a video screen, directing missiles to their targets with a laser.

Holloman sits on almost 60,000 acres of desert badlands, near jagged hills that are frosted with snow for several months of the year — a perfect training ground for pilots who will fly Predators and Reapers over the similarly hostile terrain of Afghanistan. When I visited the base earlier this year with a small group of reporters, we were taken into a command post where a large flat-screen television was broadcasting a video feed from a drone flying overhead. It took a few seconds to figure out exactly what we were looking at. A white S.U.V. traveling along a highway adjacent to the base came into the cross hairs in the center of the screen and was tracked as it headed south along the desert road. When the S.U.V. drove out of the picture, the drone began following another car.

“Wait, you guys practice tracking enemies by using civilian cars?” a reporter asked. One Air Force officer responded that this was only a training mission, and then the group was quickly hustled out of the room.

Though the Pentagon is increasing its fleet of drones by 30 percent and military leaders estimate that, within a year or so, the number of Air Force pilots flying unmanned planes could be higher than the number who actually leave the ground, much about how and where the U.S. government operates drones remains a secret. Even the pilots we interviewed wore black tape over their nametags. The Air Force, citing concerns for the pilots’ safety, forbids them to reveal their last names...
We've all Googled ourselves from time to time, but British Airways has crossed the creepy line for looking up its own passengers on Google Image Search.

The airline is rolling out a new program, called “Know Me,” that tries to improve passenger recognition through Google search and other methods. British Airways will create “dossiers” on passengers, and will use the profile data to offer 4500 “personal recognition messages” by the end of the year, the London Evening Standard reports.

For instance, flight attendants may reference Google image results to greet a high-profile, first class passenger when he or she boards the plane. British Airways will also dig into its own passenger data, so if a regular customer experienced a delay on a previous flight, airline staff can offer a personal apology.

Not surprisingly, some privacy advocates are upset. “Since when has buying a flight ticket meant giving your airline permission to start hunting for information about you on the Internet?” Nick Pickles, director of Big Brother Watch, told the Standard...
On the evening of June 19, a group of researchers from the University of Texas successfully hijacked a civilian drone at the White Sands Missile Range in New Mexico during a test organized by the Department of Homeland Security.

The drone, an Adaptive Flight Hornet Mini, was hovering at around 60 feet, locked into a predetermined position guided by GPS. Then, with a device that cost around $1,000 and the help of sophisticated software that took four years to develop, the researchers sent a radio signal from a hilltop one kilometer away. In security lingo, they carried out a spoofing attack.

“We fooled the UAV (Unmanned Aerial Vehicle) into thinking that it was rising straight up,” says Todd Humphreys, assistant professor at the Radionavigation Laboratory at the University of Texas.

Deceiving the drone’s GPS receiver, they changed its perceived coordinates. To compensate, the small copter dove straight down, thinking it was returning to its programmed position. If not for a safety pilot intervening before the drone hit the ground, it would have crashed.

But for Humphreys playing the part of an evil genius in a thriller movie, everything worked exactly to plan. “It was beautiful,” he tells Danger Room.

The rogue takeover exploited a vulnerability in GPS to take control of the drone. It was, by Humphreys’ accounting, the first time somebody proved a civilian drone could be hijacked. Last year, when the CIA lost a drone in Iran, there were reports indicating the Iranians might have launched a spoofing attack and tricked it into landing, but we’ll never know for sure. Also, in September 2011, North Korea reportedly forced a U.S. spy plane to land with a jamming attack.

With the planned integration of civilian drones in the American airspace, these problems might be coming to the U.S. The FAA must come up with new rules to allow for a freer use of drones in America by 2015 and, apart from worrying about possible collisions between manned and unmanned aircrafts, now the FAA might have to worry about people hijacking drones with spoofing devices.

What’s worse, the experiment at White Sands shows that drone-jacking is “just the tip of the iceberg of a much bigger security issue we have in this country,” according to Logan Scott, a GPS industry consultant who has worked for defense giants like Lockheed Martin.

In other words, it’s not only about drones, it’s GPS in general that is not safe...
Pakistani officials say a U.S. drone strike has killed at least 24 suspected militants in the country's northwest.

Friday's strike took place near Miran Shah, the main town in the North Waziristan tribal region — a known hub of Taliban and al-Qaida-linked militants.

Officials told VOA that foreigners were among those killed when missiles hit a compound in the area. It was one of the deadliest reported U.S. strikes and the first such attack since Pakistan re-opened NATO supply lines into Afghanistan following a seven-month shutdown.

Pakistan closed the routes after a coalition airstrike mistakenly killed 24 Pakistani troops near the Afghan border last November.

After the cross-border attack, Pakistan's parliament reviewed the country's terms of future engagement with the United States and demanded an end to drone strikes on its territory, as well as an unconditional apology for the attack that killed Pakistani troops.

U.S. Secretary of State Hillary Clinton issued a statement on Tuesday, saying the United States “is sorry for the Pakistani military's losses.” Pakistan later reopened the supply routes.

On Friday, hundreds of Islamists in the Pakistani capital, Islamabad, and the southern port city of Karachi protested the reopening of the supply lines...
Americans are gradually becoming more concerned about the use of drones and the morality and legality of this new stealthy and lethal technology. Unmanned Aerial Vehicles or UAVs have changed the nature of warfare in the last ten years, and one would like to think that the American public is sharing Pakistan's moral outrage over their use when innocent civilians are being killed along with the so-called legitimate targets.

But it is not the deaths in Pakistan causing the outrage. Civil liberty activist groups are protesting the use of the drones' artificial intelligence in seeking out crime in the U.S. homeland. As a surveillance tool, domestic drones are being used increasingly to spy on suspected drug smugglers, illegal immigrants and potential terrorists. The fear is that the next development will be arming them, but fortunately the U.S. Congress hastily passed legislation on June 15 2012, to bar any Department of Homeland Security funding for "the purchase, operation, or maintenance of armed unmanned aerial vehicles." Armed drones are incredibly powerful and dangerous weapons, and troubling new questions arise about the potential militarization of the police and wondering what Americans are willing to accept as collateral damage on their own soil.

Because mistakes do happen. There are endless examples of police raiding the wrong home, shooting the wrong suspect, arresting innocent people. To give the police this dangerous new military technology is unthinkable. Civil rights activists are increasingly speaking out against the use of drones as unethical and an overreach of the Department of Homeland Security and police forces. And they argue that armed drones stretch the definition of the legitimate use of lethal force. Yet the drone attacks continue in Pakistan because, as U.S. counter-terrorism adviser, John Brennan, stated in an April 30 speech, targeted drone strikes are "legal."

"As a matter of international law, the United States is in armed conflict with Al Qaeda, the Taliban and associated forces in response to the 9/11 attacks and we may use force consistent with our inherent right of self defense."

Brennan said that the strikes are not for vengeance but "to stop plots, prevent future attacks and save American lives."

As the number of civilian deaths increases in Pakistan, this reasoning seems not only short-sighted but morally wrong. The end does not always justify the means and the case can be made that the "war on terror" needs redefinition. "Terror" cannot be fought and stopped, but criminal acts can be addressed. As the war in Afghanistan winds down and shifts from military action to police action, perhaps the same move will happen in the FATA region on the Afghanistan-Pakistan border. If the U.S. decides that drones are inadmissible for police action within the U.S., it will be very difficult for them to make the case that they are "legal" in Pakistan.

Traditional laws of engagement stipulate that a human must decide if a weapon is to be fired and should follow the laws of military necessity, humanity, proportionality and the ability to distinguish between military and civilian targets. The more sophisticated drones have the artificial intelligence to make lethal combat decisions without human intervention and the International Committee for Robot Arms Control (ICRAC) is calling for urgent discussions to reduce the threat posed by these systems. There are increasing reports of the ease with which hackers can control domestic drones -- the possibility of armed drones coming under the control of the wrong hands should spur responsible international action.

Wake up America! Don't let this be our legacy!
Success in the online world seems to breed arrogance, and maybe more than a little carelessness. Today's example is a controversial new Facebook policy that was put in place abruptly and that, apparently inadvertently, corrupted contact lists on some users' smart phones. It's a disturbing event at a time when increasing numbers of Americans are worrying about loss of control, let alone privacy.

The problems began in the past few weeks, when, without notice, Facebook generated facebook.com email addresses for its users and publicly posted them with members' contact information. Critics saw it, correctly, as a hamfisted attempt to push more users on to Facebook sites, at the expense of other email providers. At stake is millions of dollars in advertising revenue.

That was enough of a black eye for Facebook, but it went much further when a bug in the facebook.com program evidently caused email addresses on other platforms to be wiped out on smart phones and replaced with the new address. It may have been inadvertent, but it is at least frustrating to millions of users, and potentially caused other problems as emails and other messages become redirected or otherwise confused. That's the kind of intrusion that, repeated often enough, invites a government response.

Facebook, of course, is free to its users and, as such, users should expect that the social networking site will make changes that it believes to be in its - and, hopefully, its users' - interests. But its influence is broad and deep and it needs to do a better job of notifying users of changes. That's important, not just socially, but financially as Facebook struggles on the stock market.

Facebook is hardly alone among Internet-based or -influenced companies in making moves that users don't like. Google imposed a new privacy policy earlier this year that many people found intrusive. Several years ago, Sony caused a controversy by releasing CDs that installed cloaking software when anyone loaded them into their computers. It later issued a patch that caused its own problems. Such is the nature of the modern digital experience.

It's a new world, and everyone is still adjusting to the lure of online riches or, in the case of companies like Sony, the desire to protect their products from what they see as theft. But a better balance needs to be struck. People's smart phones and computers are theirs, not some company's to mess with as they see fit.

Online privacy is a huge issue and, for the most part, government has stayed out of it, probably wisely. But, also for the most part, privacy practices tend to benefit online companies far more than their users and, at some point, users are going to protest. When that happens, government will step in. It's predictable, but also probably avoidable, if companies like Google and Facebook will do a better job of keeping their patrons happy and informed.

Really? You stupid pieces of shit are that proud of yrselves? NEWSFLASH: You're nothing more than spammers, the telemarketers of the past rolled up in an extra-douche bag techno-blanket!