(networkworld)
Carnegie Mellon researchers, taking part in DARPA's Mind's Eye program, have created visually intelligent software to recognize human activities in video and then predict what might happen next.

Mind's Eye surveillance to watch, identify and predict human behavior from video Carnegie Mellon researchers, taking part in DARPA's Mind's Eye program, have created visually intelligent software to recognize human activities in video and then predict what might happen next.

If a person holding a gun were to walk up to you, what might you think would happen next? Researchers from Carnegie Mellon have created intelligent software that will identify human activities in videos and then predict what might happen next. It should come as little surprise that the spookily named 'Mind's Eye' program is sponsored by DARPA's Information Innovation Office.

"A truly 'smart' camera would be able to describe with words everything it sees and reason about what it cannot see," said DARPA. Visually intelligent technology previously 'thought' in terms of nouns to describe a scene, but Carnegie Mellon researchers have made smart software that can also think in terms of action verbs. "A video shows a woman carrying a box into a building. Later, it shows her leaving the building without it. What was she doing?" asked Carnegie Mellon University.

The Mind's Eye software "will compare the video motion to actions it's already been trained to recognize (such as walk, jump, and stand) and identify patterns of actions (such as pick up and carry). The software examines these patterns to infer what the person in the video is doing. It also makes predictions about what is likely to happen next and can guess at activities that might be obscured or occur off-camera."

Carnegie Mellon University's National Robotics Engineering Center explained the image below as: "The Mind's Eye program will automate video analysis - recognizing current behavior, interpolating actions that occur off-camera, and predicting future behavior..."
(more)
Friday, October 19, 2012
(washingtonpost)
The CIA is urging the White House to approve a significant expansion of the agency’s fleet of armed drones, a move that would extend the spy service’s decade-long transformation into a paramilitary force, U.S. officials said.

The proposal by CIA Director David H. Petraeus would bolster the agency’s ability to sustain its campaigns of lethal strikes in Pakistan and Yemen and enable it, if directed, to shift aircraft to emerging al-Qaeda threats in North Africa or other trouble spots, officials said.

If approved, the CIA could add as many as 10 drones, the officials said, to an inventory that has ranged between 30 and 35 over the past few years.

The outcome has broad implications for counterterrorism policy and whether the CIA gradually returns to being an organization focused mainly on gathering intelligence, or remains a central player in the targeted killing of terrorism suspects abroad.

In the past, officials from the Pentagon and other departments have raised concerns about the CIA’s expanding arsenal and involvement in lethal operations, but a senior Defense official said that the Pentagon had not opposed the agency’s current plan.

Officials from the White House, the CIA and the Pentagon declined to comment on the proposal. Officials who discussed it did so on the condition of anonymity, citing the sensitive nature of the subject...
(more)
(thinkprogress)
The outing of University of Texas-Austin students to their parents as a consequence of a little-known Facebook privacy glitch has reignited longstanding concerns over the social network’s treatment of its LGBT users’ private information. According to a report in the Wall Street Journal, the two students — Bobbi Duncan and Taylor McCormick — had placed highly restrictive privacy controls on the information , but were unintentionally outed by the head of their LGBT choir when they joined its Facebook group to get access to the rehearsal schedule:

The president of the chorus, a student organization at the University of Texas campus here, had added Ms. Duncan and Mr. McCormick to the choir’s Facebook group. The president didn’t know the software would automatically tell their Facebook friends that they were now members of the chorus.

The two students were casualties of a privacy loophole on Facebook—the fact that anyone can be added to a group by a friend without their approval. As a result, the two lost control over their secrets, even though both were sophisticated users who had attempted to use Facebook’s privacy settings to shield some of their activities from their parents.

The consequences for Ms. Duncan and Mr. McCormick were dire — the former’s father “left vitriolic messages on her phone, demanding she renounce same-sex relationships, she says, and threatening to sever family ties,” causing her to spiral into a depression (she’s thankfully improved since). The latter’s dad “didn’t talk to his son for three weeks.”

The Journal notes that Facebook is making an admirable effort to make its privacy policies clearer to LGBT users, but this isn’t the first time the company’s opaque rules have outed LGBT individuals. In 2009, Library of Congress employee Peter TerVeer was outed to his supervisor as a consequence of a Facebook policy change; he was met with a systematic pattern of discrimination that cost him his job and ultimately his home. A glitch in Facebook’s advertising programming had previously sent confidential information on users’ sexual orientation to third-party advertisers.
(more)
(24dash)
A council illegally installed surveillance cameras in a home after being told to do so by police, it has emerged.

Cambridge City Council (CCC) installed the cameras in February 2010 after a woman reported suffering domestic violence. The council has since admitted "intrusive surveillance".

But it has since been discovered that it was police who told the council to install the cameras. Cambridge police are yet to get back to 24Dash regarding the installation of the cameras.

Liz Bisset, CCC's director of customer and community services, told 24dash: "This was a joint operation between us and the police. It was the police who asked us to install the cameras. The cameras were set up by a number of individuals who did not realise that they were not allowed to."

She explained that the situation was a misunderstanding and since then a system of checks and balances had been put in place to prevent future breaches.

Nick Pickles, director of privacy and civil liberties campaign group Big Brother Watch, said: “Clearly this raises some serious questions – did the police not understand what powers the council had, or did they wrongly believe they were able to authorise the council without going through any internal approval process? It is legally possible to authorise third parties under RIPA, but if the police believed they were authorising the council then what process was followed by the police and why did the council still think they needed their own internal approval?
(more)
(boingboing)
The Snooper's Charter is Britain's pending Internet surveillance law, which requires ISPs, online services and telcoms companies to retain enormous amounts of private online transactions, and to hand them over to government and law enforcement employees without a warrant. A public campaign on the bill had 19,000 responses, every one of which opposed the legislation. 19,000 against, 0 for. The question is, will the government (which ran in part by opposing similar legislation proposed by the previous Labour government) actually pay attention? Here's Glyn Moody in Computerworld:

"Got that? Out of 19,000 emails received by the Committee on the subject of the proposed Draft Communications Bill, not a single one was in favour of it, or even agreed with its premise. Has there ever been a bill so universally rejected by the public in a consultation? Clearly, it must be thrown out completely..."
(more)
(cbs)
Drones could soon operate without the help of humans.

Agence France-Presse is reporting that the Pentagon wants its drones to be more autonomous, so that they can run with little to no assistance from people.

“Before they were blind, deaf and dumb,” Mark Maybury, chief scientist for the U.S. Air Force, told AFP. “Now we’re beginning to make them to see, hear and sense.”

Ronald Arkin, a professor at the Georgia Institute of Technology, believes that drones will soon be able to kill enemies on their own independently.

“It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield, but I am convinced that they can perform more ethically than human soldiers are capable of,” Arkin told AFP.

Arkin added that robotic weapons should be designed as “ethical” warriors and that these type of robots could wage war in a more “humane” way.

The U.S. military says people will be on the ground to control the drones despite the unmanned robots gaining more independence.

Peter W. Singer, a senior fellow in Foreign Policy at The Brookings Institution, believes there could be legal hurdles in regards to using robot-controlled drones.

“These responses that are driven by science, politics and battlefield necessity get you into areas where the lawyers just aren’t ready for it yet,” Singer told AFP.
(more)