The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - May/June (2006 vol.21)
pp: 102-104, c3
Published by the IEEE Computer Society
ABSTRACT
In the first news story, "Intelligent Surveillance Empowers Security Analysts," governments and corporations worldwide are spending billions of dollars researching, developing, and deploying intelligent video surveillance systems, data mining software, biometrics systems, and Internet geolocation technology. Growing use of these technologies has increased public scrutiny of, and resistance to, them. In the second news story, "These Computers Know How You Feel," with the debate still going as to whether AI is actual intelligence or just clever programming, the question remains: Can a computer have an EQ (emotional quotient)? The first line researchers have taken in investigating computers' capacity for feeling involves affective, or emotional, computing, which refers to computers sensing, reasoning about, and responding to human emotion.
Intelligent Surveillance Empowers Security Analysts
Jan Krikke
Security-related technology is a growth industry. Governments and corporations worldwide are spending billions of dollars researching, developing, and deploying intelligent video surveillance systems, data mining software, biometrics systems, and Internet geolocation technology. The technologies target terrorists, cyber criminals, identity thieves, and violators of export restrictions. Surveillance technologies are usually shrouded in secrecy—the more that's known about them, the less effective they are—but growing use of these technologies has increased public scrutiny of, and resistance to, security-related technologies.
Intelligent video everywhere
Big Brother is protecting you. This appears to be the UK government's message as it prepares to deploy a massive surveillance system in June of this year. Thousands of cameras equipped with automatic number plate recognition technology will be linked to the National ANPR Data Centre. The NADC will be able to read 50 million vehicle license plates a day, according to the Anite Group ( www.anite.com), which is designing the system. The system will be able to track individual vehicles in real time throughout the country. Heuristic neural network technology behind the ANPR reading engine will deliver 99.8 percent accuracy, according to Lee Hendricks, managing director of Anite's Secure Information Solutions. The company claims that this high accuracy is due to a combination of better standards, improved cameras, and improved heuristic neural network technology.
The UK's nationwide surveillance system has generated relatively little public resistance. "The Big Brother concerns over surveillance technology are very real and very immediate," says Hendricks. "But data recorded by the system is stored temporarily on a closed and highly secure government network. Most people understand the surveillance system is not specifically deployed to deny citizens their civil liberties. It is meant to deny criminals the freedom to operate." Hendricks adds that the UK government has been "very open and honest" in its approach to growing the surveillance camera network. He adds, "Public opinion following the July 2005 bombings in London is open to the idea of an extended surveillance network if that is the price of safety in the UK's towns and cities."
Many other companies are developing software for intelligent video surveillance, some of which can be added to existing closed-circuit TV systems. Australia-based iOmniscient ( www.iomniscient.com) has developed heuristic algorithms for what it claims is the most intelligent video surveillance system in the world. The algorithms examine each pixel in the image on a statistical basis and compare it with neighboring pixels to form a picture of what's happening in a given scene. The system learns the background of a live scene in a few minutes; it triggers an alarm when an object such as a cart with luggage appears in or disappears from the scene. It also learns to recognize and ignore false alarms caused by environmental factors such as water movement and variations in lighting.
ObjectVideo ( www.objectvideo.com) is one of several US companies to have expanded and commercialized technology developed through the DARPA Video Surveillance and Monitoring project ( www.cs.cmu.edu/~vsam). VSAM technology is based on algorithms derived from computer vision. ObjectVideo VEW, the company's flagship product, tracks video streams and identifies objects and classifies them as specific types (vehicles, humans, and so on).
The company has recently ported its server-based content analysis software to digital-signal processors. The chip-based software, ObjectVideo OnBoard, runs on the Texas Instruments DM64x family of DSPs. ObjectVideo OnBoard can be embedded in cameras, digital video recorders, routers, and other video-processing equipment. Original equipment manufacturers can employ its object tracking and system alerts for video management functions such as rule support for surveillance tasks. The company claims that adding intelligent video surveillance algorithms to DSPs leads to considerable cost savings. This is because the intelligence is distributed throughout security systems, reducing bandwidth requirements and lowering reliance on centralized servers.
Geolocation knows where you are
Quova Inc. has developed an IP geolocation technology that determines online users' location, Internet connection, Internet routing, and network characteristics. Mike Gaynes, Quova's "Minister of Communications," says that the company collects terabytes of data on IP addresses around the globe. To determine an IP address's location, Quova uses algorithms that, according to their patent, resemble "artificially intelligent agents that continuously look at data and use their respective artificial intelligences to make decisions." The technology can track Web users in real time. Gaynes states that US intelligence agencies are among Quova's clients but that he's not at liberty to provide details.
Companies use Quova's technology to comply with regulations on encryption software and other products banned from export to unfriendly governments. "The Office of Foreign Assets Control requires companies to know their customers and take all possible measures to prevent export to, or trade with, banned countries or organizations," says Gaynes. "A company allowing a software download to Iran or an al-Qaeda cell could face criminal charges, fines, and sanctions."
FINDbase ( www.findbase.com), another geolocation developer, believes that current IP geolocation won't be able to keep up with the growth of IP addresses, especially when the Internet moves from IPv4 to IPv6. The company points out that IPv4 provides 4,294,967,296 unique addresses, while IPv6 offers a staggering 2 128. Its Geocate software uses an adaptive deterministic AI engine to track, learn, and predict a user's networking activities. The software is combined with the company's Geographic Network Technology, an adaptive-triangulation technique that detects time-to-distance information both to and from the end user. Like GPS, GNT uses time latency information to determine a Web user's likely location. Geocate makes decisions based on known details such as GNT feedback, personal information, and other learned variables.
Data mining for trouble
After the 9/11 tragedy, the US government initiated several surveillance programs, among them the Total Information Awareness project. TIA was to create dossiers on US citizens containing personal details ranging from credit card information and library visits to emails and overseas trips. Data mining and analysis would make the connection between seemingly unrelated activities that match the profile of a potential threat. A public outcry led to the project's demise, but the government is still working on similar research. A survey of the US General Accounting Office found that 52 government agencies operated or planned nearly 200 data mining programs in 2004, with at least 14 focusing on counter-terrorism. One of the most advanced programs appears to be ADVISE (Analysis, Dissemination, Visualization, Insight, and Semantic Enhancement), a US Department of Homeland Security R&D program.
Little is publicly known about ADVISE, but a report on the DHS Workshop on Data Sciences produced by Scandia National Laboratories and Lawrence Livermore National Laboratory reveals that ADVISE is focusing on knowledge discovery in databases. It will use social-behavior analysis to obtain information from relationships among people, places, and organizations. ADVISE will be able to analyze data from multimodal sources, including voice recordings, news reports (print, TV, and Internet), images, and geospatial information. The architecture will presumably use XML and XHTML to integrate these disparate data sources.
Given ADVISE's scale and complexity, it will be a work in progress for years to come. The DHS report notes that ADVISE requires long-range R&D, including research on automatic processing of text documents, semantic-graph representation, and scalable algorithms and interfaces for information retrieval from semantic-graph data. The last item is crucial because the algorithms will be applied to graphs with billions of nodes and links.
Opponents to mass-surveillance programs question their usefulness. Jennifer Granick, executive director of the Stanford Law School Center for Internet and Society, wrote in Wired News, "There are few, if any, studies demonstrating the effectiveness of mass surveillance" ( www.wired.com/ news/columns/0,70035-1.html?tw=wn_story_page_next1). Geoffrey R. Stone, a University of Chicago professor of law, is concerned about governmental snooping's unintended consequences. "To get a sense of this phenomenon," he wrote in a blog posted on huffingtonpost.com recently, "you need only recall what happened to people who in the 1930s joined organizations that were then perfectly lawful but that 20 years later became known as 'Communist-front' organizations. The lesson of that experience produced the stifling conformity of the '50s as individuals became afraid to do anything out of the ordinary."
Legal challenges against surveillance programs are increasing. The Center for Constitutional Rights has filed a suit over the US National Security Agency domestic-spying program and is seeking an injunction to prohibit the government from conducting communications surveillance. Corporations are also being drawn into the legal battles. The Electronic Frontier Foundation filed a class-action lawsuit against AT&T, accusing the telecom giant of violating its customers' privacy by collaborating with an NSA wiretap and data mining program. The US Congressional elections in November could be the next challenge to the government's spying programs. A Democratic majority in the US Senate could lead to more restrictions and less funding for governmental surveillance programs.
These Computers Know How You Feel
Benjamin Alfonsi
With the debate still going as to whether AI is actual intelligence or just clever programming, the question remains: Can a computer have an EQ (emotional quotient)?
"I would describe a computer as having EQ if it can demonstrate emotional sensitivity, meaning the capacity to adjust its behavior to the emotions of the user," says John D. Finan, a doctoral candidate in biomedical engineering at Duke University. His view seems to encapsulate a growing trend in AI research. After making great strides in the past several decades getting computers to "think" like humans, AI's next natural step seems to be getting computers to "feel" like humans.
The first line researchers have taken in investigating computers' capacity for feeling involves affective, or emotional, computing, which refers to computers sensing, reasoning about, and responding to human emotion.
Reflective design
"Our work in affective computing is focused less on how computers understand human emotions and more on how computers can be used to support human understanding of emotions," says Phoebe Sengers, professor of culturally embedded computing and science & technology studies at Cornell University.
Sengers is focusing on reflective design, which she says stimulates reflection on the normally unconscious nature of everyday activities and on technology's role in those activities. Her team is working on a smart-home application in which sensors detect a household's emotional climate and reflect it back to the household's members. The system uses a blackboard type of architecture, running a set of rules over daily sensor data to suggest likely emotional states. It then expresses the most salient emotions through natural language output.
According to Sengers, the smart home's computer doesn't have to get it "right," just "good enough" to be able to plausibly mirror the information back to the users.
"We are not building surveillance devices that will know your moods better than you do," she says. "We are building tools to help you better understand and communicate your own moods."
Sengers maintains that in almost all cases, a computer's ability to conceptualize emotions is extremely limited. But in reflective design, she says, this is okay because the computer simply needs to provide a user with a new perspective on his or her emotions, not a diagnosis.
Sensitive machines?
John Finan's Mood Phone analyzes acoustic information from speech to determine the speaker's mood. It then communicates this information to the listener in real time, as a light whose color and intensity are coded to the emotion indicated by the speaker's voice. His paper describing this project recently received top honors in Motorola's Motofwrd College Competition ( http://promo.motorola.com/ motofwrd/us/index.html). He'll be developing his idea during an internship at Motorola this summer, and says something like the Mood Phone could help people who can't process information in a normal subconscious way, such as highly functioning autistic people or those with Asperger's syndrome.
At the Fraunhofer Institute for Computer Graphics, in Rostock, Germany, scientists have developed "emotion aware" software that's embedded in a glove worn by a computer user. The glove also contains skin temperature and conductivity sensors and a receiver for heart rate data. According to principal investigator Christian Peter, a computer receiving the data wirelessly can then estimate the user's emotional state with an accuracy of up to 75 percent.
Peter believes that the software could one day benefit anyone from first-time computer users to gamers. For example, computer users trying to deal with a new application often feel helpless and become irritated. When the system realizes the user's anxiety or frustration, it could introduce a help application—such as a quick tour, tutorial, or FAQ, to the user. This is unlike the usual approach, where suggestions often (frustratingly) pop up when users don't need them. Plans are underway to expand the emotion-aware software to include Web cams to track subtle changes in facial expressions and microphones to detect slight inflections in voice. But how would the software get around human trickery, such as a poker face or voice impersonation?
"At the current state of technology, not one device [alone] is good enough to do the job satisfactorily in real life," says Peter. "But laboratory prototypes show quite promising results for each of them, and combinations of different devices will help to increase the overall accuracy."


To determine the wearer's emotional state, this glove, developed at the Frauenhofer Institute for Computer Graphics, incorporates skin temperature and conductivity sensors, a receiver for heart rate data, and special software.

Emotional hardware
David Skillicorn, professor of computing and head of the Smart Information Management Lab at Queen's University in Kingston, Ontario, believes that computers can determine a person's emotional state through his or her written words.
"Emotional state is actually extremely visible," maintains Skillicorn. "Psychologists have discovered that people do leak their emotional states quite strongly, and the reason that this has gone unnoticed for the most part is that we lack the 'hardware' to notice the channels by which this information leaks." Enter email.
Skillicorn says the technique is relatively simple: an algorithm counts the frequencies of certain kinds of words and correlates them to rank texts from most to least truthful. When scientists applied the model to a set of Enron emails, for example, they detected not only flat-out deception (fairly obvious lying) but also forms of more socially acceptable deception (such as spin).
So does counting the frequency of certain kinds of words and then ranking the text count as AI? "What [qualifies as] AI is often in the eye of the beholder," maintains Skillicorn. "Some people would consider replicating things that humans do to be a part of AI, no matter how it's done."
Future applications for the algorithm range from the expected (corporate security, forensics, and politics) to the unexpected—namely, counterterrorism. "We're interested in being able to tell how much of Osama bin Laden's pronouncements is stuff he believes and how much is designed to play to Arab or American audiences," says Skillicorn. "We'd also like to be able to look at the postings on jihadist Web sites [to try to distinguish between] general adolescent angst and genuine attack planning."
Skillicorn admits that validation has presented a problem. "It's hard to get reliable information about when public texts are actually deceptive and to what extent," he says. Still, he predicts these techniques will make life difficult for politicians as early as the next presidential election.
23 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool