News Archive

July 2012

Gig.U Project Aims for an Ultrafast US Internet

June 2012

Bringing Location and Navigation Technology Indoors

May 2012

Plans Under Way for Roaming between Cellular and Wi-Fi Networks

Encryption System Flaw Threatens Internet Security

April 2012

For Business Intelligence, the Trend Is Location, Location, Location

Corpus Linguistics Keep Up-to-Date with Language

March 2012

Are Tomorrow's Firewalls Finally Here Today?

February 2012

Spatial Humanities Brings History to Life

December 2011

Could Hackers Take Your Car for a Ride?

November 2011

What to Do about Supercookies?

October 2011

Lights, Camera, Virtual Moviemaking

September 2011

Revolutionizing Wall Street with News Analytics

August 2011

Growing Network-Encryption Use Puts Systems at Risk

New Project Could Promote Semantic Web

July 2011

FBI Employs New Botnet Eradication Tactics

Google and Twitter "Like" Social Indexing

June 2011

Computing Commodities Market in the Cloud

May 2011

Intel Chips Step up to 3D

Apple Programming Error Raises Privacy Concerns

Thunderbolt Promises Lightning Speed

April 2011

Industrial Control Systems Face More Security Challenges

Microsoft Effort Takes Down Massive Botnet

March 2011

IP Addresses Getting Security Upgrade

February 2011

Studios Agree on DRM Infrastructure

January 2011

New Web Protocol Promises to Reduce Browser Latency

To Be or NAT to Be?

December 2010

Intel Gets inside the Helmet

Tuning Body-to-Body Networks with RF Modeling

November 2010

New Wi-Fi Spec Simplifies Connectivity

Expanded Top-Level Domains Could Spur Internet Real Estate Boom

October 2010

New Weapon in War on Botnets

September 2010

Content-Centered Internet Architecture Gets a Boost

Gesturing Going Mainstream

August 2010

Is Context-Aware Computing Ready for the Limelight?

Flexible Routing in the Cloud

Signal Congestion Rejuvenates Interest in Cell Paging-Channel Protocol

July 2010

New Protocol Improves Interaction among Networked Devices and Applications

Security for Domain Name System Takes a Big Step Forward

The ROADM to Smarter Optical Networking

Distributed Cache Goes Mainstream

June 2010

New Application Protects Mobile-Phone Passwords

WiGig Alliance Reveals Ultrafast Wireless Specification

Cognitive Radio Adds Intelligence to Wireless Technology

May 2010

New Product Uses Light Connections in Blade Server

April 2010

Browser Fingerprints Threaten Privacy

New Animation Technique Uses Motion Frequencies to Shake Trees

March 2010

Researchers Take Promising Approach to Chemical Computing

Screen-Capture Programming: What You See is What You Script

Research Project Sends Data Wirelessly at High Speeds via Light

February 2010

Faster Testing for Complex Software Systems

IEEE 802.1Qbg/h to Simplify Data Center Virtual LAN Management

Distributed Data-Analysis Approach Gains Popularity

Twitter Tweak Helps Haiti Relief Effort

January 2010

2010 Rings in Some Y2K-like Problems

Infrastructure Sensors Improve Home Monitoring

Internet Search Takes a Semantic Turn

December 2009

Phase-Change Memory Technology Moves toward Mass Production

IBM Crowdsources Translation Software

Digital Ants Promise New Security Paradigm

November 2009

Program Uses Mobile Technology to Help with Crises

More Cores Keep Power Down

White-Space Networking Goes Live

Mobile Web 2.0 Experiences Growing Pains

October 2009

More Spectrum Sought for Body Sensor Networks

Optics for Universal I/O and Speed

High-Performance Computing Adds Virtualization to the Mix

ICANN Accountability Goes Multinational

RFID Tags Chat Their Way to Energy Efficiency

September 2009

Delay-Tolerant Networks in Your Pocket

Flash Cookies Stir Privacy Concerns

Addressing the Challenge of Cloud-Computing Interoperability

Ephemeralizing the Web

August 2009

Bluetooth Speeds Up

Grids Get Closer

DCN Gets Ready for Production

The Sims Meet Science

Sexy Space Threat Comes to Mobile Phones

July 2009

WiGig Alliance Makes Push for HD Specification

New Dilemnas, Same Principles:
Changing Landscape Requires IT Ethics to Go Mainstream

Synthetic DNS Stirs Controversy:
Why Breaking Is a Good Thing

New Approach Fights Microchip Piracy

Technique Makes Strong Encryption Easier to Use

New Adobe Flash Streams Internet Directly to TVs

June 2009

Aging Satellites Spark GPS Concerns

The Changing World of Outsourcing

North American CS Enrollment Rises for First Time in Seven Years

Materials Breakthrough Could Eliminate Bootups

April 2009

Trusted Computing Shapes Self-Encrypting Drives

March 2009

Google, Publishers to Try New Advertising Methods

Siftables Offer New Interaction Model for Serious Games

Hulu Boxed In by Media Conglomerates

February 2009

Chips on Verge of Reaching 32 nm Nodes

Hathaway to Lead Cybersecurity Review

A Match Made in Heaven: Gaming Enters the Cloud

January 2009

Government Support Could Spell Big Year for Open Source

25 Reasons For Better Programming

Web Guide Turns Playstation 3 Consoles into Supercomputing Cluster

Flagbearers for Technology: Contemporary Techniques Showcase US Artifact and European Treasures

December 2008

.Tel TLD Debuts As New Way to Network

Science Exchange

November 2008

The Future is Reconfigurable

For Business Intelligence, the Trend Is Location, Location, Location

Sixto Ortiz Jr.

Businesses are always looking for better ways to analyze the large amounts of data they receive from customers, suppliers, market analysts, and other sources.

Modern business intelligence approaches began in the late 1980s, but companies have used of some type of BI since business-support systems were introduced in the 1960s.

Now, BI is beginning to include a critical new element: location. Location intelligence is the ability to organize and understand complex events and trends by studying the geographic relationships in information.

In essence, LI adds geographic, demographic, economic, and similar types of data to the financial, marketing, and other information already used in BI.

This reflects the growing development and use of affordable technology that can capture and recognize geospatial data.

However, LI still faces several key challenges.

Inside LI

Early adopters of computerized LI in the late 1980s included telecommunications, oil, and gas companies, as well as government agencies, utilities, and others for which location-based data was critical. However, the cost and complexity of adding geospatial data and functionality to existing BI applications limited its use.

Increased Availability

Technology advances have recently enabled the growing availability and affordability of geographic services such as Google Maps and Microsoft Virtual Earth.

Not long ago, only geographic information system (GIS) experts used such services. Now, though, the development of turnkey applications that automatically process tabular data into information that can be plotted on maps, coupled with sophisticated analytics software that can process such data, have brought LI within reach of many organizations.

The Technology

At its most basic, LI entails the assignment of location data and other relevant information to a database, explained Mark Smith, CEO and chief research officer at advisory services firm Ventana Research.

Accomplishing this manually could require considerable programming.

To automate the analysis of multidimensional geographic and nongeographic spatial information within a single database, LI uses technologies such as spatial online analytical processing.

In evaluating the data, LI uses standard BI tools such as predictive analytics and business-process intelligence.

After analysis, LI plots the results on a map, which requires geocoding and the extraction of geographic coordinates from textual address data.

Applying geocoding to large amounts of data requires customized programming that is beyond spatial databases' basic functionality. This programming can be difficult, time-consuming, and costly, explained Brandon Purcell, vice president of engineering at LI vendor SpatialKey.

However, LI applications include software that automates the process of translating textual address information into geographical coordinates that can then be plotted on a map.


Some companies are now using LI to locate new businesses such as gas stations and convenience stores, noted Jim Harder, a principal with the Visual Data Group, a BI vendor.

The technology lets companies analyze the local demographic, economic, and other relevant business-related information to determine whether placing a store at a particular site makes sense.

Businesses can also use LI for purposes such as market-penetration studies.

Cellular-phone companies use the technology for network planning and locating cell towers. Government agencies utilize it for urban planning. Insurance companies employ the approach for risk management. And real-estate agencies use it for site reports.

Even law-enforcement agencies are working with LI to analyze crime-related data and the geographic distribution of incidents to identify "hot spots" where they should concentrate their resources, said SpatialKey's Purcell.

The approach could also be used with augmented reality to provide, for example, the real-time mapping of complex location information over a display of a user's surroundings.


LI faces several obstacles to success.

For example, noted Harder, keeping mapping information current with the addition of new streets, housing, businesses, traffic patterns, and other important factors can be challenging.

In addition, many companies may not understand LI's business value or how to best use the technology, said Purcell.

Businesses also sometimes have trouble integrating the different software they need to load, report, map, model, and process LI data along with their traditional information, added Ned Harding, chief technology officer at LI vendor Alteryx.

Down the Road

The continued integration of spatial information with other types of data could make LI even more useful, said Harding.

As users want to process increasing amounts of location-based data, he added, they will move to grid-computing approaches using large clusters, which will require a bigger IT infrastructure.

By eliminating the need for users to house huge analytical applications on their systems, cloud-based approaches will make it easier to collect, distribute, and study LI data on a large scale, said SpatialKey's Purcell. This will make the technology available even to small and medium-sized businesses, he added.

Also, said the Visual Data Group's Harder, LI tools will grow in sophistication and will feature expanded data storage and management coupled with fast and easy information access.

A key question is whether businesses will embrace LI as yet another way to study data among an already dizzying array of business-analysis tools.