Follow Us:

Email
Blogs Blogs
Rock Stars of Big Data: See Who’s Attending

Besides the opportunity to come away with actionable insights from industry-leading speakers, one of the advantages of attending a face-to-face event like Rock Stars of Big Data is the chance to network with colleagues.

Many long-term business associations have their roots at conferences. Perhaps the person sitting next to you while waiting for a presentation to start has a similar big-data problem they’re trying to solve. Or the person in front of you in the lunch line works in the same industry. Or the group you’re chatting with at the networking reception shares common colleagues or goals.

Register for Rock Stars of Big Data and you’ll be in good company. Attendees come from technology stalwarts such as Amazon, Apple, Broadcom, Cisco, Hewlett-Packard, IBM, Intel, and Microsoft, as well as healthcare companies, local and state governments, and midsized enterprises.

And they represent an array of perspectives, from data scientists and software engineers to analysts, program managers, and executives. Below are just a sample of some of the companies who will be attending:

  • Amazon.com, Software Engineer
  • Apple, Program Manager and Data Scientist
  • Acxiom, Principal Product Executive
  • Broadcom Corp., Senior Technical Director
  • Calsoft Labs Inc, Senior Vice President
  • Cedar-Sinai Health System, CTO
  • Cisco, Chief Security Officer
  • City of Mountain View, IT Analyst
  • Dynosense, Vice President
  • EMC, Software QA Manager
  • Ericsson, Director, Research and Innovations
  • ETAP, Principal Software Engineer
  • GE Global Research, Researcher
  • Georgia Institute of Technology, Professor
  • Hewlett-Packard, Research Scientist
  • IBM Research, Research Scientist
  • IDA, Assistant Director
  • Intel, Program Manager
  • Intuit, Software Engineer
  • Kaiser Permanente, Research Scientist and Programmer
  • LG Electronics, Research Fellow
  • Mentor Graphics, Senior Software Engineer
  • Microsoft Corp., Senior Software Engineer
  • Oracle, Vice President
  • Pressure Profile Systems, Inc., CEO
  • Provide Commerce, Senior Director of IT
  • Samsung, Vice-President
  • Sharp Labs, Senior Manager
  • Siemens, Senior Director, Engineering
  • TERIS, Executive Vice-President, and more.

Time is running out. Register now for Rock Stars of Big Data, 29 October at the Computer History Museum in Mountain View, California, to make sure you don't miss out on the must-attend big data event of the year.

GE Software’s William Ruh to Speak on Industrial Internet

The industrial world is undergoing a seismic shift in productivity and efficiency as machines become increasingly intelligent. The resulting Industrial Internet is expected to have the same transformative effect as the consumer Internet, creating intelligence through innovative sensor technology, machine-to-machine connectivity, new automation approaches, and software that generates real-time insight.

William Ruh, Vice President of GE Software, will discuss this transformation, and how the Industrial Internet is creating business opportunities, during his presentation at Rock Stars of Big Data, set for October 29 at the Computer History Museum in the Silicon Valley. 

GE, a 135-year-old industrial stalwart, has been making a major push into the Industrial Internet. In June, GE announced its big data and analytics platform robust enough to manage the data produced by large-scale, industrial machines in the cloud. The platform, supported by Hadoop-based historian data management software, is expected to help airlines, railroads, hospitals, and other industries increase productivity and reduce waste and downtime. The company also announced expanded partnerships with Accenture, Pivotal, and Amazon Web Services.

The convergence of advanced computing, analytics, low-cost sensing, and new levels of connectivity are combining to create a deeper meshing of the digital world with the world of machines, write Peter C. Evans, GE Director of Global Strategy and Analytics, and Marco Annunziata, Chief Economist and Executive Director of Global Market Insight at GE. That convergence has the potential not only to transform global industry, but also daily life and the way many of us do our jobs, Evans and Annunziata write in their vision paper, “Industrial Internet: Pushing the Boundaries of Minds and Machines.”

The volume of data in industry is expected to grow twice as fast as in other sectors over the next decade, and according to a report from The Wikibon Project, is expected to account for $514 billion in spending by 2020. However, industry has been slower than other sectors to embrace big data due to the enormous computing power required to process the huge amounts of raw data. Still, if industry can leverage big data, it could result in $1.3 trillion in savings from improvements in efficiency and productivity, and other associated benefits, the Wikibon researchers estimate.

In this environment, said Ruh, companies need to make strategic decisions on capturing this increasingly valuable part of the value-chain. For example, companies in the automotive industry are capturing the software value inherent in vehicles’ transformation into entertainment, navigation, and social systems. Companies that want to win and survive in today's economy must ask if it makes more sense to focus on existing products and business models or create new integrated hardware/software solutions and services, said Ruh, who leads software services and solutions portfolio strategy, development, and operations at GE Software.

Ruh is among nearly a dozen technology leaders who will share their experiences at Rock Stars of Big Data. To register, visit http://www.computer.org/Big-Data

Healthcare Industry Poised to Take Advantage of Big Data

With much progress made on digitizing patient records and pressure mounting to lower costs and improve efficiency and patient care, the healthcare industry this year is poised to take advantage of big data in a big way.

Leveraging big data could result in $300 billion in increased annual value for the industry, according to a 2011 McKinsey report. However, the healthcare industry still lags behind other industries when it comes to using big data to make informed decisions and improve efficiencies.

Healthcare providers face significant obstacles in implementing analytics, business intelligence tools, and data warehousing due to the diversity of health data, according to a 2013 paper by the Institute for Health Technology Transformation (IHTT) designed to help executives from hospitals, health systems, and other provider organizations understand how to use big data to reduce costs. The data is fragmented and nonstandardized, coming in a range of formats and generated by a multitude of stakeholders with differing interests, note the authors of “Transforming Health Care through Big Data.” And patient privacy is also paramount.

According to McKinsey, the healthcare big data revolution is still in its early days, and the potential for value creation has not yet been claimed. An evaluation of the marketplace found that more than 200 businesses aimed at leveraging healthcare information have been established since 2010. The McKinsey analysts also note that “stakeholders that are committed to innovation will likely be the first to reap the rewards.”

One of those innovators, Robert Mangel, Director of Service Quality Research for Kaiser Permanente, will be speaking at the IEEE Computer Society’s Rock Stars of Big Data event. Mangel will share his experience in using patient experience analytics to drive organizational decision-making. In his session, “Big Data Analytics: Linking Insights, Actions, and Outcomes to Drive Performance Improvement,” Mangel will discuss a mix of these different opportunities and challenges through a discussion of specific examples in healthcare delivery, and a select set of other short use cases that illustrate the variety of emerging big data analytic efforts.

The California-based Kaiser Permanente, with more than 9 million members, is a recognized pioneer in electronic health records. Its HealthConnect health information system, completed in March 2010 at an estimated cost of $6 billion, securely connects 9 million people to their healthcare teams, their personal health information, and the latest medical knowledge. Kaiser is estimated to have between 26.5 petabytes and 44 petabytes of patient data from its electronic health records alone—the equivalent of 4,400 Libraries of Congress, according to the IHTT.

At Kaiser, Mangel leads a national team that seeks to leverage patient experience analytics to inform strategic and operational decisions. A key focus of his work has been on integrating and analyzing large, disparate data sets to create actionable insights and drive performance improvement. Feedback loops between actions and outcomes are relatively longer and less direct, requiring different “Big Analytic” approaches to clarify and leverage these linkages, said Mangel.

To register for Rock Stars of Big Data, set for 29 October at the Computer History Museum in the Silicon Valley, visit http://www.computer.org/Big-Data. Team discounts are available.

Big Data Must Get Simpler to Scale

One of the most compelling aspects of the current big data Gold Rush is the race to develop tools to help data scientists process data faster and easier. By 2014, according to Gartner, 30 percent of analytic applications will use predictive capabilities. Forecasting, targeting, fraud detection, customer churn, and price elasticity are some of the more useful applications.

It currently can take weeks or months to process large data sets. The dream is that tools will shrink the processing time down to days.

“Data science is too complex today. Things must get simpler or it won’t scale,” said Roger Barga, group program manager for Microsoft’s Azure Data Platform. “When I talk to data scientists, I ask them how long it takes to deal with a data set. It can take days or weeks. This will be a game changer for who can do it and how quickly they can do it.”

Barga advised those at the recent Microsoft Research Faculty Summit, which focused on big data, that if they’re building a tool to support data scientists, understanding the workflow is key. The first step in the process is defining the goal, followed by collecting and managing data, building the model, evaluating and critiquing the model, presenting the results, then deploying the model.

“Enterprises say they need an end-to-end solution with support for collaboration, lineage tracking, archive for predictive models, and support for search and discovery,” he said. “There’s an incredible breadth of applications,” said Barga. “If you can predict it, you can own it.”

There is also an accompanying explosion of big data programs. According to Barga, there are now more than 100 data science programs, in comparison to less than a half-dozen several years ago. Typical courses include Introduction to Data Science, Hadoop, and Building Predictive Models.

“This is where the next generation of data scientists will come from,” he said, adding that “the barrier to entry here is low if you have access to the right tools.”

Another thing that’s changed is that five years ago, typically only companies with 500 or more employees had a data scientist on staff. Now, he said, “the third or fourth hire is a data scientist.”

Companies are also trying to take advantage of the interest in big data by offering data as an additional product line. For example, he said, Rolls-Royce, by adding sensors to its jet engines, is able to offer airlines and others information on fuel usage and other flight characteristics.

2013: Year of the Big Data Startup?

Although 2012 was billed as the Year of Big Data, 2013 seems to be turning into the Year of the Big Data Startup. A flurry of new companies are popping up to help organizations manage, maintain, and leverage the mounds of data that are accumulating. The solutions they offer are varied, ranging from analytics and visualization tools and platforms based on open source Apache Hadoop to big data search engines and software for specific industries.

Hadoop platform developer Hortonworks’ recent infusion of $70 million in new funding is a sign of investors’ strong interest in the space. Other startups with recent funding wins include predictive analytics firm Gainsight ($9 million first round), analytics provider siSense ($10 million second round), solutions provider ThinkBig ($13 million), analytics tools provider DataGravity ($30 million), and Hadoop distributor MapR ($59 million over three rounds).

So many big data startups are emerging that it’s difficult to keep track. Luckily, there’s a platform that lists and rates them. Big Data Startups, a media sponsor of Rock Stars of Big Data, includes dozens of the newest companies creating big data products and services, as well as industry news.

Venture funds specifically for big data startups are also emerging. Accel Partners recently established a second $100 million fund for big data. Its second fund will support entrepreneurs who are using the technology platforms that were built in the first wave of big data startups, to create Data Driven Software (DDS) designed to help the workforce “make smarter decisions through deeper insights.” According to Accel, DDS—billed as “the last mile of big data”—will automatically harness data from a variety of sources, analyze it, and present valuable real-time insights to the business end user.

 And Data Collective, another fund dedicated to big data, currently is managing 46 creative teams. Said the founders of Data Collective when they launched last summer: “We believe that big data, like the PC revolution of the 80s, the emergence of the Internet in the 90s, and Web 2.0 in the 2000s, represents a several hundred billion dollar wealth creation opportunity. The companies that solve these problems, from infrastructure providers at the bottom to corporate customers at the top of the stack, will enjoy huge competitive advantage over the next 10 years.”

Showing 11 - 15 of 18 results.
Items per Page 5
of 4

Twitter

Follow CS_Conferences on Twitter