Interoperability in the Internet of Things Interviews
Guest Editors’ Introduction • Giancarlo Fortino, Maria Ganzha, Carlos Palau, and Marcin Paprzycki • December 2016
We asked some experts in the field to share their responses to a few fundamental questions about IoT interoperability. Matus Maar, Oscar Lazaro, Roberto Minerva, Sebastián Pantoja, Laurent Belmon, and Yossi Dan provide varying perspectives on the future of IoT interoperability in industry.
Matus Maar is a co-founder and partner at Talis Capital, as well as a co-founder of two venture capital-backed technology companies: Pirate Studios and Threads. He has led or closed more than 40 private-equity or venture-capital deals. Maar sits on the advisory board for INTER-IoT, an EU-funded IoT initiative. He has a degree from the University of Manchester and is passionate about world-changing technologies and futurism.
Oscar Lazaro received a PhD in electrical engineering from the University of Strathclyde, U.K., and is now a visiting professor of wireless and mobile communications at the same university. He is the managing director of Innovalia Association, a research lab founded by the Innovalia Alliance that develops technology in E2E cybersecurity, quality assurance for high-performance networked IoT/CPS critical systems, mobile 3D visualization, and big data analytics. Lazaro has more than 20 years of experience in the ICT and manufacturing fields and has served on the Future Internet Steering and Advisory Boards.
Roberto Minerva is a manager at TIMLab, a Telecom Italia research center, where he works in SDN/NFV, 5G, big data, IoT architecture, and ICT technologies for leveraging new business models. He holds a PhD in computer science and telecommunications from Telecom SudParis, France, and an MS in computer science from the University of Bari, Italy. Minerva is the chair of the IEEE IoT Initiative, an effort to aggregate a large technical community of experts and foster research and innovation in several IoT fields. He has authored several papers published in international conferences, books, and magazines.
Sebastián Pantoja is the research and development director at Televes, a global company focused on distributing telecommunication services. He has an M.S. degree in electrical engineering from the Polytechnic University of Valencia, Spain, and has had a long career as a design engineer, purchasing engineer, new product introduction manager, project coordinator, and technology director. Pantoja also works as an expert for the European Commission for IST and PIDEA programs.
Laurent Belmon manages European projects at Thales Research & Technology. He has a PhD in electronics from the University Paris-XI, France, where he focused on scientific data compression for deep-space mission telemetry issues. Belmon has worked as a research engineer in the Paris Observatory’s Space Research department and as a researcher, coordinator, and manager at Thales Communications, where his research spanned from electronic warfare to civil crisis management.
Yossi Dan is the cofounder and chief innovation officer at Challengy, a management consulting firm focused on global innovation. He helps corporations and cities in Europe and Israel meet their digital transformation challenges. Dan co-manages the accelerator of the Founder Institute in Tel Aviv, leads the Founders Café community of startup founders, and participates as a mentor or speaker in hackathons and innovation bootcamps. Contact him at firstname.lastname@example.org.
Is full-blown interoperability a necessary technical enabler for future IoT ecosystems (systems of IoT systems), or will the “world of IoT” be able to expand and progress without it in the foreseeable future?
It's a technical question, but to answer it commercially, I think it is becoming clear that some key aspects of IoT (such as security) are hindered by lack of interoperability.
Interoperability is definitely one of the key enablers for the development of all the ICT potential behind the industry 4.0 initiative. Industry 4.0 is basically data-driven, and it is through the efficient and intelligent sharing, processing, analysis, and exploitation of today’s data silos that industry anticipates a new industrial revolution based on cyber-physical production systems. The emergence and proliferation of a family of digital manufacturing platforms that serve industry 4.0 across the full product and manufacturing process lifecycle is definitely a strong driver for full-blown interoperability.
The IoT world is implementing two different strategies. On one side, there are very vertical (and silo-ed) application domains; they are not pursuing any interworking capabilities because applications try to be de facto standards. The approach is becoming the reference for a specific market or application domain. This approach is particularly effective when the IoT solution is entirely within a single administrative domain, such as when a system is totally under the control of a single administrator. In this case, the solution provider will share or adopt a few interfaces in order to create or use an ecosystem of companies that can share technologies.
On the other side, there are large applications that need to access data and functionalities from different applications. An example could be a smart city system, in which interoperability between datasets and specific functionalities is a must. In addition, several administrative domains may need to identify and open interfaces. In this case, interoperability should be clearly identified, and well-specified interfaces, data formats, and APIs should be designed and made available.
— Roberto Minerva
In my opinion, full-blown interoperability is completely indispensable for the progress of IoT ecosystems. Nowadays, customers and manufacturers have a huge number of technologies and solutions to choose from, and the risk of investing in a nonstandard option is too high because of its possible obsolescence. The absence of an open interoperability framework is one of the factors that is limiting the expansion of IoT ecosystems (and, therefore, its maximum market and technological potential). But it is possible that the IoT market will continue its current evolution for small-scale solutions; applications such as industrial IoT or even smart cities can be seen as independent IoT systems.
I perceive that IoT will live and progress for a while with different standards that are not interoperable; interoperability does not look like an issue for today’s major IoT manufacturers (Google, Apple, and Android) that maintain these silos. But this trend should reach a limit because this fragmentation limits the benefits for the user.
I don't see interoperability as a necessity from the business side of the equation. It is always a good top-down request (not a hazard EU Commission pays for it, for instance), but on the business side, some specific IoT features may be a big competitive added value.
We can identify five layers in an IoT ecosystem: device, networking, middleware, application services, and data and semantics. Assuming this conceptualization of the hardware/software stack, (i) do you foresee any reference standards for the IoT? If so, (ii) at which level, and (iii) who will develop them (for example, W3C, IEEE SAB, or a consortium of lead players)?
Most likely a consortium of lead commercial players will act faster than a governmental organization. The consortium will want to benefit from imposing its solution and becoming the main standard, which will allow for early mover advantage in the market.
Industry 4.0 is working at the level of reference architectures with RAMI 4.0 (Reference Architecture Model for Industry 4.0) in Europe and with IIRA (Industrial Internet Reference Architecture) in the U.S. Both are reference, but not really software-development, architectures. In any case, industry 4.0 is looking at interoperability at many levels, from the evolution of OPC-UA standards to the current efforts mainly driven by IEC and ISO/TC184. It’s aiming at breaking the silos between operation technologies (usually associated with automation and shop-floor operations) and information technologies (usually associated with resource and customer management and collaboration). Moreover, recent efforts from the W3C working group on the Web of Things (WoT), IoT Lifecycle management standards by The Open Group (TOG), and standards on Object Memory Model (OMM) definition are developing the right semantics for interoperability and common vocabulary development. It is worth noting that the scope of industry 4.0 is so wide that such common vocabularies are being developed in particular domains of expertise and knowledge, such as AutomationML. Moreover, standards such as Open Service Lifecycle Collaboration (OSLC) are standardizing methods to loosely couple such tools and data.
Interfaces and open APIs will be required at least up to the middleware layer in order to integrate products and general functionalities offered by underlying resources. Middleware APIs could already be distinctive and proprietary so as to segment the market in favor of a few products. Applications development and possibly data collection and analysis are highly competitive tasks that will possibly be confined within the realm of proprietary interfaces.
However, in order to unleash the potential of IoT (especially for large applications), a well-designed interface at the middleware and data management level would be really effective and helpful. One consideration is that the data management level should be re-conducted within the set of functions and interfaces accessible through middleware APIs.
— Roberto Minerva
If we follow the presented five-layer approach, I think each layer could be suitable for standardization, and we can find solutions in the state-of-art for each (such as OpenM2M or middleware solutions as FIWARE). Nevertheless, if we analyze each layer independently, we can find differences:
- Devices. Nowadays, we can find a huge amount of connectivity standards for devices, such as ZigBee, Bluetooth Low Energy, Z-Wave, KNX, Wifi, and 4G. It is interesting to see that each connectivity standard has its advantages and disadvantages depending on the application (for example, BLE presents good power performance in personal area networks, but 4G is more adequate for mobility applications). So, the election of the technology is completely linked to the specific application, and I think it will be impossible to find a technology that covers all the scenarios while being cost-effective. NB-IoT and 5G seems a good approach for that, but I think that fee-free technologies are more suitable in fixed scenarios. I think it is needed to note the role of Smart Gateways and Smartphones as a key element in solving this application-dependent scenario and providing security and privacy.
- Networking. Despite my limited knowledge about network standards, in this layer, I think that IPv6 will be a suitable approach for standardization. The big number of connected devices and the interoperability capabilities of IPv6 will act as an enabler of this standardization.
- Middleware. In this layer, a set of competitors (such as FIWARE, SOFIA2, and IBM Watson or sensiNact) are in the market, and each is more oriented to different applications. I think the election of the middleware will be linked to the final application and to market and political factors, and it will be different to find and standardize in this layer. It is important to comment that a scenario in which FIWARE will hold highest market rate in Europe is suitable because of the support of different actors for this solution, but I think that FIWARE will not be the only player in the market.
- Application services. Related to the previous point, I think that it will be difficult to define a standard interoperability layer for application services, because of the evolution of the middlewares under it. In this point, the possible standard should be linked to regulation or commercial agreements between different middleware providers, and I think it will be difficult to reach this agreement.
- Data and semantics. In my opinion, this is one of the layers (along with the networking layer) that is most suitable for standardization. The apparition of ontologies like OpenM2M will make defining an interoperability layer at this level possible, and I think the benefits of this abstraction will act as an enabler of the evolution of these ontologies.
Regarding the organizations that will lead this standardization, I think that institutions like IEEE or W3C will lead for the networking and data layers, while a consortium of lead players—including industry and institutions—will lead for middleware, application services, and devices.
In my opinion, reference standards should first address semantics and data exchange formats; W3C looks like a good initiative to reach such interoperability. Interoperability at the application level is also very important because it would foster third-party software development, which is a huge source of revenue, but it will definitely remain in silos of proprietary platforms.
There are two paths to interoperability when forming IoT ecosystems: standards-based and voluntary (the latter defined as the act of interconnecting, making interoperable, and integrating heterogeneous systems without using standardized protocols, methods, or approaches). What are the pros and cons of each path in the short- and long-term evolution of the IoT?
A standards-based path would be ideal, but the diversity of standards makes it impossible for now. So, picking and integrating heterogeneous systems will continue to be the common practice. The cons of it are the security of the overall ecosystem and narrowed choices when building ecosystems.
The pros of de facto, voluntary standards are usually based on the adoption of particular vendor platforms or toolkits for operations. This has the clear advantage of quicker returns on investments within the platform ecosystem. But the lack of standards could limit long-term strategies in terms of incorporating a larger ecosystem of high-value stakeholders or sustaining the global operation of manufacturing activities. On the other hand, standards take longer to be developed and would slow down the adoption of particular solutions or platforms. Therefore, industry 4.0 is taking a balanced approach to these types of developments through open platform strategies. Open platforms ensure quicker return on investment, but the openness alleviates the vendor lock-in concerns for digital manufacturing strategy implementation and the future extensibility of platforms toward global standards.
The definition and implementation of standards will highly benefit the possibility of not segmenting the IoT market and would promote the possibility of creating a large IoT ecosystem. For the time-being, the voluntary and proprietary approach is predominant. This may lead to non-interoperable systems, more because of business decisions than because of technical limitations. The attempt of big players will be to affirm their solutions as de facto standards. This may result in a very fragmented IoT ecosystem that will be difficult to recompose into a large one.
— Roberto Minerva
As I commented previously, I think that the evolution of IoT ecosystems will be a combination of these two paths, because in some layers the market will accept the interoperability standards because of its commercial benefits, but in other layers it would be necessary to force this process (if it is needed and valuable). In my opinion, the main pro of the voluntary path is that it is not disruptive with the previous works of academics and industry, and the main con is the limited traction of the benefits of interoperability. On the other hand, the standards-based path has the con of breaking with some developed and market technologies (which could cause deep harm to industry and the market) and the long-term benefit of having a common and standardized IoT ecosystem, enabling cross-cutting applications. For that, in my opinion, IoT ecosystem standardization should be a voluntary process with specific standards for high-value scenarios (such as health systems or smart cities).
In the short term, the voluntary approach (if I understand the paradigm correctly) could enable the emergence of new services based on heterogeneous systems; stakeholders won't wait for greater interoperability before purchasing sensors, objects, and corresponding processing platforms. Therefore, the voluntary approach looks like the most pragmatic way to enable IoT adoption.
But in the long term, the voluntary approach could be very costly and may limit the development of IoT. Common standards will be needed; for example, in my domain of security and defense, service-oriented architectures (SOAs) are a promising approach.
The biggest pro that I envision is that such interoperability will provide the best frame for regulation-based benefits, mostly in two fields: business and privacy. In business, it will give small players an easy learning-curve to comply on every IoT level; then we'll see many more innovators entering into it. In privacy, it will give a clear path to follow in order to get more trust from the users.
According to McKinsey, interoperability will be the key to enabling more economic potential of the IoT: "Interoperability between IoT systems is critical. Of the total potential economic value the IoT enables, interoperability is required for 40 percent on average and for nearly 60 percent in some settings." Can you comment on this statement from your professional perspective?
I agree with the statement and the article. I also agree with the quote, "Companies that use IoT technology will play a critical role in developing the right systems and processes to maximize its value." This goes back to your second question. Hopefully governmental bodies can accelerate the process of helping commercial organizations deploy the right interoperability.
I could not agree more with this statement. Industry 4.0 will definitely break the information silos within and across factories. The issue at stake is whether interoperability technologies and standards will be in place in time to make such processes cost-effective and seamless. Failing to do so will significantly (1) limit the opportunities for collaboration across manufacturing industries to realize shorter times to market for innovative products and services, (2) stall the development of added-value digital services for optimization of manufacturing operations (sustainability and energy efficiency), and (3) impact the ability to reach the goal of zero-defect manufacturing connected factories value networks. Interoperability and platforms hold the key for a significant proportion of the business value behind industrial IoT and industry 4.0 projections.
The most likely approach for the time-being in IoT is that some major players will try to position their solutions as de facto standards. This will put a few of them in a predominant position, and they will gradually build larger ecosystems. So, interoperability will be offered within a proprietary context. Most likely the approach will also look to the specificity of the different IoT application domains (such as transport, automotive, logistics, and e-health). Specific solutions will emerge for the different application domains. These solutions will tend to be highly effective but strongly vertical in scope.
Large IoT implementations—those that need to integrate different application domains—will be difficult to create and implement. Horizontal platforms will lag behind specialized and more-focused ones. However, in a few years, the need for them will clearly emerge, but the new standardization phase will cope with the fragmentation of the market. In the end, the possibility of using horizontal open APIs will be granted.
— Roberto Minerva
From my point of view, as I commented in the first question, this statement is completely right. If we, as technological actors, want to unveil the complete potential of the IoT market, we need to work with interoperability solutions that allow cross-cutting and cross-domain applications. We should be conscious that we are moving to a Big Data scenario, in which the aggregation of information will be key for developing disruptive solutions and applications, and for which it is almost mandatory to develop and strengthen the interoperability concepts.
These numbers seem consistent, especially when you think about the potential benefit of being able to use most of the information gathered; more interoperability will enable more heterogeneous-source data analytics, fusion, and thus better decision making, which is mandatory for some domains such as predictive maintenance and health.
It is always a good thing for customers to buy something "open" or "universal." But because most private corporations invest in their core added values, the price of openness is sometimes much higher than the price of remaining closed. When we look at the biggest hardware and software companies, they always prefer to set their own platform and have others comply with it.
In your company, do you see any role or value in establishing interoperability between your IoT solutions and solutions that are being placed in the market by other IoT vendors? If you see such a role, how does your company envision achieving interoperability with artifacts developed by others? If you do not see such a role, please explain.
That would be ideal, but until it is possible, we are focusing on investing in cybersecurity solutions that try to work effectively even without interoperability.
Zero-defect manufacturing (the ability to produce without defects or scraps with a minimum energy budget) has always been and will remain the “holy grail” of manufacturing. As a group that delivers instrumentation, sensors, and software platforms to deal with quality control information, the role of interoperability for us is fundamental. Interoperability across equipment in the manufacturing shop floor is crucial to ensure “plug and produce,” or more autonomous production processes. The use of quality information across stakeholders in the factory and value chain has many different uses from the most evident one for production control to more advanced ones, such as collaborative manufacturing execution planning or prescriptive processes and machine maintenance. For us, data interoperability through standardized semantic models that connect engineering (CAD, CAE) and production (MES, PLM) are fundamental to increasing the level of automation of zero-defect manufacturing strategies that can be delivered at competitive costs. Shop-floor equipment data communications and product-process data representation, as well as loosely connecting the capabilities of multiple platforms and tools across the lifecycle, are the avenues that our group is investing in for achieving interoperability. We ultimately want to be able to deliver to industry 4.0 customers full traceability and visibility of the manufacturing quality of components and systems across the complete value network.
A telecom operator is used to operate with standards and to guarantee interoperability. For this reason, oneM2M or another standard initiative proposed by ETSI in the context of 5G (for example, NB-IoT) will have great importance and support. However, standardization is slower than the development and implementation of proprietary solutions. And even if the standardization process has been sped up, there is still a gap. So, there is the need to create awareness about the need for an open and programmable (at a large scale) IoT ecosystem.
— Roberto Minerva
As I commented before, I think that interoperability between IoT ecosystems will be needed for the evolution of the market, and it will allow the development of disruptive applications thanks to the aggregation of gathered data in different scenarios. For example, in the smart city scenario, the aggregation of traffic, environmental, power-consumption, water-consumption, and multimedia content-consumption data could be combined in order to find the patterns of resource consumption and act over them in a more effective way than if we used only the power-consumption data. Besides, IoT ecosystems have room for different kinds of industries, such as electronic manufacturers, application developers, and network operators. For this reason, we have defined our position in this IoT ecosystem, and we are finding high-value alliances for completing this ecosystem.
For these reasons, we are working on developing these interoperability capabilities for our systems to integrate our solutions with other IoT vendors and to expand the whole potential of IoT ecosystems. For that, we are participating in cooperative national and international projects, with the main objective of improving the interoperability between IoT ecosystems (such as ACTIVAGE). We are maintaining an active role in the forums and events that are discussing these topics, and we would like to be part of the definition of this standard IoT ecosystems definition. Finally, our IoT products are designed and developed in order to expand the whole concept of IoT ecosystems, and they are providing this interoperability idea by themselves, as you can see in our Smart Gateway.
My company is not (at least at the present time) an IoT provider, so interoperability with IoT vendors is not an issue right now. We will focus more on the security aspects (communication and data) of the IoT.