M2M standardization: “The world is no longer black and white. It is at least gray, if not a coat of many colors.”

An interview with Professor Dr. Axel Sikora, Scientific Director of the Institute of Reliable Embedded Systems and Communication Electronics (ivESK) at the University of Applied Sciences, Offenburg, Deputy Member of the Board of the Hahn-Schickard Association of Applied Research, Villingen-Schwenningen, and Board member of the M2M Alliance e.V.

Part 2:

 

Do we really have too few cross-industry standards? That is the question Professor Dr. Axel Sikora answered in the first part of our interview. His conclusion: there is already a large number of approaches and organizations that are developing M2M communication standards. European companies must, however, join forces more energetically in order not to lose touch with American and Asian alliances. Read Part 1 of the interview here. In the second part of the interview he now deals mainly with the technical levels of standardization and with possible risks.

 

Professor Sikora, at which levels do you see standardization as making sense?

 

Let me answer this question along the classical network layers of the ISO-OSI reference model, which extends from Layer 1, the physical transmission technology, via the network level (Layer 3) to the application level (Layer 7). The application level is where I currently see the greatest need for standardization because, pleasingly, at the network level the Internet Protocol version 6 (IPv6) increasingly prevails as the overriding protocol for end-to-end communication and is taking over the integration of the different L1/L2 protocols. At the application level there is an enormous number of approaches, architectures and solutions with the result that today’s IoT applications are characterized by countless application gateways that convert data models from one format to another. These gateways are typically very complex and difficult to manage.

 

 

What role do open standards play when compared with proprietary standards?

 

Open standards are playing an increasingly important role, with the world here no longer being black and white but at least gray, if not a coat of many colors. There are a number of proprietary standards even though the term “proprietary” ought not to exist in this connection. Nevertheless, these standards are in part very successful. They include standards developed by individual manufacturers, such as the EnOcean Radio Protocol and Z-Wave for home automation or SIGFOX and LoRaWAN for Long Range IoT networks, and protocols drawn up by cross-vendor alliances such as Bluetooth and ZigBee, and OMS open standards for which several players are on board, usually with a wider scope and are thereby the future-safe option because of their wider scope. In addition, open standards enable smaller partners – adopters – to be involved in a much simpler way than would be the case with proprietary standards for which the hurdles are often much higher for smaller companies.

 

Which hurdles do you mean specifically? 

 

They can be financial hurdles, but they can also be informal hurdles. The predominant player can in this way try to keep good and fast competitors at bay.

 

How would the adoption of global standards influence the IT security of M2M applications? 

 

It would be highly positive on two counts. For one, it would support global end-to-end security; for another, it would help fulfill data protection requirements for globally distributed or hosted applications.

That having been said, I do not anticipate global agreements on security because the needs of companies and countries differ widely. 

If proprietary standards are not published I feel that to be problematic because the openness of security solutions along with the confidentiality of keys and cryptomaterial continues to be the right approach. Truly secure solutions can only be developed in global competition between cryptologists. That, however, comes with a risk. If a mistake is made, systematic interventions have much further-reaching effects.

 

Do you see other, critical aspects in the development of standards?

 

I feel that continuing to seek market definition is dangerous. It is an approach that was taken very often in the past. Very closed solutions were adopted in an attempt to keep the competition out of home markets. But that inhibits willingness to invest. Furthermore, M2M applications must increasingly be designed for international use. Home automation is for me a very sad but, even more sadly, a model example of how market development is inhibited because there is no overall standard.

 

What influence does politics have on M2M standardization? 

 

The work of the Federal Office for Information Security (BSI) in connection with standardization of smart metering was in my view very positive and in many respects exemplary. The BSI laid down guidelines but also paid close attention to industry feedback. The BSI set in motion a development with a “firm hand” that would otherwise have been unlikely to happen in the way that it did. I consider political guidelines on data protection such as those that the BSI has thankfully laid down in Germany with a largely positive effect to be very important. That is an area in which the state can, in my opinion, play an important and positive role as long as it takes into consideration the justified requirements of industry. It must be a public-private partnership because regulations would otherwise be drawn up that later prove impracticable. Once the state assumes a leading role in this area, good and leading companies often spring into action that support this process seriously and sustainably and do not merely pursue egoistic lobbying policies.

 

 

A big thanks to Professor Sikora for the interview. If you are interested in the first part, please click here.

Tags: IOT , Machine-to-Machine , Internet of Things , Standards , standardization , M2M

Comments (0)

No comments found!

Leave a reply

Your email address will not be published.
Required fields are marked.