IoT Impact on the Manufacturing Industry (Part 2)

Continuing on from my last blog post, another example for IoT use in manufacturing would be for the asset management to distribute work orders and configurations to the tools or the different stages of production. And vice versa, calibration information can be fed back to the Enterprise Resource Planning (ERP) system to associate them to the bill of material (BOM). Big data and NoSQL technology is an enabler in this regard, as they can allow for the management of huge volumes of heterogeneous, multi structured data about the production process, from the data types discussed, to even images from AOI (Automated Optical Inspection) systems and other production modules. With recalls a concern point in global manufacturing, this can be an ally in the fight to keep costs down for manufacturing.

IoT can also have an impact is in intelligent edge devices and their use in improving supply chain optimization and modularity of manufacturing. Consider surface mount technology (SMT), where there is so many moving parts, calibration, types of technology used in the placement and verification of board level components. IoT sensors could be utilized to centralize SMT line asset management and to read calibration information via the factory WLAN. The asset management can form the link between the SMT tools and the ERP (Enterprise Resource Planning) and MES (Manufacturing Execution Systems) that oversee the manufacturing process.

A challenge that presents itself to the manufacturing industry is the ageing workforce, and this means that anything that speeds up the manufacturing process is critical. The advancement in mobile technology is a key enabler in ensuring that passing information to the shop floor becomes quicker, improving response time, visibility, and accessibility of operations. The recent advancement of wearables also will have an impact on enhanced visibility on the shop floor.

Building Blocks for IoT in Manufacturing

Business owners need to look at four technology elements that provide the foundation for smart manufacturing. These include (but not limited to):

  • Security: IT security is a major obstacle to setting up smart factories. Operations managers need to make sure that necessary safeguards are built into the solution including security procedures such as physical building security, hardware encryption and network security for data in transit. Security and networking solutions must also be engineered to withstand harsh environmental conditions, such as moisture and temperature, that aren’t present in typical networks. Identity and authentication structures will also need to be updated to support such “things” as well as people.
  • More Advanced Networking: Smarter manufacturing environments need a standardized IP-centric network that will enable all the devices/sensors in a plant to communicate to enterprise business systems. Cisco research states that only 4 percent of the devices on the manufacturing floor are connected to a network. A standard IP network also makes it easier to connect and collaborate with suppliers and customers to improve supply chain visibility. Manufacturers need robust networks that can cope with Radio Frequency (RF) challenges in the plant, harsher environmental conditions and need stability for transmission of alarms and real-time data processing.
  • Big Data Analytics: While manufacturers have been generating big data for numerous years, companies have had limited ability to store, analyze and effectively use all the data that was available to them, especially in real time. New big data processing tools are enabling real-time data stream analysis that can provide dramatic improvements in real time problem solving and cost avoidance. Big data and analytics will be the foundation for areas such as forecasting, proactive maintenance and automation.
  • Engineering Software Systems: Today’s IoT data is different than the data we use to operate our systems. It requires collecting a wide range of data from a variety of sensors. These software systems and models must translate information from the physical world into actionable insight that can be used by humans and machines. Toyota is using Rockwell’s software for real time error corrections in the plant. Toyota has minimized rework and scrap rates in its Alabama plant, which has resulted in an annual cost saving of $550,000.3

Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
With IoT, IP networks and analytics, manufacturers can become more efficient, improve worker safety and offer new exciting business models. IoT will help manufacturers improve resource efficiency, safety and return on assets. Manufacturers that master this new dynamic will have a variety of new opportunities for revenue growth and cost savings.

References

3: How IoT will help manufacturing

http://www.industryweek.com/blog/how-will-internet-things-help-manufacturing

4: Industrial Optimization IoT (Intel)

http://www.intel.ie/content/dam/www/public/us/en/documents/white-papers/industrial-optimizing-manufacturing-with-iot-paper.pdf

IoT Impact on the Manufacturing Industry (Part 1)

“Industry 4.0” and “Smart Factory” are some of the terms used to describe the technological and social revolution that promises to change the current industrial landscape. Industry 1.0 was the invention of mechanical assistance, Industry 2.0 was mass production, pioneered by Henry Ford, Industry 3.0 brought electronics and control systems to the shop floor, and Industry 4.0 is peer-to-peer communication between products, systems and machines. It is clear that IoT will have a different impact statement depending on the application and/or industry, one that is of particular interest, given the emphasis on process, is Manufacturing. Compared to other realms such as retail and its intangible ways, manufacturing is about physical objects and how we can bring them to the consumer in a more efficient and automated way. The manufacturing landscape is ever changing, with automation through robotics the most recent enabler.

Challenges and Possibilities of IoT and Manufacturing 1

Gartner analyst Simon Jacobsen sees five immediate challenges and possibilities posed by the IoT for the manufacturing industry1.

1. CIOs and manufacturing leads will have to move even more rapidly

Jacobson says manufacturers have moved heavily toward individualization and mass customization as part of the luxury of connected products. But in order to enable that, you have to maintain alignment with supply management, logistics functions and partners to make sure all service levels are maintained: “I have to have knowledge of my processes and optimization of my processes at a hyper level, not just simply understanding at week’s end or at the end of the shift where I need to make adjustments and improve,” Jacobson said.

2. Security must be reimagined

A connected enterprise means that you can no longer simply physically secure the facility but should blend approaches of mobile and cloud-based architectures with industrial, control and automation, ensuring information is being managed. Jacobson says the challenge will be to merge the skills of engineers and process control teams with IT and more importantly, unify their disparate approaches to security.

3. IoT will create more visibility in process performance

There’s always been a form of automation and control in manufacturing, but implementing new business applications powered by IoT will allow you to connect devices to the factory network and know tolerances: “Being able to connect those dots and derive contexts of how processes are performing is absolutely going to be where the return on investment is coming from,” Jacobson said.

4. Predictive maintenance can generate revenue for OEMs

Asset performance management is of high value today. This is the ability to drive availability, minimize costs and reduce operational risks by capturing and analyzing data. Original Equipment Manufacturers (OEMs) have already started creating revenue by using IoT-enabled tools like predictive maintenance in order to guarantee uptime, outcomes and certain levels of performance for the customer: “When you guarantee these kinds of outcomes to the customers, you have to look at this from two different perspectives, how I monetize this but also how my customer monetizes this,” Jacobson said.

5. Production will play a new role in the manufacturing value chain

The boundaries between the physical and digital worlds are blurring. Chief Information Officers (CIOs) and manufacturing strategists can use the IoT, big data and cloud to redefine the role production plays in the manufacturing value chain. It no longer has to be restricted to being a cost center, and this has all to do with the new ability to not just accelerate but innovate on the factory floor. It’s the CIO’s challenge to keep pace with these new competitive changes.

Figure 10: Real Time Intelligence on the Shop Floor [2]
Figure 10: Real Time Intelligence on the Shop Floor [2]
In my next blog post, I will continue this discussion on IoT and Manufacturing, giving further use cases, and outlining the building blocks for IoT in Manufacturing.

References:

1: Gartner Best Practices for IoT in Manufacturing

https://www.gartner.com/doc/2899318?ref=AnalystProfile

2: Building Blocks for a Smart Plant

http://www.mbtmag.com/articles/2014/10/manufacturing-transformations-building-blocks-future-smart-plant

Pre Cloud Security Considerations in IoT

Introduction

Over the past decade, hybrid cloud adoption has steadily increased, with closed network becoming less the option of choice. But this comes at a cost to security and trust metrics. As we become more dependent on intelligent devices in our lives, how do we ensure the data that is within the web is not compromised by external threats that could threaten our personal safety?

As the adoption of IoT increases, so does the risk of hackers getting at our personal information. As Alan Webber points out on his RSA blog6, there are three key risk areas or bubbles that companies need to be aware of.

1: Fully enabled Linux/Windows OS systems: This area concerns itself with those devices that are not part of a normal IT infrastructure, but are still run on full operating systems, such as Linux or Windows. As everyone knows, prior to IoT, these OS have vulnerabilities, and when they are deployed in the “free world”, they are not as visible to IT admins.

2: Building Management Systems (BMS): This pertains to infrastructure systems that assist in the management of buildings, such as fire detection, suppression, physical security systems and more. These are not usually classified as threatened, yet shutting down a fire escape alarm system could lead to a break-in scenario.

3: Industry Specific Devices: This area covers devices that assist a particular industry, such as manufacturing, navigation, or supply chain management systems. For example, in the case of a supply chain management system, route and departure times for shipments can be intercepted, which could lead to shipment intercept and reroute to another geographical location.

So, how do we guard against these types of risks, and make the devices themselves and also the web of connected devices less dumb? Security must be looked at holistically to begin with, with end to end security systems being employed to ensure system level safety, and to work on device level embedded control software to ensure data integrity from edge to cloud.

Data routing must also be taken seriously from a security standpoint. For example, smart meters generally do not push their data to a gateway continuously, but send it to a data collection hub, before sending it in a single bulk packet to the gateway. Whilst the gateway might have an acceptable security policy, what about the data collection hub? This raises a major challenge, as how does one micro manage all the various security systems their data might migrate across?

Security Design Considerations

Early stage IoT devices unfortunately had the potential loss of physical security in their design, so it is necessary for security officers to be aware of the focus and location of their security provisioning.

To apply security design to the devices is not the most utilized method (similar to internal storage), as the cost and capacity of these devices is counterproductive to same. The devices would look to ensure consistency of communication and message integrity. Usually, one would deploy the more complex security design upfront within the web services that sits in front and interacts with the devices. It is predicted as the devices themselves evolve, and nanotechnology becomes more and more of an enabler in the space, the security design will become closer to the devices, before eventually becoming embedded.

It is proposed that shared cloud based storage will play a pivotal role in combating the data volume perplexity, but not without its issues. How do we handle identification and authentication? How do we ensure adequate data governance? Partnerships will be necessary between security officers and cloud providers to ensure these questions are answered.

Searching for the holy grail of 100% threat avoidance is impossible, given the number of players in an entire IoT ecosystem. Whilst cloud service providers own their own infrastructure, it is very difficult for them to know if the data that is received has not being compromised. There are ways to reduce this, but using metadata and building “smarts” into the data from typical known sets as it transitions from edge to cloud. It seems like an approach of something equivalent to a nightclub security guard checking potential clients to their nightclub is a useful analogy. “Whats your name (what type of data are you), where have you been tonight (whats your migration path), how many drinks have you had ( what transactions happened on your data).!!

IoT Security and Chip Design

One area that could bring about increased data privacy is the increased usage of the concept of “Trusted Execution Environments” or TEEs, which is a secure area in the main processor of the device. This ensures that independent processing can occur on critical data within the silicon itself. This enables trusted applications to run to enforce confidentiality and integrity, and protect against unauthorized cloning or object impersonation by remove and replace. Taking it into a real world example, a home owner tampering with their smart meter to reduce their energy bill would be one scenario that would be avoided with TEEs.

If cloud services companies can somehow increase their influence on the IoT device design (outside of the popularity of TEE’s in cellular applications). then utilizing technology such as this will ensure less risk once the data reaches the cloud. Collaboration efforts should be increased between all parties to ensure best practice across the entire IoT landscape can be established.

Figure 1. Generalized framework for a secure SoC
Figure 1. Generalized framework for a secure SoC [7]
References:

6 RSA RISKS of IOT

https://blogs.rsa.com/3-key-risk-areas-internet-things/

7: EDN SOC TE

http://www.edn.com/design/systems-design/4402964/2/Using-virtualization-to-implement-a-scalable-trusted-execution-environment-in-secure-SoCs

IoT meets Data Intelligence: Instant Chemistry

Even in the ideal world of a perfect network topology, a web of sensors, a security profile, a suitable data center design, and lots of applications for processing and analyzing, one thing is constant across all of these, the data itself. Data science is well talked about, and careers have been built from the concept. It is normally aimed at the low hanging fruit of a set of data, things that are easily measured. Science will take you so far, but it is data intelligence that will show the true value, with capability to predict impact from actions, and track this over time, to build modelling engines to solve future problems.

Even the data set is different for data intelligence as opposed to data science, which relies on lots and lots of data sets (Facebook, working out effectiveness of their changes/features etc). It is more complex, smaller even, and can be a data set contained in a single process or building.  Imagine a hospital’s set of machines producing live data to an analytics engine, and using historical models to compare live data to gauge risk to the patients? It can have real tangible benefit to life quality. Commonly called “Operational Intelligence”, the idea is to apply real time analytics to live data with very low latency. It’s all about creating that complete picture: historical data and models working with live data to provide a solution that can potentially transform all kinds of industry.

At the core of any system of this kind is decision making. Again, one must strive to make this as intelligent as possible. There are two types of decision making. The first is stagnant decision making and the second is dynamic decision making. With the assistance of mathematical models and algorithms, it will be possible for any IoT data set to analyze the further implications of alternative actions. As such, one would predict that efficiency of decision making would be increased.

At the IoT device level, there is scope to apply such a solution. Given the limited storage capacity on the devices themselves, a form of rolling deterministic algorithm that looks to analyse a set of sensor readings, and produce an output of whether or not to send a particular measurement to the intelligent gateway or cloud service.

Another proposed implementation on-device might be to use a deviation from correctness model, such as the Mahalanobis-Taguchi Method, which is an information pattern technology, which has been used in different diagnostic applications to help in making quantitative decisions by constructing a multivariate measurement scale using data analytic methods. In the MTS approach, Mahalanobis distance (MD, a multivariate measure) is used to measure the degree of abnormality of patterns and principles of Taguchi methods are used to evaluate accuracy of predictions based on the scale constructed. The advantage of MD is that it considers correlations between the variables, which are essential in pattern analysis. Given that it can be used on a relatively small data set, with the greater the number of historical samples the greater the model to compare it to, it could be utilized in the example of hospital diagnosis. Perhaps the clinician might need a quick on-device prediction around a patient’s measurement closeness to a sample set of recent hospital measurements?

Taking this one stage further, if we expanded this to multiple hospitals, could we start to think about creating linked data sets, that would be pooled together to extract intelligence. What if a weather storm is coming? Will it affect my town or house? Imagine if we could have sensors on each house, tracking the storm in real time and try to predict the trajectory and track direction changes and the service could then communicate directly with the home owners in the path.

With the premise of open source software, consider now the concept of open data sets, linked or not. Imagine if I was the CEO of a major company in oil and gas, and I was eager to learn from other companies in my sector, and in reverse allow them to learn from us through data sets. By tagging data by type (financial, statistical, online statistical, manufacturing, sales, for example) it allows a metadata search engine to be created, which can be then be used to gain industry wide insight at the click of a mouse. The tagging is critical, as the data is not then simply a format, but descriptive also.

Case Study: Waylay IoT and Artificial Intelligence11

Waylay, an online cloud native rules engine for any OEM maker, integrator or vendor of smart connected devices, proposes a strong link11 between IoT and Artificial Intelligence.

Waylay proposes a central concept for AI, called the rational agent. By definition, an agent is something that perceives its environment through sensors and acts accordingly via actuators. An example of this is a robot utilizes camera and sensor technology and performs an action i.e. “Move” depending on its immediate environment. (See figure 8 on next page).

To extend the role of an agent, a rational agent then does the right thing. The right thing might depend on what has happened and what is currently happening in the environment.

Figure 8: Agent and Environment Diagram for AI [11]
Figure 8: Agent and Environment Diagram for AI [11]
Typically, Waylay outlines that an agent consists of an architecture and logic. The architecture allows it to ingest sensor data, run the logic on the data and act upon the outcome.

Waylay has developed a cloud-based agent architecture that observes the environment via software-defined sensors and acts on its environment through software-defined actuators rather than physical devices. A software-defined-sensor can correspond not only to a physical sensor but can also represent social media data, location data, generic API information, etc.

Figure 9: Waylay Cloud Platform and Environment Design [11]
Figure 9: Waylay Cloud Platform and Environment Design [11]
For the logic, Waylay has chosen graph modeling technology, namely Bayesian networks, as the core logical component. Graph modeling is a powerful technology that provides flexibility to match the environmental conditions observed in IoT. Waylay exposes the complete agent as a Representational State Transfer (REST) service, which means the agent, sensors and actuators can be controlled from the outside, and the intelligent agent can be integrated as part of a bigger solution.

In summary, Waylay has developed a real-time decision making service for IoT applications. It is based on powerful artificial intelligence technology and its API-driven architecture makes it compatible with modern SaaS development practices.

End of Case Study 

Reference:

11: Waylay: Case study AI and IoT

http://www.waylay.io/when-iot-meets-artificial-intelligence/

Why IoT needs Software Defined Networking (SDN)

Software defined networking (SDN), with its ability to intelligently route traffic and take advantage of underutilized network resources will help stop the data flood of IoT. Cisco has a pretty aggressive IoT strategy, and they place their application centric infrastructure version of SDN at the centre of this. And it makes sense. Software is still the main ingredient that can be used to combat network bandwidth challenges.

Lori MacVittie8 agrees with SDN being a critical enabler, but only if SDN considers all of the network layers from 2 to 7, and not just stateless 2-4. “Moving packets around optimally isn’t easy in a fixed and largely manually driven network. That’s why SDN is increasingly important when data volumes increase and do so in predictable waves. SDN can provide the means to automatically shift the load either in response or, optimally, in anticipation of those peak waves.”

The network challenges in IoT do not stop at bandwidth and volumes of data. Applications will be required to deal with the peak loads of data, so services will be required in layers 4-7 that provide for scale, security and performance of those apps.

Figure 5: Stateless vs Stateful in SDN Application Services [8]

SDN has features that will also be particularly useful. Dynamic load management should allow users to monitor and orchestrate bandwidth automatically on the fly, which will be music to the ears of global IoT providers. Service chaining will enable application specific processing procedures in a sequence fashion to a client’s job. This should ease management overhead in IoT services, as the subscriptions increase globally. One of the coolest features of SDN is bandwidth calendaring which will allow the user to schedule the traffic an application will need at a given time, and when you think of a sensor only wanting to communicate at periodic times, it is apparent that this will be a great asset.

But this cannot happen soon. Data center managers will have to modernize their infrastructures. Once they do, a potential big win would be the ability to create numerous virtual and private networks on top of a single physical network. This would be a big advantage as multiple customers could then share a single network, without risk for their applications and data. However, for this to work, one would need the entire network to be SDN enabled.

When one considers the concept of Network Functional Virtualization (NFV), this path can be traversed quicker. With NFV ready networks, carriers can create services in software, rather than dedicated hardware, essentially allowing virtualized servers to allow these new services. This enables business transformation by moving away from having multiple isolated networks, and one would work with an open ecosystem, a set of virtualized network functions, and most importantly an orchestration layer. This will allow businesses to accelerate with agility in the face of device quantity explosion.

Reference:

8: Dev Central: SDN and IoT article

https://devcentral.f5.com/articles/sdn-is-important-to-iot-if-it-covers-the-entire-network

Considerations of Change: An Intro to Networking in IoT

One of the major consequences of Moore’s Law for silicon is that pretty much any device now can have a reasonable level of computing power and internet connectivity. Because of this, the number of internet enabled devices is increasing, thus causing a huge influx of IoT traffic; it is predicted that WAN bandwidth will need to be increased.

When one considers the types of data that will be generated, it becomes clear that they both present challenges. George Crump, an analyst with Storage Switzerland points this out7. “First, there is large-file data, such as images and videos captured from smartphones and other devices. This data type is typically accessed sequentially,” explains Crump. “The second data type is very small, for example, log-file data captured from sensors. These sensors, while small in size, can create billions of files that must be accessed randomly.”

From this, it is clear that data centers will need to handle both types of data, and the storage and processing requirements that come with them.

For decades, the network was considered to be the plumbing of a company’s IT solutions, and was considered a somewhat dumber element of the design. With the advent of IoT, it is clear that the networking element of the IoT ecosystem is slightly lagging behind, which is a concern as IoT is very much a network centric technology, and in essence makes the web by which the sensors communicate to the host and to each other. There are a number of ways for these devices to be networked. Some devices can be directly connected to the internet utilizing standard Ethernet or Wifi, which are TCP/IP based. There are other wireless technologies, some of which are dependent on TCP/IP, but all require some sort of intelligent gateway to convert their network into standard Ethernet or Wifi. These include, but are not limited to, Zig Bee, Z-Wave, Bluetooth, and Cellular.

Evolution towards IPv6 

Due to the advancement of object gateways, the first two stages of the IoT roadmap will sit on current infrastructure and protocols. Once the volume of devices and data increases and true IoT is in motion, the IPv6 protocol will be required, which offers unlimited IP addresses.

The main challenge that IPv6 looks to overcome is the large packet size when we consider standard IP protocols. For IPv6, the packet size is reduced by making a number of changes to the release of the 6LoWPAN standard, namely RFC 4944. Changes included the compression of IP headers and the introduction of a fragmentation mechanism that enabled reassembly of IP packets that did not fit the IEEE 802 packet. Lastly, routing protocols for lossy, low power networks were required. New protocols were developed by the Internet Engineering Task Force (IETF) that provided basic routing in low power lossy networks.

In my next blog post, i will continue to write about network enablement requirements, talking about why IoT needs “Software Defined Networking” (SDN)

Reference:

7: Orange Business: Can your business handle IoT

http://www.orange-business.com/en/blogs/connecting-technology/data-centers-virtualisation/can-your-data-center-handle-the-internet-of-things

IoT and Classical Business Models

Many companies, especially in the Information Technology (IT) section are aware of the IoT explosion, one of the biggest challenges facing any company is how they prepare for the change that will result from the increased business impact that IoT will present.

With figures in the trillions in terms of the market for IoT, how do companies ensure they can get a slice of the pie? If they currently do not sit within the relevant market segment, analysis will be required to determine if it can be an opportunity or a threat to their business as a whole.

IDC in 20143 predicted that IoT will actually overtake the Information Communication Technology (ICT) over time. It predicts IoT will grow 12% year on year, whilst classical ICT will grow just 4%. Figure 3 below illustrates this.

Figure 3: IDC Prediction of IoT vs ICT [3]

Considering that most business is consistently monitoring the bottom line, it is not only the opportunities that it will present, but how it will impact how we work. With limitless numbers of sensors monitoring processes, improving business energy efficiency, enabling new ways of working in teams, business will need to be more open to change, and more dauntingly, open to the elements of a “big brother” type scenario.

There are trends that are ensuring an evolution of business practice as we know it. Normally, new technology platforms impact on a single strand, with the exception of the impact of the internet. But IoT has the potential to become an entire business ecosystem, where creating and capturing business value will be paramount. However, this is not a straightforward suggestion. Barriers to this include the current early position of IoT in its lifecycle, and the sheer volume and types of devices to be considered. From an ecosystem perspective, by nature it would indicate a seamless quantity of micro-systems working together in a self-sustaining fashion. Trying to estimate what this will mean for IoT is still not clear.

Consider the classical technology adoption lifecycle. There are five types of innovation adopters, the first being the innovators themselves. The list is completed, in sequence by early adopters, early majority, late majority and laggards. With the current immaturity in IoT, and the lack of clarity in the various emerging technologies, the challenge for business is to try to advance the early adopters to early majority, so the business needs to be able to scale. The early adopters are less fussy when it comes to product design, but once the number of adopter’s increases later in the life cycle, the early majority will want polished product offerings, with appropriate services.

With the IoT still in its relevant infancy, it is appropriate to compare it to the early stages of the Internet. When we look at the recent business ecosystems that have been spun out of the Internet for EMC, such as Pivotal Cloud Foundry, one would postulate about future ecosystems opportunities for EMC from the IoT spectrum.

Another important consideration for companies is to consider the skill-sets and people that are required to drive their Big Data strategy as a result of their growing IoT ecosystem. A key tenant for this will be the data itself, and in the February IT@Cork Tech Talk by my EMC colleague Steve Todd, and even more recently in his blog on data value (value was something I had never associated to data until this talk), Steve spoke to the importance for major companies to begin to consider a more structured approach to their employees that are involved in data set discovery, identification and migration (Data Architect) and also a Chief Data Officer to represent the company from a data perspective. Interestingly, my role in EMC changed last year, to the role of a Data Architect. So I could first hand relate to this. When faced with a business challenge in big data, 5 steps that can be critical to success are as follows.

1: Demystify and then map the current devices, tools, processes and trajectory of data across the business unit or company (AS-IS Diagram)

2: Scour the company and external sources for any technologies that can enable a more scalable and clearer approach

3: Look to centralize data storage, to allow the company to focus on being agile and scalable, and also remove duplicate data (concept of a Business Data Lake)

4: Develop an ingestion framework to ensure the data lake has a sufficient landing platform for data.

5: Build the Analytic’s platform that is pointed at the centralized “Business Data Lake” to meet existing and future needs of the business.

When we apply this to IoT, we start to that every company, no matter how small, will begin to generate huge data-sets, and there will be a new skillet needed at companies that never had previously to ensure they can gain as much insight from the data sets. Sure, there are companies that can provide these solutions, but realistically, the future state will surely be to have these as core skills, just as “internet skills” once appeared on resumes?!

It is proposed here that key stakeholders across multinationals can overcome these challenges and design practical IoT business models if they consider an ecosystem style approach, instead of looking at modular needs of individual business units. This will allow the business to get a high level perspective of where IoT can bring value to their business offerings.

Reference:

3: Digital Universe Article

http://www.emc.com/leadership/digital-universe/2014iview/internet-of-things.htm

An IoT Data Flood. Are we ready? (Intro)

A flood of 50 billion pieces. That’s the predicted number of internet enabled devices that will span our globe in 2020 to create the expanding Internet of Things (IoT). And it is a conservative estimate, when you consider other types of technologies that could be enablers, namely Near Field Communication (NFC) and Radio Frequency Identification (RFID). The speed of internet connectivity in the future is likely to hyperscale, and makes Moore’s Law, which we successfully navigated, seem tortoise like.

So what will all this mean? Crop fields will be smart. Crime will diminish. Stagnant business models will become fluid. Heck, we may even predict the weather. But one thing will remain: The need to collect, store and analyze this data. The coming years will see a dramatic and disruptive innovation of the classical data center model as we know it. It will not be practical for all these remotely distributed devices to transfer their data to centralized data centers. In recent years, data center consolidation has accelerated, yet it does not fit well with IoT. It is proposed in this article that a single person’s life and home of tomorrow will generate more data than the industrial plant of today.

Whilst estimating the impact of the Internet of Things (IoT) over the coming decade would be difficult to do accurately, one thing is apparent. It is going to be a game changer. With the number of devices , the ability to connect, communicate and remotely manage these automated devices is becoming an enabler, from the parking lot to the factory floor to the homes we live in.

Figure 1: Explosion Potential of IoT [1]

A critical enabler for IoT longer term is the concept of smart cities, where both human centric wearables and machine sensors will work together to make the cities of tomorrow more efficient, secure and safe. By 2050, it is predicted that two thirds of the world population will live in cities. This migration naturally represents great challenges especially in healthcare, security and energy use.

Sogeti2, a global collection of over 120 technologists, makes an excellent association between smart cities SMACT (Social, Mobile, Analytics, Cloud and Things) and the concept of a platform. The City as a Platform is twofold: it is the infrastructural capacity plus the human dimension, the empowerment of behavior via data and applications. It shows that the digital architecture of a city is beginning to look like a platform with various abstraction layers that support one another. There are 11 scenarios in which a city can become smarter: waste, healthcare, grids, retailing, supply chains, tourism, e-government, smart meters, food, traffic and logistics management.

Figure 2 : Smart City as a Platform Illustration [2]

From Figure 2 above, the top shows the activities of everyday life, with citizens, students, consumers and commuters. Below this is an abstraction layer containing technology such as an Application Programming Interface (API). Streets become smart if we can link camera systems with facial recognition technology. As you traverse the layers, you will notice common elements of any platform, with communication and/or collaboration between these layers. These are already in action, apps like Air B&B and Halo/Uber show that smartness in applications can make cities more efficient in regard to transport and space respectively.

Many people own internet connected devices, such as their smart phone, laptop or smart TV, but this is the beginning of an age where technical advances and cost reductions mean elements such as baby monitors, fridges, temperature sensors, in-home heating and lighting will all be connected. The list of devices is growing all the time. But if we stop and think about what these devices mean for the classical data center model, it soon becomes apparent that this deluge or flood of data will impact data storage, processing and analytic platforms that we use today.

The strain is evident already, and that is with the devices that we control (laptops and phones generating data by surfing the net for example). It’s still just two devices per day per person. Imagine if that number increases over 50? Imagine the data load and bandwidth implications when those devices are sending data regularly? Then consider an entire city of people with the same level of internet connected devices, leading to billions of devices generating vast quantities of data which must be processed and stored. Understanding this impact is important if you are to ensure that your infrastructure is correctly designed to support an IoT strategy that your organization will need to remain competitive in the coming decades.

In my next blog post, I will explore the impact of IoT on Classical Business Models. Stay tuned!

References:

1: Connectivist Chart on IoT Growth

http://www.theconnectivist.com/2014/05/infographic-the-growth-of-the-internet-of-things/

2: Sogeti Labs: City as a Platform Article

http://labs.sogeti.com/internet-things-cities-platform/