it@Cork European Technology Summit 2015 – a WOW event!

I wanted to change direction slightly and give an update on an event I had the privilege of being involved with this week, the it@Cork European Technology Summit. The event was held at Cork City Hall, on Wednesday May 5th, with a full day technology summit, followed by a black tie dinner with 3d wearables fashion show.

An epic journey over the past few months, with way more ups than downs resulted in…

1 Day – 4 Sections – 20 speakers – 4 Chair Speakers – 400+ Day attendees – #1 Trending on Twitter – 9 Amazing artisan food stalls – Lots of Sponsors – 200+ Night Attendees – 2 Fashion Designers – 1 Model Agency – 10 Models – 2 Fire Dancers – 4 3D printed bow ties !

So how did I arrive there? Last year, Gillian Bergin from EMC asked me to get involved with it@Cork, as part of the Tech Talk committee. I’m delighted she did, as over the past few months, I got to partake in and help organise some excellent tech talks from a variety of people, including my fellow technologist, Mr Steve Todd of EMC. The tech talk series is just one of many successful strands of it@Cork, holding six high end, rockstars speakers/panels per year. The series is full up until 2016, but if you are a “rockstar” speaker interested in speaking, please contact us directly. From this, James O’Connell of VMWare who passed over the tech talk committee chair to Barry O’Connell, took on chair of the Summit Organising committee. James, coupled with myself and Paddy O’Connell of Berkley Group, (known collectively now as the Macroom or Muskerry Mafia 🙂 ) assisted Sarah Walsh of it@Cork in organising the day summit. The night summit was excellently organised by Marnie O’Leary Daly of VMWare.

The event was kicked off by James, and then Ronan Murphy, chairman of the board it@Cork, CEO Smarttech gave an address that spoke about how Cork needs a cluster manager to help drive more employment in the region. More from Ronan here by the Examiner.  Donal Cahalane, from Teamwork.com, gave an insightful talk on how he saw the industry progressing, with some excellent advice for everyone from startups through to multinationals .


The four sections throughout the day offered a mix balance between raw technology (Cloud- challenge the fear, Internet of Everything) along with Digital Marketing and a Tech Talent/ Diversity panel. I found this to work quite well, as it ensured the audience got a variety of speakers.

The cloud session on “challenging the fear” was an excellent one to start with, as it had a mix of SME’s from companies such as Kingspan (John Shaw), Trend Micro (Simon Walsh) and Barricade (David Coallier), but also had representation from the legal profession, in the form of Michael Valley, Barrister and Noel Doherty – Solicitor who spoke at length on cloud governance. This session was chaired by Anton Savage of The Communications Clinic, who hosted a panel discussion with all five presenters at the end.


All of the sections were split by networking opportunities in the exhibition halls, where companies from the region presented their organisations, and some even demonstrated their wares. The athmosphere was great to see with lots of chatter, tweeting and drinking of coffee! 😀


The second section was a panel session on Tech Talent, the chair being Paddy O’Connell from Berkely, and the facilitators were Meghan M Biro, founder and CEO of TalentCulture, and Kevin Grossman, who co founded and co hosts the the TalentCulture #TChat show with Meghan. They later presented their TChat show live from the Clarion hotel Cork. It was awesome!

Such variety (no pun intended!) in the panel, with David Parry Jones, VP UKI VMWare and Noelle Burke Head of HR Microsoft Ireland representing industry, Michael Loftus – Head of Faculty of Engineering and Science CIT representing academia, and the hugely impressive student Ciara Judge, one of the Kinsale winners of the 2013 Google Science Award. Everyone inspired in their own way, and the dynamic at lunchtime was one of motivation, hope and leadership.


Having started my own personal digital marketing brand last year, and learning by making mistakes, I was exceptionally excited by our third section – Digital Marketing. Again, Anton did an incredible job of asking the right questions, and effortless listenership followed. To listen to experts such as Meghan, Antonio Santos, Niall Harbison and Raluca Saceanu was a privilege, and I also got the opportunity to speak with the directly (as did many others). This was true of all the speakers throughout the day. I believe a huge number of people got lots of what I call “advice snippets” that they can take away and grow their own brand.


The last session was on an area close to my heart, the Internet of everything (IoE), and I had the privilege of chairing the session. We had speakers from Climote (Derek Roddy), my future employer Tyco (Craig Trivelpiece), Salesforce (Carl Dempsey), Dell (Marc Flanagan) and Xanadu (David Mills). All these companies are in different stages on their IoE journey, but the message was consistent: IoE is going to make a huge impact on our smart futures. I really like how Craig spoke of “if you want to improve something, measure it”  and how Tyco are looking at predictive maintenance and pushing intelligence/insight back out to the devices. Derek showed how Climote is changing how we live, David did the same in relation to sport. Marc gave an excellent account of Dells practical approach to IOT, showing the capabilities needed for IoE projects. Carl got me really excited about Salesforce’ plans in the IoE space. The session really closed out the event well, and the numbers in attendance stayed consistent.

Having attended a huge number of tech events over the years, it was great to see again, year on year growth of Munsters premier Technology Summit. The athmosphere was electric all day, both locally and on Twitter. The tweet wall was a big success, and we expect that next years event will be bigger and better again.


The black tie dinner was also a huge success, with the Millenium Hall in City Hall packed to capacity. Marnie O’Leary Daly, along with Emer from Lockdown model agency, put on an amazing dinner (superb catering by Brooks) and fashion show, with 3D wearables fashion provided by Aoibheann Daly from LoveandRobots and Rachael Garrett from Limerick School of Art and Design (@LSAD). Special mention to FabLab also for helping Rachael get her garments ready. It really was a spectacular evening. The Clarion hotel was also hugely supportive of the night element. (Photos to follow!) Emer will also blog on the night event fashion soon and do a much better job than me!

It@Cork European Technology Summit 2016. Watch this space. 

If you are interested in getting involved in 2016, please contact Sarah Walsh at it@Cork.

Case Study: IoT Technology Platform – ThingWorx [10]

In my previous blog, I mentioned some platform design considerations at the outset. In this blog, I discuss one such Platform that has gained significant traction in the industry in recent times.

About ThingWorx10

ThingWorx is one of the first software platforms designed to build and run the applications of the connected IoT world. ThingWorx reduces the cost, time, and risk required to build innovative Machine-to-Machine (M2M) and Internet of Things (IoT) applications.

The ThingWorx platform provides a complete application design, runtime, and intelligence environment with the below features:

  • Modern and Complete Platform
  • Mashup People, Systems & Machines
  • Deploy 10X Faster with Model-based Development
  • Deploy How You Like
  • Evolve & Grow Your Application Over Time

What ThingWorx does that was really clever was that they created a modelling environment based on a database of graphs that keeps track of thousands of devices that communicate with other devices and applications.

“There’s nothing new about gathering and using data to make something better. What is new, and complex, is getting these things that are now web-enabled to take better advantage of the IoT. This requires application developers to rethink how they collect, analyze, manipulate and interact with information,” said Russ Fadel, CEO, ThingWorx9. “ThingWorx is the first software platform on the market designed to build and run applications in the connected IoT world and offers a fully integrated and pre-architected solution that covers connectivity, event processing, analytics, storage and presentation of any kind of M2M and IoT data. Our goal is to provide customers with instant insight into collected data from these smart, connected things so they can be proactive and address issues before they happen in a smarter way than previously able.”10

Figure 7: ThingWorx Architecture [10]
Figure 7: ThingWorx Architecture [10]

Features10

ThingWorx Composer™

ThingWorx Composer is an end-to-end application modeling environment designed to help you easily build the unique applications of today’s connected world. Composer makes it easy to model the Things, Business Logic, Visualization, Data Storage, Collaboration, and Security required for a connected application.

Codeless Mashup Builder

ThingWorx “drag and drop” Mashup Builder empowers developers and business users to rapidly create rich, interactive applications, real-time dashboards, collaborative workspaces, and mobile interfaces without the need for coding. This next-generation application builder reduces development time and produces high quality, scalable connected applications which allows companies to accelerate the pace at which they can deliver value-add solutions, resulting in greater market share against new and existing competitors.

Event-Driven Execution and “3D” Storage

ThingWorx’s event-driven execution engine and 3-Dimensional storage allows companies to make business sense of the massive amounts of data from their people, systems, and connected “Things” – making the data useful and actionable. The platform supports scale requirements for millions of devices, and provides connectivity, storage, analysis, execution, and collaboration capabilities required for applications in today’s connected world. It also features a data collection engine that provides unified, semantic storage for time-series, structured, and social data at rates 10X faster than traditional RDBs.

Search-based Intelligence

ThingWorx SQUEAL™ (Search, Query, and Analysis) brings Search to the world of connected devices and distributed data. With SQUEAL’s interactive search capabilities, users can now correlate data that delivers answers to key business questions. Pertinent and related collaboration data, line-of-business system records, and equipment data get returned in a single search, speeding problem resolution and enabling innovation.

Collaboration

ThingWorx dynamically and virtually brings together people, systems, and connected equipment, and utilizes live collaboration sessions that help individuals or teams solve problems faster. The ThingWorx data store becomes the basis of context aware collaboration and interaction among the systems users, further enhancing its value. Additionally, the tribal knowledge exposed during the process is automatically captured and indexed for use in future troubleshooting activities.

End of Case Study

References 

10: ThingWorx: About ThingWorx

http://www.thingworx.com/

Platform Architecture Pre Considerations for IoT

Apart from the sheer volume of data generated by IoT devices, there are also a huge number of different data customers requirements, both known and unknown that will need to be considered. In this regard, the platform technology will need to be agile enough to meet this variation. How will this scale both horizontally and vertically to ensure sustainability? I started to think of profiling requirements, and looking to give personality to the IoT customer type, so that the platform can morph and adjust itself based on not only the inputs (data type, frequency, format, lifetime), but also what outputs it needs to provide.

Data latency will also be a requirement that any platform will need to firstly understand, and then address, depending on the application and customer requirements. In an interesting discussion today in Silicon Valley with Jeff Davis (my original hiring manager in EMC, and now senior director of the xGMO group looking at operations cloud, analytics and infrastructure services ), he mentioned having worked in a previous company in the sensor business, latency represented a huge challenge, especially when the amount of data grew exponentially. We chatted more and more about how the consumer of now wants their devices/ technology interactions to be instant. How long will people be willing to wait for smart light bulbs/ switches? What if my devices are distributed? More importantly, Jeff outlined a key question. “How much are the consumer willing to pay for the added services provided by adding “smarts” to standard everyday sensors”? This is a “understand the market” question, and should be a consideration for anyone looking at building an IoT platform.

When one starts to consider that most applications in the IoT space might require more than one industry working together, cross collaboration is key to making it work. Consider some of the taxi apps in use currently, whereby the taxi company provides the car locations, the application needs to offer information on locations, then the banking is used to pay for it from your account, and perhaps there is advertisement shown on your receipt, if a suitable arrangement is not formed between the various It companies, it becomes too easy for the “blame game” to ruin the user’s experience of the application when something goes wrong.

Central to the satisfying both the varying requirements of the customers and latency management will be the concept of a customer or business data lake, powered by Hadoop or Spark technology, will form the primary storage and processing in the data center. There is also an option to look at tiering to help address the variation in requirements for the platform, with the possibility to send the “big hitting data”, which brings the most value in close to real time, to an in memory database, to provide fast cache insightful analytics. In a later blog post, I will elaborate greatly on this paragraph, so stay tuned. If the same dataset can be used by multiple applications, in a multi-tenant schema, then there will be clear orchestration challenges in ensuring that this data can be processed in real time.  Other features of any data architecture for IoT could also include:

  • Multiple Data Format Support
  • Real Time Processing
  • High Volume Data Transfer
  • Geographically Agnostic
  • Data Lake Archival and Snipping

As with all technology, IoT will evolve, which means that we will build on top of previous technologies, and new technologies will add to the ecosystem. The enterprise data warehouse will continue to play an important role, but a series of technology platforms will be necessary. While numerous platforms have and will be created, one such platform, ThingWorx is the subject of case study in my next blog.

IoT Impact on the Manufacturing Industry (Part 2)

Continuing on from my last blog post, another example for IoT use in manufacturing would be for the asset management to distribute work orders and configurations to the tools or the different stages of production. And vice versa, calibration information can be fed back to the Enterprise Resource Planning (ERP) system to associate them to the bill of material (BOM). Big data and NoSQL technology is an enabler in this regard, as they can allow for the management of huge volumes of heterogeneous, multi structured data about the production process, from the data types discussed, to even images from AOI (Automated Optical Inspection) systems and other production modules. With recalls a concern point in global manufacturing, this can be an ally in the fight to keep costs down for manufacturing.

IoT can also have an impact is in intelligent edge devices and their use in improving supply chain optimization and modularity of manufacturing. Consider surface mount technology (SMT), where there is so many moving parts, calibration, types of technology used in the placement and verification of board level components. IoT sensors could be utilized to centralize SMT line asset management and to read calibration information via the factory WLAN. The asset management can form the link between the SMT tools and the ERP (Enterprise Resource Planning) and MES (Manufacturing Execution Systems) that oversee the manufacturing process.

A challenge that presents itself to the manufacturing industry is the ageing workforce, and this means that anything that speeds up the manufacturing process is critical. The advancement in mobile technology is a key enabler in ensuring that passing information to the shop floor becomes quicker, improving response time, visibility, and accessibility of operations. The recent advancement of wearables also will have an impact on enhanced visibility on the shop floor.

Building Blocks for IoT in Manufacturing

Business owners need to look at four technology elements that provide the foundation for smart manufacturing. These include (but not limited to):

  • Security: IT security is a major obstacle to setting up smart factories. Operations managers need to make sure that necessary safeguards are built into the solution including security procedures such as physical building security, hardware encryption and network security for data in transit. Security and networking solutions must also be engineered to withstand harsh environmental conditions, such as moisture and temperature, that aren’t present in typical networks. Identity and authentication structures will also need to be updated to support such “things” as well as people.
  • More Advanced Networking: Smarter manufacturing environments need a standardized IP-centric network that will enable all the devices/sensors in a plant to communicate to enterprise business systems. Cisco research states that only 4 percent of the devices on the manufacturing floor are connected to a network. A standard IP network also makes it easier to connect and collaborate with suppliers and customers to improve supply chain visibility. Manufacturers need robust networks that can cope with Radio Frequency (RF) challenges in the plant, harsher environmental conditions and need stability for transmission of alarms and real-time data processing.
  • Big Data Analytics: While manufacturers have been generating big data for numerous years, companies have had limited ability to store, analyze and effectively use all the data that was available to them, especially in real time. New big data processing tools are enabling real-time data stream analysis that can provide dramatic improvements in real time problem solving and cost avoidance. Big data and analytics will be the foundation for areas such as forecasting, proactive maintenance and automation.
  • Engineering Software Systems: Today’s IoT data is different than the data we use to operate our systems. It requires collecting a wide range of data from a variety of sensors. These software systems and models must translate information from the physical world into actionable insight that can be used by humans and machines. Toyota is using Rockwell’s software for real time error corrections in the plant. Toyota has minimized rework and scrap rates in its Alabama plant, which has resulted in an annual cost saving of $550,000.3

Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
With IoT, IP networks and analytics, manufacturers can become more efficient, improve worker safety and offer new exciting business models. IoT will help manufacturers improve resource efficiency, safety and return on assets. Manufacturers that master this new dynamic will have a variety of new opportunities for revenue growth and cost savings.

References

3: How IoT will help manufacturing

http://www.industryweek.com/blog/how-will-internet-things-help-manufacturing

4: Industrial Optimization IoT (Intel)

http://www.intel.ie/content/dam/www/public/us/en/documents/white-papers/industrial-optimizing-manufacturing-with-iot-paper.pdf

IoT Impact on the Manufacturing Industry (Part 1)

“Industry 4.0” and “Smart Factory” are some of the terms used to describe the technological and social revolution that promises to change the current industrial landscape. Industry 1.0 was the invention of mechanical assistance, Industry 2.0 was mass production, pioneered by Henry Ford, Industry 3.0 brought electronics and control systems to the shop floor, and Industry 4.0 is peer-to-peer communication between products, systems and machines. It is clear that IoT will have a different impact statement depending on the application and/or industry, one that is of particular interest, given the emphasis on process, is Manufacturing. Compared to other realms such as retail and its intangible ways, manufacturing is about physical objects and how we can bring them to the consumer in a more efficient and automated way. The manufacturing landscape is ever changing, with automation through robotics the most recent enabler.

Challenges and Possibilities of IoT and Manufacturing 1

Gartner analyst Simon Jacobsen sees five immediate challenges and possibilities posed by the IoT for the manufacturing industry1.

1. CIOs and manufacturing leads will have to move even more rapidly

Jacobson says manufacturers have moved heavily toward individualization and mass customization as part of the luxury of connected products. But in order to enable that, you have to maintain alignment with supply management, logistics functions and partners to make sure all service levels are maintained: “I have to have knowledge of my processes and optimization of my processes at a hyper level, not just simply understanding at week’s end or at the end of the shift where I need to make adjustments and improve,” Jacobson said.

2. Security must be reimagined

A connected enterprise means that you can no longer simply physically secure the facility but should blend approaches of mobile and cloud-based architectures with industrial, control and automation, ensuring information is being managed. Jacobson says the challenge will be to merge the skills of engineers and process control teams with IT and more importantly, unify their disparate approaches to security.

3. IoT will create more visibility in process performance

There’s always been a form of automation and control in manufacturing, but implementing new business applications powered by IoT will allow you to connect devices to the factory network and know tolerances: “Being able to connect those dots and derive contexts of how processes are performing is absolutely going to be where the return on investment is coming from,” Jacobson said.

4. Predictive maintenance can generate revenue for OEMs

Asset performance management is of high value today. This is the ability to drive availability, minimize costs and reduce operational risks by capturing and analyzing data. Original Equipment Manufacturers (OEMs) have already started creating revenue by using IoT-enabled tools like predictive maintenance in order to guarantee uptime, outcomes and certain levels of performance for the customer: “When you guarantee these kinds of outcomes to the customers, you have to look at this from two different perspectives, how I monetize this but also how my customer monetizes this,” Jacobson said.

5. Production will play a new role in the manufacturing value chain

The boundaries between the physical and digital worlds are blurring. Chief Information Officers (CIOs) and manufacturing strategists can use the IoT, big data and cloud to redefine the role production plays in the manufacturing value chain. It no longer has to be restricted to being a cost center, and this has all to do with the new ability to not just accelerate but innovate on the factory floor. It’s the CIO’s challenge to keep pace with these new competitive changes.

Figure 10: Real Time Intelligence on the Shop Floor [2]
Figure 10: Real Time Intelligence on the Shop Floor [2]
In my next blog post, I will continue this discussion on IoT and Manufacturing, giving further use cases, and outlining the building blocks for IoT in Manufacturing.

References:

1: Gartner Best Practices for IoT in Manufacturing

https://www.gartner.com/doc/2899318?ref=AnalystProfile

2: Building Blocks for a Smart Plant

http://www.mbtmag.com/articles/2014/10/manufacturing-transformations-building-blocks-future-smart-plant

Pre Cloud Security Considerations in IoT

Introduction

Over the past decade, hybrid cloud adoption has steadily increased, with closed network becoming less the option of choice. But this comes at a cost to security and trust metrics. As we become more dependent on intelligent devices in our lives, how do we ensure the data that is within the web is not compromised by external threats that could threaten our personal safety?

As the adoption of IoT increases, so does the risk of hackers getting at our personal information. As Alan Webber points out on his RSA blog6, there are three key risk areas or bubbles that companies need to be aware of.

1: Fully enabled Linux/Windows OS systems: This area concerns itself with those devices that are not part of a normal IT infrastructure, but are still run on full operating systems, such as Linux or Windows. As everyone knows, prior to IoT, these OS have vulnerabilities, and when they are deployed in the “free world”, they are not as visible to IT admins.

2: Building Management Systems (BMS): This pertains to infrastructure systems that assist in the management of buildings, such as fire detection, suppression, physical security systems and more. These are not usually classified as threatened, yet shutting down a fire escape alarm system could lead to a break-in scenario.

3: Industry Specific Devices: This area covers devices that assist a particular industry, such as manufacturing, navigation, or supply chain management systems. For example, in the case of a supply chain management system, route and departure times for shipments can be intercepted, which could lead to shipment intercept and reroute to another geographical location.

So, how do we guard against these types of risks, and make the devices themselves and also the web of connected devices less dumb? Security must be looked at holistically to begin with, with end to end security systems being employed to ensure system level safety, and to work on device level embedded control software to ensure data integrity from edge to cloud.

Data routing must also be taken seriously from a security standpoint. For example, smart meters generally do not push their data to a gateway continuously, but send it to a data collection hub, before sending it in a single bulk packet to the gateway. Whilst the gateway might have an acceptable security policy, what about the data collection hub? This raises a major challenge, as how does one micro manage all the various security systems their data might migrate across?

Security Design Considerations

Early stage IoT devices unfortunately had the potential loss of physical security in their design, so it is necessary for security officers to be aware of the focus and location of their security provisioning.

To apply security design to the devices is not the most utilized method (similar to internal storage), as the cost and capacity of these devices is counterproductive to same. The devices would look to ensure consistency of communication and message integrity. Usually, one would deploy the more complex security design upfront within the web services that sits in front and interacts with the devices. It is predicted as the devices themselves evolve, and nanotechnology becomes more and more of an enabler in the space, the security design will become closer to the devices, before eventually becoming embedded.

It is proposed that shared cloud based storage will play a pivotal role in combating the data volume perplexity, but not without its issues. How do we handle identification and authentication? How do we ensure adequate data governance? Partnerships will be necessary between security officers and cloud providers to ensure these questions are answered.

Searching for the holy grail of 100% threat avoidance is impossible, given the number of players in an entire IoT ecosystem. Whilst cloud service providers own their own infrastructure, it is very difficult for them to know if the data that is received has not being compromised. There are ways to reduce this, but using metadata and building “smarts” into the data from typical known sets as it transitions from edge to cloud. It seems like an approach of something equivalent to a nightclub security guard checking potential clients to their nightclub is a useful analogy. “Whats your name (what type of data are you), where have you been tonight (whats your migration path), how many drinks have you had ( what transactions happened on your data).!!

IoT Security and Chip Design

One area that could bring about increased data privacy is the increased usage of the concept of “Trusted Execution Environments” or TEEs, which is a secure area in the main processor of the device. This ensures that independent processing can occur on critical data within the silicon itself. This enables trusted applications to run to enforce confidentiality and integrity, and protect against unauthorized cloning or object impersonation by remove and replace. Taking it into a real world example, a home owner tampering with their smart meter to reduce their energy bill would be one scenario that would be avoided with TEEs.

If cloud services companies can somehow increase their influence on the IoT device design (outside of the popularity of TEE’s in cellular applications). then utilizing technology such as this will ensure less risk once the data reaches the cloud. Collaboration efforts should be increased between all parties to ensure best practice across the entire IoT landscape can be established.

Figure 1. Generalized framework for a secure SoC
Figure 1. Generalized framework for a secure SoC [7]
References:

6 RSA RISKS of IOT

https://blogs.rsa.com/3-key-risk-areas-internet-things/

7: EDN SOC TE

http://www.edn.com/design/systems-design/4402964/2/Using-virtualization-to-implement-a-scalable-trusted-execution-environment-in-secure-SoCs

IoT meets Data Intelligence: Instant Chemistry

Even in the ideal world of a perfect network topology, a web of sensors, a security profile, a suitable data center design, and lots of applications for processing and analyzing, one thing is constant across all of these, the data itself. Data science is well talked about, and careers have been built from the concept. It is normally aimed at the low hanging fruit of a set of data, things that are easily measured. Science will take you so far, but it is data intelligence that will show the true value, with capability to predict impact from actions, and track this over time, to build modelling engines to solve future problems.

Even the data set is different for data intelligence as opposed to data science, which relies on lots and lots of data sets (Facebook, working out effectiveness of their changes/features etc). It is more complex, smaller even, and can be a data set contained in a single process or building.  Imagine a hospital’s set of machines producing live data to an analytics engine, and using historical models to compare live data to gauge risk to the patients? It can have real tangible benefit to life quality. Commonly called “Operational Intelligence”, the idea is to apply real time analytics to live data with very low latency. It’s all about creating that complete picture: historical data and models working with live data to provide a solution that can potentially transform all kinds of industry.

At the core of any system of this kind is decision making. Again, one must strive to make this as intelligent as possible. There are two types of decision making. The first is stagnant decision making and the second is dynamic decision making. With the assistance of mathematical models and algorithms, it will be possible for any IoT data set to analyze the further implications of alternative actions. As such, one would predict that efficiency of decision making would be increased.

At the IoT device level, there is scope to apply such a solution. Given the limited storage capacity on the devices themselves, a form of rolling deterministic algorithm that looks to analyse a set of sensor readings, and produce an output of whether or not to send a particular measurement to the intelligent gateway or cloud service.

Another proposed implementation on-device might be to use a deviation from correctness model, such as the Mahalanobis-Taguchi Method, which is an information pattern technology, which has been used in different diagnostic applications to help in making quantitative decisions by constructing a multivariate measurement scale using data analytic methods. In the MTS approach, Mahalanobis distance (MD, a multivariate measure) is used to measure the degree of abnormality of patterns and principles of Taguchi methods are used to evaluate accuracy of predictions based on the scale constructed. The advantage of MD is that it considers correlations between the variables, which are essential in pattern analysis. Given that it can be used on a relatively small data set, with the greater the number of historical samples the greater the model to compare it to, it could be utilized in the example of hospital diagnosis. Perhaps the clinician might need a quick on-device prediction around a patient’s measurement closeness to a sample set of recent hospital measurements?

Taking this one stage further, if we expanded this to multiple hospitals, could we start to think about creating linked data sets, that would be pooled together to extract intelligence. What if a weather storm is coming? Will it affect my town or house? Imagine if we could have sensors on each house, tracking the storm in real time and try to predict the trajectory and track direction changes and the service could then communicate directly with the home owners in the path.

With the premise of open source software, consider now the concept of open data sets, linked or not. Imagine if I was the CEO of a major company in oil and gas, and I was eager to learn from other companies in my sector, and in reverse allow them to learn from us through data sets. By tagging data by type (financial, statistical, online statistical, manufacturing, sales, for example) it allows a metadata search engine to be created, which can be then be used to gain industry wide insight at the click of a mouse. The tagging is critical, as the data is not then simply a format, but descriptive also.

Case Study: Waylay IoT and Artificial Intelligence11

Waylay, an online cloud native rules engine for any OEM maker, integrator or vendor of smart connected devices, proposes a strong link11 between IoT and Artificial Intelligence.

Waylay proposes a central concept for AI, called the rational agent. By definition, an agent is something that perceives its environment through sensors and acts accordingly via actuators. An example of this is a robot utilizes camera and sensor technology and performs an action i.e. “Move” depending on its immediate environment. (See figure 8 on next page).

To extend the role of an agent, a rational agent then does the right thing. The right thing might depend on what has happened and what is currently happening in the environment.

Figure 8: Agent and Environment Diagram for AI [11]
Figure 8: Agent and Environment Diagram for AI [11]
Typically, Waylay outlines that an agent consists of an architecture and logic. The architecture allows it to ingest sensor data, run the logic on the data and act upon the outcome.

Waylay has developed a cloud-based agent architecture that observes the environment via software-defined sensors and acts on its environment through software-defined actuators rather than physical devices. A software-defined-sensor can correspond not only to a physical sensor but can also represent social media data, location data, generic API information, etc.

Figure 9: Waylay Cloud Platform and Environment Design [11]
Figure 9: Waylay Cloud Platform and Environment Design [11]
For the logic, Waylay has chosen graph modeling technology, namely Bayesian networks, as the core logical component. Graph modeling is a powerful technology that provides flexibility to match the environmental conditions observed in IoT. Waylay exposes the complete agent as a Representational State Transfer (REST) service, which means the agent, sensors and actuators can be controlled from the outside, and the intelligent agent can be integrated as part of a bigger solution.

In summary, Waylay has developed a real-time decision making service for IoT applications. It is based on powerful artificial intelligence technology and its API-driven architecture makes it compatible with modern SaaS development practices.

End of Case Study 

Reference:

11: Waylay: Case study AI and IoT

http://www.waylay.io/when-iot-meets-artificial-intelligence/

Why IoT needs Software Defined Networking (SDN)

Software defined networking (SDN), with its ability to intelligently route traffic and take advantage of underutilized network resources will help stop the data flood of IoT. Cisco has a pretty aggressive IoT strategy, and they place their application centric infrastructure version of SDN at the centre of this. And it makes sense. Software is still the main ingredient that can be used to combat network bandwidth challenges.

Lori MacVittie8 agrees with SDN being a critical enabler, but only if SDN considers all of the network layers from 2 to 7, and not just stateless 2-4. “Moving packets around optimally isn’t easy in a fixed and largely manually driven network. That’s why SDN is increasingly important when data volumes increase and do so in predictable waves. SDN can provide the means to automatically shift the load either in response or, optimally, in anticipation of those peak waves.”

The network challenges in IoT do not stop at bandwidth and volumes of data. Applications will be required to deal with the peak loads of data, so services will be required in layers 4-7 that provide for scale, security and performance of those apps.

Figure 5: Stateless vs Stateful in SDN Application Services [8]

SDN has features that will also be particularly useful. Dynamic load management should allow users to monitor and orchestrate bandwidth automatically on the fly, which will be music to the ears of global IoT providers. Service chaining will enable application specific processing procedures in a sequence fashion to a client’s job. This should ease management overhead in IoT services, as the subscriptions increase globally. One of the coolest features of SDN is bandwidth calendaring which will allow the user to schedule the traffic an application will need at a given time, and when you think of a sensor only wanting to communicate at periodic times, it is apparent that this will be a great asset.

But this cannot happen soon. Data center managers will have to modernize their infrastructures. Once they do, a potential big win would be the ability to create numerous virtual and private networks on top of a single physical network. This would be a big advantage as multiple customers could then share a single network, without risk for their applications and data. However, for this to work, one would need the entire network to be SDN enabled.

When one considers the concept of Network Functional Virtualization (NFV), this path can be traversed quicker. With NFV ready networks, carriers can create services in software, rather than dedicated hardware, essentially allowing virtualized servers to allow these new services. This enables business transformation by moving away from having multiple isolated networks, and one would work with an open ecosystem, a set of virtualized network functions, and most importantly an orchestration layer. This will allow businesses to accelerate with agility in the face of device quantity explosion.

Reference:

8: Dev Central: SDN and IoT article

https://devcentral.f5.com/articles/sdn-is-important-to-iot-if-it-covers-the-entire-network

Considerations of Change: An Intro to Networking in IoT

One of the major consequences of Moore’s Law for silicon is that pretty much any device now can have a reasonable level of computing power and internet connectivity. Because of this, the number of internet enabled devices is increasing, thus causing a huge influx of IoT traffic; it is predicted that WAN bandwidth will need to be increased.

When one considers the types of data that will be generated, it becomes clear that they both present challenges. George Crump, an analyst with Storage Switzerland points this out7. “First, there is large-file data, such as images and videos captured from smartphones and other devices. This data type is typically accessed sequentially,” explains Crump. “The second data type is very small, for example, log-file data captured from sensors. These sensors, while small in size, can create billions of files that must be accessed randomly.”

From this, it is clear that data centers will need to handle both types of data, and the storage and processing requirements that come with them.

For decades, the network was considered to be the plumbing of a company’s IT solutions, and was considered a somewhat dumber element of the design. With the advent of IoT, it is clear that the networking element of the IoT ecosystem is slightly lagging behind, which is a concern as IoT is very much a network centric technology, and in essence makes the web by which the sensors communicate to the host and to each other. There are a number of ways for these devices to be networked. Some devices can be directly connected to the internet utilizing standard Ethernet or Wifi, which are TCP/IP based. There are other wireless technologies, some of which are dependent on TCP/IP, but all require some sort of intelligent gateway to convert their network into standard Ethernet or Wifi. These include, but are not limited to, Zig Bee, Z-Wave, Bluetooth, and Cellular.

Evolution towards IPv6 

Due to the advancement of object gateways, the first two stages of the IoT roadmap will sit on current infrastructure and protocols. Once the volume of devices and data increases and true IoT is in motion, the IPv6 protocol will be required, which offers unlimited IP addresses.

The main challenge that IPv6 looks to overcome is the large packet size when we consider standard IP protocols. For IPv6, the packet size is reduced by making a number of changes to the release of the 6LoWPAN standard, namely RFC 4944. Changes included the compression of IP headers and the introduction of a fragmentation mechanism that enabled reassembly of IP packets that did not fit the IEEE 802 packet. Lastly, routing protocols for lossy, low power networks were required. New protocols were developed by the Internet Engineering Task Force (IETF) that provided basic routing in low power lossy networks.

In my next blog post, i will continue to write about network enablement requirements, talking about why IoT needs “Software Defined Networking” (SDN)

Reference:

7: Orange Business: Can your business handle IoT

http://www.orange-business.com/en/blogs/connecting-technology/data-centers-virtualisation/can-your-data-center-handle-the-internet-of-things

IoT and Classical Business Models

Many companies, especially in the Information Technology (IT) section are aware of the IoT explosion, one of the biggest challenges facing any company is how they prepare for the change that will result from the increased business impact that IoT will present.

With figures in the trillions in terms of the market for IoT, how do companies ensure they can get a slice of the pie? If they currently do not sit within the relevant market segment, analysis will be required to determine if it can be an opportunity or a threat to their business as a whole.

IDC in 20143 predicted that IoT will actually overtake the Information Communication Technology (ICT) over time. It predicts IoT will grow 12% year on year, whilst classical ICT will grow just 4%. Figure 3 below illustrates this.

Figure 3: IDC Prediction of IoT vs ICT [3]

Considering that most business is consistently monitoring the bottom line, it is not only the opportunities that it will present, but how it will impact how we work. With limitless numbers of sensors monitoring processes, improving business energy efficiency, enabling new ways of working in teams, business will need to be more open to change, and more dauntingly, open to the elements of a “big brother” type scenario.

There are trends that are ensuring an evolution of business practice as we know it. Normally, new technology platforms impact on a single strand, with the exception of the impact of the internet. But IoT has the potential to become an entire business ecosystem, where creating and capturing business value will be paramount. However, this is not a straightforward suggestion. Barriers to this include the current early position of IoT in its lifecycle, and the sheer volume and types of devices to be considered. From an ecosystem perspective, by nature it would indicate a seamless quantity of micro-systems working together in a self-sustaining fashion. Trying to estimate what this will mean for IoT is still not clear.

Consider the classical technology adoption lifecycle. There are five types of innovation adopters, the first being the innovators themselves. The list is completed, in sequence by early adopters, early majority, late majority and laggards. With the current immaturity in IoT, and the lack of clarity in the various emerging technologies, the challenge for business is to try to advance the early adopters to early majority, so the business needs to be able to scale. The early adopters are less fussy when it comes to product design, but once the number of adopter’s increases later in the life cycle, the early majority will want polished product offerings, with appropriate services.

With the IoT still in its relevant infancy, it is appropriate to compare it to the early stages of the Internet. When we look at the recent business ecosystems that have been spun out of the Internet for EMC, such as Pivotal Cloud Foundry, one would postulate about future ecosystems opportunities for EMC from the IoT spectrum.

Another important consideration for companies is to consider the skill-sets and people that are required to drive their Big Data strategy as a result of their growing IoT ecosystem. A key tenant for this will be the data itself, and in the February IT@Cork Tech Talk by my EMC colleague Steve Todd, and even more recently in his blog on data value (value was something I had never associated to data until this talk), Steve spoke to the importance for major companies to begin to consider a more structured approach to their employees that are involved in data set discovery, identification and migration (Data Architect) and also a Chief Data Officer to represent the company from a data perspective. Interestingly, my role in EMC changed last year, to the role of a Data Architect. So I could first hand relate to this. When faced with a business challenge in big data, 5 steps that can be critical to success are as follows.

1: Demystify and then map the current devices, tools, processes and trajectory of data across the business unit or company (AS-IS Diagram)

2: Scour the company and external sources for any technologies that can enable a more scalable and clearer approach

3: Look to centralize data storage, to allow the company to focus on being agile and scalable, and also remove duplicate data (concept of a Business Data Lake)

4: Develop an ingestion framework to ensure the data lake has a sufficient landing platform for data.

5: Build the Analytic’s platform that is pointed at the centralized “Business Data Lake” to meet existing and future needs of the business.

When we apply this to IoT, we start to that every company, no matter how small, will begin to generate huge data-sets, and there will be a new skillet needed at companies that never had previously to ensure they can gain as much insight from the data sets. Sure, there are companies that can provide these solutions, but realistically, the future state will surely be to have these as core skills, just as “internet skills” once appeared on resumes?!

It is proposed here that key stakeholders across multinationals can overcome these challenges and design practical IoT business models if they consider an ecosystem style approach, instead of looking at modular needs of individual business units. This will allow the business to get a high level perspective of where IoT can bring value to their business offerings.

Reference:

3: Digital Universe Article

http://www.emc.com/leadership/digital-universe/2014iview/internet-of-things.htm