Even in the ideal world of a perfect network topology, a web of sensors, a security profile, a suitable data center design, and lots of applications for processing and analyzing, one thing is constant across all of these, the data itself. Data science is well talked about, and careers have been built from the concept. It is normally aimed at the low hanging fruit of a set of data, things that are easily measured. Science will take you so far, but it is data intelligence that will show the true value, with capability to predict impact from actions, and track this over time, to build modelling engines to solve future problems.
Even the data set is different for data intelligence as opposed to data science, which relies on lots and lots of data sets (Facebook, working out effectiveness of their changes/features etc). It is more complex, smaller even, and can be a data set contained in a single process or building. Imagine a hospital’s set of machines producing live data to an analytics engine, and using historical models to compare live data to gauge risk to the patients? It can have real tangible benefit to life quality. Commonly called “Operational Intelligence”, the idea is to apply real time analytics to live data with very low latency. It’s all about creating that complete picture: historical data and models working with live data to provide a solution that can potentially transform all kinds of industry.
At the core of any system of this kind is decision making. Again, one must strive to make this as intelligent as possible. There are two types of decision making. The first is stagnant decision making and the second is dynamic decision making. With the assistance of mathematical models and algorithms, it will be possible for any IoT data set to analyze the further implications of alternative actions. As such, one would predict that efficiency of decision making would be increased.
At the IoT device level, there is scope to apply such a solution. Given the limited storage capacity on the devices themselves, a form of rolling deterministic algorithm that looks to analyse a set of sensor readings, and produce an output of whether or not to send a particular measurement to the intelligent gateway or cloud service.
Another proposed implementation on-device might be to use a deviation from correctness model, such as the Mahalanobis-Taguchi Method, which is an information pattern technology, which has been used in different diagnostic applications to help in making quantitative decisions by constructing a multivariate measurement scale using data analytic methods. In the MTS approach, Mahalanobis distance (MD, a multivariate measure) is used to measure the degree of abnormality of patterns and principles of Taguchi methods are used to evaluate accuracy of predictions based on the scale constructed. The advantage of MD is that it considers correlations between the variables, which are essential in pattern analysis. Given that it can be used on a relatively small data set, with the greater the number of historical samples the greater the model to compare it to, it could be utilized in the example of hospital diagnosis. Perhaps the clinician might need a quick on-device prediction around a patient’s measurement closeness to a sample set of recent hospital measurements?
Taking this one stage further, if we expanded this to multiple hospitals, could we start to think about creating linked data sets, that would be pooled together to extract intelligence. What if a weather storm is coming? Will it affect my town or house? Imagine if we could have sensors on each house, tracking the storm in real time and try to predict the trajectory and track direction changes and the service could then communicate directly with the home owners in the path.
With the premise of open source software, consider now the concept of open data sets, linked or not. Imagine if I was the CEO of a major company in oil and gas, and I was eager to learn from other companies in my sector, and in reverse allow them to learn from us through data sets. By tagging data by type (financial, statistical, online statistical, manufacturing, sales, for example) it allows a metadata search engine to be created, which can be then be used to gain industry wide insight at the click of a mouse. The tagging is critical, as the data is not then simply a format, but descriptive also.
Case Study: Waylay IoT and Artificial Intelligence11
Waylay, an online cloud native rules engine for any OEM maker, integrator or vendor of smart connected devices, proposes a strong link11 between IoT and Artificial Intelligence.
Waylay proposes a central concept for AI, called the rational agent. By definition, an agent is something that perceives its environment through sensors and acts accordingly via actuators. An example of this is a robot utilizes camera and sensor technology and performs an action i.e. “Move” depending on its immediate environment. (See figure 8 on next page).
To extend the role of an agent, a rational agent then does the right thing. The right thing might depend on what has happened and what is currently happening in the environment.
Waylay has developed a cloud-based agent architecture that observes the environment via software-defined sensors and acts on its environment through software-defined actuators rather than physical devices. A software-defined-sensor can correspond not only to a physical sensor but can also represent social media data, location data, generic API information, etc.
In summary, Waylay has developed a real-time decision making service for IoT applications. It is based on powerful artificial intelligence technology and its API-driven architecture makes it compatible with modern SaaS development practices.
End of Case Study
Reference:
11: Waylay: Case study AI and IoT
http://www.waylay.io/when-iot-meets-artificial-intelligence/