EnterConf Belfast- Day 1

Firstly, to the quote of the day “We all have to avoid software that epically sucks”.

Me at the Insight Stage!
Me at the Insight Stage!

Today I attended day one of the Enter Conf in Belfast, which for those who don’t know it, is a spin off conference from Web Summit focused at the Enterprise aspect of our tech world. On initial entry, I must admit I was really proud of the Enter Conf team for choosing the venue. It had lost of lot of history associated with it, being in the heart of the titanic quarter where the Titanic was built, and for its time, was an “Enterprise ship”! This created a chilled out atmosphere which was a nice differential from the Web Summit to be held again in November. It was full of detailed and focused meetups and conversations, and did a great job at giving a different experience of what a conference can provide. Kudos.
There were two stages, named Center and Insights, with startup exhibits, food and coffee stands to ensure everyone was nicely refreshed throughout the day. Whilst I wont cover all talks, I have picked out a few to cover to show the types of elements being discussed.

The first one Ill mention was by Lukas Biewald of Crowdflower, entitled “Processing Open Data”, who spoke extensively on their efforts to clean up the data, and also looking into elements of data moderation. It really resonated with me as I am interested and developing data cleanse frameworks over the past number of years, and always struggle with the data pollution that skews our insight. Quote from Lukas “If you want to improve your algorithm, just add more data”. Lukas is in action below.

Lukas Biewald of Crowdflower
Lukas Biewald of Crowdflower

Stephen McKeown from AnalyticsEngines and Amir Orad from Sisense were also in a panel on “Democratising Data”, which focused the talk on ensuring companies of all sizes speed up their analytics creates a more level playing field for startups competing with Enterprise. Quote from this section “Bring data into your companies DNA”

Stephen McKeown and Amir Orad
Stephen McKeown and Amir Orad

There were a few familiar faces present, with my former EMC colleague and mentor Steve Todd amongst the speakers, on “Economic Value of Data” (check out Steve’s blog here for more fascinating content in this topic. Steve spoke on the Center stage, and it was great to see this topic present, as it really stood out as a conversation we all should be having. Steve gave a similar talk in Cork for an it@Cork event we organised in February, and it was great to see the advancement in Steve’s research in this area. Steve spoke on “Valuation Business Processes” and categories within that being M&A, Asset Valuation, Data Monetisation, Data Sale and Data Insurance. I wont spoil the rest, as I am sure Steve will blog on this soon.

Steve Todd speaking on Economic Value of Data
Steve Todd speaking on Economic Value of Data

Also on Center Stage, in one of talks to close out the evening, Barak Regev, Head of Google Cloud Platform – EMEA spoke on “Architecting the Cloud”. It was great to get an update on their vision, and Barak showed Googles vision to “Build Whats Next”

IMG_1526 - Copy
Barack Regev from Google – Build Whats Next

And to end on a great quote from to James Petter VP EMEA for Pure Storage – “Security should be like an onion, it should be layered, and you cant reach the center without breaching a layer”

The day brought many epic conversations from over 10 different nationalities, including a walk back to the city with the visionary Teemu Arina. His talk on Biohacking was incredible insightful. It spoke to the challenge around humans tracking their life through Self Quantisation. Teemu took me though his idea for how humans can do a better job on hacking their bodies for information and using that to improve life quality. Teemu’s book is here!

So now, its off to the night dinner, drink a beer two and to build a few more contacts! In the morning, it looks like a few good talks on Machine Intelligence will start the trend for another awesome day!

Why IoT practitioners need to “Wide Lens” the concept of a Data Lake

As we transition towards the vast quantity of devices that will be internet enabled by 2020, (anything from 50-200 billion experts estimate), it seems that the current cloud architectures that are being proposed are somewhat short on the features required to enable the customers data requirements on 2020.

I wont dive hugely into describing the technology stack of a Data Lake in this post (Ben Greene from Analytics Engines in Belfast, who I visit on Wednesday en route to Enter Conf, does a nice job here of that in his blog here). A quick side step, if you look at the Analytics Engines website, I saw that customer choice and ease of use were some of their architecture pillars, when providing their AE Big Data Analytics Software Stack. Quick to deploy, modular, configurable  with lots of optional high performance appliances. Its neat to say the least, and I am looking forward to seeing more.

The concept of a Data Lake has a large reputation in current tech chatter, and rightly so. Its got huge advantages in enterprise architecture scenarios. Consider the use case of a multinational company, with 30,000+ employees, countless geographically spread locations, multiple business functions. So where is all the data? Its normally a challenging question, with multiple databases, repositories and more recently, hadoop enabled technologies storing the companies data. This is the very reason why a business data lake (BDL) is a huge advantage to the corporation. If a company has a Data Architect at its disposal, then it can develop a BDL architecture (such as shown below, ref – Pivotal) that can be used to act as a landing zone for all their enterprise data. This makes a huge amount of sense. Imagine being the CEO of that company, and as we see changes in the Data Protection Act(s) over the next decade, a company can take the right step towards managing, scaling and most importantly protecting their data sets. All of this leads to a more effective data governance strategy.

Pivotal-Data-Lake

Now shift focus to 2020 (or even before?). And lets take a look at the customer landscape. The customers that will require what the concept of a BDL now provides will need far more choice. And wont necessarily be willing to pay huge sums for that service. Now whilst there is some customer choice of today, such as Pivotal Cloud Foundry, Amazon Web Services, Google Cloud and Windows Azure, it is predicted that even these services are targeted at a consumer base of a startup and upwards in the business maturity life cycle. The vast majority of cloud services customers in the future will be everyone around us, the homes we live in and beyond. And the requirement to store data in a far distance data center might not be as critical for them. It is expect they will need far more choice.

I expect in the case of building monitoring data, which could be useful to the wider audience in a secure linked open data sets (LOD’s) topology. For example, smart grid provider might be interested in energy data from all the buildings and trying to suggest optimal profiles for them to reduce impact on the grid. Perhaps the provider might even be willing to pay for that data? This is where data valuation discussions come into play, and is outside the scope of the blog. But the building itself, or its tenants might not need to store all their humidity and temperature data for example. They might some quick insight up front, and then might choose bin that data (based on some simple protocol describing the data usage) in their home for example).

Whilst a BDL is built on the premise of “Store Everything”, it is expected that whilst that will bring value for these organisations monitoring consumers of their resources, individual consumers might not be willing to pay for this.

To close, the key enablers to these concepts are the ensure that real time edge analytics and increased data architecture choice. And this is beginning to happen. Cisco have introduced edge analytics services into their routers, and this is a valid approach to ensuring that the consumer has choice. And they are taking the right approach, as there is even different services for different verticals (Retail, IT, Mobility).

In my next blog, Edge Analytics will be the focus area, where we will dive deeper into the question. “where do we put our compute?”

Why Ireland needs to use Technology and IoT more to help their Homeless



21% rise in homeless sleeping rough in Dublin

15% rise in homeless in Cork

Rise also in Limerick, Galway and Waterford

Over 1,000 children are now homeless in Ireland.

Startling figures. Especially the last one. I cannot try to comprehend what it is like to be a parent, who must tell their children that they don’t have a place to go at night.

I, like many others have been in other countries cities to see that this is not simply an Irish challenge. So I will start by giving those examples, to address the global scenarios, and how our physiology must change locally. I remember vividly two occasions whilst abroad that a homeless person made a big impact on my life.

The first time was in 2005, I was in Auckland, New Zealand. We were staying in a hostel whilst backpacking. We weren’t having a particularly good day. The weather wasn’t great, and one or two things went badly. But as we strolled back to the hostel, we noticed a homeless elderly guy in a doorway right beside us. The “bad” weather had turned into a storm, with an incredible amount of flash flooding. I felt awful. And everyone has been there, where a reality check ensures that we come back to earth. I asked the hostel could the guy get a room. They said no, as he must have an address. I offered to pay for his room, still no. Yikes. So I decided all I could do was give him some money. But then I thought, why not do that, and have a conversation. I think we automatically think only money is what they need. I went out, and sat close to him. Whilst chatting, I learned part of his story, and one of the first things he said was that there were worse off people than him, and he didn’t drink, or smoke. And that he hated the rain! He thanked me for the $20, and also the conversation. We helped each other.

The next story is when I received incredible kindness from a homeless guy on my very first night in New York. Woohoo im in America, let go for beers! Oooops. Ended up feeling a little worse for wear outside a club. On my own. Minus my phone. In a lane way in a bad part of town. A big guy stopped. Uh-oh. But this guy asked me was I ok. I told him I was from Ireland, and that I lost my phone. He told me “man I don’t even have a phone”. He then walked me out of the lane way, and hailed me a cab. I gave him a nice tip, and the cynic out there will say he was looking for that. But he didn’t know me. Humanity exists.

And now to Ireland. I really want to stress that I am not some Saint. This is more to raise awareness and how potentially technology can help. I have contributed to Cork Simon Community at length at various points in my life, and if I have some change, I do give it to the needy. But herein lies the first challenge. A lot of people has less and less cash on them. And even if we do, people wonder, if I give this person money, what will they spend it on? Money doesn’t always help, as the upper class society of Ireland have also seen.

From a technology perspective, I want to talk about some potential ways for technology to help on this challenge.

The term Smart Cities has been branded about in relation to the Internet of Everything. Where we will use technology to improve people’s lives. Yet I have not seen much presented that will help the homeless. Imagine if we could use cost effective smart devices that would be worn by homeless volunteers to identify the paths they take, and where they sleep? So that soup runs can be more efficient, and beds can be found? I think it is one area that must at least be explored. There are doing this in Odense, Denmark. Check it out here.

I also believe that doorways could be fitted with load sensors to gauge how many are occupied in our cities. This data could be used to predict common places used, and even predict on particular nights where homeless people are. That coupled with temperature sensors could have saved Jonathan Corrie’s life last December.

The last idea I’ll propose here is to modify the many parking meters in our cities to allow them produce vouchers based on use in a particular day. The more the meters are used in the day session (which should correlate busier cities to more needy people), the meters in the evening print out food/supply vouchers when homeless people enter a code that is text to them. If they don’t have a phone, then their date of birth would be previously registered and entered.

I came across a startup on a recent trip to the United States. I was incredibly impressed. It is called HandUp. The whole premise is that homeless people can setup a online profile through the organisation, and can crowd fund to reach their goals. They never receive direct cash. Instead it is used to buy supplies, food, and sometimes tools to go back to work. So instead of writing their story on cardboard, they get help to set up a profile, and then hand out business cards to their site, so that people can logon and donate. It only based in San Francisco for now, but I have contacted the, to hear plans for global roll out. (And how)

Technology multinationals benefit greatly though our tax system, by positioning themselves in Ireland. And it’s great for our economy, through jobs. I have seen the kindness first hand by working in these companies. They create lots of great lives for people. I wonder if a 1% challenge in the tech sector, where people can volunteer (before tax) donate 1% of their annual wage (hence its 0.05% from us and 0.05% from government) to a particular social challenge. This could change annually. The homeless, the elderly. I think this sort of crowd funding which is spread thin could make a huge impact. I won’t do the exact maths, but 100,000 employees at average salary of €40,000 equates to €40,000,000.!!

A story of caution on the wrong ways to use technology. BBH labs tried a social experiment to use homeless people as wifi hotspots.!! You can read more here. Brain fry springs to mind.

The work done by organisations like Simon and Focus Ireland (and others) is incredible. I sometimes try to think if they weren’t so active, where would we be. I personally believe that the technology community can play a role in assisting and helping the fight. I also think the government gets bad press, and whilst not completely innocent, neither are we. Dublin Simon Community submitted an application for new accommodation last year. The result? 33 objections from the public. Not the government, but us.

“Part of the problem is we have a lack of activism.. We have a lack of people who are willing to step forward and be part of the solution” – Michael Esswein

 

it@Cork European Technology Summit 2015 – a WOW event!

I wanted to change direction slightly and give an update on an event I had the privilege of being involved with this week, the it@Cork European Technology Summit. The event was held at Cork City Hall, on Wednesday May 5th, with a full day technology summit, followed by a black tie dinner with 3d wearables fashion show.

An epic journey over the past few months, with way more ups than downs resulted in…

1 Day – 4 Sections – 20 speakers – 4 Chair Speakers – 400+ Day attendees – #1 Trending on Twitter – 9 Amazing artisan food stalls – Lots of Sponsors – 200+ Night Attendees – 2 Fashion Designers – 1 Model Agency – 10 Models – 2 Fire Dancers – 4 3D printed bow ties !

So how did I arrive there? Last year, Gillian Bergin from EMC asked me to get involved with it@Cork, as part of the Tech Talk committee. I’m delighted she did, as over the past few months, I got to partake in and help organise some excellent tech talks from a variety of people, including my fellow technologist, Mr Steve Todd of EMC. The tech talk series is just one of many successful strands of it@Cork, holding six high end, rockstars speakers/panels per year. The series is full up until 2016, but if you are a “rockstar” speaker interested in speaking, please contact us directly. From this, James O’Connell of VMWare who passed over the tech talk committee chair to Barry O’Connell, took on chair of the Summit Organising committee. James, coupled with myself and Paddy O’Connell of Berkley Group, (known collectively now as the Macroom or Muskerry Mafia 🙂 ) assisted Sarah Walsh of it@Cork in organising the day summit. The night summit was excellently organised by Marnie O’Leary Daly of VMWare.

The event was kicked off by James, and then Ronan Murphy, chairman of the board it@Cork, CEO Smarttech gave an address that spoke about how Cork needs a cluster manager to help drive more employment in the region. More from Ronan here by the Examiner.  Donal Cahalane, from Teamwork.com, gave an insightful talk on how he saw the industry progressing, with some excellent advice for everyone from startups through to multinationals .


The four sections throughout the day offered a mix balance between raw technology (Cloud- challenge the fear, Internet of Everything) along with Digital Marketing and a Tech Talent/ Diversity panel. I found this to work quite well, as it ensured the audience got a variety of speakers.

The cloud session on “challenging the fear” was an excellent one to start with, as it had a mix of SME’s from companies such as Kingspan (John Shaw), Trend Micro (Simon Walsh) and Barricade (David Coallier), but also had representation from the legal profession, in the form of Michael Valley, Barrister and Noel Doherty – Solicitor who spoke at length on cloud governance. This session was chaired by Anton Savage of The Communications Clinic, who hosted a panel discussion with all five presenters at the end.


All of the sections were split by networking opportunities in the exhibition halls, where companies from the region presented their organisations, and some even demonstrated their wares. The athmosphere was great to see with lots of chatter, tweeting and drinking of coffee! 😀


The second section was a panel session on Tech Talent, the chair being Paddy O’Connell from Berkely, and the facilitators were Meghan M Biro, founder and CEO of TalentCulture, and Kevin Grossman, who co founded and co hosts the the TalentCulture #TChat show with Meghan. They later presented their TChat show live from the Clarion hotel Cork. It was awesome!

Such variety (no pun intended!) in the panel, with David Parry Jones, VP UKI VMWare and Noelle Burke Head of HR Microsoft Ireland representing industry, Michael Loftus – Head of Faculty of Engineering and Science CIT representing academia, and the hugely impressive student Ciara Judge, one of the Kinsale winners of the 2013 Google Science Award. Everyone inspired in their own way, and the dynamic at lunchtime was one of motivation, hope and leadership.


Having started my own personal digital marketing brand last year, and learning by making mistakes, I was exceptionally excited by our third section – Digital Marketing. Again, Anton did an incredible job of asking the right questions, and effortless listenership followed. To listen to experts such as Meghan, Antonio Santos, Niall Harbison and Raluca Saceanu was a privilege, and I also got the opportunity to speak with the directly (as did many others). This was true of all the speakers throughout the day. I believe a huge number of people got lots of what I call “advice snippets” that they can take away and grow their own brand.


The last session was on an area close to my heart, the Internet of everything (IoE), and I had the privilege of chairing the session. We had speakers from Climote (Derek Roddy), my future employer Tyco (Craig Trivelpiece), Salesforce (Carl Dempsey), Dell (Marc Flanagan) and Xanadu (David Mills). All these companies are in different stages on their IoE journey, but the message was consistent: IoE is going to make a huge impact on our smart futures. I really like how Craig spoke of “if you want to improve something, measure it”  and how Tyco are looking at predictive maintenance and pushing intelligence/insight back out to the devices. Derek showed how Climote is changing how we live, David did the same in relation to sport. Marc gave an excellent account of Dells practical approach to IOT, showing the capabilities needed for IoE projects. Carl got me really excited about Salesforce’ plans in the IoE space. The session really closed out the event well, and the numbers in attendance stayed consistent.

Having attended a huge number of tech events over the years, it was great to see again, year on year growth of Munsters premier Technology Summit. The athmosphere was electric all day, both locally and on Twitter. The tweet wall was a big success, and we expect that next years event will be bigger and better again.


The black tie dinner was also a huge success, with the Millenium Hall in City Hall packed to capacity. Marnie O’Leary Daly, along with Emer from Lockdown model agency, put on an amazing dinner (superb catering by Brooks) and fashion show, with 3D wearables fashion provided by Aoibheann Daly from LoveandRobots and Rachael Garrett from Limerick School of Art and Design (@LSAD). Special mention to FabLab also for helping Rachael get her garments ready. It really was a spectacular evening. The Clarion hotel was also hugely supportive of the night element. (Photos to follow!) Emer will also blog on the night event fashion soon and do a much better job than me!

It@Cork European Technology Summit 2016. Watch this space. 

If you are interested in getting involved in 2016, please contact Sarah Walsh at it@Cork.

Case Study: IoT Technology Platform – ThingWorx [10]

In my previous blog, I mentioned some platform design considerations at the outset. In this blog, I discuss one such Platform that has gained significant traction in the industry in recent times.

About ThingWorx10

ThingWorx is one of the first software platforms designed to build and run the applications of the connected IoT world. ThingWorx reduces the cost, time, and risk required to build innovative Machine-to-Machine (M2M) and Internet of Things (IoT) applications.

The ThingWorx platform provides a complete application design, runtime, and intelligence environment with the below features:

  • Modern and Complete Platform
  • Mashup People, Systems & Machines
  • Deploy 10X Faster with Model-based Development
  • Deploy How You Like
  • Evolve & Grow Your Application Over Time

What ThingWorx does that was really clever was that they created a modelling environment based on a database of graphs that keeps track of thousands of devices that communicate with other devices and applications.

“There’s nothing new about gathering and using data to make something better. What is new, and complex, is getting these things that are now web-enabled to take better advantage of the IoT. This requires application developers to rethink how they collect, analyze, manipulate and interact with information,” said Russ Fadel, CEO, ThingWorx9. “ThingWorx is the first software platform on the market designed to build and run applications in the connected IoT world and offers a fully integrated and pre-architected solution that covers connectivity, event processing, analytics, storage and presentation of any kind of M2M and IoT data. Our goal is to provide customers with instant insight into collected data from these smart, connected things so they can be proactive and address issues before they happen in a smarter way than previously able.”10

Figure 7: ThingWorx Architecture [10]
Figure 7: ThingWorx Architecture [10]

Features10

ThingWorx Composer™

ThingWorx Composer is an end-to-end application modeling environment designed to help you easily build the unique applications of today’s connected world. Composer makes it easy to model the Things, Business Logic, Visualization, Data Storage, Collaboration, and Security required for a connected application.

Codeless Mashup Builder

ThingWorx “drag and drop” Mashup Builder empowers developers and business users to rapidly create rich, interactive applications, real-time dashboards, collaborative workspaces, and mobile interfaces without the need for coding. This next-generation application builder reduces development time and produces high quality, scalable connected applications which allows companies to accelerate the pace at which they can deliver value-add solutions, resulting in greater market share against new and existing competitors.

Event-Driven Execution and “3D” Storage

ThingWorx’s event-driven execution engine and 3-Dimensional storage allows companies to make business sense of the massive amounts of data from their people, systems, and connected “Things” – making the data useful and actionable. The platform supports scale requirements for millions of devices, and provides connectivity, storage, analysis, execution, and collaboration capabilities required for applications in today’s connected world. It also features a data collection engine that provides unified, semantic storage for time-series, structured, and social data at rates 10X faster than traditional RDBs.

Search-based Intelligence

ThingWorx SQUEAL™ (Search, Query, and Analysis) brings Search to the world of connected devices and distributed data. With SQUEAL’s interactive search capabilities, users can now correlate data that delivers answers to key business questions. Pertinent and related collaboration data, line-of-business system records, and equipment data get returned in a single search, speeding problem resolution and enabling innovation.

Collaboration

ThingWorx dynamically and virtually brings together people, systems, and connected equipment, and utilizes live collaboration sessions that help individuals or teams solve problems faster. The ThingWorx data store becomes the basis of context aware collaboration and interaction among the systems users, further enhancing its value. Additionally, the tribal knowledge exposed during the process is automatically captured and indexed for use in future troubleshooting activities.

End of Case Study

References 

10: ThingWorx: About ThingWorx

http://www.thingworx.com/

Platform Architecture Pre Considerations for IoT

Apart from the sheer volume of data generated by IoT devices, there are also a huge number of different data customers requirements, both known and unknown that will need to be considered. In this regard, the platform technology will need to be agile enough to meet this variation. How will this scale both horizontally and vertically to ensure sustainability? I started to think of profiling requirements, and looking to give personality to the IoT customer type, so that the platform can morph and adjust itself based on not only the inputs (data type, frequency, format, lifetime), but also what outputs it needs to provide.

Data latency will also be a requirement that any platform will need to firstly understand, and then address, depending on the application and customer requirements. In an interesting discussion today in Silicon Valley with Jeff Davis (my original hiring manager in EMC, and now senior director of the xGMO group looking at operations cloud, analytics and infrastructure services ), he mentioned having worked in a previous company in the sensor business, latency represented a huge challenge, especially when the amount of data grew exponentially. We chatted more and more about how the consumer of now wants their devices/ technology interactions to be instant. How long will people be willing to wait for smart light bulbs/ switches? What if my devices are distributed? More importantly, Jeff outlined a key question. “How much are the consumer willing to pay for the added services provided by adding “smarts” to standard everyday sensors”? This is a “understand the market” question, and should be a consideration for anyone looking at building an IoT platform.

When one starts to consider that most applications in the IoT space might require more than one industry working together, cross collaboration is key to making it work. Consider some of the taxi apps in use currently, whereby the taxi company provides the car locations, the application needs to offer information on locations, then the banking is used to pay for it from your account, and perhaps there is advertisement shown on your receipt, if a suitable arrangement is not formed between the various It companies, it becomes too easy for the “blame game” to ruin the user’s experience of the application when something goes wrong.

Central to the satisfying both the varying requirements of the customers and latency management will be the concept of a customer or business data lake, powered by Hadoop or Spark technology, will form the primary storage and processing in the data center. There is also an option to look at tiering to help address the variation in requirements for the platform, with the possibility to send the “big hitting data”, which brings the most value in close to real time, to an in memory database, to provide fast cache insightful analytics. In a later blog post, I will elaborate greatly on this paragraph, so stay tuned. If the same dataset can be used by multiple applications, in a multi-tenant schema, then there will be clear orchestration challenges in ensuring that this data can be processed in real time.  Other features of any data architecture for IoT could also include:

  • Multiple Data Format Support
  • Real Time Processing
  • High Volume Data Transfer
  • Geographically Agnostic
  • Data Lake Archival and Snipping

As with all technology, IoT will evolve, which means that we will build on top of previous technologies, and new technologies will add to the ecosystem. The enterprise data warehouse will continue to play an important role, but a series of technology platforms will be necessary. While numerous platforms have and will be created, one such platform, ThingWorx is the subject of case study in my next blog.

IoT Impact on the Manufacturing Industry (Part 2)

Continuing on from my last blog post, another example for IoT use in manufacturing would be for the asset management to distribute work orders and configurations to the tools or the different stages of production. And vice versa, calibration information can be fed back to the Enterprise Resource Planning (ERP) system to associate them to the bill of material (BOM). Big data and NoSQL technology is an enabler in this regard, as they can allow for the management of huge volumes of heterogeneous, multi structured data about the production process, from the data types discussed, to even images from AOI (Automated Optical Inspection) systems and other production modules. With recalls a concern point in global manufacturing, this can be an ally in the fight to keep costs down for manufacturing.

IoT can also have an impact is in intelligent edge devices and their use in improving supply chain optimization and modularity of manufacturing. Consider surface mount technology (SMT), where there is so many moving parts, calibration, types of technology used in the placement and verification of board level components. IoT sensors could be utilized to centralize SMT line asset management and to read calibration information via the factory WLAN. The asset management can form the link between the SMT tools and the ERP (Enterprise Resource Planning) and MES (Manufacturing Execution Systems) that oversee the manufacturing process.

A challenge that presents itself to the manufacturing industry is the ageing workforce, and this means that anything that speeds up the manufacturing process is critical. The advancement in mobile technology is a key enabler in ensuring that passing information to the shop floor becomes quicker, improving response time, visibility, and accessibility of operations. The recent advancement of wearables also will have an impact on enhanced visibility on the shop floor.

Building Blocks for IoT in Manufacturing

Business owners need to look at four technology elements that provide the foundation for smart manufacturing. These include (but not limited to):

  • Security: IT security is a major obstacle to setting up smart factories. Operations managers need to make sure that necessary safeguards are built into the solution including security procedures such as physical building security, hardware encryption and network security for data in transit. Security and networking solutions must also be engineered to withstand harsh environmental conditions, such as moisture and temperature, that aren’t present in typical networks. Identity and authentication structures will also need to be updated to support such “things” as well as people.
  • More Advanced Networking: Smarter manufacturing environments need a standardized IP-centric network that will enable all the devices/sensors in a plant to communicate to enterprise business systems. Cisco research states that only 4 percent of the devices on the manufacturing floor are connected to a network. A standard IP network also makes it easier to connect and collaborate with suppliers and customers to improve supply chain visibility. Manufacturers need robust networks that can cope with Radio Frequency (RF) challenges in the plant, harsher environmental conditions and need stability for transmission of alarms and real-time data processing.
  • Big Data Analytics: While manufacturers have been generating big data for numerous years, companies have had limited ability to store, analyze and effectively use all the data that was available to them, especially in real time. New big data processing tools are enabling real-time data stream analysis that can provide dramatic improvements in real time problem solving and cost avoidance. Big data and analytics will be the foundation for areas such as forecasting, proactive maintenance and automation.
  • Engineering Software Systems: Today’s IoT data is different than the data we use to operate our systems. It requires collecting a wide range of data from a variety of sensors. These software systems and models must translate information from the physical world into actionable insight that can be used by humans and machines. Toyota is using Rockwell’s software for real time error corrections in the plant. Toyota has minimized rework and scrap rates in its Alabama plant, which has resulted in an annual cost saving of $550,000.3

Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
With IoT, IP networks and analytics, manufacturers can become more efficient, improve worker safety and offer new exciting business models. IoT will help manufacturers improve resource efficiency, safety and return on assets. Manufacturers that master this new dynamic will have a variety of new opportunities for revenue growth and cost savings.

References

3: How IoT will help manufacturing

http://www.industryweek.com/blog/how-will-internet-things-help-manufacturing

4: Industrial Optimization IoT (Intel)

http://www.intel.ie/content/dam/www/public/us/en/documents/white-papers/industrial-optimizing-manufacturing-with-iot-paper.pdf

IoT Impact on the Manufacturing Industry (Part 1)

“Industry 4.0” and “Smart Factory” are some of the terms used to describe the technological and social revolution that promises to change the current industrial landscape. Industry 1.0 was the invention of mechanical assistance, Industry 2.0 was mass production, pioneered by Henry Ford, Industry 3.0 brought electronics and control systems to the shop floor, and Industry 4.0 is peer-to-peer communication between products, systems and machines. It is clear that IoT will have a different impact statement depending on the application and/or industry, one that is of particular interest, given the emphasis on process, is Manufacturing. Compared to other realms such as retail and its intangible ways, manufacturing is about physical objects and how we can bring them to the consumer in a more efficient and automated way. The manufacturing landscape is ever changing, with automation through robotics the most recent enabler.

Challenges and Possibilities of IoT and Manufacturing 1

Gartner analyst Simon Jacobsen sees five immediate challenges and possibilities posed by the IoT for the manufacturing industry1.

1. CIOs and manufacturing leads will have to move even more rapidly

Jacobson says manufacturers have moved heavily toward individualization and mass customization as part of the luxury of connected products. But in order to enable that, you have to maintain alignment with supply management, logistics functions and partners to make sure all service levels are maintained: “I have to have knowledge of my processes and optimization of my processes at a hyper level, not just simply understanding at week’s end or at the end of the shift where I need to make adjustments and improve,” Jacobson said.

2. Security must be reimagined

A connected enterprise means that you can no longer simply physically secure the facility but should blend approaches of mobile and cloud-based architectures with industrial, control and automation, ensuring information is being managed. Jacobson says the challenge will be to merge the skills of engineers and process control teams with IT and more importantly, unify their disparate approaches to security.

3. IoT will create more visibility in process performance

There’s always been a form of automation and control in manufacturing, but implementing new business applications powered by IoT will allow you to connect devices to the factory network and know tolerances: “Being able to connect those dots and derive contexts of how processes are performing is absolutely going to be where the return on investment is coming from,” Jacobson said.

4. Predictive maintenance can generate revenue for OEMs

Asset performance management is of high value today. This is the ability to drive availability, minimize costs and reduce operational risks by capturing and analyzing data. Original Equipment Manufacturers (OEMs) have already started creating revenue by using IoT-enabled tools like predictive maintenance in order to guarantee uptime, outcomes and certain levels of performance for the customer: “When you guarantee these kinds of outcomes to the customers, you have to look at this from two different perspectives, how I monetize this but also how my customer monetizes this,” Jacobson said.

5. Production will play a new role in the manufacturing value chain

The boundaries between the physical and digital worlds are blurring. Chief Information Officers (CIOs) and manufacturing strategists can use the IoT, big data and cloud to redefine the role production plays in the manufacturing value chain. It no longer has to be restricted to being a cost center, and this has all to do with the new ability to not just accelerate but innovate on the factory floor. It’s the CIO’s challenge to keep pace with these new competitive changes.

Figure 10: Real Time Intelligence on the Shop Floor [2]
Figure 10: Real Time Intelligence on the Shop Floor [2]
In my next blog post, I will continue this discussion on IoT and Manufacturing, giving further use cases, and outlining the building blocks for IoT in Manufacturing.

References:

1: Gartner Best Practices for IoT in Manufacturing

https://www.gartner.com/doc/2899318?ref=AnalystProfile

2: Building Blocks for a Smart Plant

http://www.mbtmag.com/articles/2014/10/manufacturing-transformations-building-blocks-future-smart-plant

Pre Cloud Security Considerations in IoT

Introduction

Over the past decade, hybrid cloud adoption has steadily increased, with closed network becoming less the option of choice. But this comes at a cost to security and trust metrics. As we become more dependent on intelligent devices in our lives, how do we ensure the data that is within the web is not compromised by external threats that could threaten our personal safety?

As the adoption of IoT increases, so does the risk of hackers getting at our personal information. As Alan Webber points out on his RSA blog6, there are three key risk areas or bubbles that companies need to be aware of.

1: Fully enabled Linux/Windows OS systems: This area concerns itself with those devices that are not part of a normal IT infrastructure, but are still run on full operating systems, such as Linux or Windows. As everyone knows, prior to IoT, these OS have vulnerabilities, and when they are deployed in the “free world”, they are not as visible to IT admins.

2: Building Management Systems (BMS): This pertains to infrastructure systems that assist in the management of buildings, such as fire detection, suppression, physical security systems and more. These are not usually classified as threatened, yet shutting down a fire escape alarm system could lead to a break-in scenario.

3: Industry Specific Devices: This area covers devices that assist a particular industry, such as manufacturing, navigation, or supply chain management systems. For example, in the case of a supply chain management system, route and departure times for shipments can be intercepted, which could lead to shipment intercept and reroute to another geographical location.

So, how do we guard against these types of risks, and make the devices themselves and also the web of connected devices less dumb? Security must be looked at holistically to begin with, with end to end security systems being employed to ensure system level safety, and to work on device level embedded control software to ensure data integrity from edge to cloud.

Data routing must also be taken seriously from a security standpoint. For example, smart meters generally do not push their data to a gateway continuously, but send it to a data collection hub, before sending it in a single bulk packet to the gateway. Whilst the gateway might have an acceptable security policy, what about the data collection hub? This raises a major challenge, as how does one micro manage all the various security systems their data might migrate across?

Security Design Considerations

Early stage IoT devices unfortunately had the potential loss of physical security in their design, so it is necessary for security officers to be aware of the focus and location of their security provisioning.

To apply security design to the devices is not the most utilized method (similar to internal storage), as the cost and capacity of these devices is counterproductive to same. The devices would look to ensure consistency of communication and message integrity. Usually, one would deploy the more complex security design upfront within the web services that sits in front and interacts with the devices. It is predicted as the devices themselves evolve, and nanotechnology becomes more and more of an enabler in the space, the security design will become closer to the devices, before eventually becoming embedded.

It is proposed that shared cloud based storage will play a pivotal role in combating the data volume perplexity, but not without its issues. How do we handle identification and authentication? How do we ensure adequate data governance? Partnerships will be necessary between security officers and cloud providers to ensure these questions are answered.

Searching for the holy grail of 100% threat avoidance is impossible, given the number of players in an entire IoT ecosystem. Whilst cloud service providers own their own infrastructure, it is very difficult for them to know if the data that is received has not being compromised. There are ways to reduce this, but using metadata and building “smarts” into the data from typical known sets as it transitions from edge to cloud. It seems like an approach of something equivalent to a nightclub security guard checking potential clients to their nightclub is a useful analogy. “Whats your name (what type of data are you), where have you been tonight (whats your migration path), how many drinks have you had ( what transactions happened on your data).!!

IoT Security and Chip Design

One area that could bring about increased data privacy is the increased usage of the concept of “Trusted Execution Environments” or TEEs, which is a secure area in the main processor of the device. This ensures that independent processing can occur on critical data within the silicon itself. This enables trusted applications to run to enforce confidentiality and integrity, and protect against unauthorized cloning or object impersonation by remove and replace. Taking it into a real world example, a home owner tampering with their smart meter to reduce their energy bill would be one scenario that would be avoided with TEEs.

If cloud services companies can somehow increase their influence on the IoT device design (outside of the popularity of TEE’s in cellular applications). then utilizing technology such as this will ensure less risk once the data reaches the cloud. Collaboration efforts should be increased between all parties to ensure best practice across the entire IoT landscape can be established.

Figure 1. Generalized framework for a secure SoC
Figure 1. Generalized framework for a secure SoC [7]
References:

6 RSA RISKS of IOT

https://blogs.rsa.com/3-key-risk-areas-internet-things/

7: EDN SOC TE

http://www.edn.com/design/systems-design/4402964/2/Using-virtualization-to-implement-a-scalable-trusted-execution-environment-in-secure-SoCs

IoT meets Data Intelligence: Instant Chemistry

Even in the ideal world of a perfect network topology, a web of sensors, a security profile, a suitable data center design, and lots of applications for processing and analyzing, one thing is constant across all of these, the data itself. Data science is well talked about, and careers have been built from the concept. It is normally aimed at the low hanging fruit of a set of data, things that are easily measured. Science will take you so far, but it is data intelligence that will show the true value, with capability to predict impact from actions, and track this over time, to build modelling engines to solve future problems.

Even the data set is different for data intelligence as opposed to data science, which relies on lots and lots of data sets (Facebook, working out effectiveness of their changes/features etc). It is more complex, smaller even, and can be a data set contained in a single process or building.  Imagine a hospital’s set of machines producing live data to an analytics engine, and using historical models to compare live data to gauge risk to the patients? It can have real tangible benefit to life quality. Commonly called “Operational Intelligence”, the idea is to apply real time analytics to live data with very low latency. It’s all about creating that complete picture: historical data and models working with live data to provide a solution that can potentially transform all kinds of industry.

At the core of any system of this kind is decision making. Again, one must strive to make this as intelligent as possible. There are two types of decision making. The first is stagnant decision making and the second is dynamic decision making. With the assistance of mathematical models and algorithms, it will be possible for any IoT data set to analyze the further implications of alternative actions. As such, one would predict that efficiency of decision making would be increased.

At the IoT device level, there is scope to apply such a solution. Given the limited storage capacity on the devices themselves, a form of rolling deterministic algorithm that looks to analyse a set of sensor readings, and produce an output of whether or not to send a particular measurement to the intelligent gateway or cloud service.

Another proposed implementation on-device might be to use a deviation from correctness model, such as the Mahalanobis-Taguchi Method, which is an information pattern technology, which has been used in different diagnostic applications to help in making quantitative decisions by constructing a multivariate measurement scale using data analytic methods. In the MTS approach, Mahalanobis distance (MD, a multivariate measure) is used to measure the degree of abnormality of patterns and principles of Taguchi methods are used to evaluate accuracy of predictions based on the scale constructed. The advantage of MD is that it considers correlations between the variables, which are essential in pattern analysis. Given that it can be used on a relatively small data set, with the greater the number of historical samples the greater the model to compare it to, it could be utilized in the example of hospital diagnosis. Perhaps the clinician might need a quick on-device prediction around a patient’s measurement closeness to a sample set of recent hospital measurements?

Taking this one stage further, if we expanded this to multiple hospitals, could we start to think about creating linked data sets, that would be pooled together to extract intelligence. What if a weather storm is coming? Will it affect my town or house? Imagine if we could have sensors on each house, tracking the storm in real time and try to predict the trajectory and track direction changes and the service could then communicate directly with the home owners in the path.

With the premise of open source software, consider now the concept of open data sets, linked or not. Imagine if I was the CEO of a major company in oil and gas, and I was eager to learn from other companies in my sector, and in reverse allow them to learn from us through data sets. By tagging data by type (financial, statistical, online statistical, manufacturing, sales, for example) it allows a metadata search engine to be created, which can be then be used to gain industry wide insight at the click of a mouse. The tagging is critical, as the data is not then simply a format, but descriptive also.

Case Study: Waylay IoT and Artificial Intelligence11

Waylay, an online cloud native rules engine for any OEM maker, integrator or vendor of smart connected devices, proposes a strong link11 between IoT and Artificial Intelligence.

Waylay proposes a central concept for AI, called the rational agent. By definition, an agent is something that perceives its environment through sensors and acts accordingly via actuators. An example of this is a robot utilizes camera and sensor technology and performs an action i.e. “Move” depending on its immediate environment. (See figure 8 on next page).

To extend the role of an agent, a rational agent then does the right thing. The right thing might depend on what has happened and what is currently happening in the environment.

Figure 8: Agent and Environment Diagram for AI [11]
Figure 8: Agent and Environment Diagram for AI [11]
Typically, Waylay outlines that an agent consists of an architecture and logic. The architecture allows it to ingest sensor data, run the logic on the data and act upon the outcome.

Waylay has developed a cloud-based agent architecture that observes the environment via software-defined sensors and acts on its environment through software-defined actuators rather than physical devices. A software-defined-sensor can correspond not only to a physical sensor but can also represent social media data, location data, generic API information, etc.

Figure 9: Waylay Cloud Platform and Environment Design [11]
Figure 9: Waylay Cloud Platform and Environment Design [11]
For the logic, Waylay has chosen graph modeling technology, namely Bayesian networks, as the core logical component. Graph modeling is a powerful technology that provides flexibility to match the environmental conditions observed in IoT. Waylay exposes the complete agent as a Representational State Transfer (REST) service, which means the agent, sensors and actuators can be controlled from the outside, and the intelligent agent can be integrated as part of a bigger solution.

In summary, Waylay has developed a real-time decision making service for IoT applications. It is based on powerful artificial intelligence technology and its API-driven architecture makes it compatible with modern SaaS development practices.

End of Case Study 

Reference:

11: Waylay: Case study AI and IoT

http://www.waylay.io/when-iot-meets-artificial-intelligence/