Releasing Software Developer Superpowers

Article is aimed at anyone looking to gain the edge in their software development team creation or advancement in the digital age. Concepts can be applied outside of sw dev at some level. Open to discussion – views are my own.

UX is not just for Customers

User Experience is an ever growing component of product development, with creating user centric design paradigms to ensure that personalisation and consumer/market fit is achieved. From a development team view, leveraging some of the user experience concepts in how they work can achieve operational efficiency, to accelerate product development. For example, how is the experience for each of the developer personnas in your team? How do their days translate to user stories? Can interviewing the development community lead to creating better features for your development culture?

Build Products not Technology

Super important. Sometimes with developers, there is an over emphasis on the importance of building features, a lot of the time for features sake. By keeping the lens on the value or “job to be done” for the customer in the delivery of a product at all times can ensure you are building what is truly needed by your customer. To do this, select and leverage a series of metrics to measure value for that product, along with keeping your product developent in series, and tightly coupled to your customer experience development.

Leverage PaaS to deliver SaaS

This sounds catching but its becoming the norm. 5 years ago, it took a developer a week of development time to do what you can do in Amazon Web Services or Azure now in minutes. This has led to a paradigm shift, where you being to look at the various platforms and tools that are available to enable the developers to deliver great products to customers. Of course, there will always be custom development apps, but you can help your developers by getting them the right toolkit. There is no point reinventing the wheel when OTS open source components are sitting there, right? Products like Docker and Spring and concepts like DevOps are bringing huge value to organisations, enabling the delivery of software or microservices at enhanced speed. Also, the balance between buying OTS and building custom is a careful decision at product and strategic levels.

“The role of a developer is evolving to one like a top chef, where all the ingredients and tools are available, its just getting the recipe right to deliver beautiful products to your customer.”

Create Lean Ninjas!

shutterstock_215389786 - Copy

Evolving the cultural mindset of developers and the organisation toward agile development is super important. Having critical mass of development resources, plus defined agile processes to deliver business success  can really reshape how your organisation into one where value creation in a rapid manner can take place. However, its important to perform ethnographical studies on the organisation to assess the culture. This can help decide on which agile frameworks and practices (kanban, scrum, xp etc) can work best to evolve the development life cycle.

Implement the 10% rule

Could be slightly controversial, and can be hard to do. Developers should aim to spend 10% of their time looking at the new. The new technologies, development practices, company direction, conferences, training. Otherwise you will have a siloed mis-skilled pool of superheros with their powers bottled.

However, with lean ninjas and effective agile company wide processes, resources and time can be closely aligned to exact projects and avoid injecting randomness into the development lifecycle. Developers need time to immerse and focus. If you cant do that for them, or continously distract them with mistimed requests – they will leave. If you can enable them 10% is achievable.

Risk Awareness

shutterstock_331041884 (Large)

We are seeing an evolution in threats to enterprise all over the world, and in a software driven and defined world, getting developers to have security inherent design practices prior to products hitting the market can help protect companies. Moons ago, everything sat on prem. The demands of consumers mean a myriad of cloud deployed services are adding to a complex technology footprint globally. If they know the risk landscape metrics from where they deploy, they can act accordingly. Naturally, lining them up with business leaders on compliance and security can also help on the educational pathway.

Business and Technology Convergence

We are beginning to see not only evolution in development practices –  we are also seeing a new type of convergance (brought about by lean agile and other methods) where business roles and technology roles are converging. We are beginning to see business analysts and UX people directly positioned into development teams to represent the customer and change the mindset. We are seeing technology roles being positioned directly into business services teams like HR and finance. This is impacting culture, wherby the saviness in both directions needs to be embraced and developed.

shutterstock_334013903 (Large)

Growth Mindset

We mentioned mindset a lot in the article. That because its hugely important. Having the right culture and mindset can make all the difference in team success. As Carol Dweck talks about in her book “Mindset”, you can broadly categorise them into two – growth and fixed. This can be applied in all walks of life, but for team building it can be critical.

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.

Creating a team where being on a growth curve and failures are seen as learning can really enable a brilliant culture. As Michaelangelo said “I am still learning”. Especially as we evolve to six generations of developers. How do we ensure we are creating and mentoring the next set of leaders from interns through to experienced people?

Check a Ted talk from Carol here – link.

And most importantly … HAVE FUN!

5 Technology trends to consider right now

As we are a month inside the second half of 2015, I thought it would be a good time to look at some of the technology trends that are in motion, and will have more of an influence as we enter 2016.

1: SMAC becomes SMACT

Social Mobile, Analytics and Cloud (SMAC) has existed for a number of years in enterprise applications. Internet of Things (IoT) has accelerated as an enabler in technology, and hence will begin to be added to SMAC to create SMACT. I introduced this concept in one of my first posts here. And they need each other to succeed and/or progress. As more and more devices in IoT come online, SMAC demand will increase. And IoT will add value to SMAC, as it will spurn new technology directions that can utilize SMAC. The A in SMAC will be affect more than others, as with new data sets being generated, open data sets available for data multi tenancy will drive new requirements for on demand insights in real time.

2: Co-Creation:

A key tenant of open innovation (which was mentioned in a previous blog here) is co-creation. As companies take a more outside in approach to discovering next business direction, co-creation will be a huge part of this. Whilst its slowly increasing in chatter, co-creation will be key enabler in the coming years. Industry partners, vendors and consumers will create ecosystems that will drive new business models by utilizing analytics, and understanding customers at heightened levels. We have seen how disruptive NetFlix, Uber and Bitcoin have been in the past few years, and it is expected co-creation will also drive further disruption, but in different directions at increased velocity. Ikea’s home tour is a good example of them listening to their consumers to understand the requirements for why they were doing up their homes.

3: Technology and Business Strategy Leadership positions collide

It is expected that there will be a blurring of the lines between technology and classical business positions in companies, and this will result in a series of new positions to drive next generation technology direction. We are seeing that technology and business executives need to be proficient in both areas, and understand the dependencies of the decisions made in either will be crucial. The rise in roles such as Chief Data Architect (CDA), Chief Digital Officer (CDO) and Chief Governance Officer (CGO) has meant that board rooms have increased percentage of technology executives. It is predicted that an organisations Chief Technology Officer (CTO) will create a series of direct reports in the areas of data intelligence, data monetisation, futurism and collaboration strategy. These roles will be necessary to assist the CTO in managing digital disruption.

4: Data Monetization

This is a hot topic right now, and one of the pioneers that is driving a lot of new research in this area is Steve Todd from EMC, along with Dr. Jim Short of the San Diego Supercomputer Center. Whilst you can read extensively on this topic at Steve’s blog, Ill outline some of the considerations that are prominent for your business. The first is the idea of monetisation of your current and future data assets. Data is the new oil, a form of currency that can be used to drive business metamorphosis, but also can be something that is of use to others. So then it becomes a sale-able asset. We have seen first hand where major companies are looking to acquire companies not only for their technology, but their data also (example). Imagine if your store had a considerable data set, I expect major retailers such as amazon would be interested in buying that data-set from you, to understand street shopper trends. Another aspect to consider is valuing data at all stages of your companies cycle from inception, through beta to its growth cycle. An accurate snapshot of your data assets can increase the valuation of your organisation, and is especially useful in acquisition. From an internal company data perspective, a key pillar of your data monetization strategy is the architecture on which your data resides, as numerous data silos across your organisation are generally very difficult to even analyse for valuation. The concept of a business data lake can bring huge advantage here.

5: Search will involve more than Google

Currently, a large proportion of search involves online search for information that resides on servers. However, with the increased influence of IoT and the connected world, it is expected that more that the cloud will indeed be searchable. The billions of edge devices should enter the fray, if the data and security policies continue to be challenged into being more open. Connected cars, homes and mobile devices could widen the net for any search queries. We are seeing the emergence of alpha startups indicating this trend, such as thingful and shodan, which act as search engines for the internet of things.

Why IoT practitioners need to “Wide Lens” the concept of a Data Lake

As we transition towards the vast quantity of devices that will be internet enabled by 2020, (anything from 50-200 billion experts estimate), it seems that the current cloud architectures that are being proposed are somewhat short on the features required to enable the customers data requirements on 2020.

I wont dive hugely into describing the technology stack of a Data Lake in this post (Ben Greene from Analytics Engines in Belfast, who I visit on Wednesday en route to Enter Conf, does a nice job here of that in his blog here). A quick side step, if you look at the Analytics Engines website, I saw that customer choice and ease of use were some of their architecture pillars, when providing their AE Big Data Analytics Software Stack. Quick to deploy, modular, configurable  with lots of optional high performance appliances. Its neat to say the least, and I am looking forward to seeing more.

The concept of a Data Lake has a large reputation in current tech chatter, and rightly so. Its got huge advantages in enterprise architecture scenarios. Consider the use case of a multinational company, with 30,000+ employees, countless geographically spread locations, multiple business functions. So where is all the data? Its normally a challenging question, with multiple databases, repositories and more recently, hadoop enabled technologies storing the companies data. This is the very reason why a business data lake (BDL) is a huge advantage to the corporation. If a company has a Data Architect at its disposal, then it can develop a BDL architecture (such as shown below, ref – Pivotal) that can be used to act as a landing zone for all their enterprise data. This makes a huge amount of sense. Imagine being the CEO of that company, and as we see changes in the Data Protection Act(s) over the next decade, a company can take the right step towards managing, scaling and most importantly protecting their data sets. All of this leads to a more effective data governance strategy.

Pivotal-Data-Lake

Now shift focus to 2020 (or even before?). And lets take a look at the customer landscape. The customers that will require what the concept of a BDL now provides will need far more choice. And wont necessarily be willing to pay huge sums for that service. Now whilst there is some customer choice of today, such as Pivotal Cloud Foundry, Amazon Web Services, Google Cloud and Windows Azure, it is predicted that even these services are targeted at a consumer base of a startup and upwards in the business maturity life cycle. The vast majority of cloud services customers in the future will be everyone around us, the homes we live in and beyond. And the requirement to store data in a far distance data center might not be as critical for them. It is expect they will need far more choice.

I expect in the case of building monitoring data, which could be useful to the wider audience in a secure linked open data sets (LOD’s) topology. For example, smart grid provider might be interested in energy data from all the buildings and trying to suggest optimal profiles for them to reduce impact on the grid. Perhaps the provider might even be willing to pay for that data? This is where data valuation discussions come into play, and is outside the scope of the blog. But the building itself, or its tenants might not need to store all their humidity and temperature data for example. They might some quick insight up front, and then might choose bin that data (based on some simple protocol describing the data usage) in their home for example).

Whilst a BDL is built on the premise of “Store Everything”, it is expected that whilst that will bring value for these organisations monitoring consumers of their resources, individual consumers might not be willing to pay for this.

To close, the key enablers to these concepts are the ensure that real time edge analytics and increased data architecture choice. And this is beginning to happen. Cisco have introduced edge analytics services into their routers, and this is a valid approach to ensuring that the consumer has choice. And they are taking the right approach, as there is even different services for different verticals (Retail, IT, Mobility).

In my next blog, Edge Analytics will be the focus area, where we will dive deeper into the question. “where do we put our compute?”

it@Cork European Technology Summit 2015 – a WOW event!

I wanted to change direction slightly and give an update on an event I had the privilege of being involved with this week, the it@Cork European Technology Summit. The event was held at Cork City Hall, on Wednesday May 5th, with a full day technology summit, followed by a black tie dinner with 3d wearables fashion show.

An epic journey over the past few months, with way more ups than downs resulted in…

1 Day – 4 Sections – 20 speakers – 4 Chair Speakers – 400+ Day attendees – #1 Trending on Twitter – 9 Amazing artisan food stalls – Lots of Sponsors – 200+ Night Attendees – 2 Fashion Designers – 1 Model Agency – 10 Models – 2 Fire Dancers – 4 3D printed bow ties !

So how did I arrive there? Last year, Gillian Bergin from EMC asked me to get involved with it@Cork, as part of the Tech Talk committee. I’m delighted she did, as over the past few months, I got to partake in and help organise some excellent tech talks from a variety of people, including my fellow technologist, Mr Steve Todd of EMC. The tech talk series is just one of many successful strands of it@Cork, holding six high end, rockstars speakers/panels per year. The series is full up until 2016, but if you are a “rockstar” speaker interested in speaking, please contact us directly. From this, James O’Connell of VMWare who passed over the tech talk committee chair to Barry O’Connell, took on chair of the Summit Organising committee. James, coupled with myself and Paddy O’Connell of Berkley Group, (known collectively now as the Macroom or Muskerry Mafia 🙂 ) assisted Sarah Walsh of it@Cork in organising the day summit. The night summit was excellently organised by Marnie O’Leary Daly of VMWare.

The event was kicked off by James, and then Ronan Murphy, chairman of the board it@Cork, CEO Smarttech gave an address that spoke about how Cork needs a cluster manager to help drive more employment in the region. More from Ronan here by the Examiner.  Donal Cahalane, from Teamwork.com, gave an insightful talk on how he saw the industry progressing, with some excellent advice for everyone from startups through to multinationals .


The four sections throughout the day offered a mix balance between raw technology (Cloud- challenge the fear, Internet of Everything) along with Digital Marketing and a Tech Talent/ Diversity panel. I found this to work quite well, as it ensured the audience got a variety of speakers.

The cloud session on “challenging the fear” was an excellent one to start with, as it had a mix of SME’s from companies such as Kingspan (John Shaw), Trend Micro (Simon Walsh) and Barricade (David Coallier), but also had representation from the legal profession, in the form of Michael Valley, Barrister and Noel Doherty – Solicitor who spoke at length on cloud governance. This session was chaired by Anton Savage of The Communications Clinic, who hosted a panel discussion with all five presenters at the end.


All of the sections were split by networking opportunities in the exhibition halls, where companies from the region presented their organisations, and some even demonstrated their wares. The athmosphere was great to see with lots of chatter, tweeting and drinking of coffee! 😀


The second section was a panel session on Tech Talent, the chair being Paddy O’Connell from Berkely, and the facilitators were Meghan M Biro, founder and CEO of TalentCulture, and Kevin Grossman, who co founded and co hosts the the TalentCulture #TChat show with Meghan. They later presented their TChat show live from the Clarion hotel Cork. It was awesome!

Such variety (no pun intended!) in the panel, with David Parry Jones, VP UKI VMWare and Noelle Burke Head of HR Microsoft Ireland representing industry, Michael Loftus – Head of Faculty of Engineering and Science CIT representing academia, and the hugely impressive student Ciara Judge, one of the Kinsale winners of the 2013 Google Science Award. Everyone inspired in their own way, and the dynamic at lunchtime was one of motivation, hope and leadership.


Having started my own personal digital marketing brand last year, and learning by making mistakes, I was exceptionally excited by our third section – Digital Marketing. Again, Anton did an incredible job of asking the right questions, and effortless listenership followed. To listen to experts such as Meghan, Antonio Santos, Niall Harbison and Raluca Saceanu was a privilege, and I also got the opportunity to speak with the directly (as did many others). This was true of all the speakers throughout the day. I believe a huge number of people got lots of what I call “advice snippets” that they can take away and grow their own brand.


The last session was on an area close to my heart, the Internet of everything (IoE), and I had the privilege of chairing the session. We had speakers from Climote (Derek Roddy), my future employer Tyco (Craig Trivelpiece), Salesforce (Carl Dempsey), Dell (Marc Flanagan) and Xanadu (David Mills). All these companies are in different stages on their IoE journey, but the message was consistent: IoE is going to make a huge impact on our smart futures. I really like how Craig spoke of “if you want to improve something, measure it”  and how Tyco are looking at predictive maintenance and pushing intelligence/insight back out to the devices. Derek showed how Climote is changing how we live, David did the same in relation to sport. Marc gave an excellent account of Dells practical approach to IOT, showing the capabilities needed for IoE projects. Carl got me really excited about Salesforce’ plans in the IoE space. The session really closed out the event well, and the numbers in attendance stayed consistent.

Having attended a huge number of tech events over the years, it was great to see again, year on year growth of Munsters premier Technology Summit. The athmosphere was electric all day, both locally and on Twitter. The tweet wall was a big success, and we expect that next years event will be bigger and better again.


The black tie dinner was also a huge success, with the Millenium Hall in City Hall packed to capacity. Marnie O’Leary Daly, along with Emer from Lockdown model agency, put on an amazing dinner (superb catering by Brooks) and fashion show, with 3D wearables fashion provided by Aoibheann Daly from LoveandRobots and Rachael Garrett from Limerick School of Art and Design (@LSAD). Special mention to FabLab also for helping Rachael get her garments ready. It really was a spectacular evening. The Clarion hotel was also hugely supportive of the night element. (Photos to follow!) Emer will also blog on the night event fashion soon and do a much better job than me!

It@Cork European Technology Summit 2016. Watch this space. 

If you are interested in getting involved in 2016, please contact Sarah Walsh at it@Cork.

Case Study: IoT Technology Platform – ThingWorx [10]

In my previous blog, I mentioned some platform design considerations at the outset. In this blog, I discuss one such Platform that has gained significant traction in the industry in recent times.

About ThingWorx10

ThingWorx is one of the first software platforms designed to build and run the applications of the connected IoT world. ThingWorx reduces the cost, time, and risk required to build innovative Machine-to-Machine (M2M) and Internet of Things (IoT) applications.

The ThingWorx platform provides a complete application design, runtime, and intelligence environment with the below features:

  • Modern and Complete Platform
  • Mashup People, Systems & Machines
  • Deploy 10X Faster with Model-based Development
  • Deploy How You Like
  • Evolve & Grow Your Application Over Time

What ThingWorx does that was really clever was that they created a modelling environment based on a database of graphs that keeps track of thousands of devices that communicate with other devices and applications.

“There’s nothing new about gathering and using data to make something better. What is new, and complex, is getting these things that are now web-enabled to take better advantage of the IoT. This requires application developers to rethink how they collect, analyze, manipulate and interact with information,” said Russ Fadel, CEO, ThingWorx9. “ThingWorx is the first software platform on the market designed to build and run applications in the connected IoT world and offers a fully integrated and pre-architected solution that covers connectivity, event processing, analytics, storage and presentation of any kind of M2M and IoT data. Our goal is to provide customers with instant insight into collected data from these smart, connected things so they can be proactive and address issues before they happen in a smarter way than previously able.”10

Figure 7: ThingWorx Architecture [10]
Figure 7: ThingWorx Architecture [10]

Features10

ThingWorx Composer™

ThingWorx Composer is an end-to-end application modeling environment designed to help you easily build the unique applications of today’s connected world. Composer makes it easy to model the Things, Business Logic, Visualization, Data Storage, Collaboration, and Security required for a connected application.

Codeless Mashup Builder

ThingWorx “drag and drop” Mashup Builder empowers developers and business users to rapidly create rich, interactive applications, real-time dashboards, collaborative workspaces, and mobile interfaces without the need for coding. This next-generation application builder reduces development time and produces high quality, scalable connected applications which allows companies to accelerate the pace at which they can deliver value-add solutions, resulting in greater market share against new and existing competitors.

Event-Driven Execution and “3D” Storage

ThingWorx’s event-driven execution engine and 3-Dimensional storage allows companies to make business sense of the massive amounts of data from their people, systems, and connected “Things” – making the data useful and actionable. The platform supports scale requirements for millions of devices, and provides connectivity, storage, analysis, execution, and collaboration capabilities required for applications in today’s connected world. It also features a data collection engine that provides unified, semantic storage for time-series, structured, and social data at rates 10X faster than traditional RDBs.

Search-based Intelligence

ThingWorx SQUEAL™ (Search, Query, and Analysis) brings Search to the world of connected devices and distributed data. With SQUEAL’s interactive search capabilities, users can now correlate data that delivers answers to key business questions. Pertinent and related collaboration data, line-of-business system records, and equipment data get returned in a single search, speeding problem resolution and enabling innovation.

Collaboration

ThingWorx dynamically and virtually brings together people, systems, and connected equipment, and utilizes live collaboration sessions that help individuals or teams solve problems faster. The ThingWorx data store becomes the basis of context aware collaboration and interaction among the systems users, further enhancing its value. Additionally, the tribal knowledge exposed during the process is automatically captured and indexed for use in future troubleshooting activities.

End of Case Study

References 

10: ThingWorx: About ThingWorx

http://www.thingworx.com/

Platform Architecture Pre Considerations for IoT

Apart from the sheer volume of data generated by IoT devices, there are also a huge number of different data customers requirements, both known and unknown that will need to be considered. In this regard, the platform technology will need to be agile enough to meet this variation. How will this scale both horizontally and vertically to ensure sustainability? I started to think of profiling requirements, and looking to give personality to the IoT customer type, so that the platform can morph and adjust itself based on not only the inputs (data type, frequency, format, lifetime), but also what outputs it needs to provide.

Data latency will also be a requirement that any platform will need to firstly understand, and then address, depending on the application and customer requirements. In an interesting discussion today in Silicon Valley with Jeff Davis (my original hiring manager in EMC, and now senior director of the xGMO group looking at operations cloud, analytics and infrastructure services ), he mentioned having worked in a previous company in the sensor business, latency represented a huge challenge, especially when the amount of data grew exponentially. We chatted more and more about how the consumer of now wants their devices/ technology interactions to be instant. How long will people be willing to wait for smart light bulbs/ switches? What if my devices are distributed? More importantly, Jeff outlined a key question. “How much are the consumer willing to pay for the added services provided by adding “smarts” to standard everyday sensors”? This is a “understand the market” question, and should be a consideration for anyone looking at building an IoT platform.

When one starts to consider that most applications in the IoT space might require more than one industry working together, cross collaboration is key to making it work. Consider some of the taxi apps in use currently, whereby the taxi company provides the car locations, the application needs to offer information on locations, then the banking is used to pay for it from your account, and perhaps there is advertisement shown on your receipt, if a suitable arrangement is not formed between the various It companies, it becomes too easy for the “blame game” to ruin the user’s experience of the application when something goes wrong.

Central to the satisfying both the varying requirements of the customers and latency management will be the concept of a customer or business data lake, powered by Hadoop or Spark technology, will form the primary storage and processing in the data center. There is also an option to look at tiering to help address the variation in requirements for the platform, with the possibility to send the “big hitting data”, which brings the most value in close to real time, to an in memory database, to provide fast cache insightful analytics. In a later blog post, I will elaborate greatly on this paragraph, so stay tuned. If the same dataset can be used by multiple applications, in a multi-tenant schema, then there will be clear orchestration challenges in ensuring that this data can be processed in real time.  Other features of any data architecture for IoT could also include:

  • Multiple Data Format Support
  • Real Time Processing
  • High Volume Data Transfer
  • Geographically Agnostic
  • Data Lake Archival and Snipping

As with all technology, IoT will evolve, which means that we will build on top of previous technologies, and new technologies will add to the ecosystem. The enterprise data warehouse will continue to play an important role, but a series of technology platforms will be necessary. While numerous platforms have and will be created, one such platform, ThingWorx is the subject of case study in my next blog.

IoT Impact on the Manufacturing Industry (Part 2)

Continuing on from my last blog post, another example for IoT use in manufacturing would be for the asset management to distribute work orders and configurations to the tools or the different stages of production. And vice versa, calibration information can be fed back to the Enterprise Resource Planning (ERP) system to associate them to the bill of material (BOM). Big data and NoSQL technology is an enabler in this regard, as they can allow for the management of huge volumes of heterogeneous, multi structured data about the production process, from the data types discussed, to even images from AOI (Automated Optical Inspection) systems and other production modules. With recalls a concern point in global manufacturing, this can be an ally in the fight to keep costs down for manufacturing.

IoT can also have an impact is in intelligent edge devices and their use in improving supply chain optimization and modularity of manufacturing. Consider surface mount technology (SMT), where there is so many moving parts, calibration, types of technology used in the placement and verification of board level components. IoT sensors could be utilized to centralize SMT line asset management and to read calibration information via the factory WLAN. The asset management can form the link between the SMT tools and the ERP (Enterprise Resource Planning) and MES (Manufacturing Execution Systems) that oversee the manufacturing process.

A challenge that presents itself to the manufacturing industry is the ageing workforce, and this means that anything that speeds up the manufacturing process is critical. The advancement in mobile technology is a key enabler in ensuring that passing information to the shop floor becomes quicker, improving response time, visibility, and accessibility of operations. The recent advancement of wearables also will have an impact on enhanced visibility on the shop floor.

Building Blocks for IoT in Manufacturing

Business owners need to look at four technology elements that provide the foundation for smart manufacturing. These include (but not limited to):

  • Security: IT security is a major obstacle to setting up smart factories. Operations managers need to make sure that necessary safeguards are built into the solution including security procedures such as physical building security, hardware encryption and network security for data in transit. Security and networking solutions must also be engineered to withstand harsh environmental conditions, such as moisture and temperature, that aren’t present in typical networks. Identity and authentication structures will also need to be updated to support such “things” as well as people.
  • More Advanced Networking: Smarter manufacturing environments need a standardized IP-centric network that will enable all the devices/sensors in a plant to communicate to enterprise business systems. Cisco research states that only 4 percent of the devices on the manufacturing floor are connected to a network. A standard IP network also makes it easier to connect and collaborate with suppliers and customers to improve supply chain visibility. Manufacturers need robust networks that can cope with Radio Frequency (RF) challenges in the plant, harsher environmental conditions and need stability for transmission of alarms and real-time data processing.
  • Big Data Analytics: While manufacturers have been generating big data for numerous years, companies have had limited ability to store, analyze and effectively use all the data that was available to them, especially in real time. New big data processing tools are enabling real-time data stream analysis that can provide dramatic improvements in real time problem solving and cost avoidance. Big data and analytics will be the foundation for areas such as forecasting, proactive maintenance and automation.
  • Engineering Software Systems: Today’s IoT data is different than the data we use to operate our systems. It requires collecting a wide range of data from a variety of sensors. These software systems and models must translate information from the physical world into actionable insight that can be used by humans and machines. Toyota is using Rockwell’s software for real time error corrections in the plant. Toyota has minimized rework and scrap rates in its Alabama plant, which has resulted in an annual cost saving of $550,000.3

Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
With IoT, IP networks and analytics, manufacturers can become more efficient, improve worker safety and offer new exciting business models. IoT will help manufacturers improve resource efficiency, safety and return on assets. Manufacturers that master this new dynamic will have a variety of new opportunities for revenue growth and cost savings.

References

3: How IoT will help manufacturing

http://www.industryweek.com/blog/how-will-internet-things-help-manufacturing

4: Industrial Optimization IoT (Intel)

http://www.intel.ie/content/dam/www/public/us/en/documents/white-papers/industrial-optimizing-manufacturing-with-iot-paper.pdf

IoT Impact on the Manufacturing Industry (Part 1)

“Industry 4.0” and “Smart Factory” are some of the terms used to describe the technological and social revolution that promises to change the current industrial landscape. Industry 1.0 was the invention of mechanical assistance, Industry 2.0 was mass production, pioneered by Henry Ford, Industry 3.0 brought electronics and control systems to the shop floor, and Industry 4.0 is peer-to-peer communication between products, systems and machines. It is clear that IoT will have a different impact statement depending on the application and/or industry, one that is of particular interest, given the emphasis on process, is Manufacturing. Compared to other realms such as retail and its intangible ways, manufacturing is about physical objects and how we can bring them to the consumer in a more efficient and automated way. The manufacturing landscape is ever changing, with automation through robotics the most recent enabler.

Challenges and Possibilities of IoT and Manufacturing 1

Gartner analyst Simon Jacobsen sees five immediate challenges and possibilities posed by the IoT for the manufacturing industry1.

1. CIOs and manufacturing leads will have to move even more rapidly

Jacobson says manufacturers have moved heavily toward individualization and mass customization as part of the luxury of connected products. But in order to enable that, you have to maintain alignment with supply management, logistics functions and partners to make sure all service levels are maintained: “I have to have knowledge of my processes and optimization of my processes at a hyper level, not just simply understanding at week’s end or at the end of the shift where I need to make adjustments and improve,” Jacobson said.

2. Security must be reimagined

A connected enterprise means that you can no longer simply physically secure the facility but should blend approaches of mobile and cloud-based architectures with industrial, control and automation, ensuring information is being managed. Jacobson says the challenge will be to merge the skills of engineers and process control teams with IT and more importantly, unify their disparate approaches to security.

3. IoT will create more visibility in process performance

There’s always been a form of automation and control in manufacturing, but implementing new business applications powered by IoT will allow you to connect devices to the factory network and know tolerances: “Being able to connect those dots and derive contexts of how processes are performing is absolutely going to be where the return on investment is coming from,” Jacobson said.

4. Predictive maintenance can generate revenue for OEMs

Asset performance management is of high value today. This is the ability to drive availability, minimize costs and reduce operational risks by capturing and analyzing data. Original Equipment Manufacturers (OEMs) have already started creating revenue by using IoT-enabled tools like predictive maintenance in order to guarantee uptime, outcomes and certain levels of performance for the customer: “When you guarantee these kinds of outcomes to the customers, you have to look at this from two different perspectives, how I monetize this but also how my customer monetizes this,” Jacobson said.

5. Production will play a new role in the manufacturing value chain

The boundaries between the physical and digital worlds are blurring. Chief Information Officers (CIOs) and manufacturing strategists can use the IoT, big data and cloud to redefine the role production plays in the manufacturing value chain. It no longer has to be restricted to being a cost center, and this has all to do with the new ability to not just accelerate but innovate on the factory floor. It’s the CIO’s challenge to keep pace with these new competitive changes.

Figure 10: Real Time Intelligence on the Shop Floor [2]
Figure 10: Real Time Intelligence on the Shop Floor [2]
In my next blog post, I will continue this discussion on IoT and Manufacturing, giving further use cases, and outlining the building blocks for IoT in Manufacturing.

References:

1: Gartner Best Practices for IoT in Manufacturing

https://www.gartner.com/doc/2899318?ref=AnalystProfile

2: Building Blocks for a Smart Plant

http://www.mbtmag.com/articles/2014/10/manufacturing-transformations-building-blocks-future-smart-plant

Pre Cloud Security Considerations in IoT

Introduction

Over the past decade, hybrid cloud adoption has steadily increased, with closed network becoming less the option of choice. But this comes at a cost to security and trust metrics. As we become more dependent on intelligent devices in our lives, how do we ensure the data that is within the web is not compromised by external threats that could threaten our personal safety?

As the adoption of IoT increases, so does the risk of hackers getting at our personal information. As Alan Webber points out on his RSA blog6, there are three key risk areas or bubbles that companies need to be aware of.

1: Fully enabled Linux/Windows OS systems: This area concerns itself with those devices that are not part of a normal IT infrastructure, but are still run on full operating systems, such as Linux or Windows. As everyone knows, prior to IoT, these OS have vulnerabilities, and when they are deployed in the “free world”, they are not as visible to IT admins.

2: Building Management Systems (BMS): This pertains to infrastructure systems that assist in the management of buildings, such as fire detection, suppression, physical security systems and more. These are not usually classified as threatened, yet shutting down a fire escape alarm system could lead to a break-in scenario.

3: Industry Specific Devices: This area covers devices that assist a particular industry, such as manufacturing, navigation, or supply chain management systems. For example, in the case of a supply chain management system, route and departure times for shipments can be intercepted, which could lead to shipment intercept and reroute to another geographical location.

So, how do we guard against these types of risks, and make the devices themselves and also the web of connected devices less dumb? Security must be looked at holistically to begin with, with end to end security systems being employed to ensure system level safety, and to work on device level embedded control software to ensure data integrity from edge to cloud.

Data routing must also be taken seriously from a security standpoint. For example, smart meters generally do not push their data to a gateway continuously, but send it to a data collection hub, before sending it in a single bulk packet to the gateway. Whilst the gateway might have an acceptable security policy, what about the data collection hub? This raises a major challenge, as how does one micro manage all the various security systems their data might migrate across?

Security Design Considerations

Early stage IoT devices unfortunately had the potential loss of physical security in their design, so it is necessary for security officers to be aware of the focus and location of their security provisioning.

To apply security design to the devices is not the most utilized method (similar to internal storage), as the cost and capacity of these devices is counterproductive to same. The devices would look to ensure consistency of communication and message integrity. Usually, one would deploy the more complex security design upfront within the web services that sits in front and interacts with the devices. It is predicted as the devices themselves evolve, and nanotechnology becomes more and more of an enabler in the space, the security design will become closer to the devices, before eventually becoming embedded.

It is proposed that shared cloud based storage will play a pivotal role in combating the data volume perplexity, but not without its issues. How do we handle identification and authentication? How do we ensure adequate data governance? Partnerships will be necessary between security officers and cloud providers to ensure these questions are answered.

Searching for the holy grail of 100% threat avoidance is impossible, given the number of players in an entire IoT ecosystem. Whilst cloud service providers own their own infrastructure, it is very difficult for them to know if the data that is received has not being compromised. There are ways to reduce this, but using metadata and building “smarts” into the data from typical known sets as it transitions from edge to cloud. It seems like an approach of something equivalent to a nightclub security guard checking potential clients to their nightclub is a useful analogy. “Whats your name (what type of data are you), where have you been tonight (whats your migration path), how many drinks have you had ( what transactions happened on your data).!!

IoT Security and Chip Design

One area that could bring about increased data privacy is the increased usage of the concept of “Trusted Execution Environments” or TEEs, which is a secure area in the main processor of the device. This ensures that independent processing can occur on critical data within the silicon itself. This enables trusted applications to run to enforce confidentiality and integrity, and protect against unauthorized cloning or object impersonation by remove and replace. Taking it into a real world example, a home owner tampering with their smart meter to reduce their energy bill would be one scenario that would be avoided with TEEs.

If cloud services companies can somehow increase their influence on the IoT device design (outside of the popularity of TEE’s in cellular applications). then utilizing technology such as this will ensure less risk once the data reaches the cloud. Collaboration efforts should be increased between all parties to ensure best practice across the entire IoT landscape can be established.

Figure 1. Generalized framework for a secure SoC
Figure 1. Generalized framework for a secure SoC [7]
References:

6 RSA RISKS of IOT

https://blogs.rsa.com/3-key-risk-areas-internet-things/

7: EDN SOC TE

http://www.edn.com/design/systems-design/4402964/2/Using-virtualization-to-implement-a-scalable-trusted-execution-environment-in-secure-SoCs

IoT meets Data Intelligence: Instant Chemistry

Even in the ideal world of a perfect network topology, a web of sensors, a security profile, a suitable data center design, and lots of applications for processing and analyzing, one thing is constant across all of these, the data itself. Data science is well talked about, and careers have been built from the concept. It is normally aimed at the low hanging fruit of a set of data, things that are easily measured. Science will take you so far, but it is data intelligence that will show the true value, with capability to predict impact from actions, and track this over time, to build modelling engines to solve future problems.

Even the data set is different for data intelligence as opposed to data science, which relies on lots and lots of data sets (Facebook, working out effectiveness of their changes/features etc). It is more complex, smaller even, and can be a data set contained in a single process or building.  Imagine a hospital’s set of machines producing live data to an analytics engine, and using historical models to compare live data to gauge risk to the patients? It can have real tangible benefit to life quality. Commonly called “Operational Intelligence”, the idea is to apply real time analytics to live data with very low latency. It’s all about creating that complete picture: historical data and models working with live data to provide a solution that can potentially transform all kinds of industry.

At the core of any system of this kind is decision making. Again, one must strive to make this as intelligent as possible. There are two types of decision making. The first is stagnant decision making and the second is dynamic decision making. With the assistance of mathematical models and algorithms, it will be possible for any IoT data set to analyze the further implications of alternative actions. As such, one would predict that efficiency of decision making would be increased.

At the IoT device level, there is scope to apply such a solution. Given the limited storage capacity on the devices themselves, a form of rolling deterministic algorithm that looks to analyse a set of sensor readings, and produce an output of whether or not to send a particular measurement to the intelligent gateway or cloud service.

Another proposed implementation on-device might be to use a deviation from correctness model, such as the Mahalanobis-Taguchi Method, which is an information pattern technology, which has been used in different diagnostic applications to help in making quantitative decisions by constructing a multivariate measurement scale using data analytic methods. In the MTS approach, Mahalanobis distance (MD, a multivariate measure) is used to measure the degree of abnormality of patterns and principles of Taguchi methods are used to evaluate accuracy of predictions based on the scale constructed. The advantage of MD is that it considers correlations between the variables, which are essential in pattern analysis. Given that it can be used on a relatively small data set, with the greater the number of historical samples the greater the model to compare it to, it could be utilized in the example of hospital diagnosis. Perhaps the clinician might need a quick on-device prediction around a patient’s measurement closeness to a sample set of recent hospital measurements?

Taking this one stage further, if we expanded this to multiple hospitals, could we start to think about creating linked data sets, that would be pooled together to extract intelligence. What if a weather storm is coming? Will it affect my town or house? Imagine if we could have sensors on each house, tracking the storm in real time and try to predict the trajectory and track direction changes and the service could then communicate directly with the home owners in the path.

With the premise of open source software, consider now the concept of open data sets, linked or not. Imagine if I was the CEO of a major company in oil and gas, and I was eager to learn from other companies in my sector, and in reverse allow them to learn from us through data sets. By tagging data by type (financial, statistical, online statistical, manufacturing, sales, for example) it allows a metadata search engine to be created, which can be then be used to gain industry wide insight at the click of a mouse. The tagging is critical, as the data is not then simply a format, but descriptive also.

Case Study: Waylay IoT and Artificial Intelligence11

Waylay, an online cloud native rules engine for any OEM maker, integrator or vendor of smart connected devices, proposes a strong link11 between IoT and Artificial Intelligence.

Waylay proposes a central concept for AI, called the rational agent. By definition, an agent is something that perceives its environment through sensors and acts accordingly via actuators. An example of this is a robot utilizes camera and sensor technology and performs an action i.e. “Move” depending on its immediate environment. (See figure 8 on next page).

To extend the role of an agent, a rational agent then does the right thing. The right thing might depend on what has happened and what is currently happening in the environment.

Figure 8: Agent and Environment Diagram for AI [11]
Figure 8: Agent and Environment Diagram for AI [11]
Typically, Waylay outlines that an agent consists of an architecture and logic. The architecture allows it to ingest sensor data, run the logic on the data and act upon the outcome.

Waylay has developed a cloud-based agent architecture that observes the environment via software-defined sensors and acts on its environment through software-defined actuators rather than physical devices. A software-defined-sensor can correspond not only to a physical sensor but can also represent social media data, location data, generic API information, etc.

Figure 9: Waylay Cloud Platform and Environment Design [11]
Figure 9: Waylay Cloud Platform and Environment Design [11]
For the logic, Waylay has chosen graph modeling technology, namely Bayesian networks, as the core logical component. Graph modeling is a powerful technology that provides flexibility to match the environmental conditions observed in IoT. Waylay exposes the complete agent as a Representational State Transfer (REST) service, which means the agent, sensors and actuators can be controlled from the outside, and the intelligent agent can be integrated as part of a bigger solution.

In summary, Waylay has developed a real-time decision making service for IoT applications. It is based on powerful artificial intelligence technology and its API-driven architecture makes it compatible with modern SaaS development practices.

End of Case Study 

Reference:

11: Waylay: Case study AI and IoT

http://www.waylay.io/when-iot-meets-artificial-intelligence/