Releasing Software Developer Superpowers

Article is aimed at anyone looking to gain the edge in their software development team creation or advancement in the digital age. Concepts can be applied outside of sw dev at some level. Open to discussion – views are my own.

UX is not just for Customers

User Experience is an ever growing component of product development, with creating user centric design paradigms to ensure that personalisation and consumer/market fit is achieved. From a development team view, leveraging some of the user experience concepts in how they work can achieve operational efficiency, to accelerate product development. For example, how is the experience for each of the developer personnas in your team? How do their days translate to user stories? Can interviewing the development community lead to creating better features for your development culture?

Build Products not Technology

Super important. Sometimes with developers, there is an over emphasis on the importance of building features, a lot of the time for features sake. By keeping the lens on the value or “job to be done” for the customer in the delivery of a product at all times can ensure you are building what is truly needed by your customer. To do this, select and leverage a series of metrics to measure value for that product, along with keeping your product developent in series, and tightly coupled to your customer experience development.

Leverage PaaS to deliver SaaS

This sounds catching but its becoming the norm. 5 years ago, it took a developer a week of development time to do what you can do in Amazon Web Services or Azure now in minutes. This has led to a paradigm shift, where you being to look at the various platforms and tools that are available to enable the developers to deliver great products to customers. Of course, there will always be custom development apps, but you can help your developers by getting them the right toolkit. There is no point reinventing the wheel when OTS open source components are sitting there, right? Products like Docker and Spring and concepts like DevOps are bringing huge value to organisations, enabling the delivery of software or microservices at enhanced speed. Also, the balance between buying OTS and building custom is a careful decision at product and strategic levels.

“The role of a developer is evolving to one like a top chef, where all the ingredients and tools are available, its just getting the recipe right to deliver beautiful products to your customer.”

Create Lean Ninjas!

shutterstock_215389786 - Copy

Evolving the cultural mindset of developers and the organisation toward agile development is super important. Having critical mass of development resources, plus defined agile processes to deliver business success  can really reshape how your organisation into one where value creation in a rapid manner can take place. However, its important to perform ethnographical studies on the organisation to assess the culture. This can help decide on which agile frameworks and practices (kanban, scrum, xp etc) can work best to evolve the development life cycle.

Implement the 10% rule

Could be slightly controversial, and can be hard to do. Developers should aim to spend 10% of their time looking at the new. The new technologies, development practices, company direction, conferences, training. Otherwise you will have a siloed mis-skilled pool of superheros with their powers bottled.

However, with lean ninjas and effective agile company wide processes, resources and time can be closely aligned to exact projects and avoid injecting randomness into the development lifecycle. Developers need time to immerse and focus. If you cant do that for them, or continously distract them with mistimed requests – they will leave. If you can enable them 10% is achievable.

Risk Awareness

shutterstock_331041884 (Large)

We are seeing an evolution in threats to enterprise all over the world, and in a software driven and defined world, getting developers to have security inherent design practices prior to products hitting the market can help protect companies. Moons ago, everything sat on prem. The demands of consumers mean a myriad of cloud deployed services are adding to a complex technology footprint globally. If they know the risk landscape metrics from where they deploy, they can act accordingly. Naturally, lining them up with business leaders on compliance and security can also help on the educational pathway.

Business and Technology Convergence

We are beginning to see not only evolution in development practices –  we are also seeing a new type of convergance (brought about by lean agile and other methods) where business roles and technology roles are converging. We are beginning to see business analysts and UX people directly positioned into development teams to represent the customer and change the mindset. We are seeing technology roles being positioned directly into business services teams like HR and finance. This is impacting culture, wherby the saviness in both directions needs to be embraced and developed.

shutterstock_334013903 (Large)

Growth Mindset

We mentioned mindset a lot in the article. That because its hugely important. Having the right culture and mindset can make all the difference in team success. As Carol Dweck talks about in her book “Mindset”, you can broadly categorise them into two – growth and fixed. This can be applied in all walks of life, but for team building it can be critical.

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.

Creating a team where being on a growth curve and failures are seen as learning can really enable a brilliant culture. As Michaelangelo said “I am still learning”. Especially as we evolve to six generations of developers. How do we ensure we are creating and mentoring the next set of leaders from interns through to experienced people?

Check a Ted talk from Carol here – link.

And most importantly … HAVE FUN!

Distributed Analytics in IoT – Why Positioning is Key

analytics-word-cloud

The current global focus on the “Internet of Things (IoT)” have highlighted extreme importance of sensor-based intelligent and ubiquitous systems contributing to improving and introducing increased efficiency into our lives. There is a natural challenge in this, as the load on our networks and cloud infrastructures from a data perspective continues to increase. Velocity, variety and volume are attributes to consider when designed your IoT solution, and then it is necessary to design where and where the execution of analytical algorithms on the data sets should be placed.

Apart from classical data centers, there is a huge potential in looking at the various compute sources across the IoT landscape. We live in a world where compute is at every juncture, from us to our mobile phones, our sensor devices and gateways to our cars. Leveraging this normally idle compute is important in meeting the data analytics requirements in IoT. Future research will attempt to consider these challenges. There are three main classical architecture principles that can be applied to analytics. 1: Centralized 2: Decentralized and 3: Distributed.

The first, centralized is the most known and understood today. Pretty simple concept. Centralized compute across clusters of physical nodes is the landing zone (ingestion) for data coming from multiple locations. Data is thus in one place for analytics. By contrast, a decentralized architecture utilizes multiple big distributed clusters are hierarchically located in a tree like architecture. Consider the analogy where the leaves are close to the sources, can compute the data earlier or distribute the data more efficiently to perform the analysis. This can have some form of grouping applied to it, for example – per geographical location or some form of hierarchy setup to distribute the jobs.

Lastly, in a distributed architecture, which is the most suitable for devices in IoT, the compute is everywhere. Generally speaking, the further from centralized, the size of the compute decreases, right down to the silicon on the devices themselves. Therefore, it should be possible to push analytics tasks closer to the device. In that way, these analytics jobs can act as a sort of data filter and decision maker, to determine whether quick insight can be got from smaller data-sets at the edge or beyond, and whether or not to push the data to the cloud or discard. Naturally with this type of architecture, there are more constraints and requirements for effective network management, security and monitoring of not only the devices, but the traffic itself. It makes more sense to bring the computation power to the data, rather than the data to a centralized processing location. 

There is a direct relationship between the smartness of the devices and the selection and effectiveness of these three outlined architectures. As our silicon gets smarter and more powerful and efficient, this will mean that more and more compute will become available, which should result in the less strain on the cloud. As we distribute the compute, it should mean more resilience in our solutions, as there is no single point of failure.

In summary, the “Intelligent Infrastructures” now form the crux of the IoT paradigm. This means that there will be more choice for IoT practitioners to determine where they place their analytics jobs to ensure they are best utilizing the compute that is available, and ensuring they control the latency for faster response, to meet the real time requirements for the business metamorphosis that is ongoing.

Why IoT practitioners need to “Wide Lens” the concept of a Data Lake

As we transition towards the vast quantity of devices that will be internet enabled by 2020, (anything from 50-200 billion experts estimate), it seems that the current cloud architectures that are being proposed are somewhat short on the features required to enable the customers data requirements on 2020.

I wont dive hugely into describing the technology stack of a Data Lake in this post (Ben Greene from Analytics Engines in Belfast, who I visit on Wednesday en route to Enter Conf, does a nice job here of that in his blog here). A quick side step, if you look at the Analytics Engines website, I saw that customer choice and ease of use were some of their architecture pillars, when providing their AE Big Data Analytics Software Stack. Quick to deploy, modular, configurable  with lots of optional high performance appliances. Its neat to say the least, and I am looking forward to seeing more.

The concept of a Data Lake has a large reputation in current tech chatter, and rightly so. Its got huge advantages in enterprise architecture scenarios. Consider the use case of a multinational company, with 30,000+ employees, countless geographically spread locations, multiple business functions. So where is all the data? Its normally a challenging question, with multiple databases, repositories and more recently, hadoop enabled technologies storing the companies data. This is the very reason why a business data lake (BDL) is a huge advantage to the corporation. If a company has a Data Architect at its disposal, then it can develop a BDL architecture (such as shown below, ref – Pivotal) that can be used to act as a landing zone for all their enterprise data. This makes a huge amount of sense. Imagine being the CEO of that company, and as we see changes in the Data Protection Act(s) over the next decade, a company can take the right step towards managing, scaling and most importantly protecting their data sets. All of this leads to a more effective data governance strategy.

Pivotal-Data-Lake

Now shift focus to 2020 (or even before?). And lets take a look at the customer landscape. The customers that will require what the concept of a BDL now provides will need far more choice. And wont necessarily be willing to pay huge sums for that service. Now whilst there is some customer choice of today, such as Pivotal Cloud Foundry, Amazon Web Services, Google Cloud and Windows Azure, it is predicted that even these services are targeted at a consumer base of a startup and upwards in the business maturity life cycle. The vast majority of cloud services customers in the future will be everyone around us, the homes we live in and beyond. And the requirement to store data in a far distance data center might not be as critical for them. It is expect they will need far more choice.

I expect in the case of building monitoring data, which could be useful to the wider audience in a secure linked open data sets (LOD’s) topology. For example, smart grid provider might be interested in energy data from all the buildings and trying to suggest optimal profiles for them to reduce impact on the grid. Perhaps the provider might even be willing to pay for that data? This is where data valuation discussions come into play, and is outside the scope of the blog. But the building itself, or its tenants might not need to store all their humidity and temperature data for example. They might some quick insight up front, and then might choose bin that data (based on some simple protocol describing the data usage) in their home for example).

Whilst a BDL is built on the premise of “Store Everything”, it is expected that whilst that will bring value for these organisations monitoring consumers of their resources, individual consumers might not be willing to pay for this.

To close, the key enablers to these concepts are the ensure that real time edge analytics and increased data architecture choice. And this is beginning to happen. Cisco have introduced edge analytics services into their routers, and this is a valid approach to ensuring that the consumer has choice. And they are taking the right approach, as there is even different services for different verticals (Retail, IT, Mobility).

In my next blog, Edge Analytics will be the focus area, where we will dive deeper into the question. “where do we put our compute?”

it@Cork European Technology Summit 2015 – a WOW event!

I wanted to change direction slightly and give an update on an event I had the privilege of being involved with this week, the it@Cork European Technology Summit. The event was held at Cork City Hall, on Wednesday May 5th, with a full day technology summit, followed by a black tie dinner with 3d wearables fashion show.

An epic journey over the past few months, with way more ups than downs resulted in…

1 Day – 4 Sections – 20 speakers – 4 Chair Speakers – 400+ Day attendees – #1 Trending on Twitter – 9 Amazing artisan food stalls – Lots of Sponsors – 200+ Night Attendees – 2 Fashion Designers – 1 Model Agency – 10 Models – 2 Fire Dancers – 4 3D printed bow ties !

So how did I arrive there? Last year, Gillian Bergin from EMC asked me to get involved with it@Cork, as part of the Tech Talk committee. I’m delighted she did, as over the past few months, I got to partake in and help organise some excellent tech talks from a variety of people, including my fellow technologist, Mr Steve Todd of EMC. The tech talk series is just one of many successful strands of it@Cork, holding six high end, rockstars speakers/panels per year. The series is full up until 2016, but if you are a “rockstar” speaker interested in speaking, please contact us directly. From this, James O’Connell of VMWare who passed over the tech talk committee chair to Barry O’Connell, took on chair of the Summit Organising committee. James, coupled with myself and Paddy O’Connell of Berkley Group, (known collectively now as the Macroom or Muskerry Mafia 🙂 ) assisted Sarah Walsh of it@Cork in organising the day summit. The night summit was excellently organised by Marnie O’Leary Daly of VMWare.

The event was kicked off by James, and then Ronan Murphy, chairman of the board it@Cork, CEO Smarttech gave an address that spoke about how Cork needs a cluster manager to help drive more employment in the region. More from Ronan here by the Examiner.  Donal Cahalane, from Teamwork.com, gave an insightful talk on how he saw the industry progressing, with some excellent advice for everyone from startups through to multinationals .


The four sections throughout the day offered a mix balance between raw technology (Cloud- challenge the fear, Internet of Everything) along with Digital Marketing and a Tech Talent/ Diversity panel. I found this to work quite well, as it ensured the audience got a variety of speakers.

The cloud session on “challenging the fear” was an excellent one to start with, as it had a mix of SME’s from companies such as Kingspan (John Shaw), Trend Micro (Simon Walsh) and Barricade (David Coallier), but also had representation from the legal profession, in the form of Michael Valley, Barrister and Noel Doherty – Solicitor who spoke at length on cloud governance. This session was chaired by Anton Savage of The Communications Clinic, who hosted a panel discussion with all five presenters at the end.


All of the sections were split by networking opportunities in the exhibition halls, where companies from the region presented their organisations, and some even demonstrated their wares. The athmosphere was great to see with lots of chatter, tweeting and drinking of coffee! 😀


The second section was a panel session on Tech Talent, the chair being Paddy O’Connell from Berkely, and the facilitators were Meghan M Biro, founder and CEO of TalentCulture, and Kevin Grossman, who co founded and co hosts the the TalentCulture #TChat show with Meghan. They later presented their TChat show live from the Clarion hotel Cork. It was awesome!

Such variety (no pun intended!) in the panel, with David Parry Jones, VP UKI VMWare and Noelle Burke Head of HR Microsoft Ireland representing industry, Michael Loftus – Head of Faculty of Engineering and Science CIT representing academia, and the hugely impressive student Ciara Judge, one of the Kinsale winners of the 2013 Google Science Award. Everyone inspired in their own way, and the dynamic at lunchtime was one of motivation, hope and leadership.


Having started my own personal digital marketing brand last year, and learning by making mistakes, I was exceptionally excited by our third section – Digital Marketing. Again, Anton did an incredible job of asking the right questions, and effortless listenership followed. To listen to experts such as Meghan, Antonio Santos, Niall Harbison and Raluca Saceanu was a privilege, and I also got the opportunity to speak with the directly (as did many others). This was true of all the speakers throughout the day. I believe a huge number of people got lots of what I call “advice snippets” that they can take away and grow their own brand.


The last session was on an area close to my heart, the Internet of everything (IoE), and I had the privilege of chairing the session. We had speakers from Climote (Derek Roddy), my future employer Tyco (Craig Trivelpiece), Salesforce (Carl Dempsey), Dell (Marc Flanagan) and Xanadu (David Mills). All these companies are in different stages on their IoE journey, but the message was consistent: IoE is going to make a huge impact on our smart futures. I really like how Craig spoke of “if you want to improve something, measure it”  and how Tyco are looking at predictive maintenance and pushing intelligence/insight back out to the devices. Derek showed how Climote is changing how we live, David did the same in relation to sport. Marc gave an excellent account of Dells practical approach to IOT, showing the capabilities needed for IoE projects. Carl got me really excited about Salesforce’ plans in the IoE space. The session really closed out the event well, and the numbers in attendance stayed consistent.

Having attended a huge number of tech events over the years, it was great to see again, year on year growth of Munsters premier Technology Summit. The athmosphere was electric all day, both locally and on Twitter. The tweet wall was a big success, and we expect that next years event will be bigger and better again.


The black tie dinner was also a huge success, with the Millenium Hall in City Hall packed to capacity. Marnie O’Leary Daly, along with Emer from Lockdown model agency, put on an amazing dinner (superb catering by Brooks) and fashion show, with 3D wearables fashion provided by Aoibheann Daly from LoveandRobots and Rachael Garrett from Limerick School of Art and Design (@LSAD). Special mention to FabLab also for helping Rachael get her garments ready. It really was a spectacular evening. The Clarion hotel was also hugely supportive of the night element. (Photos to follow!) Emer will also blog on the night event fashion soon and do a much better job than me!

It@Cork European Technology Summit 2016. Watch this space. 

If you are interested in getting involved in 2016, please contact Sarah Walsh at it@Cork.

IoT Impact on the Manufacturing Industry (Part 2)

Continuing on from my last blog post, another example for IoT use in manufacturing would be for the asset management to distribute work orders and configurations to the tools or the different stages of production. And vice versa, calibration information can be fed back to the Enterprise Resource Planning (ERP) system to associate them to the bill of material (BOM). Big data and NoSQL technology is an enabler in this regard, as they can allow for the management of huge volumes of heterogeneous, multi structured data about the production process, from the data types discussed, to even images from AOI (Automated Optical Inspection) systems and other production modules. With recalls a concern point in global manufacturing, this can be an ally in the fight to keep costs down for manufacturing.

IoT can also have an impact is in intelligent edge devices and their use in improving supply chain optimization and modularity of manufacturing. Consider surface mount technology (SMT), where there is so many moving parts, calibration, types of technology used in the placement and verification of board level components. IoT sensors could be utilized to centralize SMT line asset management and to read calibration information via the factory WLAN. The asset management can form the link between the SMT tools and the ERP (Enterprise Resource Planning) and MES (Manufacturing Execution Systems) that oversee the manufacturing process.

A challenge that presents itself to the manufacturing industry is the ageing workforce, and this means that anything that speeds up the manufacturing process is critical. The advancement in mobile technology is a key enabler in ensuring that passing information to the shop floor becomes quicker, improving response time, visibility, and accessibility of operations. The recent advancement of wearables also will have an impact on enhanced visibility on the shop floor.

Building Blocks for IoT in Manufacturing

Business owners need to look at four technology elements that provide the foundation for smart manufacturing. These include (but not limited to):

  • Security: IT security is a major obstacle to setting up smart factories. Operations managers need to make sure that necessary safeguards are built into the solution including security procedures such as physical building security, hardware encryption and network security for data in transit. Security and networking solutions must also be engineered to withstand harsh environmental conditions, such as moisture and temperature, that aren’t present in typical networks. Identity and authentication structures will also need to be updated to support such “things” as well as people.
  • More Advanced Networking: Smarter manufacturing environments need a standardized IP-centric network that will enable all the devices/sensors in a plant to communicate to enterprise business systems. Cisco research states that only 4 percent of the devices on the manufacturing floor are connected to a network. A standard IP network also makes it easier to connect and collaborate with suppliers and customers to improve supply chain visibility. Manufacturers need robust networks that can cope with Radio Frequency (RF) challenges in the plant, harsher environmental conditions and need stability for transmission of alarms and real-time data processing.
  • Big Data Analytics: While manufacturers have been generating big data for numerous years, companies have had limited ability to store, analyze and effectively use all the data that was available to them, especially in real time. New big data processing tools are enabling real-time data stream analysis that can provide dramatic improvements in real time problem solving and cost avoidance. Big data and analytics will be the foundation for areas such as forecasting, proactive maintenance and automation.
  • Engineering Software Systems: Today’s IoT data is different than the data we use to operate our systems. It requires collecting a wide range of data from a variety of sensors. These software systems and models must translate information from the physical world into actionable insight that can be used by humans and machines. Toyota is using Rockwell’s software for real time error corrections in the plant. Toyota has minimized rework and scrap rates in its Alabama plant, which has resulted in an annual cost saving of $550,000.3

Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
Building blocks for end-to-end infrastructure enabling manufacturing intelligence from the factory floor to the data-center (Intel) [4]
With IoT, IP networks and analytics, manufacturers can become more efficient, improve worker safety and offer new exciting business models. IoT will help manufacturers improve resource efficiency, safety and return on assets. Manufacturers that master this new dynamic will have a variety of new opportunities for revenue growth and cost savings.

References

3: How IoT will help manufacturing

http://www.industryweek.com/blog/how-will-internet-things-help-manufacturing

4: Industrial Optimization IoT (Intel)

http://www.intel.ie/content/dam/www/public/us/en/documents/white-papers/industrial-optimizing-manufacturing-with-iot-paper.pdf

IoT and Classical Business Models

Many companies, especially in the Information Technology (IT) section are aware of the IoT explosion, one of the biggest challenges facing any company is how they prepare for the change that will result from the increased business impact that IoT will present.

With figures in the trillions in terms of the market for IoT, how do companies ensure they can get a slice of the pie? If they currently do not sit within the relevant market segment, analysis will be required to determine if it can be an opportunity or a threat to their business as a whole.

IDC in 20143 predicted that IoT will actually overtake the Information Communication Technology (ICT) over time. It predicts IoT will grow 12% year on year, whilst classical ICT will grow just 4%. Figure 3 below illustrates this.

Figure 3: IDC Prediction of IoT vs ICT [3]

Considering that most business is consistently monitoring the bottom line, it is not only the opportunities that it will present, but how it will impact how we work. With limitless numbers of sensors monitoring processes, improving business energy efficiency, enabling new ways of working in teams, business will need to be more open to change, and more dauntingly, open to the elements of a “big brother” type scenario.

There are trends that are ensuring an evolution of business practice as we know it. Normally, new technology platforms impact on a single strand, with the exception of the impact of the internet. But IoT has the potential to become an entire business ecosystem, where creating and capturing business value will be paramount. However, this is not a straightforward suggestion. Barriers to this include the current early position of IoT in its lifecycle, and the sheer volume and types of devices to be considered. From an ecosystem perspective, by nature it would indicate a seamless quantity of micro-systems working together in a self-sustaining fashion. Trying to estimate what this will mean for IoT is still not clear.

Consider the classical technology adoption lifecycle. There are five types of innovation adopters, the first being the innovators themselves. The list is completed, in sequence by early adopters, early majority, late majority and laggards. With the current immaturity in IoT, and the lack of clarity in the various emerging technologies, the challenge for business is to try to advance the early adopters to early majority, so the business needs to be able to scale. The early adopters are less fussy when it comes to product design, but once the number of adopter’s increases later in the life cycle, the early majority will want polished product offerings, with appropriate services.

With the IoT still in its relevant infancy, it is appropriate to compare it to the early stages of the Internet. When we look at the recent business ecosystems that have been spun out of the Internet for EMC, such as Pivotal Cloud Foundry, one would postulate about future ecosystems opportunities for EMC from the IoT spectrum.

Another important consideration for companies is to consider the skill-sets and people that are required to drive their Big Data strategy as a result of their growing IoT ecosystem. A key tenant for this will be the data itself, and in the February IT@Cork Tech Talk by my EMC colleague Steve Todd, and even more recently in his blog on data value (value was something I had never associated to data until this talk), Steve spoke to the importance for major companies to begin to consider a more structured approach to their employees that are involved in data set discovery, identification and migration (Data Architect) and also a Chief Data Officer to represent the company from a data perspective. Interestingly, my role in EMC changed last year, to the role of a Data Architect. So I could first hand relate to this. When faced with a business challenge in big data, 5 steps that can be critical to success are as follows.

1: Demystify and then map the current devices, tools, processes and trajectory of data across the business unit or company (AS-IS Diagram)

2: Scour the company and external sources for any technologies that can enable a more scalable and clearer approach

3: Look to centralize data storage, to allow the company to focus on being agile and scalable, and also remove duplicate data (concept of a Business Data Lake)

4: Develop an ingestion framework to ensure the data lake has a sufficient landing platform for data.

5: Build the Analytic’s platform that is pointed at the centralized “Business Data Lake” to meet existing and future needs of the business.

When we apply this to IoT, we start to that every company, no matter how small, will begin to generate huge data-sets, and there will be a new skillet needed at companies that never had previously to ensure they can gain as much insight from the data sets. Sure, there are companies that can provide these solutions, but realistically, the future state will surely be to have these as core skills, just as “internet skills” once appeared on resumes?!

It is proposed here that key stakeholders across multinationals can overcome these challenges and design practical IoT business models if they consider an ecosystem style approach, instead of looking at modular needs of individual business units. This will allow the business to get a high level perspective of where IoT can bring value to their business offerings.

Reference:

3: Digital Universe Article

http://www.emc.com/leadership/digital-universe/2014iview/internet-of-things.htm