Releasing Software Developer Superpowers

Article is aimed at anyone looking to gain the edge in their software development team creation or advancement in the digital age. Concepts can be applied outside of sw dev at some level. Open to discussion – views are my own.

UX is not just for Customers

User Experience is an ever growing component of product development, with creating user centric design paradigms to ensure that personalisation and consumer/market fit is achieved. From a development team view, leveraging some of the user experience concepts in how they work can achieve operational efficiency, to accelerate product development. For example, how is the experience for each of the developer personnas in your team? How do their days translate to user stories? Can interviewing the development community lead to creating better features for your development culture?

Build Products not Technology

Super important. Sometimes with developers, there is an over emphasis on the importance of building features, a lot of the time for features sake. By keeping the lens on the value or “job to be done” for the customer in the delivery of a product at all times can ensure you are building what is truly needed by your customer. To do this, select and leverage a series of metrics to measure value for that product, along with keeping your product developent in series, and tightly coupled to your customer experience development.

Leverage PaaS to deliver SaaS

This sounds catching but its becoming the norm. 5 years ago, it took a developer a week of development time to do what you can do in Amazon Web Services or Azure now in minutes. This has led to a paradigm shift, where you being to look at the various platforms and tools that are available to enable the developers to deliver great products to customers. Of course, there will always be custom development apps, but you can help your developers by getting them the right toolkit. There is no point reinventing the wheel when OTS open source components are sitting there, right? Products like Docker and Spring and concepts like DevOps are bringing huge value to organisations, enabling the delivery of software or microservices at enhanced speed. Also, the balance between buying OTS and building custom is a careful decision at product and strategic levels.

“The role of a developer is evolving to one like a top chef, where all the ingredients and tools are available, its just getting the recipe right to deliver beautiful products to your customer.”

Create Lean Ninjas!

shutterstock_215389786 - Copy

Evolving the cultural mindset of developers and the organisation toward agile development is super important. Having critical mass of development resources, plus defined agile processes to deliver business success  can really reshape how your organisation into one where value creation in a rapid manner can take place. However, its important to perform ethnographical studies on the organisation to assess the culture. This can help decide on which agile frameworks and practices (kanban, scrum, xp etc) can work best to evolve the development life cycle.

Implement the 10% rule

Could be slightly controversial, and can be hard to do. Developers should aim to spend 10% of their time looking at the new. The new technologies, development practices, company direction, conferences, training. Otherwise you will have a siloed mis-skilled pool of superheros with their powers bottled.

However, with lean ninjas and effective agile company wide processes, resources and time can be closely aligned to exact projects and avoid injecting randomness into the development lifecycle. Developers need time to immerse and focus. If you cant do that for them, or continously distract them with mistimed requests – they will leave. If you can enable them 10% is achievable.

Risk Awareness

shutterstock_331041884 (Large)

We are seeing an evolution in threats to enterprise all over the world, and in a software driven and defined world, getting developers to have security inherent design practices prior to products hitting the market can help protect companies. Moons ago, everything sat on prem. The demands of consumers mean a myriad of cloud deployed services are adding to a complex technology footprint globally. If they know the risk landscape metrics from where they deploy, they can act accordingly. Naturally, lining them up with business leaders on compliance and security can also help on the educational pathway.

Business and Technology Convergence

We are beginning to see not only evolution in development practices –  we are also seeing a new type of convergance (brought about by lean agile and other methods) where business roles and technology roles are converging. We are beginning to see business analysts and UX people directly positioned into development teams to represent the customer and change the mindset. We are seeing technology roles being positioned directly into business services teams like HR and finance. This is impacting culture, wherby the saviness in both directions needs to be embraced and developed.

shutterstock_334013903 (Large)

Growth Mindset

We mentioned mindset a lot in the article. That because its hugely important. Having the right culture and mindset can make all the difference in team success. As Carol Dweck talks about in her book “Mindset”, you can broadly categorise them into two – growth and fixed. This can be applied in all walks of life, but for team building it can be critical.

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.

Creating a team where being on a growth curve and failures are seen as learning can really enable a brilliant culture. As Michaelangelo said “I am still learning”. Especially as we evolve to six generations of developers. How do we ensure we are creating and mentoring the next set of leaders from interns through to experienced people?

Check a Ted talk from Carol here – link.

And most importantly … HAVE FUN!

Augmented and Virtual Reality: Now more about improving User Experience

We cannot eat popcorn wearing a virtual reality (VR) headset – Zaid Mahomedy : ImmersiveAuthority.com

IIn 1995, the cringe worthy Johnny Mnemonic was released where he used a VR headset and gesture monitoring gloves to control the “future internet”. Even though this movie was over 20 years ago, it is only in the past few years we are seeing commercially ready Virtual Reality (VR) and Augmented Reality (AR) technologies hit the market.

If you watch this clip, you will hopefully notice two aspects. The first is that the technology is clunky. The second is that the predicted user experience (UX) he has is rich (for the decade of movie production): information is available at speed, the gloves are accurate, and the path and usability is seamless. When he moves his hands, the VR3 responds instantaneous. It assists him at every turn. Yet twenty years later, we have not reached this quite yet. Why? Because the focus needs to shift from technology to other aspects to enable this industry to flourish.

1: Technology Moves Aside for User Experience.

A large amount of technology companies efforts in this space in the past two years has been mostly focused at determining can they squeeze enough compute power onto a pair of glasses. Other questions to be answered were around if the battery will last a decent amount of time and will the heat emissions be low enough not to inconvenience the user. Whilst there are still optimizations to be performed, the core of the technology has at least been proven, along with some clever innovations around leveraging smart phones to save on hardware investments.

In the coming years we will see a larger amount of these companies focusing on user experience we have with these technologies – ensuring the interfaces,gesture and motion recognition are close to perfect are high on companies to-do lists. The hardware road-map will ensure they are lighter, more robust and frankly – sexier to wear. Before we discuss other aspects of how improved UX will be the focus of the coming years, its not to stay that technology wont help on this. For example, the evolution of flexible compute paradigms specifically in the nano technology area will assist in building compute into glasses, instead of adding compute retrospectively.

2: Difference in Psychologies

Apart from the technology of VR and AR being quite different under the hood, the psychology of how they are used is also. With AR, we are injecting a digital layer between us and the physical world. With VR, we are immersing ourselves into a digital world. These are very different experiences and the user experience design must have this difference at its core. We must ensure that the layer we design for AR takes characteristics from both our physical environment and our own personas. With VR, its much more emphatic to ensure the person feels comfortable and safe in that world.

3: Interfaces to VR/AR UX

The UX Design of AR and VR technologies and applications will require careful management of multiple input styles. Using wearables, voice recognition, AR and AI, we will start to see seamless blending and integration with how technology interacts with us across various senses. Touch devices are still being used, but they will move aside for voice recognition, movement tracking and even brain waves to be used to control these smart devices. The communication will be much faster and intimate, and will force designers to completely rethink how we interact with these devices.

4: The Role of AI in UX

The UX of these devices will also require more human like interactions, to built trust between the devices and the users in an organic manner. We are seeing this with voice control technology like Siri and Google Home, but they are understanding our voice, with some sample responses. Soon they will learn to evolve their speech.

Artificial intelligence will take hold of the user experience to analyze the reaction to different experiences and then make changes in real time to those assessments. UX will become a much more intuitive and personalized experience in the coming years.

5: Convergence of VR and AR Standards

Already we are seeing a myriad of startups evolving in the space, some focusing on content development in software, some on the actual hardware itself. Some are brave enough to have both on offer. We also have the larger companies creating divisions to provide offerings in this space. Choice is great, but when it becomes painful trying on your fourteenth pair of glasses at your average conference, it is not. When one takes time to observe how companies are beginning to partner up to offer solutions ( a trend extremely common in the IoT industry) it is a small step towards some form of standardization. Excessive choice can be bad from a UX perspective, as with such segregation in initial design makes it harder for app designers to get it right on the hardware.

6: Realistic Market Sensing

At some point, we have to get away from the “Toys” feel for these devices. We put them on for ten minutes in an airport or at an event to get a wow from it. Whilst the applications in the gaming industries are there to be seen, companies are beginning to focus on where else the market will be. Certain devices have flopped in the past two years, and you would wonder why with such strong brands. The first reason was awful UX. The second was the market just was not ready, with a distinct lack of content to make them anyway useful. Just because a few of these devices fail, doesn’t mean the movement stops. (Below info-graphic source is washington.edu)

Consumer and Industrial applications have very different requirements from a market perspective, with content choice and look and feel very important for consumer markets, system performance and governance sitting higher in industrial use cases. With the costs associated with adding these technologies to industrial environments under the microscope, companies must focus strongly on measuring and building the return on investment (ROI) models.

7: Protecting the User and the Experience

With these technologies predicted to get even closer than headsets (smart contact lenses for example -link here), its quite important the UX designers can intrinsically build in comfort and safety into any application. Too many times we have seen people fall through something whilst wearing a headset (more so with VR technologies). And that’s just the physical safety. When the threshold between physical and augmented worlds gets closer and closer (mixed reality), we want to avoid a scenario of interface overkill.

Whilst the past few years may indicate that these technologies are fads, the reality is far from it. They will become part of our social fabric as a new form of mobile technology. Ensuring the users experience with these technologies will be the critical enabler in their success and adoption rate.

Designing for AR and VR entails there be better understanding of a user’s need when it comes to context of use. It’s about building connections between the physical and digital world, requiring an interdisciplinary effort of service design, interaction design and industrial design.

Pre Cloud Security Considerations in IoT

Introduction

Over the past decade, hybrid cloud adoption has steadily increased, with closed network becoming less the option of choice. But this comes at a cost to security and trust metrics. As we become more dependent on intelligent devices in our lives, how do we ensure the data that is within the web is not compromised by external threats that could threaten our personal safety?

As the adoption of IoT increases, so does the risk of hackers getting at our personal information. As Alan Webber points out on his RSA blog6, there are three key risk areas or bubbles that companies need to be aware of.

1: Fully enabled Linux/Windows OS systems: This area concerns itself with those devices that are not part of a normal IT infrastructure, but are still run on full operating systems, such as Linux or Windows. As everyone knows, prior to IoT, these OS have vulnerabilities, and when they are deployed in the “free world”, they are not as visible to IT admins.

2: Building Management Systems (BMS): This pertains to infrastructure systems that assist in the management of buildings, such as fire detection, suppression, physical security systems and more. These are not usually classified as threatened, yet shutting down a fire escape alarm system could lead to a break-in scenario.

3: Industry Specific Devices: This area covers devices that assist a particular industry, such as manufacturing, navigation, or supply chain management systems. For example, in the case of a supply chain management system, route and departure times for shipments can be intercepted, which could lead to shipment intercept and reroute to another geographical location.

So, how do we guard against these types of risks, and make the devices themselves and also the web of connected devices less dumb? Security must be looked at holistically to begin with, with end to end security systems being employed to ensure system level safety, and to work on device level embedded control software to ensure data integrity from edge to cloud.

Data routing must also be taken seriously from a security standpoint. For example, smart meters generally do not push their data to a gateway continuously, but send it to a data collection hub, before sending it in a single bulk packet to the gateway. Whilst the gateway might have an acceptable security policy, what about the data collection hub? This raises a major challenge, as how does one micro manage all the various security systems their data might migrate across?

Security Design Considerations

Early stage IoT devices unfortunately had the potential loss of physical security in their design, so it is necessary for security officers to be aware of the focus and location of their security provisioning.

To apply security design to the devices is not the most utilized method (similar to internal storage), as the cost and capacity of these devices is counterproductive to same. The devices would look to ensure consistency of communication and message integrity. Usually, one would deploy the more complex security design upfront within the web services that sits in front and interacts with the devices. It is predicted as the devices themselves evolve, and nanotechnology becomes more and more of an enabler in the space, the security design will become closer to the devices, before eventually becoming embedded.

It is proposed that shared cloud based storage will play a pivotal role in combating the data volume perplexity, but not without its issues. How do we handle identification and authentication? How do we ensure adequate data governance? Partnerships will be necessary between security officers and cloud providers to ensure these questions are answered.

Searching for the holy grail of 100% threat avoidance is impossible, given the number of players in an entire IoT ecosystem. Whilst cloud service providers own their own infrastructure, it is very difficult for them to know if the data that is received has not being compromised. There are ways to reduce this, but using metadata and building “smarts” into the data from typical known sets as it transitions from edge to cloud. It seems like an approach of something equivalent to a nightclub security guard checking potential clients to their nightclub is a useful analogy. “Whats your name (what type of data are you), where have you been tonight (whats your migration path), how many drinks have you had ( what transactions happened on your data).!!

IoT Security and Chip Design

One area that could bring about increased data privacy is the increased usage of the concept of “Trusted Execution Environments” or TEEs, which is a secure area in the main processor of the device. This ensures that independent processing can occur on critical data within the silicon itself. This enables trusted applications to run to enforce confidentiality and integrity, and protect against unauthorized cloning or object impersonation by remove and replace. Taking it into a real world example, a home owner tampering with their smart meter to reduce their energy bill would be one scenario that would be avoided with TEEs.

If cloud services companies can somehow increase their influence on the IoT device design (outside of the popularity of TEE’s in cellular applications). then utilizing technology such as this will ensure less risk once the data reaches the cloud. Collaboration efforts should be increased between all parties to ensure best practice across the entire IoT landscape can be established.

Figure 1. Generalized framework for a secure SoC
Figure 1. Generalized framework for a secure SoC [7]
References:

6 RSA RISKS of IOT

https://blogs.rsa.com/3-key-risk-areas-internet-things/

7: EDN SOC TE

http://www.edn.com/design/systems-design/4402964/2/Using-virtualization-to-implement-a-scalable-trusted-execution-environment-in-secure-SoCs