Releasing Software Developer Superpowers

Article is aimed at anyone looking to gain the edge in their software development team creation or advancement in the digital age. Concepts can be applied outside of sw dev at some level. Open to discussion – views are my own.

UX is not just for Customers

User Experience is an ever growing component of product development, with creating user centric design paradigms to ensure that personalisation and consumer/market fit is achieved. From a development team view, leveraging some of the user experience concepts in how they work can achieve operational efficiency, to accelerate product development. For example, how is the experience for each of the developer personnas in your team? How do their days translate to user stories? Can interviewing the development community lead to creating better features for your development culture?

Build Products not Technology

Super important. Sometimes with developers, there is an over emphasis on the importance of building features, a lot of the time for features sake. By keeping the lens on the value or “job to be done” for the customer in the delivery of a product at all times can ensure you are building what is truly needed by your customer. To do this, select and leverage a series of metrics to measure value for that product, along with keeping your product developent in series, and tightly coupled to your customer experience development.

Leverage PaaS to deliver SaaS

This sounds catching but its becoming the norm. 5 years ago, it took a developer a week of development time to do what you can do in Amazon Web Services or Azure now in minutes. This has led to a paradigm shift, where you being to look at the various platforms and tools that are available to enable the developers to deliver great products to customers. Of course, there will always be custom development apps, but you can help your developers by getting them the right toolkit. There is no point reinventing the wheel when OTS open source components are sitting there, right? Products like Docker and Spring and concepts like DevOps are bringing huge value to organisations, enabling the delivery of software or microservices at enhanced speed. Also, the balance between buying OTS and building custom is a careful decision at product and strategic levels.

“The role of a developer is evolving to one like a top chef, where all the ingredients and tools are available, its just getting the recipe right to deliver beautiful products to your customer.”

Create Lean Ninjas!

shutterstock_215389786 - Copy

Evolving the cultural mindset of developers and the organisation toward agile development is super important. Having critical mass of development resources, plus defined agile processes to deliver business success  can really reshape how your organisation into one where value creation in a rapid manner can take place. However, its important to perform ethnographical studies on the organisation to assess the culture. This can help decide on which agile frameworks and practices (kanban, scrum, xp etc) can work best to evolve the development life cycle.

Implement the 10% rule

Could be slightly controversial, and can be hard to do. Developers should aim to spend 10% of their time looking at the new. The new technologies, development practices, company direction, conferences, training. Otherwise you will have a siloed mis-skilled pool of superheros with their powers bottled.

However, with lean ninjas and effective agile company wide processes, resources and time can be closely aligned to exact projects and avoid injecting randomness into the development lifecycle. Developers need time to immerse and focus. If you cant do that for them, or continously distract them with mistimed requests – they will leave. If you can enable them 10% is achievable.

Risk Awareness

shutterstock_331041884 (Large)

We are seeing an evolution in threats to enterprise all over the world, and in a software driven and defined world, getting developers to have security inherent design practices prior to products hitting the market can help protect companies. Moons ago, everything sat on prem. The demands of consumers mean a myriad of cloud deployed services are adding to a complex technology footprint globally. If they know the risk landscape metrics from where they deploy, they can act accordingly. Naturally, lining them up with business leaders on compliance and security can also help on the educational pathway.

Business and Technology Convergence

We are beginning to see not only evolution in development practices –  we are also seeing a new type of convergance (brought about by lean agile and other methods) where business roles and technology roles are converging. We are beginning to see business analysts and UX people directly positioned into development teams to represent the customer and change the mindset. We are seeing technology roles being positioned directly into business services teams like HR and finance. This is impacting culture, wherby the saviness in both directions needs to be embraced and developed.

shutterstock_334013903 (Large)

Growth Mindset

We mentioned mindset a lot in the article. That because its hugely important. Having the right culture and mindset can make all the difference in team success. As Carol Dweck talks about in her book “Mindset”, you can broadly categorise them into two – growth and fixed. This can be applied in all walks of life, but for team building it can be critical.

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.

Creating a team where being on a growth curve and failures are seen as learning can really enable a brilliant culture. As Michaelangelo said “I am still learning”. Especially as we evolve to six generations of developers. How do we ensure we are creating and mentoring the next set of leaders from interns through to experienced people?

Check a Ted talk from Carol here – link.

And most importantly … HAVE FUN!

Numenta and MemComputing: Perfect AI Synergy

the-brain

Let’s look at two forces of attraction that are happening in the technology space, specifically looking at creating true artificial intelligent systems, utilizing advances in both software and hardware technologies.

For years, even decades we have chased it. AI has been at the top of any list of research interest groups, and while there have been some advances, the pertinent challenge has been that advances in hardware electronics in the 70’s and 80’s occurred, software design was lagging behind. Then, software advanced incredibly in the past decade. So now, in July 2015, we reach a key point of intersection of two “brain based technologies”, which could be built together in a way that may lead to “true AI”.

At no other point in history have we had both hardware and software technologies that can “learn” like we can, whose design is based on how our mind functions.

Numenta

First, let’s look at Numenta. Apart from having the pleasure of reading Jeff Hawkins excellent book “On Intelligence”, I have started to look at all the open source AI algorithms ( github here) that they provide. In a journey that start nine years ago, when Jeff Hawkins and Donna Dubinsky started Numenta, the plan was to create software that was modeled on the way our human brain processes information. Whilst its been a long journey, the California based startup have made accelerated progress lately.

numenta-icon512

Hawkins, the creator of the original Palm Pilot, is the brain expert and co-author of the 2004 book “On Intelligence.” Dubinsky and Hawkins met during their time building Handspring, they pulled together again in 2005 with researcher Dileep George to start Numenta. The company is dedicated to reproducing the processing power of the human brain, and it shipped its first product, Grok, earlier this year to detect odd patterns in information technology systems. Those anomalies may signal a problem in a computer server, and detecting the problems early could save time, money or both. (Think power efficiency in servers)

You might think, hmm, that’s not anything great for a first application of algorithms based on the mind, but its what we actually started doing as neanderthals. Pattern recognition. First it was objects, then it was patterns of events. And so on. Numenta is built on Hawkins theory of Hierarchical Temporal Memory (HTM), about how the brain has layers of memory that store data in time sequences, which explains why we easily remember the words and music of a song. (Try this in your head. Try start a song in the middle.. Or the alphabet.. It takes a second longer to start it). HTM became the formulation for Numenta’s code base, called Cortical Learning Algorithm (CLA), which in turn forms the basis of applications such as Grok.

Still with me? Great. So that’s the software designed and built on the layers of the cortex of our brains. Now lets look at the hardware side.

 

Memcomputing

After reading this article on Scientific American recently, and at the same time as reading Hawkins book, I really began to see how these two technologies could meet somewhere, silicon up, algorithms down.

Memelements

A new computer prototype called a “memcomputer” works by mimicking the human brain, and could one day perform notoriously complex tasks like breaking codes, scientists say. These new, brain-inspired computing devices also could help neuroscientists better understand the workings of the human brain, researchers say.

In a conventional microchip, the processor, which executes computations, and the memory, which stores data, are separate entities. This constant transfer of data between the processor and the memory consumes energy and time, thus limiting the performance of standard computers.

In contrast, Massimiliano Di Ventra, a theoretical physicist at the University of California, San Diego, and his colleagues are building “memcomputers,” made up of “memprocessors,” that can actually store and process data. This setup mimics the neurons that make up the human brain, with each neuron serving as both the processor and the memory.

I wont go into specifics of the building blocks of how they are designed, but its based on three basic components of electronics – capacitors, resistors and inductors, or more aptly called memcapacitors, memresistors and meminductors. The paper describing this is here.

Di Ventra and his associates have built a prototype that are built from standard microelectronics. The scientists investigated a class of problems known as NP-complete. With this type of problem, a person may be able to quickly confirm whether any given solution may or may not work but can’t quickly find the best solution. One example of such a conundrum is the “traveling salesman problem,” in which someone is given a list of cities and asked to find the shortest route from a city that visits every other city exactly once and returns to the starting city. Finding the best solution is a brute force exercise.

The memprocessors in a memcomputer can work together to find every possible solution to such problems. If we work with this paradigm shift in computation, those problems that are notoriously difficult to solve with current computers can be solved more efficiently with memcomputers,” Di Ventra said. In addition, memcomputers could tackle problems that scientists are exploring with quantum computers, such as code breaking.

Imagine running software that is designed based on our minds, on hardware that is designed on our minds. Yikes!

In a future blog, I will discuss what this means in the context of the internet of things.

brain-computer

 

 

IoT and Governance. Its a game of RISK

Due to the sheer volume of devices, data volume, security and networking topologies that result from IoT, it is natural for there to be a lot of questions and legal challenges around governance and privacy. How do I know my data is secure? Where is my data stored? If I lose a device, what happens to data in flight?

The National Fraud Intelligence Bureau has said that 70% of the 230,845 frauds recorded in 2013/2014 included a cyber-element, compared to 40% five years ago. This would indicate that we aren’t doing a very good job on protecting the existing internet enabled devices, so why should we be adding more devices? If we internet enable our light bulbs and heating systems (Nest being acquired by Google a good example) to control from our mobile phone, can the devices be hacked to tunnel to our mobile phone data?

It is not only the singular consumer that needs to be aware of privacy and governance. Businesses too will need to ensure when they adopt IoT, they must place resources at the door of the legal requirement and implications of IoT enablement. A key aspect of this will be to ensure their internal teams are aligned in relation to IoT, and more specifically, security, data protection and privacy.

More and more, governments and regulatory bodies have IoT in their remit. This included the EU commission who published a report that recommended that IoT should be designed from the beginning to meet suitable governance requirements and rights, including right of deletion and data portability and privacy.

The draft Data Protection Regulation addresses some of these measures including:

  • Privacy by design and default – to ensure that the default position is the least possible accessibility of personal data
  • Consent
  • Profiling – clearer guidelines on when data collected to build a person’s profile can be used lawfully, for example to analyse or predict a particular factor such as a person’s preferences, reliability, location or health
  • Privacy policies
  • Enforcement and sanctions – violations of data privacy obligations could result in fines of up to 5% of annual worldwide turnover or €100m, whichever is greater

The first point above, privacy by design is normally an afterthought unfortunately. Whilst not a requirement by the Data Protection Act, it makes the compliance exercise much smoother. Taking such an approach brings advantages in building trust and minimizing risk.

IoT presents a number of challenges that must be addressed by European privacy regulators as IoT evolves. It is predicted that the scrutiny on these challenges will increase as the device number increases.

Some of the challenges include:

  • Lack of control over the data trajectory path
  • The lack of awareness by the user of the devices capabilities
  • Risk associate with processing data beyond original scope, especially with advances in predictive and analytic engines
  • Lack of anonymity for users
  • Non threat everyday devices becoming alive to threat

As can be seen from these challenges above, there are characteristics in common, such as control, security and visibility which makes governance of IoT a bigger challenge than expected.

Finally, governance in IoT is expected to follow other technologies. Up to now, the software industry has not had single standards for the complete service portfolio (including cloud), although government are addressing this. From the geographical standpoint, different regulations are commonplace for different jurisdictions in IT, so IoT is predicted to follow suit.

Why IoT needs Software Defined Networking (SDN)

Software defined networking (SDN), with its ability to intelligently route traffic and take advantage of underutilized network resources will help stop the data flood of IoT. Cisco has a pretty aggressive IoT strategy, and they place their application centric infrastructure version of SDN at the centre of this. And it makes sense. Software is still the main ingredient that can be used to combat network bandwidth challenges.

Lori MacVittie8 agrees with SDN being a critical enabler, but only if SDN considers all of the network layers from 2 to 7, and not just stateless 2-4. “Moving packets around optimally isn’t easy in a fixed and largely manually driven network. That’s why SDN is increasingly important when data volumes increase and do so in predictable waves. SDN can provide the means to automatically shift the load either in response or, optimally, in anticipation of those peak waves.”

The network challenges in IoT do not stop at bandwidth and volumes of data. Applications will be required to deal with the peak loads of data, so services will be required in layers 4-7 that provide for scale, security and performance of those apps.

Figure 5: Stateless vs Stateful in SDN Application Services [8]

SDN has features that will also be particularly useful. Dynamic load management should allow users to monitor and orchestrate bandwidth automatically on the fly, which will be music to the ears of global IoT providers. Service chaining will enable application specific processing procedures in a sequence fashion to a client’s job. This should ease management overhead in IoT services, as the subscriptions increase globally. One of the coolest features of SDN is bandwidth calendaring which will allow the user to schedule the traffic an application will need at a given time, and when you think of a sensor only wanting to communicate at periodic times, it is apparent that this will be a great asset.

But this cannot happen soon. Data center managers will have to modernize their infrastructures. Once they do, a potential big win would be the ability to create numerous virtual and private networks on top of a single physical network. This would be a big advantage as multiple customers could then share a single network, without risk for their applications and data. However, for this to work, one would need the entire network to be SDN enabled.

When one considers the concept of Network Functional Virtualization (NFV), this path can be traversed quicker. With NFV ready networks, carriers can create services in software, rather than dedicated hardware, essentially allowing virtualized servers to allow these new services. This enables business transformation by moving away from having multiple isolated networks, and one would work with an open ecosystem, a set of virtualized network functions, and most importantly an orchestration layer. This will allow businesses to accelerate with agility in the face of device quantity explosion.

Reference:

8: Dev Central: SDN and IoT article

https://devcentral.f5.com/articles/sdn-is-important-to-iot-if-it-covers-the-entire-network