Article on evolving software development with a target on climate change: Image : Unsplash https://unsplash.com/photos/RcVzx4burTY
Article on evolving software development with a target on climate change: Image : Unsplash https://unsplash.com/photos/RcVzx4burTY
Article with some predicted trends in the next decade. Image : Unsplash https://unsplash.com/photos/xU5Mqq0Chck
Article about how we can evolve our education systems to ensure we consider AI societal impact, now and in the future. Image/ Unsplash
Article on evolving DevOps to DevEx – 5 minute read. Image Unsplash
Ive been asked to post the video of one of my implants in action – the one in my right hand is a low frequency RFID implant, similar to a standard access card in the workplace.
I made the decision to do this as I have always been fascinated by the evolution of technology, especially with how fast it has moved since I was a kid in the 80s. I remember then technology was something that was in the workplace (first computers). Later, we got computers in the home, but we still needed to put some effort in to “connect” to the internet.
As it evolved, it seemed like it was getting closer to us via wearables, so I wanted to get implants to feel what it would be like, both physically and psychologically. So far so good, I dont even feel them. Its great to hear and see peoples reactions, good and bad. Technology will start to enter the body in various ways, so I wanted to push the boundaries. This is the start of my journey.
I also have a high freq RFID & NFC implant in my left hand which unlocks my phone and stores my business card – this is very programmable and can be used to store medical data, and other data.
Video below, with some media about the event
Article is aimed at anyone looking to gain the edge in their software development team creation or advancement in the digital age. Concepts can be applied outside of sw dev at some level. Open to discussion – views are my own.
User Experience is an ever growing component of product development, with creating user centric design paradigms to ensure that personalisation and consumer/market fit is achieved. From a development team view, leveraging some of the user experience concepts in how they work can achieve operational efficiency, to accelerate product development. For example, how is the experience for each of the developer personnas in your team? How do their days translate to user stories? Can interviewing the development community lead to creating better features for your development culture?
Super important. Sometimes with developers, there is an over emphasis on the importance of building features, a lot of the time for features sake. By keeping the lens on the value or “job to be done” for the customer in the delivery of a product at all times can ensure you are building what is truly needed by your customer. To do this, select and leverage a series of metrics to measure value for that product, along with keeping your product developent in series, and tightly coupled to your customer experience development.
This sounds catching but its becoming the norm. 5 years ago, it took a developer a week of development time to do what you can do in Amazon Web Services or Azure now in minutes. This has led to a paradigm shift, where you being to look at the various platforms and tools that are available to enable the developers to deliver great products to customers. Of course, there will always be custom development apps, but you can help your developers by getting them the right toolkit. There is no point reinventing the wheel when OTS open source components are sitting there, right? Products like Docker and Spring and concepts like DevOps are bringing huge value to organisations, enabling the delivery of software or microservices at enhanced speed. Also, the balance between buying OTS and building custom is a careful decision at product and strategic levels.
“The role of a developer is evolving to one like a top chef, where all the ingredients and tools are available, its just getting the recipe right to deliver beautiful products to your customer.”
Evolving the cultural mindset of developers and the organisation toward agile development is super important. Having critical mass of development resources, plus defined agile processes to deliver business success can really reshape how your organisation into one where value creation in a rapid manner can take place. However, its important to perform ethnographical studies on the organisation to assess the culture. This can help decide on which agile frameworks and practices (kanban, scrum, xp etc) can work best to evolve the development life cycle.
Could be slightly controversial, and can be hard to do. Developers should aim to spend 10% of their time looking at the new. The new technologies, development practices, company direction, conferences, training. Otherwise you will have a siloed mis-skilled pool of superheros with their powers bottled.
However, with lean ninjas and effective agile company wide processes, resources and time can be closely aligned to exact projects and avoid injecting randomness into the development lifecycle. Developers need time to immerse and focus. If you cant do that for them, or continously distract them with mistimed requests – they will leave. If you can enable them 10% is achievable.
We are seeing an evolution in threats to enterprise all over the world, and in a software driven and defined world, getting developers to have security inherent design practices prior to products hitting the market can help protect companies. Moons ago, everything sat on prem. The demands of consumers mean a myriad of cloud deployed services are adding to a complex technology footprint globally. If they know the risk landscape metrics from where they deploy, they can act accordingly. Naturally, lining them up with business leaders on compliance and security can also help on the educational pathway.
We are beginning to see not only evolution in development practices – we are also seeing a new type of convergance (brought about by lean agile and other methods) where business roles and technology roles are converging. We are beginning to see business analysts and UX people directly positioned into development teams to represent the customer and change the mindset. We are seeing technology roles being positioned directly into business services teams like HR and finance. This is impacting culture, wherby the saviness in both directions needs to be embraced and developed.
We mentioned mindset a lot in the article. That because its hugely important. Having the right culture and mindset can make all the difference in team success. As Carol Dweck talks about in her book “Mindset”, you can broadly categorise them into two – growth and fixed. This can be applied in all walks of life, but for team building it can be critical.
In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.
Creating a team where being on a growth curve and failures are seen as learning can really enable a brilliant culture. As Michaelangelo said “I am still learning”. Especially as we evolve to six generations of developers. How do we ensure we are creating and mentoring the next set of leaders from interns through to experienced people?
Check a Ted talk from Carol here – link.
Whilst this blog is focused at the characteristics that millennials (and others) functioning in the technology sector will need to have in the coming years, it could be associated to other industries also. Some of these features are already in play, however they are not seen currently as a full set. Happy reading.
Internal company perspectives will no longer be enough. With the lines between various technologies and industries blurring, to truly understand the necessary trends to keep up, both the person and the company will require one to forge strong active links with external companies, startups and universities. The tools required to achieve this will also evolve, with social media collaboration tools to come into the mainstream, along with the continued influence of smart devices and wearables for managing our workload.
A positive from active collaboration is the potential to see other technology methods across various industries. The wide lens approach will provide plenty of food for thought when the technologist looks to solve their immediate challenges. A good example of this is applying classical file compression algorithms to bioinformatic problems in genome sequence analysis for disease susceptibility patterns. There will be huge advances on the adage “Think outside the box”, where people will build algorithms to find best fit algorithms to solve a related challenge. Seriously.
Personal brand is going to continue to grow in influence for future technologists. There are a few aspects to brand to consider. First, your internal brand within your immediate company – how your colleagues view you, how you ensure you remain visible in the right areas within your company. Next, your external brand is how you are viewed in immediate applicable technology areas, both geographically and in parallel companies. Lastly, the social brand of a technologist will require a suitable online social media strategy, to compliment the first two, and ensure that you are visible in areas that may very well blur into yours in the coming years.
For years, technologists had an interesting reputation! Most people believed them to sit in dark rooms, writing code, and building circuits, with “geek” and “nerd” aimed in their general direction. Not so now. Its now “cool” to be in technology, given that the technology we work on is impacting everyone’s lives. We can see it, feel it, touch it. Its real. And thus, the impression that technologists can make in various circles has increased. We are now in boardrooms (see my previous blog on trends), becoming online influencers, some are even getting celebrity status (Elon Musk). Also, there is going to be a continued evolution in the number of generations that we will have to work with, which will mean more youthful employees will have to lead the aging generations.
OK this may sound controversial. But think about it. As the lines between companies blur, with collaboration having a magnetic effect in pulling them exceptionally close to one another, it is predicted here that employees may begin to work in different organisations, with companies contracting their core employees into other partner companies that may need a particular skill set for a fixed period of time. It is predicted here that it may go a step further, with employees interviewing companies, rather than the other way round. The shift in power will happen, and employees will maintain their time bandwidth per week/year, and will give their services on a consultancy basis to multiple companies. Also there is a trend that the “one company employee” is a thing of the past, with employees more free to move quicker between jobs.
The next generation of technologists will have independence in their DNA, and will possess the required soft skills to be able to self manage their time and tasks. Point 5 above will demand this, but this is not to say that upper management will not be required. What is being said is that the hierarchical org charts will be a thing of the past, with flat structures work best in evolving technology companies.
The walls of companies will be well and truly knocked, with advances in technology ensuring that “work from anywhere” is a distinct reality. Augment reality will play a part in this, when renderings of colleagues will solve the lack of contact/visibility challenge that currently exists. Enabled by technology, an entirely new work environment is on the horizon. According to Wakefield research, 91 percent of C-level executives and IT decision-makers believe that today’s teenagers will be working in roles that do not exist today. 72% agree that the traditional office as we know it will become obsolete within four years. Think about it. How are the generation in school now communicating? There were born into technology.
With online education companies such as Coursera becoming hugely disruptive in the education sector, it is predicted that the classical – Degree – Post Grad – Work (with training) model will change greatly. Numerous people have been quoted as saying “I don’t use a huge amount of my primary degree”. This will mean that certain individuals (think of the 16 year old kid who became a millionaire) will be hired quicker by companies, and then incrementally receive their education throughout their company. This is quite common in Japanese companies, with kids being given apprenticeships at 16, and mix college with work over the next 6 years. Now if the employment laws would catch up! Whilst incremental training is happening now within companies, colleges/companies don’t recognize it as a sum of the parts.
Currently, having external commitments in technology areas, such as startup involvement is seen as a bad thing by most companies. There are trends to suggest that companies are actively opening the door to employees who use their spare time to engage in other opportunities. And rightly so. The skill set that can be gained from contributing in different company and academic structures are incredibly valuable, and there is the added bonus for the company in that they have a viewpoint into more early stage alpha and beta companies.
Yep. Its happening. And we don’t even know it. The way education is being delivered these days demands huge levels of multitasking. The ability to respond to several different stimuli at the same time is called continuous partial attention. We used to teach in a way that demanded a tremendous amount of memorization, but now it’s more about cognitive agility and multi-tasking. The part of the brain, called the hippo-campus, that’s involved in memory is a little different than the multitasking part at the front of the brain. Technologies such as Augmented Reality and Virtual Reality will assist us in managing more information in real time.
We have so much choice now in the technology campus. To every Splunk there is a Hunk. Hadoop was barely alive when Spark came along. Java now has over 50 different varieties. Argh! Do we need to be expert at all of them? No, but we need to be able to switch between them seamlessly. Or at least know what gets used where to meet the challenge we are working on.
“Computers are incredibly fast, accurate, and stupid; humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.
Connection. An ancient word. The meaning is “a relationship in which a person or thing is linked or associated with something or someone else.”
We use it in every walk of life. When we were kids, we formed relationship bonds with our parents,siblings and friends. We learned about the physical and emotional worlds by forming connections using our minds. The connection between cause and effect. Falling off a wall was connected to a resulting injury. Hugging a person caused an increase in happiness and connection with that person. Back then, we were the masters of what connections happened. They were with people and the world around us, so were bound to our psychology.
We are now in a period of seismic shift in one of the chief currencies of our everyday world – connection. In the big data world, there are three parameters used to describe data – variety, volume and velocity (the 3 V’s). We can also apply these to how connected we are. With the internet of screens that has happened in the last decade, a proliferation of apps means we are becoming more connected through technology. But are we? The first wave has seen us build digital connections and a mechanism to stay informed of other peoples lives and offer people more options, convenience and to an extent comfort. But what cost is associated with this accelerated connectivity, and do we understand it?
Thus far, the proliferation of connected devices and apps has very much evolved in ways that would replicate what would be considered left brain operations (logical, analytical, sequential). The number of connections we have, scrolling endlessly on “walls” for updates. We are just entering into a wave of innovation that will require these devices to replicate right brain operations such as intuition, emotion and empathy. We use the left side of our brain to perform the control aspects of our lives, whilst the right side is used to connect or join the dots between disparate societal occurrences.
There is an important evolution in our mindset that needs to occur. We have blindly accepted technology based on trend without truly understanding why it is required. An appreciation for why we should develop more human like relationships with these connected devices – such relationship characteristics as trust and emotion being in high priority. There is huge excitement for how humans and technology will work together in the future. It can be said this excitement is born from realizing that as we evolve our understanding of what it means to be human outweighs anything that technology alone can deliver. People have always been at the core of innovation, and this has led to an evolution in how improved our lives are. However now the evolution of our minds is now not only dependent on genetics and learnings in the natural and emotional world, we now have accelerating inputs from the technology age. Children are seeing technology enter their world at much younger age than previous generation, what effect is this having on how they evolve and pass the genetic baton to future generations?
Now, as we move from the Internet of Screens to the Internet of Things era, it is important that we work on developing our understanding of, and psychology to this technology. Not only will there be devices in our possession, we will wear them directly and they will be added to our very genetic makeup (Read more here from Elon Musk on Neuralink). It will not stop there. Our ecosystems will become more digitized, with smart sensors in everything from household appliances to objects we encounter and use in our daily lives.
As the volume of these devices hits the trillions, it is much more important we evolve our Experience of Things rather than simply build a huge mesh of connected devices in our everyday world. We are seeing this accelerate. Empathy is starting to play a larger role – such as Alexa speaking to us creating a more human like interaction, artificial intelligence and machine learning technologies being utilized to teach computers how to learn themselves (a basic of human evolution) instead of us teaching them. Other technologies are evolving such as augmented reality, in the classical visual sense that we see a lot of these days in the upcoming curve of technology adoption. But we will soon see other forms of augmentation – touch, hearing, taste – to lead to a reduction in dependence on screens, and increase the experience of using the technology. Once the technology begins to inhibit more human like emotions, then it becomes a study into how they are position in society (check this article from Bill Gates on robots and taxes)
As a race, we need to understand what we expect from technology and what we also expect from each other. These expectations will be critical in ensuring that the evolution of technology and humans together can indeed be optimized.
We cannot eat popcorn wearing a virtual reality (VR) headset – Zaid Mahomedy : ImmersiveAuthority.com
IIn 1995, the cringe worthy Johnny Mnemonic was released where he used a VR headset and gesture monitoring gloves to control the “future internet”. Even though this movie was over 20 years ago, it is only in the past few years we are seeing commercially ready Virtual Reality (VR) and Augmented Reality (AR) technologies hit the market.
If you watch this clip, you will hopefully notice two aspects. The first is that the technology is clunky. The second is that the predicted user experience (UX) he has is rich (for the decade of movie production): information is available at speed, the gloves are accurate, and the path and usability is seamless. When he moves his hands, the VR3 responds instantaneous. It assists him at every turn. Yet twenty years later, we have not reached this quite yet. Why? Because the focus needs to shift from technology to other aspects to enable this industry to flourish.
A large amount of technology companies efforts in this space in the past two years has been mostly focused at determining can they squeeze enough compute power onto a pair of glasses. Other questions to be answered were around if the battery will last a decent amount of time and will the heat emissions be low enough not to inconvenience the user. Whilst there are still optimizations to be performed, the core of the technology has at least been proven, along with some clever innovations around leveraging smart phones to save on hardware investments.
In the coming years we will see a larger amount of these companies focusing on user experience we have with these technologies – ensuring the interfaces,gesture and motion recognition are close to perfect are high on companies to-do lists. The hardware road-map will ensure they are lighter, more robust and frankly – sexier to wear. Before we discuss other aspects of how improved UX will be the focus of the coming years, its not to stay that technology wont help on this. For example, the evolution of flexible compute paradigms specifically in the nano technology area will assist in building compute into glasses, instead of adding compute retrospectively.
Apart from the technology of VR and AR being quite different under the hood, the psychology of how they are used is also. With AR, we are injecting a digital layer between us and the physical world. With VR, we are immersing ourselves into a digital world. These are very different experiences and the user experience design must have this difference at its core. We must ensure that the layer we design for AR takes characteristics from both our physical environment and our own personas. With VR, its much more emphatic to ensure the person feels comfortable and safe in that world.
The UX Design of AR and VR technologies and applications will require careful management of multiple input styles. Using wearables, voice recognition, AR and AI, we will start to see seamless blending and integration with how technology interacts with us across various senses. Touch devices are still being used, but they will move aside for voice recognition, movement tracking and even brain waves to be used to control these smart devices. The communication will be much faster and intimate, and will force designers to completely rethink how we interact with these devices.
The UX of these devices will also require more human like interactions, to built trust between the devices and the users in an organic manner. We are seeing this with voice control technology like Siri and Google Home, but they are understanding our voice, with some sample responses. Soon they will learn to evolve their speech.
Artificial intelligence will take hold of the user experience to analyze the reaction to different experiences and then make changes in real time to those assessments. UX will become a much more intuitive and personalized experience in the coming years.
Already we are seeing a myriad of startups evolving in the space, some focusing on content development in software, some on the actual hardware itself. Some are brave enough to have both on offer. We also have the larger companies creating divisions to provide offerings in this space. Choice is great, but when it becomes painful trying on your fourteenth pair of glasses at your average conference, it is not. When one takes time to observe how companies are beginning to partner up to offer solutions ( a trend extremely common in the IoT industry) it is a small step towards some form of standardization. Excessive choice can be bad from a UX perspective, as with such segregation in initial design makes it harder for app designers to get it right on the hardware.
At some point, we have to get away from the “Toys” feel for these devices. We put them on for ten minutes in an airport or at an event to get a wow from it. Whilst the applications in the gaming industries are there to be seen, companies are beginning to focus on where else the market will be. Certain devices have flopped in the past two years, and you would wonder why with such strong brands. The first reason was awful UX. The second was the market just was not ready, with a distinct lack of content to make them anyway useful. Just because a few of these devices fail, doesn’t mean the movement stops. (Below info-graphic source is washington.edu)
Consumer and Industrial applications have very different requirements from a market perspective, with content choice and look and feel very important for consumer markets, system performance and governance sitting higher in industrial use cases. With the costs associated with adding these technologies to industrial environments under the microscope, companies must focus strongly on measuring and building the return on investment (ROI) models.
With these technologies predicted to get even closer than headsets (smart contact lenses for example -link here), its quite important the UX designers can intrinsically build in comfort and safety into any application. Too many times we have seen people fall through something whilst wearing a headset (more so with VR technologies). And that’s just the physical safety. When the threshold between physical and augmented worlds gets closer and closer (mixed reality), we want to avoid a scenario of interface overkill.
Whilst the past few years may indicate that these technologies are fads, the reality is far from it. They will become part of our social fabric as a new form of mobile technology. Ensuring the users experience with these technologies will be the critical enabler in their success and adoption rate.
Designing for AR and VR entails there be better understanding of a user’s need when it comes to context of use. It’s about building connections between the physical and digital world, requiring an interdisciplinary effort of service design, interaction design and industrial design.
Article aimed at anyone (technical or non technical) who wants to understand the steps in Machine Learning at a high level. Readable in five minutes over coffee. I think.
Today we live in a world of seemingly infinitesimal connected devices, in both personal and commercial environments. The currency associated with these devices is data, which whizzes around in near real time, is stored locally and in cloud environments. The types of data vary greatly, with text, audio, video and numerical data just a sample of the data modalities generated.
As this data is a currency, there is value associated with it, but how do we extract this value? A high growth area is called data science which is used to extract value and insight from this data. It has numerous ingredients in the recipe, with data mining, data optimization, statistics and machine learning key to generating any successful flavor. And like an good recipe, you need a good chef. These chefs in data terms are called Data Scientists, who use a wide variety of tools to glean insight from the data to deliver impact for your business. The data-sets themselves can either be uni-variate (single variable or feature), or multivariate (multiple variables or features). A persons age would be an example of uni-variate, whereas multivariate would expand a person’s feature set to include age, weight and waist size for example.
Machine learning (ML) is born out of the perspective that instead of telling computers how to perform every task, perhaps we can teach them to learn themselves. Examples include predicting the sale price of your house based on a set of features (sq. feet, number of bedrooms, area), to try to determine if an image is of a dog rather than a cat to determining the sentiment of a set of restaurant reviews to be positive or negative. There are a host of applications across many industries, some of these are shown below (source Forbes)
Before the magic is induced from the algorithms, perhaps the most important step in any machine learning problem is the upfront data transformation and mining, towards optimization. Optimization is required as most of the algorithms that “learn” are sensitive to what they receive as an input, and can greatly impact the accuracy of the model that you build. It can also ensure you have a thorough understanding of your data-set and the challenge you are trying to solve. Some of the data transformation and mining techniques include record linkage, feature derivation, outlier detection, missing value management and vector representation. All this is sometimes called “Exploratory Data Analysis”.
Once data is presented in the right manner, there are a number of machine learning techniques one can apply. They are broken in supervised and unsupervised techniques, with supervised learning taking an input data set to train your model on, and with unsupervised no data-sets are provided. Unsupervised techniques include learning vector quantification and clustering. Supervised techniques include nearest neighbors and decision trees. Another techniques is Reinforcement learning, and this type of algorithm allows software agents and/or machines to automatically determine the ideal behavior within a specific context, to maximize its performance.
Verifying your model is also an important step, and we often use confusion matrices to do that. This involves building a table of four results – true positives, true negatives, false positive and false negatives. A set of test data is applied to the classifier and the result are analysed to assess performance. Sometimes, the result of the model is still questionable. When this happens, machine learning has an answer in the form of ensemble methods, which essentially you build a series of models that you build your final prediction from. Examples here include boosting and bagging on the training data. Bagging splits the training data into multiple input sets, boosting works by building a series of increasingly complex models.
There are complimentary techniques used in any successful machine learning problem – these include data management and visualization, and software languages such as python and java have a variety of libraries that can be used for your projects.
Taking a step further from machine learning, you are into a complimentary area called artificial intelligence (AI), which leans more on methods such as neural networks and natural language processing which look to mimic the operation of the human brain. This is showing how human centric design in technology is evolving, and how much excitement there is for how humans and technology will work together in the future. It can be said this excitement is born from revealing that as we evolve our understanding of what it means to be human, it outweighs anything that technology alone can deliver. People have always been at the core of innovation, and this has led to an evolution in how improved our lives are.