Our Evolution towards Ubiquitous Intelligence

“The brain is the organ of destiny. It holds within its humming mechanism secrets that will determine the future of the human race”

– Wilder Penfield

Over the past few months, I have become hugely interested in understanding where the relationship between the human race and our technology landscape currently sits, and how it will evolve in the coming years. My thoughts balance between convenience and security. I can see the potential for how technology can improve our lives at greater levels to generate convenience, but believe there needs to be some security economics around how it is managed.

Our Brain’s Genetic Evolution

The most complex component of our genetic makeup is our brain, and if we look back over the past million years or so, our brains have for the most part, increased in size. However, that is not the case for the past 20,000 years. Our brains have shrunk in that time frame. The reason why? Its simple: our biology is focused on survival, not intelligence. Larger brains were necessary to allow us to learn to use language, tools and all of the innovations that allowed our species to prosper. But now that we have become civilized certain aspects of intelligence are less necessary.

Something else has contributed to this: technology, which has allowed us to fast forward evolution. Initially we could not win a fight with a deer, so we created spears and rifles. Transporting goods at high speed was impossible until we invented the wheel. We could not fly so we invented airplanes. The list goes on. It is expect that our brains will shrink more in the next 1,000 years than the last 20,000 due to the acceleration of technology. Now we are at the tail end of the internet revolution, and one must ask: how will the IoT revolution affect our brain’s genetic makeup? The natural answer is hugely. Just ten years ago we had a large requirement to remember a lot more “data” to perform our everyday jobs adequately. Now we are a mouse click away.

Design will rule the relationship

Before this occurs, how humans and machines relate must evolve significantly. Humans are becoming more insecure about how technology is taking over, and rightly so. Technology 100 years ago was much more about being complimentary, with humans maintaining their sense of agency, which is the feeling that you have control over your own actions, and through them have an impact on your environment. Humans controlled machines, and held the upper hand in 99% of situations. That percentage is dwindling as the artificial intelligence and compute power available to technology is increasing. Now technology corrects our spellings are we type, can tell us where the nearest car parking space is, and can now even drive our cars.

But will everyone trust self driving cars? The answer is no (link here). At least not initially, as the vehicles will be largely alien to us. Humans work and communicate incredibly well with each-other as we have so much in common. We trust a taxi driver to get us to our destination. Why not a machine? The reason is that they don’t feel human. Yes they are smart, analytical and can apply reasoning to avoid a traffic collision. But these are what we would associate as being left brain activities. For true mass adoption, whats missing is representation from the right side components such as emotion, meaning, empathy.

We need to design in features which make them more human like. Recent studies have shown that people who had driven in a self driving car with a name and a voice were less inclined to blame the vehicle for an accident. The same reason is why some people grow attached to Apple’s Siri. An experiment. Try asking Siri “Can I change your name?”. Her response “But everyone else calls me Siri”, thus indicating her emotional attachment to people. The key to building the relationship is how we design the machines to make them easier to work with, seamlessly integrate into our lives and most importantly trust.

Societal Impact – Ubiquitous Intelligence

There are about 7 billion human brains on earth. With the advent of the Internet of Things (IoT), it is predicted that somewhere between 20 and 50 billion devices will be all around us by 2020. We can call these digital brains. Back in 1995, I remember when I wanted to use a computer’s digital brain, I had to be on the top floor of my parents house, or at my fathers business premises. Now, as I type, I can count 28 devices that have some form of compute within them. Naturally, the level of sophistication and usage patterns in the compute varies, which is quite similar to the human race. We have multiple types of intelligence within humans and in technology. As we have job titles, machines also satisfy a job, whether it be to tell us when our fridge is empty (simple) or interpret our shopping habits online through sentiment analysis (more complex). Throughout our evolution, technology has both replaced jobs, and created new ones. So it shouldn’t be surprising that computers are taking over what we have come to regard as high-level human tasks. We did not evolve to optimize, but to survive and, perhaps most of all, to collaborate with others to ensure our survival. We are, after all, creatures of biology, not silicon.

As MIT’s Sandy Pentland put it, “We teach people that everything that matters happens between your ears, when in fact it actually happens between people.” So technology doesn’t mitigate the need for human skills, but it will change which skills are most highly valued. (source Forbes)

The neural network within our minds is becoming ever closer to the internet network of things. It is only a matter of time before they are exclusively connected. Mind and machine. In our natural world a large amount of the data being generated is static, modular or even transactional by nature. A item of sale, an aspect of sentiment from an online browser, an environmental stat. With the advent of IoT, you will begin to see more streams and dynamic data types, which will need more dynamic compute paradigms (recently posted an article on this in the Dzone.com IoT guide for 2016 – link here). This dynamism is something we as humans do incredibly well, change, react, act. Progress in areas like Machine Learning and Cognitive Computing are ensuring technology can do the same, so mind and machine can work more effectively together.

Technologies make it possible to augment human performance in physical, emotional and cognitive areas. The main benefit for any business in augmenting humans with technology is to create a more capable workforce. We are about to live in a world of ubiquitous intelligence, so we must ensure the psychology of human to machine interaction is sustainable. As machines become more integrated in society, gone will be the days of them making our lives easier in ways that convenience us. Our agency will be affected in ways that would seem incomprehensible a decade ago. Whilst it may begin with “us” and “them”, our trust and acceptance will evolve the relationship to one which will seem normal to children of future generations.

Recent Talk: Humans, Data Science and IoT – Perfect Synergy

A quick post to link you to a talk I did recently at the Tech Connect Live Conference at the RDS, and the Big Data Breakouts Conference at Belfast (hosted by Analytics Engines).

You can watch the talk here, posted during the week. Humans, Data Science and IoT – Perfect Synergy . 16 minutes, so its not too long!

I am so excited by how technology and humans will evolve together, so this talk was more to set the scene of how both have evolved. The next talk in the series will be focusing more on the future evolution of technology and the human race, showcasing the interdependence between both, and how technologies such as machine intelligence and context aware computing will evolve to bridge the gap and integrate machines deeper into out everyday lives.

Man vs Machine: Why the competition?

With the continued evolution of industries such as Data Science and the Internet of Things, there is a mix of excitement and fear amongst the populous. Excitement for what they can do for our lives or businesses, but fear of what it will mean for humanity.

The fears are normally sourced from the media or some childhood movies we watched where pretty large robots take over planet earth. Quotes of “Will the robots take our jobs?” “There will be nothing left for us to do with the evolution of the computer”.

In reality a synergy between humans and technology can lead to better all round solutions, rather than in isolation. This is something that is rarely considered in current engineering circles. With so much technology choice, why would we need to stupid humans?

A brief story to set the tone

As far back as 2007, I hosted questions like this as part of the day job. Increased automation in manufacturing is a natural spore for questions of this nature. An example of this was an computer vision application that I built for a Masters dissertation whilst working for Alps Electric (one of the coolest companies in the world). It was inspecting graphics on buttons for correctness, both in finish and symbol.  Naturally this was a task done by humans historically. We were using classification techniques to perform the task on the images, and the receiver operator characteristic curves (ROC Curves) showed that the classifier was right 93% of the time, which was a pretty good first pass result. Please note that this was a time that data science was called “doing your job”.

We wanted to achieve 100%, so in order to improve the algorithm, we decided to use the main source of intelligence in the room, the humans! By presenting the failures to the operator on the production line, and asking them a particularly simple binary yes/no “Is this a genuine failure?”, saving their response and the original image, we were able to get the classifier to close to 99% accuracy.

This proved something that I felt was always the case. Humans and machines can work in tandem as opposed to viewing it as a competition. With the rapid advancement of technology, along with the obsession with using technology to optimize our lives, I pose a question: Have we forgotten how these can compliment each-other? If any data science/machine learning application can get an accuracy level of 70% for example, we try to squeeze extra accuracy out of it through “fine tuning” the algorithm. Perhaps we could present the results in some way to a human for final classification?

Bring it all together

Last April, I tried to draw out how I saw Data Science, IoT and Intelligence (both computing and human) fit together, which is shown below. It is an evolutionary map of sort, where we have always had the verticals and data modalities (data type), and we began on our data journey by building some simple data processing/mining applications (either by us manually or by using algorithms). And lots of the current challenges in data science can be solved by this tier. However, we are seeing an increase in the requirement to bring in machine learning applications to solve more advanced challenges. This is a natural evolution towards artificial intelligence,or deep learning. If we look right down the map in a holistic sense, this is where the top class really comes to the fore.

Humans are by our very nature, true intelligent, which evolves are we do, and also NOT very good at mass processing. Computing on the other hand are not so intelligent to begin with, but are incredibly good at mass processing. A natural hybrid would be true intelligence and mass processing, and that should be the aim for modern Artificial Intelligence companies/ enthusiasts.

Picture1

Now I am not saying that all applications in IoT and Data Science can be solved like this. Of course there will be exceptions. But there are some real tangible benefits to this approach. Consider the area of street crime. Imagine every camera in a city feeding video into a central location, and asking a human to monitor it. In reality, this is actually happening individually per building, park, mall, where security guards are monitoring areas in real time. With advancements in video analytics, it was feared that technology would replace humans. But it is not the case. What has happened as more devices/cameras hit our streets, it becomes impossible to monitor everything. By using advanced video analytics/machine learning capability to flag the anomalies to security, it means they can monitor a bigger space.

Thankfully, one of the high growth areas in technology is in Human Machine Interfaces (HMI’s), and there are some really good examples on how humans and computers can work together. Daqri‘s smart helmet is one such product, which is the worlds first wearable HMI. Their mandate is to use technology to improve and optimize how we work. by integrating compute, sensors and computer vision technology into a well designed helmet. Work, in the Future.

FullSizeRender (35)

As we enter the next phase of digital transformation, ask yourself: How can humans improve/compliment the work of technology in your application?

Just how “Data Intelligent” is your company?

Whilst terms like “Big Data”, “Data Analytics”, “Business Intelligence” and “Data Science” have seemingly being around for many years, not a lot of companies have really understood the boundaries between these, and the interrelationships between them to lead their efforts in data to genuine business impact.

Business impact is the key end goal from any investment in data initiatives in your company. Whilst data exploration is always a useful exercise, if it does not lead to benefit for either your internal organisation or your customers, then it can be a waste of company resources.

538b8bffd2d2baac92398fa8be4536b6

Although the specific approach to the application of analytics – either through BI, Data Science, or application building – may vary according to an enterprise’s needs, it is important to note the broad applicability of BI. Its capacities are constantly expanding to include greater access to more forms of data in intuitive, interactive ways that favor non-technical users. Consequently, the business can do more with the data accessed through these tools in less time than it used to, which makes applying discovery-based BI an excellent starting point for the deployment of analytics. A nice approach outlined by Michael Li of LinkedIn here, shows an EOI model  for driving business value.

02bb6d8

According to Gartner: “By 2015, ‘smart data discovery,’ which includes natural-language query and search, automated, prescriptive advanced analytics and interactive data discovery capabilities, will be the most in-demand BI platform user experience paradigm, enabling mainstream business consumers to get insights (such as clusters, segments, predictions, outliers and anomalies) from data.”

Data Transformation is key

Companies around the globe normally have these questions to answer: Just where is all my data? What format is it in? Can I use it? A large amount of the challenge is maximizing the business impact from your data is to understand what I like to call your “Data Atlas”. And it is normally a journey.  The larger the company, the greater the size of this challenge. Multinationals for example, being in existence for a long period of time have offended for a longer period of time, with it common to have multiple data centers, hosting strategies, database types, data types, data format, and how the data is actually used. It can be difficult for these companies to get their data into the formats required for the latest data software platforms. This can be a time consuming exercise, which can

Looking at the industry, one company that is doing wonders in solving this type of challenge for companies is Analytics Engines, based out of Belfast. Their “Fetch Do Act” methodology offers a click to deploy, end to end big data analytics platform that enables rapid transformation of your data into business insights within a single environment. Check it out here. This major advantage of this approach is that it accelerates your data transformation, so you can focus more of your time on the “Act” element. Remember Big Data is just a tool. 

Defining Data Science?

Explore. Hypothesize. Test. Repeat.

That’s what scientists do. We explore the world around us, come up with hypotheses that generalize our observations, and then test those hypotheses through controlled experiments. The positive and negative outcomes of those experiments advance our understanding of reality. Now one of the best definitions for Data Science I have come across is described by DATAVERSITY™ as:

“Data Science combines the allure of Big Data, the fascination of Unstructured Data, the precision of advanced mathematics and statistics, the innovation of social media, the creativity of storytelling, the investigation and inquiry of forensics, and the ability to use all of those skills together while still being able to demonstrate the results to non-technical audiences.”

Just like in any other Science industry, everything you do with a sample, whether it be biological, chemical or physical, is considered science. Up front analysis, sampling, applying statistics, interpreting and securing the end results.

Beware the Hype

Industry indicates that the hype curve of analytics has peaked, but as it settles, terms like machine learning and predictive analytics are coming up the hype curve, and will have a huge role to play in the coming years. But ensure you only adopt them when the use cases require them. See past the buzz and ensure your strategy takes on board industry trends, but is somewhat unique to the personality of your company. Stay focused, and ensure simplicity is at the forefront of your mind. It is also becoming easier to outsource and partner on some of these advanced methods, typing “machine learning platform” into Google will give numerous results (here).

Customer Centric Analytics

Exploration and experimentation is an important part of your data journey. The key is not to let it become all you do, and to understand the difference insight and impact. Insight does not result in improvement unless you can translate it to business impact. The “data to action” loop below does a nice job of visualizing the difference between data to insight and insight to action.

DataToInsight2

Know your customer. Every data custodian has one. The IT Manager’s customer is the Data Architect, who’s customer is the Data Scientist. They in turn must ensure they meet the requirements of the business sponsor, and having a use case to solve or KPI to meet will help you to build comprehensive return on investment (ROI) statements, and ensure a quicker acceptance in the importance of analytics in your companies business future.

 

CIT Guest Lecture on IoT

I’d like to point you to a recent lecture I did for the Cork Institute of Technology (CIT) on the Internet of Things. You can see it here.

In the lecture I cover the following: Happy watching!

-> Introducing the Internet of Things
-> Understanding IoT
-> How to Design for IoT
-> Data Governance
-> What is your IoT Identity?
-> How to build your IoT ideas – Lean Startup
-> IoT Use Cases
-> Tech & Innovation trends

IoT’s predicted impact on TV, Digital Media and Advertising in our homes

First, let’s look at some history. Up to the advancement of the internet, the TV was the smartest device in our homes, and took a lot of our “downtime” attention. Thus, the advertising strategy for media and TV companies was centralized around this. Once the computer took over, TV started to surrender some of its in home attention index. The industry didn’t react, as it didn’t see it as enough of a threat. The computer offered something different to the TV, so why bother. Once the broadband rates increased however, and mobile technology became more and more prevalent, people began to shy away and watch programs on more devices. Hence the invention of smart TV’s. But is it too late? TV companies have also reacted by allowing players on mobile and tablet, and that has helped to stem the flow.

“The smart home of 2020 will produce more data and be smarter than an average sized industrial plant today.”

Are digital media, advertising and TV companies seeing the bigger picture though? The number of connected devices in our homes is increasing all the time, and the technology is becoming more advanced. Technology like augmented reality and specifically looking at in home offerings such as the Microsoft Hololens, TV and streaming services can now be rendered on any flat surface in the home. The natural reaction is to assume this is a threat to TV. But it can be an enabler. If TV can now be anywhere on any surface, the advertising and broadcasting can be even smarter and reactive than ever before.

holoa1-820x420

With the number of smart services increasing in our homes, long gone are the days when broadband as a services and TV as a service were separate industries. Many service providers are now offering both as one, and it is advised for service providers to really look at the expansion of the types of services that IoT will bring ,such as home security, home safety, home automation, energy management or e-health,  and see if they will get growing traction with consumers as the market matures and we reach mass-market penetration. Industry research consensus shows that such advanced services could bring extra monthly ARPU in the range of 10 to 35 euros, for a penetration of 5 to 30 percent of the subscriber base in advanced markets. This is a significant opportunity for service providers – potentially tens of millions of euros in additional revenues in the future for a 1M subscriber operator – despite the regional volatility factors built into these estimates. People seem to want more choice, however, the utilization rates are still not as high as you might think for streaming services.

I want to talk about what I call secondary virtual advertising, which I will define here in relation to the mass adoption of augmented reality. It is a step beyond virtual advertising, which inserts advert over panels for example in sports events. The primary focus for people in augmented TV and streaming is on the screen. We hate to be interrupted by adverts. What is we could render adverts in free space that are not as “intrusive”, and essentially float close to our devices and environments. Advertise coffee when you approach your smart kettle. Advertise food when you approach your smart fridge. Get the message? This is secondary virtual advertising. A softer form of advertising, that can exist in unison with TV and streaming feeds.

wTVirtual-Ads-1.jpg

One aspect of IoT that can be an enabler or opportunity for TV companies is around people’s online personality, and what they surf. If I surf a lot of sports topics, or nature, TV companies should look to promote their own programs in their space for me to watch. This is similar to what’s happening in the retail industry, when we shop online. On the flip side of this, our online behavior can be leverage by TV companies to figure out what types of programs they should make. Crowd sourced topic identification for broadcasting, so to speak.

A quote from Stephen White, president at Gracenote, the TV and music metadata company.

“The evolution of the TV will be towards all-purpose connectivity, with the ability to extend or push services to tablets and smartphones and soon other appliances, even cars. The TV offers video but can be so much more.”

One final statement. TV broadcasting and advertising companies need to understand their own position in the smart media home of the future, and accept that it wont be central to it, but can be a key element or service in it. Heck, the physical TV will most likely disappear.

Closing off Web Summit 2015 – Day 2/3

And so it ends. The Web Summit on Irish shores finished on Thursday (for now), and I must admit there was an athmosphere of “what if” and that of sombreness. But we cannot allow this to affect our perspective and thinking of the impact this conference has had on Ireland tech landscape.. Paddy Cosgrave has built a conference which he began with 400 attendees to now 35,000. Let’s put that into context in regards to people’s perspective of Ireland as a Tech Hub. With over 100 countries represented, and technology itself ensuring their own tech landscape is quite small, the voices of the 35,000 will translate into millions. And I am certain the conversation will be about Web Summit, the friendliness of the services and the vibrant Night Summit, and not the number/cost of hotels, government and traffic.

Wednesday was a great day and one of the best I have had at a web summit event. It’s started on the Machine Stage, where a panel including Nell Watson from Singularity spoke on how machines and humans will coexist and complement each other in our smart future. I liked how Nell spoke about how the seamless integration of machines, and the governance of same will be a key piece of the puzzle.

Nell Watson from Singularity speaks on Machine Stage
Next up on Machine was another panel, which included Dr. Joe Salvo from GE and Dr. Said Tabet from EMC. The panel was expertly hosted by Ed Walsh who is the director of technology vision for EMC. Whilst interviewing the guys, Ed brought out not only the technology vision required for IoT, but also the collaboration that can be enabled by consortiums like the IIC, of which Dr Salvo and Dr Tabet have been so instrumental in building.

Ed Walsh hosting a panel session on the industrial internet
As already mentioned, the proliferation of Virtual reality was evident, and I got a demo of Amazons audible technology! It was quite neat!


Friday was a more relaxed day, with numbers down a little but this allowed for a different kind of networking experience. It was a day to chat with as many startups as possible, and to catch some great talks. One that stood out was on centre stage, where a panel (including Christine Herron from Intel Capital, Albert Wenger from Union Square Ventures, Mood Rowghani from KPCB ) was hosted by Charlie Wells of the Wall Street Journal. Topic discussed – tomorrow’s tech landscape. A growth or just a bubble?

 

Panel Discussion on Future of Tech
Well it looks like what Nell mentioned above is already happening from who I bumped into!

And so, we are off to Lisbon. Whilst I believe that there will be challenges there also, it is Cosgraves personality that will shine through. An excellent CI labs data science company spawned out of the web summit, and whist there is that data science feel to a lot of the web summit, it’s this personality of Cosgrave and his team that still makes this event stand above many.

Web Summit 2015 – Day 1 

Just catching my breath after day 1 of web summit. Wow! How can it get any better? And it doesn’t feel like day 1, as networking and talking tech began yesterday. I shared a coffee with Ed Walsh from EMC after registration yesterday, and an excellent conversation set the scene for the web summit. I’m really looking forward to the EMC/GE panel event tomorrow. 
The big thing for me is to compare this with last years event, and see how the tech landscape has shifted. We are seeing a lot around the usual topics – analytics, mobile, social. The big additions this year is that it seems more focused per vertical, with medical and retail featuring prominently. I believe the key drivers are simply around the concept of “process and life optimisation”. The Internet of things (IoT) is everywhere, both in talks, conversation and startups. Virtual and Augmented Reality are also a hot topic this year. 

With 2000 startups, there are multiples of everything, so trying to pick out the cream can be a challenge. The best advice I could give to startups is to have 3 key differentiators ready, along with three of their biggest clients. Selling is a key component of technology, and it can be overlooked. 

I attended some really cool talks today, with the future of healthcare talk on  the centre stage generating a nice buzz because it applies to everyone. Amit Singh from Google also spoke very well on how machine learning will be an enabler. Rob Mee, CEO of Pivotal held a captive audience with some fascinating insight into enterprise software analytics they provide. A thought leader with a unique vision. 

Ramji Srinivasen of Counsyl speaking at panel event on future of healthcare

The athmosphere does indicate the size of the tech landscape here in Ireland. Even when having coffee yesterday, I bumped into Fintan and Adrian from Firmwave, who bring really cool technology around firmware and software design for IoT. Check those guys out tomorrow. It was also great to catch up with the team from Analytics Engines in Belfast. A company with great momentum, drive and a suite of analytics services in numerous verticals now including manufacturing and life sciences. 

One key element that you see is that we have a huge advantage in how small our island is as its natural for collaboration to occur, and it’s going to be crucial to technology enablement in this country. Everyone has a role to play, from the kids to the elder generation. Coderdojo again has a strong presence, and we can learn from the kids creativity and enthusiam. Never lose it. Again I met lots of bright talented people in tech at Web Summit, none more so than Andrea Graham from EMC, who despite her age, acts and converses like someone far more senior, and has even built her first company. People like Andrea have a big role to play in Irelands technology future. 

It was also great to see such a cultural diversity, with over 100 countries represented. I am fascinated by tech landscape and trends in Africa. With such huge populations and not having a widespread number of tech solutions, it seems like the perfect landscape as a Testbed for new technology. This year there are a number of alpha startups from Africa on display. Awesome to see. 
An interesting dynamic is also see how startups have progressed in a year, it was good to see startups like Algolia progress so well. 

Me and Alexander Farrell from Aveeza chilling near main stage!

So now it’s onto day 2! Expect another update tomorrow. 
Finally – Go team Tyco! 

Graham Baitson, Glenn Fitzpatrick and myself modelling the Tyco t shirts!

10 Perspectives on “All Things Data”

Switching focus back to a series of technical blog posts, over the next 5/6 blog posts (there may be some Web Summit updates intertwined!) I aim to demystify “all things data”, to include reporting – analytics – data science – business intelligence, key difference and dependencies between these terms, explore an introduction to where machine learning fits into your data model in your company. Governance, security and data management will also be covered.

To begin, a short post with 10 perspectives that will get you thinking. (hopefully!)

1: Big Data is just a tool.

2: Analytics is utilized by Data Science and Business Intelligence

3: Data is never clean. You will spend more of your time cleaning and preparing data (up to 90%) than anything else.

4: 90% of tasks do not require deep machine learning

5: More data beats a cleverer algorithm

6: Data Science + Decision Science + Analytics = Business Impact

7: You should embrace the Bayesian approach

8: Academia and Business are two different worlds – know this.

9: Presentation/Visualisation is key (know your audience)

10: There is no fully automated Data Science. You need to get your hands dirty.

The 2020 Digital Employee: 10 Characteristics

Whilst this blog is focused at the characteristics that millennials (and others) functioning in the technology sector will need to have in the coming years, it could be associated to other industries also. Some of these features are already in play, however they are not seen currently as a full set. Happy reading.

1: Collaborator

Internal company perspectives will no longer be enough. With the lines between various technologies and industries blurring, to truly understand the necessary trends to keep up, both the person and the company will require one to forge strong active links with external companies, startups and universities. The tools required to achieve this will also evolve, with social media collaboration tools to come into the mainstream, along with the continued influence of smart devices and wearables for managing our workload.

2: Applies Relational Technology

A positive from active collaboration is the potential to see other technology methods across various industries. The wide lens approach will provide plenty of food for thought when the technologist looks to solve their immediate challenges. A  good example of this is applying classical file compression algorithms to bioinformatic problems in genome sequence analysis for disease susceptibility patterns. There will be huge advances on the adage “Think outside the box”, where people will build algorithms to find best fit algorithms to solve a related challenge. Seriously.

3: Brand.me

Personal brand is going to continue to grow in influence for future technologists. There are a few aspects to brand to consider. First, your internal brand within your immediate company – how your colleagues view you, how you ensure you remain visible in the right areas within your company. Next, your external brand is how you are viewed in immediate applicable technology areas, both geographically and in parallel companies. Lastly, the social brand of a technologist will require a suitable online social media strategy, to compliment the first two, and ensure that you are visible in areas that may very well blur into yours in the coming years.

4: People Person/ Personality

For years, technologists had an interesting reputation! Most people believed them to sit in dark rooms, writing code, and building circuits, with “geek” and “nerd” aimed in their general direction. Not so now. Its now “cool” to be in technology, given that the technology we work on is impacting everyone’s lives. We can see it, feel it, touch it. Its real. And thus, the impression that technologists can make in various circles has increased. We are now in boardrooms (see my previous blog on trends), becoming online influencers, some are even getting celebrity status (Elon Musk). Also, there is going to be a continued evolution in the number of generations that we will have to work with, which will mean more youthful employees will have to lead the aging generations.

5: Employee Skills as a Service

OK this may sound controversial. But think about it. As the lines between companies blur, with collaboration having a magnetic effect in pulling them exceptionally close to one another, it is predicted here that employees may begin to work in different organisations, with companies contracting their core employees into other partner companies that may need a particular skill set for a fixed period of time. It is predicted here that it may go a step further, with employees interviewing companies, rather than the other way round. The shift in power will happen, and employees will maintain their time bandwidth per week/year, and will give their services on a consultancy basis to multiple companies. Also there is a trend that the “one company employee” is a thing of the past, with employees more free to move quicker between jobs.

6: Self Managing

The next generation of technologists will have independence in their DNA, and will possess the required soft skills to be able to self manage their time and tasks. Point 5 above will demand this, but this is not to say that upper management will not be required. What is being said is that the hierarchical org charts will be a thing of the past, with flat structures work best in evolving technology companies.

7: Mobility

The walls of companies will be well and truly knocked, with advances in technology ensuring that “work from anywhere” is a distinct reality. Augment reality will play a part in this, when renderings of colleagues will solve the lack of contact/visibility challenge that currently exists. Enabled by technology, an entirely new work environment is on the horizon. According to Wakefield research, 91 percent of C-level executives and IT decision-makers believe that today’s teenagers will be working in roles that do not exist today. 72% agree that the traditional office as we know it will become obsolete within four years. Think about it. How are the generation in school now communicating? There were born into technology.

8: Educational Diversity

With online education companies such as Coursera becoming hugely disruptive in the education sector, it is predicted that the classical – Degree – Post Grad – Work (with training) model will change greatly. Numerous people have been quoted as saying “I don’t use a huge amount of my primary degree”. This will mean that certain individuals (think of the 16 year old kid who became a millionaire) will be hired quicker by companies, and then incrementally receive their education throughout their company. This is quite common in Japanese companies, with kids being given apprenticeships at 16, and mix college with work over the next 6 years. Now if the employment laws would catch up! Whilst incremental training is happening now within companies, colleges/companies don’t recognize it as a sum of the parts.

9: Startup(s) as a Hobby

Currently, having external commitments in technology areas, such as startup involvement is seen as a bad thing by most companies. There are trends to suggest that companies are actively opening the door to employees who use their spare time to engage in other opportunities. And rightly so. The skill set that can be gained from contributing in different company and academic structures are incredibly valuable, and there is the added bonus for the company in that they have a viewpoint into more early stage alpha and beta companies.

10: Mass Parallel Processors of Information

Yep. Its happening. And we don’t even know it. The way education is being delivered these days demands huge levels of multitasking. The ability to respond to several different stimuli at the same time is called continuous partial attention. We used to teach in a way that demanded a tremendous amount of memorization, but now it’s more about cognitive agility and multi-tasking. The part of the brain, called the hippo-campus, that’s involved in memory is a little different than the multitasking part at the front of the brain.

We see it currently in technology. To every Splunk there is a Hunk. Hadoop was barely alive when Spark came along. Java now has over 50 different varieties. Argh! Do we need to be expert at all of them? No, but we need to be able to switch between them seamlessly. Or at least know what gets used where to meet the challenge we are working on.