The 2020 Digital Employee: 10 Characteristics

Whilst this blog is focused at the characteristics that millennials (and others) functioning in the technology sector will need to have in the coming years, it could be associated to other industries also. Some of these features are already in play, however they are not seen currently as a full set. Happy reading.

1: Collaborator

Internal company perspectives will no longer be enough. With the lines between various technologies and industries blurring, to truly understand the necessary trends to keep up, both the person and the company will require one to forge strong active links with external companies, startups and universities. The tools required to achieve this will also evolve, with social media collaboration tools to come into the mainstream, along with the continued influence of smart devices and wearables for managing our workload.

2: Applies Relational Technology

A positive from active collaboration is the potential to see other technology methods across various industries. The wide lens approach will provide plenty of food for thought when the technologist looks to solve their immediate challenges. A  good example of this is applying classical file compression algorithms to bioinformatic problems in genome sequence analysis for disease susceptibility patterns. There will be huge advances on the adage “Think outside the box”, where people will build algorithms to find best fit algorithms to solve a related challenge. Seriously.

3: Brand.me

Personal brand is going to continue to grow in influence for future technologists. There are a few aspects to brand to consider. First, your internal brand within your immediate company – how your colleagues view you, how you ensure you remain visible in the right areas within your company. Next, your external brand is how you are viewed in immediate applicable technology areas, both geographically and in parallel companies. Lastly, the social brand of a technologist will require a suitable online social media strategy, to compliment the first two, and ensure that you are visible in areas that may very well blur into yours in the coming years.

4: People Person/ Personality

For years, technologists had an interesting reputation! Most people believed them to sit in dark rooms, writing code, and building circuits, with “geek” and “nerd” aimed in their general direction. Not so now. Its now “cool” to be in technology, given that the technology we work on is impacting everyone’s lives. We can see it, feel it, touch it. Its real. And thus, the impression that technologists can make in various circles has increased. We are now in boardrooms (see my previous blog on trends), becoming online influencers, some are even getting celebrity status (Elon Musk). Also, there is going to be a continued evolution in the number of generations that we will have to work with, which will mean more youthful employees will have to lead the aging generations.

5: Employee Skills as a Service

OK this may sound controversial. But think about it. As the lines between companies blur, with collaboration having a magnetic effect in pulling them exceptionally close to one another, it is predicted here that employees may begin to work in different organisations, with companies contracting their core employees into other partner companies that may need a particular skill set for a fixed period of time. It is predicted here that it may go a step further, with employees interviewing companies, rather than the other way round. The shift in power will happen, and employees will maintain their time bandwidth per week/year, and will give their services on a consultancy basis to multiple companies. Also there is a trend that the “one company employee” is a thing of the past, with employees more free to move quicker between jobs.

6: Self Managing

The next generation of technologists will have independence in their DNA, and will possess the required soft skills to be able to self manage their time and tasks. Point 5 above will demand this, but this is not to say that upper management will not be required. What is being said is that the hierarchical org charts will be a thing of the past, with flat structures work best in evolving technology companies.

7: Mobility

The walls of companies will be well and truly knocked, with advances in technology ensuring that “work from anywhere” is a distinct reality. Augment reality will play a part in this, when renderings of colleagues will solve the lack of contact/visibility challenge that currently exists. Enabled by technology, an entirely new work environment is on the horizon. According to Wakefield research, 91 percent of C-level executives and IT decision-makers believe that today’s teenagers will be working in roles that do not exist today. 72% agree that the traditional office as we know it will become obsolete within four years. Think about it. How are the generation in school now communicating? There were born into technology.

8: Educational Diversity

With online education companies such as Coursera becoming hugely disruptive in the education sector, it is predicted that the classical – Degree – Post Grad – Work (with training) model will change greatly. Numerous people have been quoted as saying “I don’t use a huge amount of my primary degree”. This will mean that certain individuals (think of the 16 year old kid who became a millionaire) will be hired quicker by companies, and then incrementally receive their education throughout their company. This is quite common in Japanese companies, with kids being given apprenticeships at 16, and mix college with work over the next 6 years. Now if the employment laws would catch up! Whilst incremental training is happening now within companies, colleges/companies don’t recognize it as a sum of the parts.

9: Startup(s) as a Hobby

Currently, having external commitments in technology areas, such as startup involvement is seen as a bad thing by most companies. There are trends to suggest that companies are actively opening the door to employees who use their spare time to engage in other opportunities. And rightly so. The skill set that can be gained from contributing in different company and academic structures are incredibly valuable, and there is the added bonus for the company in that they have a viewpoint into more early stage alpha and beta companies.

10: Mass Parallel Processors of Information

Yep. Its happening. And we don’t even know it. The way education is being delivered these days demands huge levels of multitasking. The ability to respond to several different stimuli at the same time is called continuous partial attention. We used to teach in a way that demanded a tremendous amount of memorization, but now it’s more about cognitive agility and multi-tasking. The part of the brain, called the hippo-campus, that’s involved in memory is a little different than the multitasking part at the front of the brain.

We see it currently in technology. To every Splunk there is a Hunk. Hadoop was barely alive when Spark came along. Java now has over 50 different varieties. Argh! Do we need to be expert at all of them? No, but we need to be able to switch between them seamlessly. Or at least know what gets used where to meet the challenge we are working on.

 

5 Technology trends to consider right now

As we are a month inside the second half of 2015, I thought it would be a good time to look at some of the technology trends that are in motion, and will have more of an influence as we enter 2016.

1: SMAC becomes SMACT

Social Mobile, Analytics and Cloud (SMAC) has existed for a number of years in enterprise applications. Internet of Things (IoT) has accelerated as an enabler in technology, and hence will begin to be added to SMAC to create SMACT. I introduced this concept in one of my first posts here. And they need each other to succeed and/or progress. As more and more devices in IoT come online, SMAC demand will increase. And IoT will add value to SMAC, as it will spurn new technology directions that can utilize SMAC. The A in SMAC will be affect more than others, as with new data sets being generated, open data sets available for data multi tenancy will drive new requirements for on demand insights in real time.

2: Co-Creation:

A key tenant of open innovation (which was mentioned in a previous blog here) is co-creation. As companies take a more outside in approach to discovering next business direction, co-creation will be a huge part of this. Whilst its slowly increasing in chatter, co-creation will be key enabler in the coming years. Industry partners, vendors and consumers will create ecosystems that will drive new business models by utilizing analytics, and understanding customers at heightened levels. We have seen how disruptive NetFlix, Uber and Bitcoin have been in the past few years, and it is expected co-creation will also drive further disruption, but in different directions at increased velocity. Ikea’s home tour is a good example of them listening to their consumers to understand the requirements for why they were doing up their homes.

3: Technology and Business Strategy Leadership positions collide

It is expected that there will be a blurring of the lines between technology and classical business positions in companies, and this will result in a series of new positions to drive next generation technology direction. We are seeing that technology and business executives need to be proficient in both areas, and understand the dependencies of the decisions made in either will be crucial. The rise in roles such as Chief Data Architect (CDA), Chief Digital Officer (CDO) and Chief Governance Officer (CGO) has meant that board rooms have increased percentage of technology executives. It is predicted that an organisations Chief Technology Officer (CTO) will create a series of direct reports in the areas of data intelligence, data monetisation, futurism and collaboration strategy. These roles will be necessary to assist the CTO in managing digital disruption.

4: Data Monetization

This is a hot topic right now, and one of the pioneers that is driving a lot of new research in this area is Steve Todd from EMC, along with Dr. Jim Short of the San Diego Supercomputer Center. Whilst you can read extensively on this topic at Steve’s blog, Ill outline some of the considerations that are prominent for your business. The first is the idea of monetisation of your current and future data assets. Data is the new oil, a form of currency that can be used to drive business metamorphosis, but also can be something that is of use to others. So then it becomes a sale-able asset. We have seen first hand where major companies are looking to acquire companies not only for their technology, but their data also (example). Imagine if your store had a considerable data set, I expect major retailers such as amazon would be interested in buying that data-set from you, to understand street shopper trends. Another aspect to consider is valuing data at all stages of your companies cycle from inception, through beta to its growth cycle. An accurate snapshot of your data assets can increase the valuation of your organisation, and is especially useful in acquisition. From an internal company data perspective, a key pillar of your data monetization strategy is the architecture on which your data resides, as numerous data silos across your organisation are generally very difficult to even analyse for valuation. The concept of a business data lake can bring huge advantage here.

5: Search will involve more than Google

Currently, a large proportion of search involves online search for information that resides on servers. However, with the increased influence of IoT and the connected world, it is expected that more that the cloud will indeed be searchable. The billions of edge devices should enter the fray, if the data and security policies continue to be challenged into being more open. Connected cars, homes and mobile devices could widen the net for any search queries. We are seeing the emergence of alpha startups indicating this trend, such as thingful and shodan, which act as search engines for the internet of things.

Numenta and MemComputing: Perfect AI Synergy

the-brain

Let’s look at two forces of attraction that are happening in the technology space, specifically looking at creating true artificial intelligent systems, utilizing advances in both software and hardware technologies.

For years, even decades we have chased it. AI has been at the top of any list of research interest groups, and while there have been some advances, the pertinent challenge has been that advances in hardware electronics in the 70’s and 80’s occurred, software design was lagging behind. Then, software advanced incredibly in the past decade. So now, in July 2015, we reach a key point of intersection of two “brain based technologies”, which could be built together in a way that may lead to “true AI”.

At no other point in history have we had both hardware and software technologies that can “learn” like we can, whose design is based on how our mind functions.

Numenta

First, let’s look at Numenta. Apart from having the pleasure of reading Jeff Hawkins excellent book “On Intelligence”, I have started to look at all the open source AI algorithms ( github here) that they provide. In a journey that start nine years ago, when Jeff Hawkins and Donna Dubinsky started Numenta, the plan was to create software that was modeled on the way our human brain processes information. Whilst its been a long journey, the California based startup have made accelerated progress lately.

numenta-icon512

Hawkins, the creator of the original Palm Pilot, is the brain expert and co-author of the 2004 book “On Intelligence.” Dubinsky and Hawkins met during their time building Handspring, they pulled together again in 2005 with researcher Dileep George to start Numenta. The company is dedicated to reproducing the processing power of the human brain, and it shipped its first product, Grok, earlier this year to detect odd patterns in information technology systems. Those anomalies may signal a problem in a computer server, and detecting the problems early could save time, money or both. (Think power efficiency in servers)

You might think, hmm, that’s not anything great for a first application of algorithms based on the mind, but its what we actually started doing as neanderthals. Pattern recognition. First it was objects, then it was patterns of events. And so on. Numenta is built on Hawkins theory of Hierarchical Temporal Memory (HTM), about how the brain has layers of memory that store data in time sequences, which explains why we easily remember the words and music of a song. (Try this in your head. Try start a song in the middle.. Or the alphabet.. It takes a second longer to start it). HTM became the formulation for Numenta’s code base, called Cortical Learning Algorithm (CLA), which in turn forms the basis of applications such as Grok.

Still with me? Great. So that’s the software designed and built on the layers of the cortex of our brains. Now lets look at the hardware side.

 

Memcomputing

After reading this article on Scientific American recently, and at the same time as reading Hawkins book, I really began to see how these two technologies could meet somewhere, silicon up, algorithms down.

Memelements

A new computer prototype called a “memcomputer” works by mimicking the human brain, and could one day perform notoriously complex tasks like breaking codes, scientists say. These new, brain-inspired computing devices also could help neuroscientists better understand the workings of the human brain, researchers say.

In a conventional microchip, the processor, which executes computations, and the memory, which stores data, are separate entities. This constant transfer of data between the processor and the memory consumes energy and time, thus limiting the performance of standard computers.

In contrast, Massimiliano Di Ventra, a theoretical physicist at the University of California, San Diego, and his colleagues are building “memcomputers,” made up of “memprocessors,” that can actually store and process data. This setup mimics the neurons that make up the human brain, with each neuron serving as both the processor and the memory.

I wont go into specifics of the building blocks of how they are designed, but its based on three basic components of electronics – capacitors, resistors and inductors, or more aptly called memcapacitors, memresistors and meminductors. The paper describing this is here.

Di Ventra and his associates have built a prototype that are built from standard microelectronics. The scientists investigated a class of problems known as NP-complete. With this type of problem, a person may be able to quickly confirm whether any given solution may or may not work but can’t quickly find the best solution. One example of such a conundrum is the “traveling salesman problem,” in which someone is given a list of cities and asked to find the shortest route from a city that visits every other city exactly once and returns to the starting city. Finding the best solution is a brute force exercise.

The memprocessors in a memcomputer can work together to find every possible solution to such problems. If we work with this paradigm shift in computation, those problems that are notoriously difficult to solve with current computers can be solved more efficiently with memcomputers,” Di Ventra said. In addition, memcomputers could tackle problems that scientists are exploring with quantum computers, such as code breaking.

Imagine running software that is designed based on our minds, on hardware that is designed on our minds. Yikes!

In a future blog, I will discuss what this means in the context of the internet of things.

brain-computer

 

 

Ideate! Innovation strategy in your company

2010ThinkBigStartSmall

“Innovation is hard. The larger your business, the harder it gets.”

Is the above statement true? Maybe. More importantly, does it have to be true? No it doesn’t.

One of the most common mistakes in large companies in respect to innovation strategy is that when they see themselves as “big”, they need a “big” innovation model to succeed. Consider the life-cycle of a company. First, in startup mode, they are agile by default. They don’t look out past 2/3 months, with many looking at bringing new features to their product so they can secure the next round of funding. Once they grow and mature, and longer term goals become a priority and less financial pressure results, innovation slows down. The longer it does so, the more investment is needed to get it back.

Whilst having a large business means sharing knowledge becomes more of a challenge, there are advantages to size and scale. For one, diversity is a critical component in innovation, and the larger the company, generally the more diverse it is, its how you identify and partner that diverse thinking that can differentiate your innovation strategy. To enable a large organisation to reach its true innovative capacity, there are a number of approaches that can be used to reach it. Some of these are introduced below.

Disruptive Innovation Task Teams

A concept to consider when maintaining your innovation strategy is placing disruptive innovation task teams into your existing business model, whereby they act with limited budget and resources, look no further than 2/3 months out and are agile by nature. The also have a keen eye on not what the customer wants now, but what they will need in the future ( but not too far out 6-12 months). Planting a seed team like this can result in cross pollination with other product teams around it, and this will increase your companies overall ideation. A key component of these teams is to ensure there is diversity present, with both seasoned campaigners that have the business history, along with new generation employees that can bring outside thinking to the table.

Agile. Rinse. Repeat

dilbert2666700071126

One of the first aspects that is critical to success is to introduce agile as a mantra. This isn’t easy in large companies, and this is why an upfront investigation must be performed to assess how you inject it. Unfortunately, agile has been targeted mainly at software development teams, but it can have uses in other business teams, such as finance and human resources. However, it must be adapted to suit. No agile paradigm fits all. And within certain types of projects, if agile is not adjusted, then success becomes difficult (Read Ken Collier’s book on Agile Analytics for any data intelligence readers).

Think Lean

Think fast, learn fast, fail fast. The concept of Lean Startup is now well known, and has huge advantages. Lean startup is a method for developing businesses and products first proposed in 2011 by Eric Ries. Whether in your large organisation or in the start up space, most new ideas/ concepts fail. T he odds are not with you: As new research by Harvard Business School’s Shikhar Ghosh shows, 75% of all start-ups fail. A key component of the lean startup philosophy is to favor experimentation over elaborate planning, working with the customer over intuition, and incremental design over big design planning meetings. The aim is to build a minimum viable product, and iterate and pivot on that product. For more on this, check out the website.

LeanStartupLoopClassic

Innovation Identity

Do you know your companies innovation identity? Innovation identity is the intersections between your companies technology, your innovation teams, the market(s),  and other departments of the company. Two main innovation models seem to emerge:

  1. Thriving innovation model means the innovation culture is at the cornerstone of the corporate company; the company develops interactions both across internal departments and with external resources to complete its innovations. Cisco, Sanofi, 3M, Renault, and the open source way of working are championing this model.
  2. Dedicated entity model involves the creation of an autonomous unit pursuing new and uncertain activity lines. Lockheed with its skunk works, At Google for instance, innovation is at the core DNA, which links them to the first model; reversely, as they enable small teams to investigate disruptive innovation in a flexible framework, they are really close to this model
Open Innovation

The identity of innovation has been gradually shaped by multiple interactions between different levels of a company with other external groups/organisations. And not just any type or size of external companies. The people you collaborate with must be suited to the market entry you are trying to achieve. Although they may exist within a market you are trying to enter, sometimes it is key to identify the technology required to enter that market, and even look at university collaboration to fulfill the technology requirements for market penetration.

In the way to define our innovation mantra and strategy,  a look at the 10 facets defined by Jeffrey Philips can also help, positioning where you want to be:

  • open vs closed innovation:
  • skunk works vs broadly participative;
  • suggestive vs directed, incremental vs disruptive (also stretching innovation vs “all included” innovation vs disruptive innovation);
  • centralized vs decentralized;
  • product / service / operations / business model, funding, wisdom of crowd vs defined criteria and experts.
In closing..

Creating and maintaining the innovation strategy at your company is both a challenge and an evolution of not only your company, but also the internal personality and dynamic of the individuals who contribute to it. The direction you take and how you make the journey is down to you.

Distributed Analytics in IoT – Why Positioning is Key

analytics-word-cloud

The current global focus on the “Internet of Things (IoT)” have highlighted extreme importance of sensor-based intelligent and ubiquitous systems contributing to improving and introducing increased efficiency into our lives. There is a natural challenge in this, as the load on our networks and cloud infrastructures from a data perspective continues to increase. Velocity, variety and volume are attributes to consider when designed your IoT solution, and then it is necessary to design where and where the execution of analytical algorithms on the data sets should be placed.

Apart from classical data centers, there is a huge potential in looking at the various compute sources across the IoT landscape. We live in a world where compute is at every juncture, from us to our mobile phones, our sensor devices and gateways to our cars. Leveraging this normally idle compute is important in meeting the data analytics requirements in IoT. Future research will attempt to consider these challenges. There are three main classical architecture principles that can be applied to analytics. 1: Centralized 2: Decentralized and 3: Distributed.

The first, centralized is the most known and understood today. Pretty simple concept. Centralized compute across clusters of physical nodes is the landing zone (ingestion) for data coming from multiple locations. Data is thus in one place for analytics. By contrast, a decentralized architecture utilizes multiple big distributed clusters are hierarchically located in a tree like architecture. Consider the analogy where the leaves are close to the sources, can compute the data earlier or distribute the data more efficiently to perform the analysis. This can have some form of grouping applied to it, for example – per geographical location or some form of hierarchy setup to distribute the jobs.

Lastly, in a distributed architecture, which is the most suitable for devices in IoT, the compute is everywhere. Generally speaking, the further from centralized, the size of the compute decreases, right down to the silicon on the devices themselves. Therefore, it should be possible to push analytics tasks closer to the device. In that way, these analytics jobs can act as a sort of data filter and decision maker, to determine whether quick insight can be got from smaller data-sets at the edge or beyond, and whether or not to push the data to the cloud or discard. Naturally with this type of architecture, there are more constraints and requirements for effective network management, security and monitoring of not only the devices, but the traffic itself. It makes more sense to bring the computation power to the data, rather than the data to a centralized processing location. 

There is a direct relationship between the smartness of the devices and the selection and effectiveness of these three outlined architectures. As our silicon gets smarter and more powerful and efficient, this will mean that more and more compute will become available, which should result in the less strain on the cloud. As we distribute the compute, it should mean more resilience in our solutions, as there is no single point of failure.

In summary, the “Intelligent Infrastructures” now form the crux of the IoT paradigm. This means that there will be more choice for IoT practitioners to determine where they place their analytics jobs to ensure they are best utilizing the compute that is available, and ensuring they control the latency for faster response, to meet the real time requirements for the business metamorphosis that is ongoing.

Nell, Google and a Half Pipe! EnterConf Belfast – Day 2

Quote of the day. “Counterfeiting is an insidious problem in life sciences, our network tenant cloud can help stop it” – Shabbir Dahod – TraceLink, Inc

As EnterConf entered its second day, I continually saw the benefit of having more detailed discussions with people in the Enterprise sector. Even during the night events (the speaker dinner in the Harbour Commissioners Office, great venue, followed by a few sociables in the Dirty Onion Bar), I kept monitored the dynamics taking place. The networking normally began with two people, but the circles were growing, joining to form what I like to call “RoundStandUps”. These were normally not short conversations, and collaboration was inherent in the voices and chatter. There also was a deep and satisfying undertone, which was an energy to keep “building great” in Ireland.

Check out the Half Pipe! Hope its at Web Summit! 🙂

Half Pipe at EnterConf
Half Pipe at EnterConf

Kicking us off on Centre Stage was none other than the inspirational futurist Nell Watson from Singularity University, who is also the CEO of Poikos, the smartphone 3D body measurement company. She talked about virtual employees, how we will replicate the human mind through AI in 20 years (and run business through AI). I liked how Nell bridged the machine and human inter-dependencies.  It was an insightful talk, and having spent the past year looking at machine intelligence (from both a hardware and software implementation perspective), I am seeing more and more futurists thinking like this.

Nell Watson, CEO of Poikos on Centre Stage
Nell Watson, CEO of Poikos on Centre Stage

A few talks focused on our evolving workplace. David Hale, from Gigwalk spoke on the Insight stage on “Deploying Technology to Power Mobile Field Teams and Maximise Work Efficiency”. David spoke on how mobile tools for consumer brands and retailers are being used to more effectively manage field teams, gather in-store data and direct resources to improve retail execution ROI. David also spoke about how our employees are changing, and how companies have to empower the “Millennial Employee”, whose requirements include flexibility, and having a social and online mindset.

David Hale, from Gigwalk on the Insight Stage

Shabbir Dahod – TraceLink, Inc, spoke on the Centre stage, his topic – “Delivering the Internet of Things (IoT) to the Enterprise”, and it was one of the highlight talks of the summit I found. Shabbir spoke about how Tracelink were the world’s largest track and trace network for connecting the Life Sciences supply chain and eliminating counterfeit drugs from the global marketplace, by using their Life Sciences Cloud, configured in a network tenant architecture.

Shabbir Dahod – TraceLink, Inc

Thomas Davies, Head of Enterprise for Google drew a huge level of engagement from the crowd with his talk on the next stage of collaboration. Thomas mentioned the evolution of how we collaborate, but even since the early 1980’s the structures were quite rigid and have not changed that much up to a few years ago. But now, customer and employee expectations have changed. They are fast, 24/7, global and personalised. He discussed how employees and organisations are more efficient when they collaborate. “We shape our tools, and then our tools shape us” – Marshall McLuan.

Thomas Davies (Google) in exhuberant form on Center Stage

One last talk Ill cover is a topic that is somewhat under the covers of Enterprise IT, and I am glad that Engin Akyol of Distil Networks talked on “Dark Cloud: Cloud Providers as a Platform for Bot Attacks”. Engin first spoke about good bots, which do serve a purpose for major cloud providers. But this talk was focusing on bad bots, which slow down application performance and skew analytics. As the volume of cloud platforms continues to scale, this leads to ease in setting up bot networks which can pilfer content from websites, or launch other malicious attacks.

Engin Akyol of Distil Networks

So, ill sign off from EnterConf 2015, and onto Web Summit in November, with many events, collaborations and new experiences in between. As a two day conference, perhaps I built less contacts than I expected to. But the ones I did are more meaningful contacts, and EnterConf allows their attendees an environment to do that. I also sat in on round-tables on big data and security, which gave yet another dynamic. It really is a conference experience I will be returning to. Special mention to all the organisers, volunteers and the inspiring venue. Goodbye Belfast, hello Dublin!

Oh, I almost forgot, I really hope Krem Coffee are at Web Summit, awesome coffee!

EnterConf Belfast- Day 1

Firstly, to the quote of the day “We all have to avoid software that epically sucks”.

Me at the Insight Stage!
Me at the Insight Stage!

Today I attended day one of the Enter Conf in Belfast, which for those who don’t know it, is a spin off conference from Web Summit focused at the Enterprise aspect of our tech world. On initial entry, I must admit I was really proud of the Enter Conf team for choosing the venue. It had lost of lot of history associated with it, being in the heart of the titanic quarter where the Titanic was built, and for its time, was an “Enterprise ship”! This created a chilled out atmosphere which was a nice differential from the Web Summit to be held again in November. It was full of detailed and focused meetups and conversations, and did a great job at giving a different experience of what a conference can provide. Kudos.
There were two stages, named Center and Insights, with startup exhibits, food and coffee stands to ensure everyone was nicely refreshed throughout the day. Whilst I wont cover all talks, I have picked out a few to cover to show the types of elements being discussed.

The first one Ill mention was by Lukas Biewald of Crowdflower, entitled “Processing Open Data”, who spoke extensively on their efforts to clean up the data, and also looking into elements of data moderation. It really resonated with me as I am interested and developing data cleanse frameworks over the past number of years, and always struggle with the data pollution that skews our insight. Quote from Lukas “If you want to improve your algorithm, just add more data”. Lukas is in action below.

Lukas Biewald of Crowdflower
Lukas Biewald of Crowdflower

Stephen McKeown from AnalyticsEngines and Amir Orad from Sisense were also in a panel on “Democratising Data”, which focused the talk on ensuring companies of all sizes speed up their analytics creates a more level playing field for startups competing with Enterprise. Quote from this section “Bring data into your companies DNA”

Stephen McKeown and Amir Orad
Stephen McKeown and Amir Orad

There were a few familiar faces present, with my former EMC colleague and mentor Steve Todd amongst the speakers, on “Economic Value of Data” (check out Steve’s blog here for more fascinating content in this topic. Steve spoke on the Center stage, and it was great to see this topic present, as it really stood out as a conversation we all should be having. Steve gave a similar talk in Cork for an it@Cork event we organised in February, and it was great to see the advancement in Steve’s research in this area. Steve spoke on “Valuation Business Processes” and categories within that being M&A, Asset Valuation, Data Monetisation, Data Sale and Data Insurance. I wont spoil the rest, as I am sure Steve will blog on this soon.

Steve Todd speaking on Economic Value of Data
Steve Todd speaking on Economic Value of Data

Also on Center Stage, in one of talks to close out the evening, Barak Regev, Head of Google Cloud Platform – EMEA spoke on “Architecting the Cloud”. It was great to get an update on their vision, and Barak showed Googles vision to “Build Whats Next”

IMG_1526 - Copy
Barack Regev from Google – Build Whats Next

And to end on a great quote from to James Petter VP EMEA for Pure Storage – “Security should be like an onion, it should be layered, and you cant reach the center without breaching a layer”

The day brought many epic conversations from over 10 different nationalities, including a walk back to the city with the visionary Teemu Arina. His talk on Biohacking was incredible insightful. It spoke to the challenge around humans tracking their life through Self Quantisation. Teemu took me though his idea for how humans can do a better job on hacking their bodies for information and using that to improve life quality. Teemu’s book is here!

So now, its off to the night dinner, drink a beer two and to build a few more contacts! In the morning, it looks like a few good talks on Machine Intelligence will start the trend for another awesome day!

Why IoT practitioners need to “Wide Lens” the concept of a Data Lake

As we transition towards the vast quantity of devices that will be internet enabled by 2020, (anything from 50-200 billion experts estimate), it seems that the current cloud architectures that are being proposed are somewhat short on the features required to enable the customers data requirements on 2020.

I wont dive hugely into describing the technology stack of a Data Lake in this post (Ben Greene from Analytics Engines in Belfast, who I visit on Wednesday en route to Enter Conf, does a nice job here of that in his blog here). A quick side step, if you look at the Analytics Engines website, I saw that customer choice and ease of use were some of their architecture pillars, when providing their AE Big Data Analytics Software Stack. Quick to deploy, modular, configurable  with lots of optional high performance appliances. Its neat to say the least, and I am looking forward to seeing more.

The concept of a Data Lake has a large reputation in current tech chatter, and rightly so. Its got huge advantages in enterprise architecture scenarios. Consider the use case of a multinational company, with 30,000+ employees, countless geographically spread locations, multiple business functions. So where is all the data? Its normally a challenging question, with multiple databases, repositories and more recently, hadoop enabled technologies storing the companies data. This is the very reason why a business data lake (BDL) is a huge advantage to the corporation. If a company has a Data Architect at its disposal, then it can develop a BDL architecture (such as shown below, ref – Pivotal) that can be used to act as a landing zone for all their enterprise data. This makes a huge amount of sense. Imagine being the CEO of that company, and as we see changes in the Data Protection Act(s) over the next decade, a company can take the right step towards managing, scaling and most importantly protecting their data sets. All of this leads to a more effective data governance strategy.

Pivotal-Data-Lake

Now shift focus to 2020 (or even before?). And lets take a look at the customer landscape. The customers that will require what the concept of a BDL now provides will need far more choice. And wont necessarily be willing to pay huge sums for that service. Now whilst there is some customer choice of today, such as Pivotal Cloud Foundry, Amazon Web Services, Google Cloud and Windows Azure, it is predicted that even these services are targeted at a consumer base of a startup and upwards in the business maturity life cycle. The vast majority of cloud services customers in the future will be everyone around us, the homes we live in and beyond. And the requirement to store data in a far distance data center might not be as critical for them. It is expect they will need far more choice.

I expect in the case of building monitoring data, which could be useful to the wider audience in a secure linked open data sets (LOD’s) topology. For example, smart grid provider might be interested in energy data from all the buildings and trying to suggest optimal profiles for them to reduce impact on the grid. Perhaps the provider might even be willing to pay for that data? This is where data valuation discussions come into play, and is outside the scope of the blog. But the building itself, or its tenants might not need to store all their humidity and temperature data for example. They might some quick insight up front, and then might choose bin that data (based on some simple protocol describing the data usage) in their home for example).

Whilst a BDL is built on the premise of “Store Everything”, it is expected that whilst that will bring value for these organisations monitoring consumers of their resources, individual consumers might not be willing to pay for this.

To close, the key enablers to these concepts are the ensure that real time edge analytics and increased data architecture choice. And this is beginning to happen. Cisco have introduced edge analytics services into their routers, and this is a valid approach to ensuring that the consumer has choice. And they are taking the right approach, as there is even different services for different verticals (Retail, IT, Mobility).

In my next blog, Edge Analytics will be the focus area, where we will dive deeper into the question. “where do we put our compute?”

IoT and Governance. Its a game of RISK

Due to the sheer volume of devices, data volume, security and networking topologies that result from IoT, it is natural for there to be a lot of questions and legal challenges around governance and privacy. How do I know my data is secure? Where is my data stored? If I lose a device, what happens to data in flight?

The National Fraud Intelligence Bureau has said that 70% of the 230,845 frauds recorded in 2013/2014 included a cyber-element, compared to 40% five years ago. This would indicate that we aren’t doing a very good job on protecting the existing internet enabled devices, so why should we be adding more devices? If we internet enable our light bulbs and heating systems (Nest being acquired by Google a good example) to control from our mobile phone, can the devices be hacked to tunnel to our mobile phone data?

It is not only the singular consumer that needs to be aware of privacy and governance. Businesses too will need to ensure when they adopt IoT, they must place resources at the door of the legal requirement and implications of IoT enablement. A key aspect of this will be to ensure their internal teams are aligned in relation to IoT, and more specifically, security, data protection and privacy.

More and more, governments and regulatory bodies have IoT in their remit. This included the EU commission who published a report that recommended that IoT should be designed from the beginning to meet suitable governance requirements and rights, including right of deletion and data portability and privacy.

The draft Data Protection Regulation addresses some of these measures including:

  • Privacy by design and default – to ensure that the default position is the least possible accessibility of personal data
  • Consent
  • Profiling – clearer guidelines on when data collected to build a person’s profile can be used lawfully, for example to analyse or predict a particular factor such as a person’s preferences, reliability, location or health
  • Privacy policies
  • Enforcement and sanctions – violations of data privacy obligations could result in fines of up to 5% of annual worldwide turnover or €100m, whichever is greater

The first point above, privacy by design is normally an afterthought unfortunately. Whilst not a requirement by the Data Protection Act, it makes the compliance exercise much smoother. Taking such an approach brings advantages in building trust and minimizing risk.

IoT presents a number of challenges that must be addressed by European privacy regulators as IoT evolves. It is predicted that the scrutiny on these challenges will increase as the device number increases.

Some of the challenges include:

  • Lack of control over the data trajectory path
  • The lack of awareness by the user of the devices capabilities
  • Risk associate with processing data beyond original scope, especially with advances in predictive and analytic engines
  • Lack of anonymity for users
  • Non threat everyday devices becoming alive to threat

As can be seen from these challenges above, there are characteristics in common, such as control, security and visibility which makes governance of IoT a bigger challenge than expected.

Finally, governance in IoT is expected to follow other technologies. Up to now, the software industry has not had single standards for the complete service portfolio (including cloud), although government are addressing this. From the geographical standpoint, different regulations are commonplace for different jurisdictions in IT, so IoT is predicted to follow suit.

Why Ireland needs to use Technology and IoT more to help their Homeless



21% rise in homeless sleeping rough in Dublin

15% rise in homeless in Cork

Rise also in Limerick, Galway and Waterford

Over 1,000 children are now homeless in Ireland.

Startling figures. Especially the last one. I cannot try to comprehend what it is like to be a parent, who must tell their children that they don’t have a place to go at night.

I, like many others have been in other countries cities to see that this is not simply an Irish challenge. So I will start by giving those examples, to address the global scenarios, and how our physiology must change locally. I remember vividly two occasions whilst abroad that a homeless person made a big impact on my life.

The first time was in 2005, I was in Auckland, New Zealand. We were staying in a hostel whilst backpacking. We weren’t having a particularly good day. The weather wasn’t great, and one or two things went badly. But as we strolled back to the hostel, we noticed a homeless elderly guy in a doorway right beside us. The “bad” weather had turned into a storm, with an incredible amount of flash flooding. I felt awful. And everyone has been there, where a reality check ensures that we come back to earth. I asked the hostel could the guy get a room. They said no, as he must have an address. I offered to pay for his room, still no. Yikes. So I decided all I could do was give him some money. But then I thought, why not do that, and have a conversation. I think we automatically think only money is what they need. I went out, and sat close to him. Whilst chatting, I learned part of his story, and one of the first things he said was that there were worse off people than him, and he didn’t drink, or smoke. And that he hated the rain! He thanked me for the $20, and also the conversation. We helped each other.

The next story is when I received incredible kindness from a homeless guy on my very first night in New York. Woohoo im in America, let go for beers! Oooops. Ended up feeling a little worse for wear outside a club. On my own. Minus my phone. In a lane way in a bad part of town. A big guy stopped. Uh-oh. But this guy asked me was I ok. I told him I was from Ireland, and that I lost my phone. He told me “man I don’t even have a phone”. He then walked me out of the lane way, and hailed me a cab. I gave him a nice tip, and the cynic out there will say he was looking for that. But he didn’t know me. Humanity exists.

And now to Ireland. I really want to stress that I am not some Saint. This is more to raise awareness and how potentially technology can help. I have contributed to Cork Simon Community at length at various points in my life, and if I have some change, I do give it to the needy. But herein lies the first challenge. A lot of people has less and less cash on them. And even if we do, people wonder, if I give this person money, what will they spend it on? Money doesn’t always help, as the upper class society of Ireland have also seen.

From a technology perspective, I want to talk about some potential ways for technology to help on this challenge.

The term Smart Cities has been branded about in relation to the Internet of Everything. Where we will use technology to improve people’s lives. Yet I have not seen much presented that will help the homeless. Imagine if we could use cost effective smart devices that would be worn by homeless volunteers to identify the paths they take, and where they sleep? So that soup runs can be more efficient, and beds can be found? I think it is one area that must at least be explored. There are doing this in Odense, Denmark. Check it out here.

I also believe that doorways could be fitted with load sensors to gauge how many are occupied in our cities. This data could be used to predict common places used, and even predict on particular nights where homeless people are. That coupled with temperature sensors could have saved Jonathan Corrie’s life last December.

The last idea I’ll propose here is to modify the many parking meters in our cities to allow them produce vouchers based on use in a particular day. The more the meters are used in the day session (which should correlate busier cities to more needy people), the meters in the evening print out food/supply vouchers when homeless people enter a code that is text to them. If they don’t have a phone, then their date of birth would be previously registered and entered.

I came across a startup on a recent trip to the United States. I was incredibly impressed. It is called HandUp. The whole premise is that homeless people can setup a online profile through the organisation, and can crowd fund to reach their goals. They never receive direct cash. Instead it is used to buy supplies, food, and sometimes tools to go back to work. So instead of writing their story on cardboard, they get help to set up a profile, and then hand out business cards to their site, so that people can logon and donate. It only based in San Francisco for now, but I have contacted the, to hear plans for global roll out. (And how)

Technology multinationals benefit greatly though our tax system, by positioning themselves in Ireland. And it’s great for our economy, through jobs. I have seen the kindness first hand by working in these companies. They create lots of great lives for people. I wonder if a 1% challenge in the tech sector, where people can volunteer (before tax) donate 1% of their annual wage (hence its 0.05% from us and 0.05% from government) to a particular social challenge. This could change annually. The homeless, the elderly. I think this sort of crowd funding which is spread thin could make a huge impact. I won’t do the exact maths, but 100,000 employees at average salary of €40,000 equates to €40,000,000.!!

A story of caution on the wrong ways to use technology. BBH labs tried a social experiment to use homeless people as wifi hotspots.!! You can read more here. Brain fry springs to mind.

The work done by organisations like Simon and Focus Ireland (and others) is incredible. I sometimes try to think if they weren’t so active, where would we be. I personally believe that the technology community can play a role in assisting and helping the fight. I also think the government gets bad press, and whilst not completely innocent, neither are we. Dublin Simon Community submitted an application for new accommodation last year. The result? 33 objections from the public. Not the government, but us.

“Part of the problem is we have a lack of activism.. We have a lack of people who are willing to step forward and be part of the solution” – Michael Esswein