Ideate! Innovation strategy in your company

2010ThinkBigStartSmall

“Innovation is hard. The larger your business, the harder it gets.”

Is the above statement true? Maybe. More importantly, does it have to be true? No it doesn’t.

One of the most common mistakes in large companies in respect to innovation strategy is that when they see themselves as “big”, they need a “big” innovation model to succeed. Consider the life-cycle of a company. First, in startup mode, they are agile by default. They don’t look out past 2/3 months, with many looking at bringing new features to their product so they can secure the next round of funding. Once they grow and mature, and longer term goals become a priority and less financial pressure results, innovation slows down. The longer it does so, the more investment is needed to get it back.

Whilst having a large business means sharing knowledge becomes more of a challenge, there are advantages to size and scale. For one, diversity is a critical component in innovation, and the larger the company, generally the more diverse it is, its how you identify and partner that diverse thinking that can differentiate your innovation strategy. To enable a large organisation to reach its true innovative capacity, there are a number of approaches that can be used to reach it. Some of these are introduced below.

Disruptive Innovation Task Teams

A concept to consider when maintaining your innovation strategy is placing disruptive innovation task teams into your existing business model, whereby they act with limited budget and resources, look no further than 2/3 months out and are agile by nature. The also have a keen eye on not what the customer wants now, but what they will need in the future ( but not too far out 6-12 months). Planting a seed team like this can result in cross pollination with other product teams around it, and this will increase your companies overall ideation. A key component of these teams is to ensure there is diversity present, with both seasoned campaigners that have the business history, along with new generation employees that can bring outside thinking to the table.

Agile. Rinse. Repeat

dilbert2666700071126

One of the first aspects that is critical to success is to introduce agile as a mantra. This isn’t easy in large companies, and this is why an upfront investigation must be performed to assess how you inject it. Unfortunately, agile has been targeted mainly at software development teams, but it can have uses in other business teams, such as finance and human resources. However, it must be adapted to suit. No agile paradigm fits all. And within certain types of projects, if agile is not adjusted, then success becomes difficult (Read Ken Collier’s book on Agile Analytics for any data intelligence readers).

Think Lean

Think fast, learn fast, fail fast. The concept of Lean Startup is now well known, and has huge advantages. Lean startup is a method for developing businesses and products first proposed in 2011 by Eric Ries. Whether in your large organisation or in the start up space, most new ideas/ concepts fail. T he odds are not with you: As new research by Harvard Business School’s Shikhar Ghosh shows, 75% of all start-ups fail. A key component of the lean startup philosophy is to favor experimentation over elaborate planning, working with the customer over intuition, and incremental design over big design planning meetings. The aim is to build a minimum viable product, and iterate and pivot on that product. For more on this, check out the website.

LeanStartupLoopClassic

Innovation Identity

Do you know your companies innovation identity? Innovation identity is the intersections between your companies technology, your innovation teams, the market(s),  and other departments of the company. Two main innovation models seem to emerge:

  1. Thriving innovation model means the innovation culture is at the cornerstone of the corporate company; the company develops interactions both across internal departments and with external resources to complete its innovations. Cisco, Sanofi, 3M, Renault, and the open source way of working are championing this model.
  2. Dedicated entity model involves the creation of an autonomous unit pursuing new and uncertain activity lines. Lockheed with its skunk works, At Google for instance, innovation is at the core DNA, which links them to the first model; reversely, as they enable small teams to investigate disruptive innovation in a flexible framework, they are really close to this model
Open Innovation

The identity of innovation has been gradually shaped by multiple interactions between different levels of a company with other external groups/organisations. And not just any type or size of external companies. The people you collaborate with must be suited to the market entry you are trying to achieve. Although they may exist within a market you are trying to enter, sometimes it is key to identify the technology required to enter that market, and even look at university collaboration to fulfill the technology requirements for market penetration.

In the way to define our innovation mantra and strategy,  a look at the 10 facets defined by Jeffrey Philips can also help, positioning where you want to be:

  • open vs closed innovation:
  • skunk works vs broadly participative;
  • suggestive vs directed, incremental vs disruptive (also stretching innovation vs “all included” innovation vs disruptive innovation);
  • centralized vs decentralized;
  • product / service / operations / business model, funding, wisdom of crowd vs defined criteria and experts.
In closing..

Creating and maintaining the innovation strategy at your company is both a challenge and an evolution of not only your company, but also the internal personality and dynamic of the individuals who contribute to it. The direction you take and how you make the journey is down to you.

Distributed Analytics in IoT – Why Positioning is Key

analytics-word-cloud

The current global focus on the “Internet of Things (IoT)” have highlighted extreme importance of sensor-based intelligent and ubiquitous systems contributing to improving and introducing increased efficiency into our lives. There is a natural challenge in this, as the load on our networks and cloud infrastructures from a data perspective continues to increase. Velocity, variety and volume are attributes to consider when designed your IoT solution, and then it is necessary to design where and where the execution of analytical algorithms on the data sets should be placed.

Apart from classical data centers, there is a huge potential in looking at the various compute sources across the IoT landscape. We live in a world where compute is at every juncture, from us to our mobile phones, our sensor devices and gateways to our cars. Leveraging this normally idle compute is important in meeting the data analytics requirements in IoT. Future research will attempt to consider these challenges. There are three main classical architecture principles that can be applied to analytics. 1: Centralized 2: Decentralized and 3: Distributed.

The first, centralized is the most known and understood today. Pretty simple concept. Centralized compute across clusters of physical nodes is the landing zone (ingestion) for data coming from multiple locations. Data is thus in one place for analytics. By contrast, a decentralized architecture utilizes multiple big distributed clusters are hierarchically located in a tree like architecture. Consider the analogy where the leaves are close to the sources, can compute the data earlier or distribute the data more efficiently to perform the analysis. This can have some form of grouping applied to it, for example – per geographical location or some form of hierarchy setup to distribute the jobs.

Lastly, in a distributed architecture, which is the most suitable for devices in IoT, the compute is everywhere. Generally speaking, the further from centralized, the size of the compute decreases, right down to the silicon on the devices themselves. Therefore, it should be possible to push analytics tasks closer to the device. In that way, these analytics jobs can act as a sort of data filter and decision maker, to determine whether quick insight can be got from smaller data-sets at the edge or beyond, and whether or not to push the data to the cloud or discard. Naturally with this type of architecture, there are more constraints and requirements for effective network management, security and monitoring of not only the devices, but the traffic itself. It makes more sense to bring the computation power to the data, rather than the data to a centralized processing location. 

There is a direct relationship between the smartness of the devices and the selection and effectiveness of these three outlined architectures. As our silicon gets smarter and more powerful and efficient, this will mean that more and more compute will become available, which should result in the less strain on the cloud. As we distribute the compute, it should mean more resilience in our solutions, as there is no single point of failure.

In summary, the “Intelligent Infrastructures” now form the crux of the IoT paradigm. This means that there will be more choice for IoT practitioners to determine where they place their analytics jobs to ensure they are best utilizing the compute that is available, and ensuring they control the latency for faster response, to meet the real time requirements for the business metamorphosis that is ongoing.

Nell, Google and a Half Pipe! EnterConf Belfast – Day 2

Quote of the day. “Counterfeiting is an insidious problem in life sciences, our network tenant cloud can help stop it” – Shabbir Dahod – TraceLink, Inc

As EnterConf entered its second day, I continually saw the benefit of having more detailed discussions with people in the Enterprise sector. Even during the night events (the speaker dinner in the Harbour Commissioners Office, great venue, followed by a few sociables in the Dirty Onion Bar), I kept monitored the dynamics taking place. The networking normally began with two people, but the circles were growing, joining to form what I like to call “RoundStandUps”. These were normally not short conversations, and collaboration was inherent in the voices and chatter. There also was a deep and satisfying undertone, which was an energy to keep “building great” in Ireland.

Check out the Half Pipe! Hope its at Web Summit! 🙂

Half Pipe at EnterConf
Half Pipe at EnterConf

Kicking us off on Centre Stage was none other than the inspirational futurist Nell Watson from Singularity University, who is also the CEO of Poikos, the smartphone 3D body measurement company. She talked about virtual employees, how we will replicate the human mind through AI in 20 years (and run business through AI). I liked how Nell bridged the machine and human inter-dependencies.  It was an insightful talk, and having spent the past year looking at machine intelligence (from both a hardware and software implementation perspective), I am seeing more and more futurists thinking like this.

Nell Watson, CEO of Poikos on Centre Stage
Nell Watson, CEO of Poikos on Centre Stage

A few talks focused on our evolving workplace. David Hale, from Gigwalk spoke on the Insight stage on “Deploying Technology to Power Mobile Field Teams and Maximise Work Efficiency”. David spoke on how mobile tools for consumer brands and retailers are being used to more effectively manage field teams, gather in-store data and direct resources to improve retail execution ROI. David also spoke about how our employees are changing, and how companies have to empower the “Millennial Employee”, whose requirements include flexibility, and having a social and online mindset.

David Hale, from Gigwalk on the Insight Stage

Shabbir Dahod – TraceLink, Inc, spoke on the Centre stage, his topic – “Delivering the Internet of Things (IoT) to the Enterprise”, and it was one of the highlight talks of the summit I found. Shabbir spoke about how Tracelink were the world’s largest track and trace network for connecting the Life Sciences supply chain and eliminating counterfeit drugs from the global marketplace, by using their Life Sciences Cloud, configured in a network tenant architecture.

Shabbir Dahod – TraceLink, Inc

Thomas Davies, Head of Enterprise for Google drew a huge level of engagement from the crowd with his talk on the next stage of collaboration. Thomas mentioned the evolution of how we collaborate, but even since the early 1980’s the structures were quite rigid and have not changed that much up to a few years ago. But now, customer and employee expectations have changed. They are fast, 24/7, global and personalised. He discussed how employees and organisations are more efficient when they collaborate. “We shape our tools, and then our tools shape us” – Marshall McLuan.

Thomas Davies (Google) in exhuberant form on Center Stage

One last talk Ill cover is a topic that is somewhat under the covers of Enterprise IT, and I am glad that Engin Akyol of Distil Networks talked on “Dark Cloud: Cloud Providers as a Platform for Bot Attacks”. Engin first spoke about good bots, which do serve a purpose for major cloud providers. But this talk was focusing on bad bots, which slow down application performance and skew analytics. As the volume of cloud platforms continues to scale, this leads to ease in setting up bot networks which can pilfer content from websites, or launch other malicious attacks.

Engin Akyol of Distil Networks

So, ill sign off from EnterConf 2015, and onto Web Summit in November, with many events, collaborations and new experiences in between. As a two day conference, perhaps I built less contacts than I expected to. But the ones I did are more meaningful contacts, and EnterConf allows their attendees an environment to do that. I also sat in on round-tables on big data and security, which gave yet another dynamic. It really is a conference experience I will be returning to. Special mention to all the organisers, volunteers and the inspiring venue. Goodbye Belfast, hello Dublin!

Oh, I almost forgot, I really hope Krem Coffee are at Web Summit, awesome coffee!

EnterConf Belfast- Day 1

Firstly, to the quote of the day “We all have to avoid software that epically sucks”.

Me at the Insight Stage!
Me at the Insight Stage!

Today I attended day one of the Enter Conf in Belfast, which for those who don’t know it, is a spin off conference from Web Summit focused at the Enterprise aspect of our tech world. On initial entry, I must admit I was really proud of the Enter Conf team for choosing the venue. It had lost of lot of history associated with it, being in the heart of the titanic quarter where the Titanic was built, and for its time, was an “Enterprise ship”! This created a chilled out atmosphere which was a nice differential from the Web Summit to be held again in November. It was full of detailed and focused meetups and conversations, and did a great job at giving a different experience of what a conference can provide. Kudos.
There were two stages, named Center and Insights, with startup exhibits, food and coffee stands to ensure everyone was nicely refreshed throughout the day. Whilst I wont cover all talks, I have picked out a few to cover to show the types of elements being discussed.

The first one Ill mention was by Lukas Biewald of Crowdflower, entitled “Processing Open Data”, who spoke extensively on their efforts to clean up the data, and also looking into elements of data moderation. It really resonated with me as I am interested and developing data cleanse frameworks over the past number of years, and always struggle with the data pollution that skews our insight. Quote from Lukas “If you want to improve your algorithm, just add more data”. Lukas is in action below.

Lukas Biewald of Crowdflower
Lukas Biewald of Crowdflower

Stephen McKeown from AnalyticsEngines and Amir Orad from Sisense were also in a panel on “Democratising Data”, which focused the talk on ensuring companies of all sizes speed up their analytics creates a more level playing field for startups competing with Enterprise. Quote from this section “Bring data into your companies DNA”

Stephen McKeown and Amir Orad
Stephen McKeown and Amir Orad

There were a few familiar faces present, with my former EMC colleague and mentor Steve Todd amongst the speakers, on “Economic Value of Data” (check out Steve’s blog here for more fascinating content in this topic. Steve spoke on the Center stage, and it was great to see this topic present, as it really stood out as a conversation we all should be having. Steve gave a similar talk in Cork for an it@Cork event we organised in February, and it was great to see the advancement in Steve’s research in this area. Steve spoke on “Valuation Business Processes” and categories within that being M&A, Asset Valuation, Data Monetisation, Data Sale and Data Insurance. I wont spoil the rest, as I am sure Steve will blog on this soon.

Steve Todd speaking on Economic Value of Data
Steve Todd speaking on Economic Value of Data

Also on Center Stage, in one of talks to close out the evening, Barak Regev, Head of Google Cloud Platform – EMEA spoke on “Architecting the Cloud”. It was great to get an update on their vision, and Barak showed Googles vision to “Build Whats Next”

IMG_1526 - Copy
Barack Regev from Google – Build Whats Next

And to end on a great quote from to James Petter VP EMEA for Pure Storage – “Security should be like an onion, it should be layered, and you cant reach the center without breaching a layer”

The day brought many epic conversations from over 10 different nationalities, including a walk back to the city with the visionary Teemu Arina. His talk on Biohacking was incredible insightful. It spoke to the challenge around humans tracking their life through Self Quantisation. Teemu took me though his idea for how humans can do a better job on hacking their bodies for information and using that to improve life quality. Teemu’s book is here!

So now, its off to the night dinner, drink a beer two and to build a few more contacts! In the morning, it looks like a few good talks on Machine Intelligence will start the trend for another awesome day!

Why IoT practitioners need to “Wide Lens” the concept of a Data Lake

As we transition towards the vast quantity of devices that will be internet enabled by 2020, (anything from 50-200 billion experts estimate), it seems that the current cloud architectures that are being proposed are somewhat short on the features required to enable the customers data requirements on 2020.

I wont dive hugely into describing the technology stack of a Data Lake in this post (Ben Greene from Analytics Engines in Belfast, who I visit on Wednesday en route to Enter Conf, does a nice job here of that in his blog here). A quick side step, if you look at the Analytics Engines website, I saw that customer choice and ease of use were some of their architecture pillars, when providing their AE Big Data Analytics Software Stack. Quick to deploy, modular, configurable  with lots of optional high performance appliances. Its neat to say the least, and I am looking forward to seeing more.

The concept of a Data Lake has a large reputation in current tech chatter, and rightly so. Its got huge advantages in enterprise architecture scenarios. Consider the use case of a multinational company, with 30,000+ employees, countless geographically spread locations, multiple business functions. So where is all the data? Its normally a challenging question, with multiple databases, repositories and more recently, hadoop enabled technologies storing the companies data. This is the very reason why a business data lake (BDL) is a huge advantage to the corporation. If a company has a Data Architect at its disposal, then it can develop a BDL architecture (such as shown below, ref – Pivotal) that can be used to act as a landing zone for all their enterprise data. This makes a huge amount of sense. Imagine being the CEO of that company, and as we see changes in the Data Protection Act(s) over the next decade, a company can take the right step towards managing, scaling and most importantly protecting their data sets. All of this leads to a more effective data governance strategy.

Pivotal-Data-Lake

Now shift focus to 2020 (or even before?). And lets take a look at the customer landscape. The customers that will require what the concept of a BDL now provides will need far more choice. And wont necessarily be willing to pay huge sums for that service. Now whilst there is some customer choice of today, such as Pivotal Cloud Foundry, Amazon Web Services, Google Cloud and Windows Azure, it is predicted that even these services are targeted at a consumer base of a startup and upwards in the business maturity life cycle. The vast majority of cloud services customers in the future will be everyone around us, the homes we live in and beyond. And the requirement to store data in a far distance data center might not be as critical for them. It is expect they will need far more choice.

I expect in the case of building monitoring data, which could be useful to the wider audience in a secure linked open data sets (LOD’s) topology. For example, smart grid provider might be interested in energy data from all the buildings and trying to suggest optimal profiles for them to reduce impact on the grid. Perhaps the provider might even be willing to pay for that data? This is where data valuation discussions come into play, and is outside the scope of the blog. But the building itself, or its tenants might not need to store all their humidity and temperature data for example. They might some quick insight up front, and then might choose bin that data (based on some simple protocol describing the data usage) in their home for example).

Whilst a BDL is built on the premise of “Store Everything”, it is expected that whilst that will bring value for these organisations monitoring consumers of their resources, individual consumers might not be willing to pay for this.

To close, the key enablers to these concepts are the ensure that real time edge analytics and increased data architecture choice. And this is beginning to happen. Cisco have introduced edge analytics services into their routers, and this is a valid approach to ensuring that the consumer has choice. And they are taking the right approach, as there is even different services for different verticals (Retail, IT, Mobility).

In my next blog, Edge Analytics will be the focus area, where we will dive deeper into the question. “where do we put our compute?”

IoT and Governance. Its a game of RISK

Due to the sheer volume of devices, data volume, security and networking topologies that result from IoT, it is natural for there to be a lot of questions and legal challenges around governance and privacy. How do I know my data is secure? Where is my data stored? If I lose a device, what happens to data in flight?

The National Fraud Intelligence Bureau has said that 70% of the 230,845 frauds recorded in 2013/2014 included a cyber-element, compared to 40% five years ago. This would indicate that we aren’t doing a very good job on protecting the existing internet enabled devices, so why should we be adding more devices? If we internet enable our light bulbs and heating systems (Nest being acquired by Google a good example) to control from our mobile phone, can the devices be hacked to tunnel to our mobile phone data?

It is not only the singular consumer that needs to be aware of privacy and governance. Businesses too will need to ensure when they adopt IoT, they must place resources at the door of the legal requirement and implications of IoT enablement. A key aspect of this will be to ensure their internal teams are aligned in relation to IoT, and more specifically, security, data protection and privacy.

More and more, governments and regulatory bodies have IoT in their remit. This included the EU commission who published a report that recommended that IoT should be designed from the beginning to meet suitable governance requirements and rights, including right of deletion and data portability and privacy.

The draft Data Protection Regulation addresses some of these measures including:

  • Privacy by design and default – to ensure that the default position is the least possible accessibility of personal data
  • Consent
  • Profiling – clearer guidelines on when data collected to build a person’s profile can be used lawfully, for example to analyse or predict a particular factor such as a person’s preferences, reliability, location or health
  • Privacy policies
  • Enforcement and sanctions – violations of data privacy obligations could result in fines of up to 5% of annual worldwide turnover or €100m, whichever is greater

The first point above, privacy by design is normally an afterthought unfortunately. Whilst not a requirement by the Data Protection Act, it makes the compliance exercise much smoother. Taking such an approach brings advantages in building trust and minimizing risk.

IoT presents a number of challenges that must be addressed by European privacy regulators as IoT evolves. It is predicted that the scrutiny on these challenges will increase as the device number increases.

Some of the challenges include:

  • Lack of control over the data trajectory path
  • The lack of awareness by the user of the devices capabilities
  • Risk associate with processing data beyond original scope, especially with advances in predictive and analytic engines
  • Lack of anonymity for users
  • Non threat everyday devices becoming alive to threat

As can be seen from these challenges above, there are characteristics in common, such as control, security and visibility which makes governance of IoT a bigger challenge than expected.

Finally, governance in IoT is expected to follow other technologies. Up to now, the software industry has not had single standards for the complete service portfolio (including cloud), although government are addressing this. From the geographical standpoint, different regulations are commonplace for different jurisdictions in IT, so IoT is predicted to follow suit.

Why Ireland needs to use Technology and IoT more to help their Homeless



21% rise in homeless sleeping rough in Dublin

15% rise in homeless in Cork

Rise also in Limerick, Galway and Waterford

Over 1,000 children are now homeless in Ireland.

Startling figures. Especially the last one. I cannot try to comprehend what it is like to be a parent, who must tell their children that they don’t have a place to go at night.

I, like many others have been in other countries cities to see that this is not simply an Irish challenge. So I will start by giving those examples, to address the global scenarios, and how our physiology must change locally. I remember vividly two occasions whilst abroad that a homeless person made a big impact on my life.

The first time was in 2005, I was in Auckland, New Zealand. We were staying in a hostel whilst backpacking. We weren’t having a particularly good day. The weather wasn’t great, and one or two things went badly. But as we strolled back to the hostel, we noticed a homeless elderly guy in a doorway right beside us. The “bad” weather had turned into a storm, with an incredible amount of flash flooding. I felt awful. And everyone has been there, where a reality check ensures that we come back to earth. I asked the hostel could the guy get a room. They said no, as he must have an address. I offered to pay for his room, still no. Yikes. So I decided all I could do was give him some money. But then I thought, why not do that, and have a conversation. I think we automatically think only money is what they need. I went out, and sat close to him. Whilst chatting, I learned part of his story, and one of the first things he said was that there were worse off people than him, and he didn’t drink, or smoke. And that he hated the rain! He thanked me for the $20, and also the conversation. We helped each other.

The next story is when I received incredible kindness from a homeless guy on my very first night in New York. Woohoo im in America, let go for beers! Oooops. Ended up feeling a little worse for wear outside a club. On my own. Minus my phone. In a lane way in a bad part of town. A big guy stopped. Uh-oh. But this guy asked me was I ok. I told him I was from Ireland, and that I lost my phone. He told me “man I don’t even have a phone”. He then walked me out of the lane way, and hailed me a cab. I gave him a nice tip, and the cynic out there will say he was looking for that. But he didn’t know me. Humanity exists.

And now to Ireland. I really want to stress that I am not some Saint. This is more to raise awareness and how potentially technology can help. I have contributed to Cork Simon Community at length at various points in my life, and if I have some change, I do give it to the needy. But herein lies the first challenge. A lot of people has less and less cash on them. And even if we do, people wonder, if I give this person money, what will they spend it on? Money doesn’t always help, as the upper class society of Ireland have also seen.

From a technology perspective, I want to talk about some potential ways for technology to help on this challenge.

The term Smart Cities has been branded about in relation to the Internet of Everything. Where we will use technology to improve people’s lives. Yet I have not seen much presented that will help the homeless. Imagine if we could use cost effective smart devices that would be worn by homeless volunteers to identify the paths they take, and where they sleep? So that soup runs can be more efficient, and beds can be found? I think it is one area that must at least be explored. There are doing this in Odense, Denmark. Check it out here.

I also believe that doorways could be fitted with load sensors to gauge how many are occupied in our cities. This data could be used to predict common places used, and even predict on particular nights where homeless people are. That coupled with temperature sensors could have saved Jonathan Corrie’s life last December.

The last idea I’ll propose here is to modify the many parking meters in our cities to allow them produce vouchers based on use in a particular day. The more the meters are used in the day session (which should correlate busier cities to more needy people), the meters in the evening print out food/supply vouchers when homeless people enter a code that is text to them. If they don’t have a phone, then their date of birth would be previously registered and entered.

I came across a startup on a recent trip to the United States. I was incredibly impressed. It is called HandUp. The whole premise is that homeless people can setup a online profile through the organisation, and can crowd fund to reach their goals. They never receive direct cash. Instead it is used to buy supplies, food, and sometimes tools to go back to work. So instead of writing their story on cardboard, they get help to set up a profile, and then hand out business cards to their site, so that people can logon and donate. It only based in San Francisco for now, but I have contacted the, to hear plans for global roll out. (And how)

Technology multinationals benefit greatly though our tax system, by positioning themselves in Ireland. And it’s great for our economy, through jobs. I have seen the kindness first hand by working in these companies. They create lots of great lives for people. I wonder if a 1% challenge in the tech sector, where people can volunteer (before tax) donate 1% of their annual wage (hence its 0.05% from us and 0.05% from government) to a particular social challenge. This could change annually. The homeless, the elderly. I think this sort of crowd funding which is spread thin could make a huge impact. I won’t do the exact maths, but 100,000 employees at average salary of €40,000 equates to €40,000,000.!!

A story of caution on the wrong ways to use technology. BBH labs tried a social experiment to use homeless people as wifi hotspots.!! You can read more here. Brain fry springs to mind.

The work done by organisations like Simon and Focus Ireland (and others) is incredible. I sometimes try to think if they weren’t so active, where would we be. I personally believe that the technology community can play a role in assisting and helping the fight. I also think the government gets bad press, and whilst not completely innocent, neither are we. Dublin Simon Community submitted an application for new accommodation last year. The result? 33 objections from the public. Not the government, but us.

“Part of the problem is we have a lack of activism.. We have a lack of people who are willing to step forward and be part of the solution” – Michael Esswein

 

it@Cork European Technology Summit 2015 – a WOW event!

I wanted to change direction slightly and give an update on an event I had the privilege of being involved with this week, the it@Cork European Technology Summit. The event was held at Cork City Hall, on Wednesday May 5th, with a full day technology summit, followed by a black tie dinner with 3d wearables fashion show.

An epic journey over the past few months, with way more ups than downs resulted in…

1 Day – 4 Sections – 20 speakers – 4 Chair Speakers – 400+ Day attendees – #1 Trending on Twitter – 9 Amazing artisan food stalls – Lots of Sponsors – 200+ Night Attendees – 2 Fashion Designers – 1 Model Agency – 10 Models – 2 Fire Dancers – 4 3D printed bow ties !

So how did I arrive there? Last year, Gillian Bergin from EMC asked me to get involved with it@Cork, as part of the Tech Talk committee. I’m delighted she did, as over the past few months, I got to partake in and help organise some excellent tech talks from a variety of people, including my fellow technologist, Mr Steve Todd of EMC. The tech talk series is just one of many successful strands of it@Cork, holding six high end, rockstars speakers/panels per year. The series is full up until 2016, but if you are a “rockstar” speaker interested in speaking, please contact us directly. From this, James O’Connell of VMWare who passed over the tech talk committee chair to Barry O’Connell, took on chair of the Summit Organising committee. James, coupled with myself and Paddy O’Connell of Berkley Group, (known collectively now as the Macroom or Muskerry Mafia 🙂 ) assisted Sarah Walsh of it@Cork in organising the day summit. The night summit was excellently organised by Marnie O’Leary Daly of VMWare.

The event was kicked off by James, and then Ronan Murphy, chairman of the board it@Cork, CEO Smarttech gave an address that spoke about how Cork needs a cluster manager to help drive more employment in the region. More from Ronan here by the Examiner.  Donal Cahalane, from Teamwork.com, gave an insightful talk on how he saw the industry progressing, with some excellent advice for everyone from startups through to multinationals .


The four sections throughout the day offered a mix balance between raw technology (Cloud- challenge the fear, Internet of Everything) along with Digital Marketing and a Tech Talent/ Diversity panel. I found this to work quite well, as it ensured the audience got a variety of speakers.

The cloud session on “challenging the fear” was an excellent one to start with, as it had a mix of SME’s from companies such as Kingspan (John Shaw), Trend Micro (Simon Walsh) and Barricade (David Coallier), but also had representation from the legal profession, in the form of Michael Valley, Barrister and Noel Doherty – Solicitor who spoke at length on cloud governance. This session was chaired by Anton Savage of The Communications Clinic, who hosted a panel discussion with all five presenters at the end.


All of the sections were split by networking opportunities in the exhibition halls, where companies from the region presented their organisations, and some even demonstrated their wares. The athmosphere was great to see with lots of chatter, tweeting and drinking of coffee! 😀


The second section was a panel session on Tech Talent, the chair being Paddy O’Connell from Berkely, and the facilitators were Meghan M Biro, founder and CEO of TalentCulture, and Kevin Grossman, who co founded and co hosts the the TalentCulture #TChat show with Meghan. They later presented their TChat show live from the Clarion hotel Cork. It was awesome!

Such variety (no pun intended!) in the panel, with David Parry Jones, VP UKI VMWare and Noelle Burke Head of HR Microsoft Ireland representing industry, Michael Loftus – Head of Faculty of Engineering and Science CIT representing academia, and the hugely impressive student Ciara Judge, one of the Kinsale winners of the 2013 Google Science Award. Everyone inspired in their own way, and the dynamic at lunchtime was one of motivation, hope and leadership.


Having started my own personal digital marketing brand last year, and learning by making mistakes, I was exceptionally excited by our third section – Digital Marketing. Again, Anton did an incredible job of asking the right questions, and effortless listenership followed. To listen to experts such as Meghan, Antonio Santos, Niall Harbison and Raluca Saceanu was a privilege, and I also got the opportunity to speak with the directly (as did many others). This was true of all the speakers throughout the day. I believe a huge number of people got lots of what I call “advice snippets” that they can take away and grow their own brand.


The last session was on an area close to my heart, the Internet of everything (IoE), and I had the privilege of chairing the session. We had speakers from Climote (Derek Roddy), my future employer Tyco (Craig Trivelpiece), Salesforce (Carl Dempsey), Dell (Marc Flanagan) and Xanadu (David Mills). All these companies are in different stages on their IoE journey, but the message was consistent: IoE is going to make a huge impact on our smart futures. I really like how Craig spoke of “if you want to improve something, measure it”  and how Tyco are looking at predictive maintenance and pushing intelligence/insight back out to the devices. Derek showed how Climote is changing how we live, David did the same in relation to sport. Marc gave an excellent account of Dells practical approach to IOT, showing the capabilities needed for IoE projects. Carl got me really excited about Salesforce’ plans in the IoE space. The session really closed out the event well, and the numbers in attendance stayed consistent.

Having attended a huge number of tech events over the years, it was great to see again, year on year growth of Munsters premier Technology Summit. The athmosphere was electric all day, both locally and on Twitter. The tweet wall was a big success, and we expect that next years event will be bigger and better again.


The black tie dinner was also a huge success, with the Millenium Hall in City Hall packed to capacity. Marnie O’Leary Daly, along with Emer from Lockdown model agency, put on an amazing dinner (superb catering by Brooks) and fashion show, with 3D wearables fashion provided by Aoibheann Daly from LoveandRobots and Rachael Garrett from Limerick School of Art and Design (@LSAD). Special mention to FabLab also for helping Rachael get her garments ready. It really was a spectacular evening. The Clarion hotel was also hugely supportive of the night element. (Photos to follow!) Emer will also blog on the night event fashion soon and do a much better job than me!

It@Cork European Technology Summit 2016. Watch this space. 

If you are interested in getting involved in 2016, please contact Sarah Walsh at it@Cork.

Case Study: IoT Technology Platform – ThingWorx [10]

In my previous blog, I mentioned some platform design considerations at the outset. In this blog, I discuss one such Platform that has gained significant traction in the industry in recent times.

About ThingWorx10

ThingWorx is one of the first software platforms designed to build and run the applications of the connected IoT world. ThingWorx reduces the cost, time, and risk required to build innovative Machine-to-Machine (M2M) and Internet of Things (IoT) applications.

The ThingWorx platform provides a complete application design, runtime, and intelligence environment with the below features:

  • Modern and Complete Platform
  • Mashup People, Systems & Machines
  • Deploy 10X Faster with Model-based Development
  • Deploy How You Like
  • Evolve & Grow Your Application Over Time

What ThingWorx does that was really clever was that they created a modelling environment based on a database of graphs that keeps track of thousands of devices that communicate with other devices and applications.

“There’s nothing new about gathering and using data to make something better. What is new, and complex, is getting these things that are now web-enabled to take better advantage of the IoT. This requires application developers to rethink how they collect, analyze, manipulate and interact with information,” said Russ Fadel, CEO, ThingWorx9. “ThingWorx is the first software platform on the market designed to build and run applications in the connected IoT world and offers a fully integrated and pre-architected solution that covers connectivity, event processing, analytics, storage and presentation of any kind of M2M and IoT data. Our goal is to provide customers with instant insight into collected data from these smart, connected things so they can be proactive and address issues before they happen in a smarter way than previously able.”10

Figure 7: ThingWorx Architecture [10]
Figure 7: ThingWorx Architecture [10]

Features10

ThingWorx Composer™

ThingWorx Composer is an end-to-end application modeling environment designed to help you easily build the unique applications of today’s connected world. Composer makes it easy to model the Things, Business Logic, Visualization, Data Storage, Collaboration, and Security required for a connected application.

Codeless Mashup Builder

ThingWorx “drag and drop” Mashup Builder empowers developers and business users to rapidly create rich, interactive applications, real-time dashboards, collaborative workspaces, and mobile interfaces without the need for coding. This next-generation application builder reduces development time and produces high quality, scalable connected applications which allows companies to accelerate the pace at which they can deliver value-add solutions, resulting in greater market share against new and existing competitors.

Event-Driven Execution and “3D” Storage

ThingWorx’s event-driven execution engine and 3-Dimensional storage allows companies to make business sense of the massive amounts of data from their people, systems, and connected “Things” – making the data useful and actionable. The platform supports scale requirements for millions of devices, and provides connectivity, storage, analysis, execution, and collaboration capabilities required for applications in today’s connected world. It also features a data collection engine that provides unified, semantic storage for time-series, structured, and social data at rates 10X faster than traditional RDBs.

Search-based Intelligence

ThingWorx SQUEAL™ (Search, Query, and Analysis) brings Search to the world of connected devices and distributed data. With SQUEAL’s interactive search capabilities, users can now correlate data that delivers answers to key business questions. Pertinent and related collaboration data, line-of-business system records, and equipment data get returned in a single search, speeding problem resolution and enabling innovation.

Collaboration

ThingWorx dynamically and virtually brings together people, systems, and connected equipment, and utilizes live collaboration sessions that help individuals or teams solve problems faster. The ThingWorx data store becomes the basis of context aware collaboration and interaction among the systems users, further enhancing its value. Additionally, the tribal knowledge exposed during the process is automatically captured and indexed for use in future troubleshooting activities.

End of Case Study

References 

10: ThingWorx: About ThingWorx

http://www.thingworx.com/

Platform Architecture Pre Considerations for IoT

Apart from the sheer volume of data generated by IoT devices, there are also a huge number of different data customers requirements, both known and unknown that will need to be considered. In this regard, the platform technology will need to be agile enough to meet this variation. How will this scale both horizontally and vertically to ensure sustainability? I started to think of profiling requirements, and looking to give personality to the IoT customer type, so that the platform can morph and adjust itself based on not only the inputs (data type, frequency, format, lifetime), but also what outputs it needs to provide.

Data latency will also be a requirement that any platform will need to firstly understand, and then address, depending on the application and customer requirements. In an interesting discussion today in Silicon Valley with Jeff Davis (my original hiring manager in EMC, and now senior director of the xGMO group looking at operations cloud, analytics and infrastructure services ), he mentioned having worked in a previous company in the sensor business, latency represented a huge challenge, especially when the amount of data grew exponentially. We chatted more and more about how the consumer of now wants their devices/ technology interactions to be instant. How long will people be willing to wait for smart light bulbs/ switches? What if my devices are distributed? More importantly, Jeff outlined a key question. “How much are the consumer willing to pay for the added services provided by adding “smarts” to standard everyday sensors”? This is a “understand the market” question, and should be a consideration for anyone looking at building an IoT platform.

When one starts to consider that most applications in the IoT space might require more than one industry working together, cross collaboration is key to making it work. Consider some of the taxi apps in use currently, whereby the taxi company provides the car locations, the application needs to offer information on locations, then the banking is used to pay for it from your account, and perhaps there is advertisement shown on your receipt, if a suitable arrangement is not formed between the various It companies, it becomes too easy for the “blame game” to ruin the user’s experience of the application when something goes wrong.

Central to the satisfying both the varying requirements of the customers and latency management will be the concept of a customer or business data lake, powered by Hadoop or Spark technology, will form the primary storage and processing in the data center. There is also an option to look at tiering to help address the variation in requirements for the platform, with the possibility to send the “big hitting data”, which brings the most value in close to real time, to an in memory database, to provide fast cache insightful analytics. In a later blog post, I will elaborate greatly on this paragraph, so stay tuned. If the same dataset can be used by multiple applications, in a multi-tenant schema, then there will be clear orchestration challenges in ensuring that this data can be processed in real time.  Other features of any data architecture for IoT could also include:

  • Multiple Data Format Support
  • Real Time Processing
  • High Volume Data Transfer
  • Geographically Agnostic
  • Data Lake Archival and Snipping

As with all technology, IoT will evolve, which means that we will build on top of previous technologies, and new technologies will add to the ecosystem. The enterprise data warehouse will continue to play an important role, but a series of technology platforms will be necessary. While numerous platforms have and will be created, one such platform, ThingWorx is the subject of case study in my next blog.