(Republished with permission from Aruba, a Hewlett Packard Enterprise company. You can view the original blog here.)
We live, it seems, in an age of perpetual uncertainty where even the ‘what if’s’ are in a constant state of flux. Seismic shifts in transformation have accelerated the way we are harnessing the power of digital in ways previously unimagined. Artificial Intelligence (AI) and Machine Learning (ML) are being used to analyse everything from vaccine efficacy to scanning job applications. But as with any analysis, the insights gleaned are only as good as the source data that drives it.
For real world examples of this, you only need to look at recent concerns over COVID vaccine efficacy that are based on scientists not having a large enough sample size, or the ongoing controversy over the use of algorithms to scan job applications and the risk they bake in bias because they are being fed by unintentionally biased data. What both make clear is that when it comes to AI it’s not only about crunching the data, but about having the depth of data necessary to get the right results.
Dealing with the data deluge
As the world around us has shifted into the digital sphere over the last twelve months, it has become quickly apparent just how much data this is going to produce and how reliant on it we all are. From developing contact tracing apps to reinventing an omni-channel retail experience, data – and our ability capture, analyse and act on it – will be fundamental to our recovery from this time of uncertainty. And this puts AI and ML front and centre.
The bottom line is, we simply cannot keep up with all this data without these technologies.
It’s just not humanly possible to gather intel from numerous disparate sources, crunch it and immediately offer up real-time, actionable insights and solutions unless we have the right solutions in place to support effortless fact finding and solving missions. But what if the intelligence behind these sources is not robust enough to deliver these insights?
The accelerated and widespread adoption of AI/ML to get to the answer faster and spot patterns of anomalies in data that human beings are likely to miss enables much needed efficiencies and agility for businesses. However, the danger is that incomplete, unintentionally biased or ‘bad’ data will get you to the wrong answer quicker, or spot something that isn’t really there. This is where the size and scope of the data lake is crucial and why it is so important to ask questions about it as part of any AI sales conversation you have.
It’s all well and good to bring in AI solutions under the guise of making things simpler and easier, but the reality is that bad AI solutions could leave your business in a much worse situation than you began with. The difference between a data lake that provides clean, relevant data and one that may look great in a demo, but in reality, can’t stand up to a real-life scenario, is having a robust network of sensors and devices in place. So do your due diligence to make sure you are setting yourself up for failure.
Super skilling your IT department
While the applications of AI are endless, one of its most effective uses within any large organization today is to help manage IT networks. According to ZK Research, the average network engineer in 2018 was already spending 10 hours a week finding and fixing Wi-Fi problems – that’s way before the pandemic scattered workforces across cities, countries and continents. One year into remote working, the elevated levels of engagement and stress that come from unnecessary downtime, outages or slowdowns for both employees and IT managers leave little room for network errors or products that do not work.
The goal of AI/ML in this situation is to make the lives of IT engineers easier by automating a fix for simpler problems or helping them get to the root of more complex problems faster, freeing up their time to spend on other tasks. Used in the right ways, AI/ML can help them solve network issues - from wherever they are - by identifying the problem, presenting the evidence, and finding a resolution – all in a fraction of the time it would have taken for them to do it manually.
In a WFH world, downtime is not an option and using AI/ML to highlight and identify problems before they become an issue can save time and money but beyond fixing immediate problems, it is also becoming increasingly necessary to help spur the innovation that organizations so desperately need to adapt to the rapidly changing world around us. In short, the need for smarter networks is becoming mission critical and most companies will have one chance to get it right – this is where it’s worth spending the time and asking the right questions to enable you to choose solutions that actually deliver on their promises.