DigiPlex

submenu

Three sectors ripe for AI and HPC disruption

20 January 2020
AI is perhaps the single biggest driver of adoption of High-Performance Computing - but what are the disruptions?
Three sectors ripe for AI and HPC disruption

The use of Artificial Intelligence (AI) is growing at an astounding rate. According to market research firm Tractica, the global AI market is expected grow from approximately $9.5 billion last year to $118.6 billion by 2025. Markets&Markets, another research company sees even more growth at a CAGR of 36% between 2018 and 2025 to reach $190 Billion by the end of the period. In a comprehensive survey of the market, MMC Ventures found that; “Large companies are adopting AI at a rapidly accelerating rate. Just 4% of enterprises had adopted AI 12 months ago (Gartner). Today, 14% of enterprises have deployed AI. A further 23% intend to deploy AI within the next 12 months.”

The widespread use of AI has consequences for jobs, time to market, revenues and costs, as well as underpinning significant shifts in the core business of many organisations. It also has profound implications for data centres and their operators, as we illustrated in our own ‘Perfect Storm’ White paper. AI is perhaps the single biggest driver of adoption of High-Performance Computing; the high-density hardware systems needed to run AI software. As discussed in a previous blog, few organisations will have the skills, facilities and resources to build and maintain their own HPC equipment. Many will turn to colocation providers to host and maintain these systems.

‘Early Majority’ driving next wave of growth
Currently, AI is associated with tech leaders and specific segments of automotive and financial services sectors. However, I believe that the next wave of AI deployments come from Healthcare, Education and Pharmaceutical industries. Each of these are ripe for AI disruption.

The potential for collection, analysis and application of data in healthcare is huge. The proliferation of wearable devices, both consumer-selected and practitioner-prescribed, plus the ability of smartphones to collect, store and share every aspect of our daily lives, is creating an ocean of data points. Connecting, analysing and finding patterns in this data is the key to better and faster diagnoses, more personalised ‘wellness’ programmes and a more proactive, preventative approach to healthcare.

AI is already helping doctors to spot cancer sooner and Alexa’s AI-driven natural language processor can answer medical enquiries. But this is the tip of the iceberg. AI software, leveraging HPC systems could analyse billions of data points before providing your healthcare professional with tailored insights allowing highly effective, personalised treatments and care.

The pharmaceutical industry is already showing significant interest in AI. The California Biomedical Research Association estimates it costs about $1 billion and 12 years to develop a new drug  – and just 2%  of developments lead to medicines approved for human use. Much of this cost derives from human-driven trial and error processes that painstakingly narrow down active ingredients and their outcomes. AI technology not only massively accelerates this process, but can spot new, unlooked-for correlations. Using AI better, drugs can be produced faster – but to do so massive amounts of data needs to be analysed.

Education is often overlooked as an opportunity for AI. The sector has seen significant disruption as the internet overturned centuries of traditional teaching methods, but AI offers the potential of truly individualised learning. A whole class may be following the same course, but individual students will be dynamically set tasks that are appropriate for their ability.  No one is allowed to coast, and no one is left struggling.  Companies such as Content Technologies are already using AI to create dynamic textbooks and courseware for individual students.

As data and insights develop it will be possible to better predict outcomes and advise individuals as to what courses, subjects and areas of development are most likely to be effective and advantageous.

Common issues
How and where you house the HPC resource to manage AI will have a big impact on what it costs. There will be trade-offs and decisions to make speed and cost, compute and storage and also alround security, privacy and data sovereignty issues.  

In most of the scenarios I’ve sketched here, latency, the time it takes data to move from a data centre to a user’s device, is less of an issue than in other areas. For an autonomous vehicle or a high-frequency trading platform delays of a millisecond can have disastrous consequences. Delays of one or two seconds would not have a huge impact on the quality of a GPs advice to a patient, or a student’s progress through coursework. The ‘heavy-lift’ of AI processing through the billions of data points can be done wherever it is most cost effective. Locating HPC resource in data centres built where land and power are cheaper, makes sense. Many are looking to the Nordic region with its combination of cool climate, abundant renewable power and stable, pro-business regimes provide the foundation for economically as well as environmentally sustainable AI investments.

As the importance and the cost of storing and processing ever increasing volumes of AI data rise, can business afford not to look at the significant advantages locating in the Nordics offers?

For healthcare and education sectors in particular, data privacy is crucial. Many governments have mandated that data about patients and children in particular must remain in their jurisdiction. Although the ultimate diagnosis, or the educational attainment of an individual will clearly be personal data, much of the raw information used to deliver those outputs is not. Millions of test scores, symptoms, drug reactions and countless other variables are simply numbers containing no personally identifiable information. These can be safely processed at the most cost-effective and convenient location before being re-united with personal data at the point of delivery.

Own or rent
The critical nature of data, and the commercial impact of the insights derived from it, may sway some to locate HPC capacity on premises. However, this is a major capital investment and an ongoing operational cost that could hurt the commercial viability of AI. At the other end of the spectrum, pubic cloud-based AI may be sufficient for small-scale, ad-hoc projects, but neither robust nor scalable enough for mission-critical loads. For the majority, locating specialist HPC equipment in a bespoke colocation site offers the best of both worlds.

Data centre operators have the expertise and experience to create bespoke facilities needed to support HPC equipment. They already have sufficient power and cooling density, plus the specialist staff to ensure optimum operation. The leading data centre designs are not only highly sustainable, but flexible, providing bespoke solutions for HPC, Cloud and enterprise server functions, all interconnected as hybrid solutions. Managing these resources as operational expenditure, rather than high and risky capex makes AI investments much easier to swallow.

AI will have a significant impact on virtually every business – but can put significant strain on existing IT resource and on finances. Working with an expert partner can help achieve the right balance of risk and reward, investment and return from AI.

Article by Tim Bawtree - VP of International Sales

Read our other Blogs on HPC / Speed to Market / Edge

Mail or Print: /
Share this:

Contact

Connect with us

Get updates

i

This content has been hidden since it uses cookies you have not accepted. To see, please click the link below: Cookie preferences