AI - why inferencing matters as much as LLMs

David Hurtado, Managing Director of Quetta Data Centers, discusses the impact of AI on the data centre market, explaining that today’s focus on LLMs and training will soon be joined by AI inferencing, where low latency at the edge will be crucial in delivering an optimised application experience. David also discusses the need for liquid cooling in high density data centre environments and outlines how Quetta is developing an AI data centre portfolio for the Iberian peninsula.
Maxime Vermeir, Senior Director of AI Strategy at ABBYY, discusses the results from ABBYY’s State...
Mike Arrowsmith, CTO at NinjaOne, shares valuable insights into a number of cybersecurity issues,...
Daniel Thorpe, JLL's Head of Data Center Research, EMEA, outlines the findings of the company’s...
Caroline Monfrais, Global VP Strategy & Transformation, Wipro Consulting, looks at, for all the...
Daniel Thorpe, JLL's Head of Data Center Research, EMEA, outlines the findings of the company’s...
Chris Carreiro, CTO, Park Place Technologies, explains why so many AI ambitions are faltering and...
Steve Wilson, Chief AI and Product Officer at Exabeam, discusses the company’s recent report,...
Eric Herzog, Chief Marketing Officer at Infinidat, looks at what the biggest challenges facing...