AI capabilities are growing faster than hardware: Can decentralisation close the gap?

21.08.2024

 

AI capabilities have surged dramatically over the past two years, with large language models (LLMs) like ChatGPT, DALL-E, and Midjourney becoming widely used tools. Today, generative AI programs are doing everything from responding to emails and writing marketing content to composing music and generating images from simple prompts.

What’s even more striking is the speed at which both individuals and businesses are adopting AI technology. A recent McKinsey survey found that the percentage of companies using generative AI in at least one business function doubled in a year, rising to 65% from 33% at the start of 2023.

However, as with many technological advances, this emerging field is not without its challenges. Training and operating AI systems is resource-intensive, and currently, large tech companies hold a significant advantage, which could lead to the centralization of AI.

The World Economic Forum highlights the increasing demand for AI computing power, noting that the required computational resources are growing at an annual rate of 26% to 36%.

A study by Epoch AI supports this trend, projecting that the cost to train or operate AI programs will soon reach billions of dollars.

“The expense of the largest AI training sessions has been doubling or tripling annually since 2016, putting billion-dollar price tags within reach by 2027, if not sooner,” stated Epoch AI researcher Ben Cottier.

In my view, we’re already seeing this unfold. Last year, Microsoft invested $10 billion in OpenAI, and recent reports indicate that the two companies are planning to build a data center housing a supercomputer powered by millions of specialized chips. The projected cost? A staggering $100 billion, ten times the initial investment.

Microsoft isn’t alone in this AI arms race. Other tech giants like Google, Alphabet, and Nvidia are also heavily investing in AI research and development.

While the results of these investments might justify the spending, it’s hard to overlook the fact that AI development is currently dominated by a few large companies. Only these major players have the resources to fund AI projects on the scale of tens or hundreds of billions of dollars.

This raises the question: How can we avoid the pitfalls seen in Web2 innovations, where a small number of companies control the majority of technological progress?

James Landay, Stanford’s HAI Vice Director and Faculty Director of Research, has weighed in on this issue. Landay suggests that the race for GPU resources and the focus by big tech companies on using their AI computing power internally will drive the demand for more affordable hardware solutions.

In China, the government is already stepping in to support AI startups in response to the chip wars with the U.S., which have restricted Chinese companies’ access to essential chips. Earlier this year, local governments in China introduced subsidies, offering AI startups computing vouchers ranging from $140,000 to $280,000, aiming to lower the costs associated with computing power.

Given the current landscape of AI computing, one constant theme is the centralization of the industry. Major tech companies control the bulk of the computing power and AI programs. Despite ongoing changes, the status quo remains largely unchanged.

However, there is a glimmer of hope that this time things could shift for the better, thanks to decentralized computing infrastructures like the Qubic Layer 1 blockchain. This L1 blockchain employs an advanced mining mechanism called useful Proof-of-Work (PoW). Unlike Bitcoin’s traditional PoW, which uses energy solely to secure the network, Qubic’s uPoW leverages computational power for productive AI tasks, such as training neural networks.

In simpler terms, Qubic is decentralizing the sourcing of AI computational power by moving away from the current model, where innovators are dependent on the hardware they own or rent from big tech companies. Instead, this L1 taps into a network of miners, potentially numbering in the tens of thousands, to provide computational power.

While this approach is more technically complex than relying on big tech to handle the backend, it is more economical. More importantly, it ensures that AI innovation is driven by a broader range of stakeholders, rather than the current scenario where a few players dominate the industry.

What happens if these companies fail? Even worse, these tech giants have shown themselves to be untrustworthy with significant technological advancements.

Today, many people are concerned about data privacy violations and other related issues, such as societal manipulation. Decentralized AI innovations could make it easier to monitor developments and lower the barrier to entry.

AI innovations are still in their early stages, but the challenge of accessing computational power remains a major obstacle. On top of that, big tech currently controls most of the resources, which poses a significant challenge to the pace of innovation. There’s also the risk that these same companies could gain even more control over our data—the digital gold of our time.

However, with the rise of decentralized infrastructures, the AI ecosystem could see reduced computational costs and diminished control by big tech over one of the most valuable technologies of the 21st century.

en_USEnglish