Saturday, September 21, 2024
HomeArtificial IntelligenceMicrosoft developing its own AI chip Athena to reduce Machine Learning costs...

Microsoft developing its own AI chip Athena to reduce Machine Learning costs — DataSagar

Welcome back to yet another blog post on DataSagar! This post is a continuation of my blog series on Artificial Intelligence. As a technology educator and e-business consultant, I aim to provide insightful articles and helpful tips to keep you up-to-date with the latest developments in the world of tech. In this article, we’ll be discussing Microsoft’s latest project: developing its own artificial intelligence (AI) chip code named “Athena.”

Microsoft is reportedly working on their new chip project called Athena which is supposed to be an AI processor intended to power the technology powering AI chatbots like ChatGPT, which will be used to train the substantial language models required by generative AI. Massive volumes of data are provided by these chatbots, which can also spot trends and imitate human speech. Microsoft expects Athena to outperform the chips it presently purchases from other vendors, saving time and money on its pricey AI research and development.

The market for these chips is presently dominated by Nvidia, but Microsoft aims to change that with Athena. Developed in 2019, the chip is now being tested by a small team of Microsoft and OpenAI staff members. Athena is reportedly being developed by more than 300 employees, and Microsoft anticipates that it will be offered to Azure users by the end of the year.

According to Dylan Patel, principal analyst at SemiAnalysis, “Microsoft wants to use large language models in all of their applications, including Bing, Microsoft 365, and GitHub.” The corporation might save tens of billions of dollars as a result.

As a result of ChatGPT’s success, Microsoft is quickening the rollout of Athena. In an effort to compete with Google and capitalize on its cooperation with OpenAI, Microsoft earlier this year released Bing AI, its own AI-powered search engine.

Microsoft’s upcoming Surface PCs will also come with an NPU (neural processing unit) to speed up on-device artificial intelligence and machine learning (ML), in addition to Athena. Microsoft now employs chipsets with NPUs from AMD, Intel, and Qualcomm, all of which are currently adding them to their microprocessors.

It is obvious that Microsoft is making significant investments in AI, and with the creation of Athena, it aspires to dominate the AI chip industry. Although Nvidia presently rules the market, it is still unclear how Microsoft’s chip will function and whether it will be able to challenge the top chip maker. Whatever the case, AI and technology as a whole are in an exciting phase right now, and we can’t wait to see what the future brings.

The idea of developing AI chips internally is not new; companies like Amazon, Google, and Facebook already produce AI chips. It’s an increasing trend as tech behemoths grasp AI’s potential and how it can completely transform the market. These businesses can save costs on their time- and money-consuming AI endeavors with the development of AI chips.

Additionally, the epidemic has increased technology use and raised demand for AI-powered products and services. In light of this, Microsoft’s investment in AI chips may prove to be a big step forward for the business, allowing it to increase its product offerings and competence in the AI sector.

You should know that ChatGPT and other language models need high-capacity AI devices to do their computations effectively. The size and complexity of the model determine how much processing power is needed.

Modern language generation models like ChatGPT, which are built on the GPT-3 architecture and trained using enormous amounts of data, demand a lot of computing power. Depending on the situation, ChatGPT may consume more or less processing power, although it normally needs strong GPU clusters or specialist AI chips.

For instance, a supercomputer with approximately 3,000 graphics processing units (GPUs) and 192 server-class CPUs was used to train the greatest version of GPT-3, which has 175 billion parameters. Although ChatGPT is a more compact model than GPT-3, it still needs a lot of computing power to function properly.

And yes, whatever it be, Microsoft’s creation of its own AI chip is a positive step forward for the field of artificial intelligence. Microsoft expects that Athena will outperform the chips it presently purchases from other vendors, saving time and resources on its pricey AI research and development. AI is undoubtedly the technology of the future, and by making this investment, Microsoft is establishing itself as a significant player in the AI chip industry. It will be interesting to see how Athena does on the market and how it counters Nvidia’s hegemony.

datasagarhttp://www.DataSagar.com
The author of this blog post is a technology fellow, an IT entrepreneur, and Educator in Kathmandu Nepal. With his keen interest in Data Science and Business Intelligence, he writes on random topics occasionally in the DataSagar blog.
RELATED ARTICLES
- Advertisment -

Most Popular