- Item 1
- Item 2
- Item 3
- Item 4
Microsoft's AI Chip Strategy Reduces Costs and Nvidia Dependence
Microsoft’s own AI chip, codenamed Project Athena, could help cut their ballooning AI costs by up to a third. With ChatGPT costing OpenAI an estimated $700,000 per day to run, computing expenses are top of mind for many AI companies.
A Microsoft Azure data center. Photo credit: Microsoft.
🧠 Stay Ahead of the Curve
Microsoft is readying its custom AI chip, codenamed Athena, to reduce reliance on Nvidia and cut machine learning costs.
As platforms like ChatGPT cost more than $700,000 per day to run, reducing costs and securing additional computing resources is critical.
Microsoft’s own chips could enable them to pursue an increasingly aggressive AI strategy as Google plays catchup.
April 18, 2023
As the competition for AI supremacy intensifies, Microsoft is reportedly developing its own AI chips to reduce dependence on Nvidia's products, aiming for a full-scale launch by 2024.
Nvidia currently produces the A100 and H100 GPUs, which are extensively utilized for training AI models. These GPUs are purchased in bulk by major cloud companies, including Microsoft’s own Azure division, and then rented out to AI customers. With demand high and capacity limited, others in the AI space are scrambling to acquire their own chips for their data centers to ensure capacity.
The explosion of AI products has created a sharp demand for computing resources. Industry analysts estimate that OpenAI may require an additional 30,000 A100 GPUs to support ChatGPT's commercialization efforts this year. Recent reports show that Twitter CEO Elon Musk purchased 10,000 GPUs for his own generative AI project. Nvidia’s H100 GPU retails for over $30,000 per unit, but due to high demand, they are selling for over $40,000 each on eBay.
Microsoft's chip initiative, code-named Project Athena, is reportedly progressing rapidly, with prototypes already available to a select group of employees for testing. If successful, Microsoft aims to implement Athena broadly within the company by early next year. By developing its own AI chip, Microsoft would join the ranks of Google, Meta, Amazon, and Apple as one of the few companies creating custom chips for AI development.
Dylan Patel, Chief Analyst at research firm SemiAnalysis, estimates that ChatGPT currently costs $700,000 per day to operate, or 36 cents per query. Microsoft's custom chips could potentially reduce operating costs by a third or more, while also supporting the continued large-scale deployment of AI-powered features in Bing, Office, and GitHub products. Microsoft is currently pursuing an aggressive AI strategy, which has prompted Google to declare a "code red" as they scramble to release their own AI products.
Industry insiders don't expect Microsoft to completely replace its Nvidia GPUs, but rather to diminish its overall dependence. If successful, Project Athena could provide Microsoft with additional leverage in future negotiations with Nvidia.
Initiated in 2019, Project Athena has reached a crucial milestone at an opportune time for Microsoft. As various groups vie for access to computing power, Microsoft has had to ration resources for its internal users. External customers are also encountering difficulties in securing sufficient cloud computing resources for their AI projects.
And as Microsoft seeks to reclaim market share on the search engine side, it’s no doubt mindful that operating costs could explode. AI-powered search is significantly more expensive than traditional search engine queries, and Microsoft is clearly aware. “From now on, the [gross margin] of search is going to drop forever,” CEO Satya Nadella told the Financial Times.
NewsAI and Media Titans Quietly Hash Out Future of Content Licensing
June 16, 2023
ResearchIn Largest-Ever Turing Test, 1.5 Million Humans Guess Little Better Than Chance
June 09, 2023
NewsHigh-Profile AI Leaders Warn of “Risk of Extinction” from AI
May 30, 2023
NewsKey Takeaways from OpenAI CEO Sam Altman's Senate Testimony
May 16, 2023
NewsOpenAI Readies Open-Source Model as Competition Intensifies
May 15, 2023
ResearchChatGPT Trading Algorithm Delivers 500% Returns in Stock Market
May 10, 2023
NewsLeaked Google Memo Claiming “We Have No Moat, and Neither Does OpenAI” Shakes the AI World
May 05, 2023
NewsChegg’s Stock Tumble Serves as Wake Up Call on the Perils of AI
May 03, 2023
NewsHollywood Writers on Strike Grapple with AI’s Role in Creative Process
May 02, 2023
ResearchGPT AI Enables Scientists to Passively Decode Thoughts in Groundbreaking Study
May 01, 2023
NewsChatGPT Grows in Popularity as Bing and Bard Flatline
April 27, 2023
ResearchStanford/MIT Study: GPT Boosts Support Agent Productivity by up to 35%
April 26, 2023
NewsSnap's My AI Feature Faces Unexpected Backlash from Users
April 24, 2023
News"Next to Impossible": OpenAI's ChatGPT Faces GDPR Compliance Woes
April 20, 2023
News4 Million Accounts Compromised by Fake ChatGPT App
April 17, 2023
NewsEU's AI Act: Stricter Rules for Chatbots on the Horizon
April 14, 2023
ResearchStudy: Assigning Personas Creates a Sixfold Increase in ChatGPT Toxicity
April 13, 2023
ResearchGPT-4 Outperforms Elite Crowdworkers, Saving Researchers $500,000 and 20,000 hours
April 11, 2023
ResearchGenerative Agents: Stanford's Groundbreaking AI Study Simulates Authentic Human Behavior
April 10, 2023
ResearchBye-Bye, Mechanical Turk? How ChatGPT is Making Humans Obsolete
April 09, 2023
NewsMayor Threatens Landmark Defamation Lawsuit Against OpenAI's ChatGPT
April 06, 2023
NewsOpenAI's ChatGPT Suspended in Italy Amid Privacy and Cybersecurity Concerns
March 31, 2023
NewsCiting "Profound Risks to Society," Prominent AI Experts Call for Pause
March 29, 2023
NewsEuropol Warns of ChatGPT's Dark Side as Criminals Exploit AI Potential
March 28, 2023
CultureAs Online Users Increasingly Jailbreak ChatGPT in Creative Ways, Risks Abound for OpenAI
March 27, 2023
NewsAI Researchers Voice Disappointment at GPT-4’s Lack of Openness
March 16, 2023