Skip to main content
We may receive compensation from affiliate partners for some links on this site. Read our full Disclosure here.

Elon Musk Sounds The Alarm: Is Microsoft Building A Real-Life Skynet?


Elon Musk has sounded the alarm on Microsoft and its artificial intelligence plans.

Microsoft’s new Maia A.I. chip is set to be the cornerstone of Microsoft’s A.I. infrastructure and aims to bring artificial generalized intelligence to “every facet” of our lives.

Artificial generalized intelligence (AGI) differs from the isolated A.I. applications and language models we have all seen and heard about now in that it represents actual intelligence.

ChatGPT and MidJourney AI are cool, novel applications, but they only function within a specific domain—an AGI is more like the science fiction computers we have seen in the Terminator franchise.

Of course, lovers of liberty have never been fans of Microsoft or its founder Bill Gates, who has expressed blatant misanthropy and zeal for ‘depopulation’ openly, so Microsoft building the AGI infrastructure of the future doesn’t exactly excite the liberty crowd.

Elon Musk, always being one who wants more liberty and not less, took the chance to call out Microsoft’s purported AGI goals:

Scott Guthrie, Vice President of the Cloud and AI Group at Microsoft, recently stated:

“Microsoft has reimagined our infrastructure with an end-to-end systems approach to meet our customer’s unique AI and cloud needs.

With the launch of our new AI Accelerator, Azure Maia, and cloud native CPU, Azure Cobalt, alongside our continued partnerships with silicon providers, we can now provide even more choice and performance.”

Microsoft recently announced:

Today at Microsoft Ignite the company unveiled two custom-designed chips and integrated systems that resulted from that journey: the Microsoft Azure Maia AI Accelerator, optimized for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Cloud.

The chips represent a last puzzle piece for Microsoft to deliver infrastructure systems – which include everything from silicon choices, software and servers to racks and cooling systems – that have been designed from top to bottom and can be optimized with internal and customer workloads in mind.

The chips will start to roll out early next year to Microsoft’s datacenters, initially powering the company’s services such as Microsoft Copilot or Azure OpenAI Service.

Chairman and CEO of Microsoft Satya Nadella likewise stated:

“We remain committed to our partnership with OpenAI and have confidence in our product roadmap, our ability to continue to innovate with everything we announced at Microsoft Ignite, and in continuing to support our customers and partners.

We look forward to getting to know Emmett Shear and OAI’s new leadership team and working with them. And we’re extremely excited to share the news that Sam Altman and Greg Brockman, together with colleagues, will be joining Microsoft to lead a new advanced AI research team.

We look forward to moving quickly to provide them with the resources needed for their success.”

Coin Telegraph added:

The AI company OpenAI, which is backed by Microsoft, is said to have provided feedback on the new Maia 100 AI Accelerator and how its own workloads run on top of the new infrastructure.

Sam Altman, CEO of OpenAI, said that these new chips will help make their AI models more “capable” and “cheaper” for users.



 

Join the conversation!

Please share your thoughts about this article below. We value your opinions, and would love to see you add to the discussion!

Leave a comment
Thanks for sharing!