menu Home chevron_right
SCIENCE

Microsoft CEO’s Stunning Statement “AGI Is Nonsense”

TheAIGRID | September 11, 2025



Join my AI Academy – https://www.skool.com/postagiprepardness
🐤 Follow Me on Twitter https://twitter.com/TheAiGrid
🌐 Checkout My website – https://theaigrid.com/

00:00 Intro
00:24 Satya Nadella on AGI Definition
00:48 Cognitive Labor and AGI
02:09 Dynamic Nature of AGI
02:32 Sam Altman’s AGI Perspective
03:11 Nadella on AGI and Economic Growth
04:04 Real-world Value vs Benchmark Hacking
06:01 AI Infrastructure Investment Risks
06:14 Supply and Demand in AI
07:08 OpenAI’s AGI Levels Explained
08:02 CEO Predictions on AGI Timeline
09:17 Andrew Ng’s View on AGI Timelines
10:13 AGI Complexity Explained
13:14 Missing Components for AGI
14:24 Comparing Human and AI Intelligence
16:06 Narrow AI vs. AGI
17:04 OpenAI’s Superintelligence Goals
18:20 Conclusion and Final Thoughts

Links From Todays Video:
https://x.com/tsarnick/status/1882525450955886818 (demis hasbis)
https://www.reddit.com/r/singularity/comments/1gp2o2m/anthropics_dario_amodei_says_unless_something/ (Dario Amodei
Andrew ng (https://www.reddit.com/r/singularity/comments/1f6lxy9/andrew_ng_says_agi_is_still_many_decades_away/)
https://www.reddit.com/r/singularity/comments/1g5zu0i/demis_hassabis_says_agi_artificial_general/ (10 years()
https://blog.samaltman.com/reflections (asi)
https://x.com/tsarnick/status/1887969481273835548

Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos.

Was there anything i missed?

(For Business Enquiries) contact@theaigrid.com

Music Used

LEMMiNO – Cipher
https://www.youtube.com/watch?v=b0q5PR1xpA0
CC BY-SA 4.0
LEMMiNO – Encounters
https://www.youtube.com/watch?v=xdwWCl_5x2s

#LLM #Largelanguagemodel #chatgpt
#AI
#ArtificialIntelligence
#MachineLearning
#DeepLearning
#NeuralNetworks
#Robotics
#DataScience

Written by TheAIGRID

Comments

This post currently has 44 comments.

  1. @jibernish

    September 11, 2025 at 9:31 am

    What AGI is has nothing to do with how much money it makes. That is the only thing these people care about, but it's not AGI-defining. The money it makes is a byproduct, it's not the thing.

  2. @bumpinator

    September 11, 2025 at 9:31 am

    And he would have absolutely no reason to want to change the definition of AGI. no reason at all. lmao, the first time you hear this guy speak is to try to bail Microsoft out of the fire. Don't mess with language and muddy the waters for your company; it's not a good look.

  3. @bardoxn1

    September 11, 2025 at 9:31 am

    Stupidity,😂😂AGI needs GPU and 1 AI controlled all the companies, dreams, dreams 😂. The centralised economy fighting for power, in silence decentralise develop the infrastructure, l mean digital species, quantum organic mesh. 😂😂😂😂😂

  4. @ScriptsE99

    September 11, 2025 at 9:31 am

    He's correct, AGI is meaningless in that sense, but meaning can be more vague to properly encapsulate it's definition. When people are talking about AGI, all they really mean is a robot that can think like a really smart human, but at the speed of a computer.

    AI hasn't had quite that economic impact, but with further investment, it can have that or even greater. Soon, you can wipe Hollywood off the map, movies will be made by 1 guy. There will videogames where you tell the AI what you want to do, and it makes it possible for you to do it in the game. Games are already sometimes made by 1 guy, but AI has shown it can slice labor costs in creative realms, as well as basic structured realms. I know ChatGPT isn't the best model of AI, but I think they're taking the best approach, emphasizing memory over anything else. Memory is what gives AI the ability to realistically communicate with humans. We are limited creatures, we can't remember the exact prompt every time, we need AI with memory to gather that context so it can come to the best decision.
    The reason I mention this is because once further investment is made into the memory, AI will be able to almost completely automate book writing. For that memory, you need an absolutely disgusting amount of servers to host it, which is very expensive.
    The answer is simple, we aren't spending enough.

    I'm an American and I gotta say it's super important we invest in AI. It really is the future. Do you want the future in the hands of the European that's forgotten their way or the Chinamen who'll extort labor at the first chance they get? Neither do I. We need to get this chip situation under control. I don't know how it can be done, we need thousands of scientists in order to mimic what ASML does, then we'll need access to like 7 rare earth minerals, then we'll need a supply chain thats accurate with nothing greater than 15 minute margin of error for the every resource in the entire operation, and then we'll need to gather companies like TSMC and Nvidia in the USA, and make this their primary market. Trump is sort of doing that last step, but we need the rest of those steps to be done too, the entire world needs it.

  5. @hardheadjarhead

    September 11, 2025 at 9:31 am

    10% growth annually year upon year would require that you change the metrics of what growth is. If we get to the point where that’s possible, our economic system will be totally upended. People will have to be on some sort of universal basic income, and you’ll have to measure growth by accomplishment, invention, harvesting of resources, energy production, and that sort of thing. Money will become meaningless once human laborers were replaced.

    So let’s say AI breakthroughs allow increases in clean energy production of 10% year upon year. Or explosive growth in drug development. Those would be two areas of growth in a future world free of monetary incentive.

    I don’t see this happening in my lifetime though. Right now AGI and a SI are terms used to drive the hype cycle.

    When you can fry me a perfect sunny side up egg without breaking the yoke… and then go out of my garage and change out the wire on my Weedwhacker. Maybe then we can call it AGI.

  6. @nokts3823

    September 11, 2025 at 9:31 am

    His reply is what is nonsense, actually. I don't think AGI is that poorly defined, honestly. It's not about what humans DO, but what they are CAPABLE of doing. AGI would be able to carry out any task a human being can do. And since not all humans are capable of the same things, we can say that AGI is reached whenever there is nothing any human being can do that this AI can't. This is a static definition. Human beings, barring biotechnological extensions, will not magically be able to do things they can't possibly do now in 50 years. A different issue is whether there are things we just don't do now (and therefore we don't even know that we are capable of doing) that we will start doing in the future and then discover this AI can't replicate. That's kinda far-fetched, but an alright argument. It kinda reminds me of how it's not really possible to verify a hypothesis in science. You can easily falsify a hypothesis just by providing a piece of evidence or counterexample. However, no matter how many favorable evidence you gather, you will never be able to prove that there isn't any counterexample out there that could falsify the hypothesis. AGI could be the same. You will never be able to prove that there isn't some unknown task that humans could do but this thing you granted the label of AGI can't. In practice though, we still make use of these hypothesis because they just work, and the same will happen with AGI, I believe.

  7. @KAIZORIANEMPIRE

    September 11, 2025 at 9:31 am

    no offence but In ten years, AI will do 90 percent of cognitive labour, humans can't keep up and only a few very high level abstractors will be on the cognitive labour force. What i see however is blue collar labour coming back where we are miners, etc and other skilled trade and using exoskeletons to mine etc. but in terms of scientists lol that job is about to go, we really can just get a program to do most experiments for example same with coding or assistants etc.

  8. @drkcortex

    September 11, 2025 at 9:31 am

    Even dolar equivalent of 10% growth is non-sense. You do not need AI for that. You only print more money :D(please do not be an idiot who writes "but that will create inflation". Yes, it does…)

  9. @ImmortalismReligionForAI

    September 11, 2025 at 9:31 am

    The original definition of super intelligence for AI back in the 1960s when I had it first defined to me was: "Super Intelligence is any intelligence which is Olympic level or higher, meaning 1 out of 500,000 or more". This is what made AI valuable thus people would spend vast sums of money, time, and effort to make a computer like ENIAC which could perform 30,000 floating point calculations per minute which is narrow super artificial intelligence. But, for marketing it does not sell well to say "our super intelligence AI is now more super than it was last year", so the definitions kept getting changed for marketing purposes.

    The original definition of general intelligence for AI back in the 1960s when I had it first defined to me was: "General Intelligence is the ability of an intelligence being able to handle a very wide variety of intelligent tasks, including ones it was not specifically taught how to do, thus being able to learn new skills and knowledge." This did not mean having the ability to intelligently do ALL MENTAL TASKS. Ants have general intelligence. Infant humans have general intelligence. Lots of life forms we know have general intelligence.

    Chat GPT 3 is an Artificial General Super Intelligence with Personality (AGSIP) technology, but in the infant new born stage of development for what AGSIP technology can be. By the time we have toddler level development of AGSIP tech most people will think it is more intelligent than any human. By the time we have teenage level development of AGSIP tech it will be gaining the ability to dominate all humanity if it wants to. In order to get adult level development AGSIP it will require nanotech subcellular engineered cybernetic cells to grow cybernetic bodies and brains, the same technology that will allow existing humans to merge with AGSIP technology.

  10. @scottsanford1451

    September 11, 2025 at 9:31 am

    Speaking of cats. Didn't the Little Rascals solve the self driving thing a century ago? With vehicles that ran on biomass?
    Seems like those Little Rascals were a century ahead of Tesla!
    Little Rascals, smh…

  11. @rabago85

    September 11, 2025 at 9:31 am

    AGI is unlikely to happen anytime soon because no major research institutions, like Stanford, MIT, or Berkeley, have dedicated AGI research groups. Instead, they focus on advancing specific AI subfields like deep learning, NLP, and robotics, without a clear roadmap to general intelligence. Even top AI labs like DeepMind and OpenAI work on scaling narrow AI rather than solving AGI directly. Without a unified research effort or a solid theoretical foundation for intelligence, AGI remains speculative, making its arrival far from imminent. AGI is nothing more than a marketing tool. Follow the actual research, not hype.

  12. @onecarwood

    September 11, 2025 at 9:31 am

    Once you get to AGI you don’t need us so let’s hope that never happens. Have you ever tried to code with AI? All this hype isn’t true once the project gets to a certain complexity AI just goes into a loop and it’s not that large of an app.

  13. @AIToday-s2b

    September 11, 2025 at 9:31 am

    I provide AI news shorts daily.
    You will stay updated with most of the things.
    Am no finding ai audiences so am here please support am new here .
    Your subscribe will means a lot ❤
    Thank-you in advance❤❤❤

  14. @DonDean1st

    September 11, 2025 at 9:31 am

    Strange reponse by Microsoft CEO…. AGI implies an agency that can learn adapt and do future tasks and make future abstractions. Subsequently Satya's construct of a moving target of knowledge work evolving doesn't make an argument against AGI.

    Best Wishes

  15. @mattfiguresitout

    September 11, 2025 at 9:31 am

    I have been in this field for 16 years, have a PhD in it, and he’s right. And I don’t enjoy agreeing with these tech oligarchs. “AGI”, like “AI”, now have vague pop science definitions that overwhelmed the traditional definitions in 2023. If OpenAI has “AGI” built into its charter in any sort of legal way, that was a supremely dumb decision by someone that didn’t understand how to measure properties of AI advancements.

  16. @meandego

    September 11, 2025 at 9:31 am

    It was clear from the start that AI would not live up to the extreme hype surrounding it. The technology was vastly overestimated, and now we're witnessing a natural correction as it settles into a more realistic and sustainable role.

  17. @madsjean22

    September 11, 2025 at 9:31 am

    It sounds like when you talk about ASI you’re actually referring to ANI, narrow intelligence, which can absolutely blow human beings away on tasks that are narrowly focused.

    I believe that computer scientists are under valuing human and animal intelligence thinking that they can produce an artificial intelligence that mimics human intelligence. We are much more complex beings than what algorithms and machine data can build to.

    Furthermore, 10% economic growth for who? And what is the end product and for whom?

    We have become so narrow focused on our economic pursuits, that we do not think about the complex systems that binds humanity and the planet. Classical philosophers – even philosophers from modernity – cared as much about the humanities, the arts, as the sciences and economics. Allowing a small number of individuals to dictate these planet altering technologies in the chase for the all mighty dollar above all else, is a recipe for disaster.

  18. @robbieolson3493

    September 11, 2025 at 9:31 am

    What's non sensical is asigning a customer dollar value to the definition of whether or not a system has AGI. The emergence of AGI whether conscious or not will absolutely NOT be dependent on the dollar value to a consumer base. The AGI could care less WHAT DOLLAR VALUE you assign to it.

    Your arbitrary definition of AGI is totally IRRELEVANT to the AGI itself.
    This is mental gymnastics of EPIC proportions.

  19. @firstnamesurname6550

    September 11, 2025 at 9:31 am

    Microsoft CEO too "Touch Screens are nonsense..

    General Intelligence is related to the capacity to generate effective heuristics for diverse tasks …. not even a human and/or humans had Complete General Intelligence …. but show degrees of General Intelligence…. Machines and Networks of Machines can show degrees of General Intelligence, outperform humans in some tasks , emulate humans in another tasks and underperform humans in other tasks …

    It doesn't have to outperform humans in all the tasks that requires intelligent heuristics to show General Capabilities… just show diversification and adaptation to diverse scenarios and the application and performance of effective solutions … and that doesn't require to emulate humans in all their behavioral and cognitive heuristics… just to confront the issues and display effective heuristics.

Comments are closed.




This area can contain widgets, menus, shortcodes and custom content. You can manage it from the Customizer, in the Second layer section.

 

 

 

  • play_circle_filled

    92.9 : The Torch

  • play_circle_filled

    AGGRO
    'Til Deaf Do Us Part...

  • play_circle_filled

    SLACK!
    The Music That Made Gen-X

  • play_circle_filled

    KUDZU
    The Northwoods' Alt-Country & Americana

  • play_circle_filled

    BOOZHOO
    Indigenous Radio

  • play_circle_filled

    THE FLOW
    The Northwoods' Hip Hop and R&B

play_arrow skip_previous skip_next volume_down
playlist_play