Yes I think there is a bubble. I think AI may become extremely impressive but still be limited.
A lot of tasks involve having context to produce a correct solution. AI can whip up algorithms in a vacuum. But doesn't know about the custom data format you have to import. It doesn't understand how to map that to the target schema and will blindly import in a way that is subtly wrong. It doesn't have the context provided by chit/chat and vague statements made in an email.
No matter how impressive AI works in a vacuum, the "context" heavy problems are going to be an issue. Someone mentioned "last mile", that's where things fall apart. Same with self driving, it is impressive at first, until a road is blocked and there's some poorly marked detour signs routing you through non-standard paths.
Same with a plumber. You might make a robot that can fix many plumbing problems. But there's always custom nooks and crannies you have to contort yourself into and saw off a pipe in a very specific way that only having a tons of context would let you even know where to begin.
So until AI can consume context the way humans can, it's going to be limited to a "auto completion on steroids". Which is valuable, but not the end of human developers. Only time will tell.
Of course it's a bubble. The question is, who will survive it popping?
Just because it's a bubble doesn't prove that there's no there there at all. It proves that there are too many players trying to be among the handful of survivors, and most of them will fall by the wayside. The winners of the dotcom boom are among the largest corporations on the planet now.
Now... I'm somewhat skeptical of the AI technologies. I strongly suspect that they will not lead to AGI, and that the utility of even the best "AI"s will be limited. So I don't think there's an Amazon forthcoming. But there could well be some "unicorns" (billion-dollar corporations), and that will make the winners quite comfortable.
Plus, of course, I could well be wrong. Maybe it's gonna be AGI, and that would be the biggest thing ever. Like, a quadrillion dollar company. I don't think so, but aren't you at least a little tempted to buy a ticket to a 15-digit lottery?
AI is actually more revolutionary than people believe, it can do the repetitive human labour 10-100 times faster and doesn't has societal issues, hence in terms of business sense it makes companies more productive, allowing economy to grow.
But instead of natural progression the forces in the industry have been hyping up AI for their benefits. Which ofcourse will have consequences in the long term, I don't think there will be a burst in next 5 years, but I'd say chances highly increase in next two year, if certain conditions are met.
Disclosure: I'm not an economist, the observation is based on my understanding of propagation of AI as a developer and founder.
Personally I love it when I see people dismiss AI as a “stochastic parrot” or whatever other snarky phrase is currently popular on HN.
I just keep getting tons of value out of AI and am more productive than I ever have been in my life. If the competition wants to shoot themselves in the foot, I’m not going to wrestle the gun out of their hands.
Not sure how to answer that because I’ve always had enough money and enough free time since I was 25 or so. I’m 40ish now and that’s not because I earn a ton but because I don’t need a ton and don’t dream about having a ton of money.
I’ve been using ChatGPT for about the past 18 months to write scripts in minutes that would’ve taken me hours before (my role is not programming, I just learned how to do it because it was interesting and helpful).
I try to set aside most Friday mornings (7-9AM) to automate tasks and work on scripting projects for the agency I work at. After 9AM I stop working unless I want to do something else. Without ChatGPT, I’d be getting less done even if I worked until 12 or 1PM.
Is that a ton? It doesn’t feel like it to me but I work 30ish hours most weeks.
In my opinion, the real value is that in the past, I was limited by my programming skill but am now limited by my creativity and knowledge of how our business works.
It’s fun and interesting and I get more time where I can think more about the underlying business and spend almost no time learning how to do something technically.
I’ve made quite a few things that have cut hours or days off of different projects or processes done by other team members I work with. I’m sure they don’t mind having less to do.
I think about hiring someone to take this kind of work on for us and am sure I wouldn’t hire a trained programmer to do it. It just needs to be someone smart and curious regardless of background.
Thanks for the answer, this makes a lot of sense. I'm approaching AI from the "deeper" dev perspective and it doesn't shine as much there. I spend basically the same amount of time, but get more annoyed. Although for getting into the unknown it's the best (mostly because search sucks like hell, but still).
That said, in my experience the layer of people like you (iiuc) who can code but aren't fulltime repeatable-tasks developers is pretty thin. Maybe AI can add more people to it? One of my professional goals always was including more humans into the basic ad-hoc programming, but the "industry" was running away like hell from that. You are a perfect model of my idea, glad you finally found the way. https://news.ycombinator.com/item?id=42810175
> It’s highly unlikely that software developers are going away any time soon. The job is definitely going to change, but I think there are going to be even more opportunities for software developers to make a comfortable living making cool stuff.
And then what you said:
> I think this effect will be even greater this time (last time being higher-level “slow” languages like python and js), because AI will allow for a new wave of developers who won’t care about the “right” code that much and will perceive it as a disposable resource rather than a form of art
I agree.
The problem up until now is that Learn to Code has always been a massive chasm to cross for most people. So like doing anything let alone anything useful felt like it was miles away for most people just getting started.
I’m at a digital marketing agency and recently started our new hires on a AI 101 course I made that introduces them to how we use AI and various scripting/automation tools and just challenges them to make something useful and show it to me.
We’re not really using ChatGPT to summarize an email or something like that. I mean like being able to chain together Google Apps Scripts, Asana API, Zapier, and something else to automate and simplify one of the repetitive tasks that you do everyday. And using ChatGPT to generate the code for each step of the chain.
It’s been interesting to see how nearly all of them just go with it and don’t have any doubt or hesitation. But I think if you gave those same people a “learn to code” course, they’d think, “This isn’t something I’ll be good at.”
Fair enough, AI moves fast, but little has changed besides DeepSeek and the reasoning models. AI agents hit an inflection point around November, but people had always bet on them anyway.
I don't consider GPT-4.5 or Claude 3.7 as major enough, and based on the versioning, the creators don't consider them major either.
Yes I think there is a bubble. I think AI may become extremely impressive but still be limited.
A lot of tasks involve having context to produce a correct solution. AI can whip up algorithms in a vacuum. But doesn't know about the custom data format you have to import. It doesn't understand how to map that to the target schema and will blindly import in a way that is subtly wrong. It doesn't have the context provided by chit/chat and vague statements made in an email.
No matter how impressive AI works in a vacuum, the "context" heavy problems are going to be an issue. Someone mentioned "last mile", that's where things fall apart. Same with self driving, it is impressive at first, until a road is blocked and there's some poorly marked detour signs routing you through non-standard paths.
Same with a plumber. You might make a robot that can fix many plumbing problems. But there's always custom nooks and crannies you have to contort yourself into and saw off a pipe in a very specific way that only having a tons of context would let you even know where to begin.
So until AI can consume context the way humans can, it's going to be limited to a "auto completion on steroids". Which is valuable, but not the end of human developers. Only time will tell.
Of course it's a bubble. The question is, who will survive it popping?
Just because it's a bubble doesn't prove that there's no there there at all. It proves that there are too many players trying to be among the handful of survivors, and most of them will fall by the wayside. The winners of the dotcom boom are among the largest corporations on the planet now.
Now... I'm somewhat skeptical of the AI technologies. I strongly suspect that they will not lead to AGI, and that the utility of even the best "AI"s will be limited. So I don't think there's an Amazon forthcoming. But there could well be some "unicorns" (billion-dollar corporations), and that will make the winners quite comfortable.
Plus, of course, I could well be wrong. Maybe it's gonna be AGI, and that would be the biggest thing ever. Like, a quadrillion dollar company. I don't think so, but aren't you at least a little tempted to buy a ticket to a 15-digit lottery?
AI is actually more revolutionary than people believe, it can do the repetitive human labour 10-100 times faster and doesn't has societal issues, hence in terms of business sense it makes companies more productive, allowing economy to grow.
But instead of natural progression the forces in the industry have been hyping up AI for their benefits. Which ofcourse will have consequences in the long term, I don't think there will be a burst in next 5 years, but I'd say chances highly increase in next two year, if certain conditions are met.
Disclosure: I'm not an economist, the observation is based on my understanding of propagation of AI as a developer and founder.
Does it matter?
Personally I love it when I see people dismiss AI as a “stochastic parrot” or whatever other snarky phrase is currently popular on HN.
I just keep getting tons of value out of AI and am more productive than I ever have been in my life. If the competition wants to shoot themselves in the foot, I’m not going to wrestle the gun out of their hands.
Does this value translate to anything real, like tons of money or tons of free time?
Not sure how to answer that because I’ve always had enough money and enough free time since I was 25 or so. I’m 40ish now and that’s not because I earn a ton but because I don’t need a ton and don’t dream about having a ton of money.
I’ve been using ChatGPT for about the past 18 months to write scripts in minutes that would’ve taken me hours before (my role is not programming, I just learned how to do it because it was interesting and helpful).
I try to set aside most Friday mornings (7-9AM) to automate tasks and work on scripting projects for the agency I work at. After 9AM I stop working unless I want to do something else. Without ChatGPT, I’d be getting less done even if I worked until 12 or 1PM.
Is that a ton? It doesn’t feel like it to me but I work 30ish hours most weeks.
In my opinion, the real value is that in the past, I was limited by my programming skill but am now limited by my creativity and knowledge of how our business works.
It’s fun and interesting and I get more time where I can think more about the underlying business and spend almost no time learning how to do something technically.
I’ve made quite a few things that have cut hours or days off of different projects or processes done by other team members I work with. I’m sure they don’t mind having less to do.
I think about hiring someone to take this kind of work on for us and am sure I wouldn’t hire a trained programmer to do it. It just needs to be someone smart and curious regardless of background.
Thanks for the answer, this makes a lot of sense. I'm approaching AI from the "deeper" dev perspective and it doesn't shine as much there. I spend basically the same amount of time, but get more annoyed. Although for getting into the unknown it's the best (mostly because search sucks like hell, but still).
That said, in my experience the layer of people like you (iiuc) who can code but aren't fulltime repeatable-tasks developers is pretty thin. Maybe AI can add more people to it? One of my professional goals always was including more humans into the basic ad-hoc programming, but the "industry" was running away like hell from that. You are a perfect model of my idea, glad you finally found the way. https://news.ycombinator.com/item?id=42810175
From the article in that link:
> It’s highly unlikely that software developers are going away any time soon. The job is definitely going to change, but I think there are going to be even more opportunities for software developers to make a comfortable living making cool stuff.
And then what you said:
> I think this effect will be even greater this time (last time being higher-level “slow” languages like python and js), because AI will allow for a new wave of developers who won’t care about the “right” code that much and will perceive it as a disposable resource rather than a form of art
I agree.
The problem up until now is that Learn to Code has always been a massive chasm to cross for most people. So like doing anything let alone anything useful felt like it was miles away for most people just getting started.
I’m at a digital marketing agency and recently started our new hires on a AI 101 course I made that introduces them to how we use AI and various scripting/automation tools and just challenges them to make something useful and show it to me.
We’re not really using ChatGPT to summarize an email or something like that. I mean like being able to chain together Google Apps Scripts, Asana API, Zapier, and something else to automate and simplify one of the repetitive tasks that you do everyday. And using ChatGPT to generate the code for each step of the chain.
It’s been interesting to see how nearly all of them just go with it and don’t have any doubt or hesitation. But I think if you gave those same people a “learn to code” course, they’d think, “This isn’t something I’ll be good at.”
Ask HN: Could AI be a dot com sized bubble?
159 points|jameslk|8 months ago|130 comments
https://news.ycombinator.com/item?id=40739431
Ask HN: Is commoditization of AI finally going to burst the AI bubble/hype?
16 points|behnamoh|7 months ago|13 comments
https://news.ycombinator.com/item?id=41134422
Ask HN: When will the AI bubble burst?
14 points|roschdal|10 months ago|25 comments
https://news.ycombinator.com/item?id=40259289
Ask HN: Are we in an AI / ML bubble?
10 points|orbOfOrthanc|5 years ago|8 comments
https://news.ycombinator.com/item?id=21737972
Right but it’s feb 2025
Fair enough, AI moves fast, but little has changed besides DeepSeek and the reasoning models. AI agents hit an inflection point around November, but people had always bet on them anyway.
I don't consider GPT-4.5 or Claude 3.7 as major enough, and based on the versioning, the creators don't consider them major either.
TL;DR Yes, because the current AI has the same last-mile problem as the tech around self-driving cars.
We'll see if there is more financial loss due to bad AI, similar to self-driving fatalities. It's calculated risk.
We've seen companies held liable for AI chatbot statements and lawyers penalized for confabulated case law.