And this is why AI coding will eventually degrade into a mess. Enjoy it while it lasts.
AI eats up users caring about $company which makes library, library degrades because nobody is paying, $company goes insolvent, library goes unmaintained and eventually defunct, AI still tries to use it.
Vibe coding with libraries is a fad that is destined to die.
Vibe coding your own libraries will result in million line codebases nobody understands.
Nothing about either is sustainable, it’s all optics and optics will come crashing down eventually.
AI is destined to destroy software industry, but not itself.
Software does not decay by itself (it's literally the whole point of using digital media over analog). Libraries do not "degrade". "Bit rot" is an illusion, a fictitious force like centrifugal force in Newtonian dynamics, representing changes that happen not to a program, but to everything else around it.
The current degree of churn in webshit ecosystem (whose anti-patterns are increasingly seeping in and infecting other software ecosystems) is not a natural state of things. Reducing churn won't kill existing software - on the contrary, it'll just let it continue to work without changes.
You’re mostly right, libraries thrive by adapting to their surroundings. Mostly.
But after just months of being unmaintained, even the best libraries start to rot away due to bugs and vulnerabilities going unfixed. Users, AI included, will start applying workarounds and mitigations, and the rot spreads to the applications (or libraries) they maintain.
Unmaintained software is entropy, and entropy is infectious. Eventually, entire ecosystems will succumb to it, even if some life forms continue living in the hazardous wasteland.
I struggle to fully grasp everything you postulate. Please help me understand.
Your original point was that libraries do not need companies behind them. From what you have written here a reason for that is that (web) libraries mostly create churn by introducing constant changes. What I think you follow from that, is that those libraries aren't necessary and that "freezing" everything would not do any harm to the state of web development but would do good by decreasing churn of constantly updating to the newest state.
What I struggle to understand is (1) how does AI fit into this? And (2) Why do you think there is so much development happening in that space creating all the churn you mention? At this point in time all of this development is still mostly created by humans which are likely paid for what they do. Who pays them and why?
Bit rot isn’t some mystical decay, it’s dependency drift: APIs change, platforms evolve, security assumptions expire, build chains break. Software survives because people continuously adapt it to a moving substrate.
Reducing churn is good. Pretending maintenance disappears is fantasy. Software doesn’t decay in isolation, it decays relative to everything it depends on. And it sounds like you don’t know anything about Newtonian dynamics either.
That is one take and certainly possible and negative but I think people create libraries for different reasons.
There are people who will use AI (out of their own pocket for trivial costs) to build a library and maintain it simply out of the passion, ego, and perhaps some technical clout.
That's the same with OSS libraries in-general. Some are maintained at-cost, others are run like a business where the founders try to break even.
It's just interesting because most of the talk is programmers talking about AI taking their job by replacing them not taking their job because it's taking away revenue from the business.
Reminds me of the problem with Google & their rich results which wiped out and continues to wipe out blogs who rely on people actually visiting their site vs. getting the information they seek without leaving Google.
I expect a lot of business disruption because of AI. Agree it's not the same as employee replacement, but it adds to the sort of fog of war around what effect AI is really having.
Anything open source will be turned against its authors and against ICs.
We thought it would give us freedom, but all of the advantage will accrue to the hyperscalers.
If we don't build open source infra that is owned by everyone, we'll be owned by industrial giants and left with a thin crust that is barely ours. (This seems like such a far-fetched "Kumbaya, My Lord" type of wishful thinking, that it's a joke that I'm even suggesting this is possible.)
Tech is about to cease being ours.
I really like AI models, but I hate monopolies. Especially ones that treat us like cattle and depopulate the last vestiges of ownership and public commons.
it's a real shame no one warned us this would happen when a bunch of corporatists and opportunists wrested the term "open source" from the advocates of true freedom in the late '90s.
Also the FSF squandered its opportunity being RMS’ hobby / support organization and skipped a lot of important discussions, even before the skeevy behavior they’d been ignoring came to light. I used to donate in the 90s but … really feels like that was just flushing cash.
ChatGPT came into the picture long after the open source issues we’re talking about were apparent. AI companies are making it even worse but solid advocacy in the 2010s or 2000s would’ve been helpful.
I'm just not sure how to connect this rhetoric to the facts of the source link, where a hobbyist attempted to extend some source-available code to support a new technology, and the CEO of the for-profit company who owns the license said he's not allowed to for business reasons.
You can be and I am sympathetic towards the CEO! I wouldn't accept a PR for cannibalize_my_revenue.txt either. But if we insist on analyzing the issue according to the categories you're describing, it seems undeniable that the CEO is a corporatist, and that he put an unfree license on his repository to stop people from freely modifying or redistributing it.
There were more-or-less two original spheres of OSS. There were the academics who were too "pure" and holier than thou for everyone else, and then there were commercial FOSS that OS'ed because something already reached its reasonable lifetime potential and it was cool to give away the plans to a cult classic to let it live on in some other mostly permanent, mostly released form. When OSS becomes a mindless pattern, an absolute prerequisite to investment, and/or ceases to be released without regret, resentment, and/or strings attached, then it's not cool anymore and becomes toxic.
There's no such thing. Even if on paper "everyone" has an ownership share, in practice it's going to be a relatively small number of people who actually exercise all the functions of ownership. The idea that "everyone" can somehow collectively "own" anything is a pipe dream. Ownership in practice is control--whoever controls it owns it. "Everyone" can't control anything.
> I really like AI models, but I hate monopolies. Especially ones that treat us like cattle and depopulate the last vestiges of ownership and public commons.
I would dispute whether the tech giants are "monopolies", since there's still competition between them, but that's a minor point. I agree with you that they treat individual coders like cattle--but that's because they can: because, from their standpoint, individual coders are commodities. And if automated tools, including AI models, are cheaper commodities that, from their standpoint, can do the same job, that's what they'll use. And if the end result is that whatever they're selling as end products becomes cheaper for the same functionality, then economically speaking, that's an improvement--we as coders might not like it, but we as customers are better off because things we want are cheaper.
So I'm not sure it's a consistent position to "really like AI models" but also not want the tech giants to treat you like cattle. The two things go together.
> we as customers are better off because things we want are cheaper
Why privilege that side of the equation over "we as workers"? Being a customer isn't all there is to life. I happen to spend quite a bit more time working than shopping.
It's not a matter of "privilege". It's simple economics: if the same functionality can be provided more cheaply, that's a gain to everyone. The gain to customers is the most obvious gain, and it's what I focused on in my previous post--but it's also a gain to producers, because it frees up resources to produce other things of value. But the producers have to be willing to change how they make use of resources in order to take advantage of those opportunities.
> I happen to spend quite a bit more time working than shopping.
Then you should be a lot more worried about AI providing the same functionality you were providing as a coder, but more cheaply--because that makes you, or at least you as a coder providing that functionality, a commodity that's no longer worth its cost. So if you want to avoid being commoditized and treated like cattle, you have to change what you produce to something that AI can't do more cheaply than you can.
Stop enabling corporations' theft and exploitation.
Don't FOSS by default, unionize, embrace solidarity, and form worker-owned co-ops that aren't run by craven/unrealistic/non-business founders if you want any sort of stability.
IMO, the only ethical and legal way to build LLMs on the entire output of all human creativity, that still respects rights and won't lead to feudalism, is conforming to the actual legal requirements of fair use that are being ignored.
According to fair use doctrine, research models would be okay. Models used in education would be okay. Models used for public betterment by the government would be okay, etc
Pie in the sky version would be that models, their output and the infrastructure they run on would be held in a public trust for everyone's benefit. They wouldn't exist without consuming all of the public's intellectual and creative labor and property, therefore they should belong to the public, for the public.
> Tech is about to cease being ours.
On the hardware side, it's bad, as well. Remote attestation is here, and the frog is just about boiled when it comes to the idea of a somewhat open and compatible PC as the platform for general computing.
It was kinda cool while it lasted, glad I got to see the early internet, but it wasn't worth it to basically sign away for my great grandchildren to be peasants or belong to some rich kid's harem.
It does give us freedom. In fact, it arguably gives more people freedom, as non-programmers can create now simple tools to help themselves. I really don't see any way that it reduces our freedom.
They commoditized their complement to their hardware/infra, that being software. Good for them and the value of tech will shift to what is still scarce relatively.
Sucks that anytime you ask AI to generate a site for you Tailwind will have an impact on that.