I spent ~$1500 (and a bit more than two months) building a compiler that translates typescript to native binaries. I think I'll be able to do a Show HN later this month. It's the best $1500 I've ever spent on anything related to software development or computers.
You're not addressing the idea in the post, but the person. What the author achieved personally is irrelevant. What's important is that a very large and important industry is being completely transformed.
Add: if I did it without LLMs it would have taken me a year, and would have been less complete.
> Is a really weird way to say that you built a native compiler for typescript.
Well, the Typescript team is rewriting their compiler (landing in v7) in go, and some ppl call it the native compiler. I think my statement is clearer? But then English is not my first language, so there's that.
Well. TBH for some maybe this is the wrong question to ask but I have been thinking where did those 250 billion tokens go? What tools/products/services came out of that?
This has been my biggest question through this whole AI craze. If AI is making everyone X% more productive, where is the proof? Shouldn't we expect to see new startups now able to compete with large enterprises? Shouldn't we be seeing new amazing apps? New features being added? Shouldn't bugs be a thing of the past? Shouldn't uptime be a solved problem by now?
I look around and everything seems to be... the same? Apart from the availability of these AI tools, what has meaningfully changed since 2020?
AI coding improved a lot over 2025. In early 2025 LLMs still struggled with counting. Now they are capable of tool calling so they can just use a calculator. Frankly, I'd say AI coding may as well have not existed before mid-2025. The output wasn't really that good. Sure you could generate code but couldn't rely on a coding agent to make 2 line edits to a 1000 line file.
I don't doubt that they have improved a lot this year, but the same claims were being made last year as well. And the year before that. I still haven't seen anything that proves to me that people are truly that much more productive. They certainly _feel_ more productive, though.
Hell, the GP spent more than $50,000 this year on API calls alone and the results are... what again? Where is the innovation? Where are the tools that wouldn't have been possible to build pre-ChatGPT?
I'm constantly reminded of the Feynman quote: "The first principle is that you must not fool yourself, and you are the easiest person to fool."
"The University of Rhode Island based its report on its estimates that producing a medium-length, 1,000-token GPT-5 response can consume up to 40 watt-hours (Wh) of electricity, with an average just over 18.35 Wh, up from 2.12 Wh for GPT-4. This was higher than all other tested models, except for OpenAI's o3 (25.35 Wh) and Deepseek's R1 (20.90 Wh)."
These numbers don't pass sanity check for me. With 4x300W cards you can get a 1K token DeepSeek R1 output in about 10 seconds. That's just 3.3Wh right? And that's before you even consider batching.
That's very dishonest. Daily water intake is a fraction of how people use water! Producing 1 kg of beef requires 15 000 litres and if you put it that way (which is much more honest) it's not that bad. If you'd also take into account other ways people use water then it'd be even less shocking.
I don't know about other stats, that's why I won't comment on them. But it doesn't matter - your water use stats are still manipulative and make your point much weaker.
Adjusted beef consumption: 4.5 million litres of water can be used to produce 300kg of beef -> US (highest beef consumer/capita) consumes 23.3kg of beef , enough to feed ~13 Americans (30 Brits, ~43 Japanese) yummy delicious grass-fed beef yearly!
These number seem off. A single head of cattle may contain 300kg of edible beef with a hanging weight of roughly twice that. In what world does raising a single bovine consume of 4.5 million liters of water?
Neither the cow nor the cow's food retains much water; the water is merely delayed a little in its journey to the local watershed, and in vast parts of the US, local rainfall is adequate for this purpose (power irrigation isn't required for the crops, and cattle may drink from a pond.) Even if a cow drinks pumped well water, the majority of its nourishment will have been itself sustained by local natural rainfall.
A datacenter's use of water over any timescale can hardly be compared with a cow's.
> Neither the cow nor the cow's food retains much water
Isn't it true for datacenters too? The water used by them does not disappear, one could even argue that cows capture permanently more water than datacenters.
It's also more than hiring someone overseas esp just for a few months. Honestly it's more than most interns are paid for 3 months outside FAANG (considering housing is paid for there etc)
1. You put a lot of time into an intern or a junior too.
2. I didn't say 'paid', I said, 'fully loaded [total] cost'. The total cost of them goes far beyond their mere salary - the search process like all of the interviews for all candidates, onboarding, HR, taxes etc.
1. Idk, I didn't have to. I managed an intern a few months back and he just did everything we had planned out and written down then started making his own additions on top.
2. Yeah I mentioned that also.
3. It's still more expensive than hiring a contractor esp abroad, even all in.
Quick math on the environmental impact of this assuming 18.35Wh/1000 tokens:
Total energy: 4.73GWh, equivalent of powering 450 average US homes annually
Carbon footprint: ~1822 metric tons of CO2, equivalent of driving 4.56 million miles in a gas powered car
Water consumption: 4.5 million litres, recommended daily water intake for 4000 people for a full year
Yet they're on twitter bragging...
https://x.com/steipete/status/2004675874499842535