Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, if Commodore wanted to compete, the Amiga chips needed to keep up with Moore's Law. Every 18 to 24 months, the processing and display power should double or more. In 1988, the Amiga 500 and 2000 should have had something like AGA.

The article correctly points out that 68k chips fell behind the x86 series. Yes, chunky pixels are useful, and the display hardware should support them. But if the game requires the CPU to draw each pixel one by one, the hardware has already failed. Instead, there should be some specialized processors that do the work. In fact, that's what happened in PCs. Doing all of the drawing with the CPU lasted only about five to seven years, then everyone had GPUs.



Yes! What killed the Amiga was that Commodore effectively stopped R&D on the chips in the mid-1980’s, and by the time they restarted they had already lost too much time.

(It’s possible to argue Steve Jobs was right when he dismissed the Amiga because it was too much hardware. He knew it would be difficult to keep evolving such an architecture. It’s also possible he was wrong because he didn’t account for Commodore’s chip design and manufacturing processes.)

In any case, by 1992, there were Macs capable of 24-bit color, and the 68040 was certainly capable of pushing enough pixels quickly to run Doom / Marathon / Duke 3D without hardware acceleration.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: