Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Classic CPU hasn't held a candle compared to GPU on very repetitive math calculations. AI this year has really shown the same difference. In other words, it isn't just graphics... https://www.spiceworks.com/it-security/identity-access-manag...


I assume there is some reason why the past factorizations weren't done with GPUs. It could be just lack of a good implementation and insufficient numbers of people interested in the topic, but it could also be something about the algorithm not being very suitable for GPUs.


CUDA only had its initial release in 2007 (compared to the mentioned crack in 2009), and I remember it being a fairly limited product at that point. GPUS were also much slower back then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: