Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm very curious if they're going to ditch Google by providing on-device search. A monthly Open Crawl is under 100 terabytes, and if you clean that down to raw text and deduplicate and maybe pick out what you don't care about, the dataset might already fit onto my iPhone. They could do a lot without making a network call and reach out to a server for anything the device doesn't have, but a lot of user queries might never need to leave the phone. In another couple years, storage will be even higher.


I was hopeful for on-device AI too but any AI processing so far sucks up the battery, heats up the phone and most importantly isn't even nearly good enough. Without a breakthrough in battery, chips or the models and algorithms the way forward is thin clients that connect to some servers close to a solar farm or nuclear energy plant.


A 64GB Mac mini can run a local model at speed that solves the majority of every day user queries.

It won't rewrite a large code case (although the local coding models can do small functions) but it can do a kid's homework or rewrite an email.


This doesn't get anyone a bonus or a bigger boat so it won't happen, technical challenges aside.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: