I recently picked up a Threadripper 3960x, 256GB DDR4 and RTX2080ti 11GB running Debian 13 and open web-ui w/ ollama.
It runs well, not much difference to Claude etc but still learning the ropes and how to get the best out of it and local llms in general. Having tonnes of memory is nice for switching out models in ollama quickly since everything stays in cache.
The GPU memory is the weak point though so I'm mostly using models up to 18b parameters that can fit in the vram.
As a neurodiverse British person I tend to communicate more directly than the average English speaker and I find LLM's manner of speech very off-putting and insincere, which in some cases it literally is. I'd be glad to find a switch that made it talk more like I do but they might assume that's too robotic :/
I had a dog, then kids and now just got a puppy and I think there is perhaps some truth to it, dogs are certainly much much lower effort/stress/cost but provide a good amount of companionship. It wasn't enough for us, obviously, but we also have minimal family connections outside our household, for others the equation may add up that a dog/cat is enough and if it was then all the power to you.
I used Tiddlywiki for years, never stopped liking it but eventually migrated to LogSeq which fit my note-taking style better. Looks like things have moved on since then though so I'll have to catch up on what's happening with it!
I've only seen this happen when the plug was fitted badly (pinched or damaged wires inside the plug) or someone use a nail in place of the fuse. People do stupid things like that all the time but it's not the fault of the standard, a fuse should blow if it's run over-current for too long.