Can't speak specifically for this site, but these days many prove-you're-human tests have been added because of overzealous AI scraping eating server resources unnecessarily and to an unreasonable and excessive degree.
What kind of blog gets flooded by what, 10/100 req/s at max? Seems somewhere along the line we forgot how to deploy and run infrastructure on the internet, if some basic scrapers manage to down your website.
I have a disconnect here too, which makes it feel like I'm missing something. I'm hit so often with very onerous captcha-like demands or just blocked entirely when trying to make a single page or login request. I understand restrictions for rate limiting and things, but at this rate it feels like it's only a matter of time before the whole web is behind voluntary ID-scan requests for "security" even if laws never come to pass.