Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What would happen if someone used photoshop to create CSAM? Should Adobe be held responsible because they didn't prevent it?

Grok is just another tool, and IMO it shouldn't have guard rails. The user is responsible for their prompts and what they create with it.



Someone spending 40 hours drawing a nude is not equivalent to someone saying take this photo and make them naked and having a naked photo in 4 seconds.

Only one of these is easily preventable with guardrails.


bet I can guess which of those two is more profitable


[flagged]


No, silly billy but the responsibility for when your SAAS platform is generating it falls on you as a developer.

The user in not creating it you are based on a prompt you could easily say no to.


Is Grok simply a tool, or is it itself an agent of the creative process? If I told an art intern to create CSAM, he does, and then I publish it, who's culpable? Me? The intern? Both of us? I don't expect you to answer the question--it's not going to be a simple answer, and it's probably going to involve the courts very soon.


It's a tool. It isn't human, and (currently) is not intelligent. It's a conversational UI on top of a software program.


So, if that "software program" had a traditional button UI, a button said "Create CSAM," and the user pushed it, the program's creator is not culpable at all for providing that functionality?


I think intent comes into play here. Grok was not created to create CSAM, just like photoshop. But both can be used to create it.


I would agree with this if Grok's interface was "put a pixel there, put a line there, now fill this color there" like Photoshop. But it's not. Generative AI is actively assisting users to perform the specific task described and its programming is participating in that task. It's not just generically placing colors on the screen where the user is pointing.


"if I hired a hitman to kill someone, he does, who's culpable? Me? The hitman? Both?"

It's both. Very simple. You can't get around liability by forming a conspiracy [0].

https://en.wikipedia.org/wiki/Criminal_conspiracy


Right, but the makers of the murder weapon aren't culpable.

Or do you think a Microsoft exec should go to jail every time someone uses it to write a death threat?


The hypothetical imagined hiring an intern to do a crime and supposed that this might make liability harder to determine. It doesn't!


An intern is a human, unlike Microsoft Word or an LLM, which are tools/machines/etc.


Automated DDOS-for-hire services are not legal either. They're tools/machines/etc, possibly running more or less autonomously.

https://www.justice.gov/usao-ak/pr/federal-prosecutors-alask...


I think we all know it's illegal to sell illegal services.


Don't know about CSAM, but photoshop won't open an image that shows more than 25% of a dollar bill to prevent counterfeiting.


> Grok is just another tool, and IMO it shouldn't have guard rails.

How is the world improved by an AI tool that will generate sexual deepfake images of children?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: