I've never known a good software engineer to run away from a problem because it's too hard. Ethics is hard - all the more reason to have smart people grapple with the issues rather than ignore them.
There are different kinds of problems, and some are super hard to solve while also providing very little actual results. Creating a formula for your personal ethics with all of its intricacies AND putting that into a succinct licence that will be enforceable in court (and doesn't do more than you want it to) is probably one of those.
It's reasonable to spend time on other things instead, especially since such a licence would be very individual, because very few people have the exact same views when you look at the details.
There's a middle ground here. I don't need all of my individual ethical beliefs encoded in a software licence. I also don't think we have to stop short at the lowest common denominator of "freedom" without responsibility. Enough people share a similar enough concept of justice for another approach to be both possible and pragmatic.
Whether you are up for the challenge of thinking through the implications of your contributions or not, you hold some of the responsibility for what you create. Avoiding deeper reflection on those contributions is a dangerous thing.
> Enough people share a similar enough concept of justice for another approach to be both possible and pragmatic.
The issue isn't so much in "can you and some number of other people generally agree on what you want", it's in putting that in clear terms into a licence that a judge (or anybody, really) can read and clearly say who is and isn't in violation. And you'll want to do it in such a way that it's not open to interpretation, doesn't hinge on specific words etc, and preferably you'll do it under 100 pages. It is hard.
I believe that even agreement in what exactly you want isn't easy to achieve as the degrees of separation are hard to pin down. Plenty of people will agree to "no using this in drones that bomb people". Drones that only observe while other drones (or human-piloted jets) drop the bombs? Drones that aren't in the area but act as communication relays? A company that builds motors that, among others, are used in one of those drones? A company that builds desks for that drone-motor-company? And is this about all military things, or only those with offensive use, while e.g. a bunker-building company would be okay? What about a shoe manufacturer that sells to the army?
You'll quickly find that the agreement is largest while it's vague and gets smaller while you try to draw clearer lines.
At the same time, the more vague it is, the less will it be used because nobody wants to risk basing their things on something that depends on how a judge in some jurisdiciton understands some term.
It's not always that nobody cares or doesn't want to spend the energy on doing it, sometimes it's just really hard. You're welcome to give it a go, I'm sure all efforts are welcome in the field.
Using modular licenses might mitigate the problem of managing a frankendocument, but I have to agree: making ethos licensing viable would require a ton of effort. The SPDX (Software Package Data Exchange) standard includes boolean expressions to combine licenses[0], like `MIT AND ISC`. It would be difficult but conceivable for a lawyer to write human-readable and enforceable licenses each forbidding one specific use. Bundling licenses with `AND` into cohesive super-licenses covering ethical standards would take more effort. Figuring out what ethical standards a large-enough ecosystem of engineers agrees upon is another herculean task.
Still, I think the efforts are worth it: I'd like to opt out of some subsidizing fields of endeavor but not others. IANAL, just a programmer, but I'd be interested in contributing to ethos-licensing-related projects.
That would be a very interesting project indeed. The fun part would be to hand the judge the syntax document and ask them to please "interpret" the licence according to these rules.
Slightly related, I wonder whether complexity/length of licences plays a factor in adoption. If you have something that is very widely known, somewhat short and readable by lay-persons, you don't need to check it every time, you figure out once that you're okay with working with XYZ Licence or you're not. If you'd have to essentially parse complex expressions of licensing fragments, I expect less adoption because of higher risk of catastrophic issues being overlooked (much like I'd probably not buy a candy bar if the store asked me to read & sign 12 pages of fine print to do so).
A particular piece of geo software is heavily used by the Joint Special Operations Command (JSOC), famously dubbed "Dick Cheney's kill squads" and largely unaccountable to anybody except the president. Was it "good" to support them during the Obama era? How about the Bush era, or the Trump era? And how do you know who's next?