Over the past few years, I've seen a lot of people who've tried to improve the licensing of their software to include "ethical" clauses. Just some of these examples are:
These all start with good intents, trying to limit usage of the harm created by software that they've written, yet they're all flawed in one way or another.
Most of these ethical licenses are new—so new in fact that they haven't been tested in most legal systems. Yes, you can have lawyers write then, and have dozens of reviews of them, but unless they're actually brought up in court, the plain truth is that there's no way to truly know if any morality clause can be upheld.
While this issue is true for any new license, this is especially painful for users of ethical licenses. The raison d'ĂȘtre of these licenses are to try and provide legal support to your ethical beliefs. If the efficacy of such a license is nebulous at best, then it is really nothing but hollow empowerment.
While the intent isn't to necessarily promote the GPL, they at the very least have a track record in holding up in court, both in the US and in Germany.
You might argue "Just having the license is meaningful!" I understand you. The MIT-no-ICE license did spectacularly bring attention to what ICE was doing. But if promoting awareness was the objective, there are better ways to do that. The MIT-no-ICE license ultimately caused confusion and churn, and spawned distrust in the open source software world. Furthermore, due to all of that, it was reverted, leaving little to no effect on entities it was originally targeting.
There are better ways than to inject what effectively is a legal watering attack into the software stack to support a cause. Correctly relicensing (with due process) to a more "hostile" license with known effects such as the GPL is significantly less controversial. Promoting the cause on the README or documentation, where nearly all engineers would at least see once would be much more effective in spreading the word. Even refusing support to people who oppose your cause or work for entities that do, while certainly hostile, would make your intent clear without impacting others.
What you define as "moral" is not necessarily the same as someone else's definition of "moral"—or some organization or nation for that matter. Having a broad "Don't use it for inappropriate behavior." or "Don't use it in connection with hatred." is objectively ambiguous at best. If you live in a country where empowering women is outlawed, is inappropriate behavior then using that software to support women in different country? In a totalitarian country that releases open source software, would using that software to denounce the leader be considered hate speech? If your nation's judicial system may find you just against the demands of that country, then it only proves that the efficacy of these licenses questionable at best.
In a perfect world, these would be considered hypotheticals, but token examples of both—Saudi Arabia and China—exist. If you support the existence of ethical licenses, then you also support the possibility of such licenses that refute your beliefs to exist. I ask you, can you resist not deeming their license invalid because they are morally impermissible to you? Are you willing to permit the existence of licenses that only corporations profit from, that only countries you deem immoral are permitted to use, that promote morals that you consider are fundamentally reprehensible?
Let's say that the above flaws are reasonable to you, and instead you decide to enumerate over which entities you find offensive. Certainly, this way you limit the effects of your license while still expressing your moral beliefs. You reject a large company from using your software. Great! But what about this new shell company that's owned by that same company? Or, even worse, it's a shell company that is independently owned and whose only client is that large company? These companies have free reign to use your software, despite practically being part of the larger company. The best part? That independent company needs not disclose any information to you about who it works with. You could be working with an entity that funnels its work to an entity you object.
The fundamental problem to this solution is that it's both impossible to find the necessary amount of transparency needed to find such companies, and that the amount of such companies that are like this may be unreasonably large to enumerate. It is unachievable, especially for smaller projects that cannot maintain that list.
Okay, so if an enumerated list fails, then the other way might work: a blanket ban on anyone or anything that interacts with the target company. Wait, but it's common practice that major companies hire developers to work on the systems they use. Tools that an entire ecosystem uses now are in limbo because the developers were working there. Certainly it makes sense for them to have an exception. Oh, wait, you're telling me this tool was made by a single developer in their free time but works for that horrible company? Ok, let's make an exception for them as well.
This repeats ad infinitum until you either have a moral clause with so many exceptions it becomes useless or one so large it becomes unmaintainable. It is no better that the previous solution, except you now risk a fallout that extends significantly farther than intended.
This is an unfortunate truth: you can't just target a small group of evildoers without having an immense damage radius across the broader ecosystem. To some, this is a benefit as it forces evildoers to cooperate with the free and open source community. But, for you, who strictly wants to deny a specific monopoly from using your code, this either means your code wasn't going to be used by that company to begin with, or the associated blast was so large it needlessly hurts those who you were trying to support.
The most pessimistic flaw is that should a company find your license unacceptable, it is possible for them to side-step you entirely. Behavior can be reproduced with enough effort, and with software reducing down to discrete inputs and outputs, it's possible for a perfect replication without any involvement on your end. You could prevent this with software patents, but in turn you lose support from much of the community you intend to protect.
If you truly wish to have a license that truly caters to your perspective of morality, then there's an often overlooked license that will allow you to allow you to fine each usage of your software so that you can ensure no harm can truly be done by your software:
© <Year> <Name>.
This license is a simple license, yet as powerful as the law under which your nation functions under. It allows you to make sure that only those doing social good can use your software. It allows you to deny the use of your software from those looking to cause what you define as harm. It's infinitely scalable: New individuals, organizations, nor governments can bypass this license. You can handle things on a case-by-case basis, making sure no one can use your software without your approval. It can be perfectly tuned to your sense of morality and ethics. And the best part? It's been proven in law: you can use this license without any worry of having an untested license. With this, you're guaranteed a license that's been accepted for hundreds of years.
Source code can be freely open source while having a full copyright license. Just because you display your work to the public does not mean that you immediately lose your rights to your work, nor give others permission to use it without your permission. That's the entire point of copyright. Artists don't let their artwork hang in museums in constant fear of their work being copied.
"That's a false analogy!", you respond. "It's a lot easier to 'steal' code, and people copy and use copyrighted work all the time and get away with it!" Yes, that's exactly right. But if you're so concerned about your work being stolen and being used for nefarious purposes, then what good will a license with legally hazy clauses at best do for you, if completely restrictive copyright won't stop those people from using it?
I've seen people claim that there is a ethical free and open source solution, yet they fail to understand that the basis of free and open source code is fundamentally in opposition of ethical licenses. You cannot claim to empower users of your code while simultaneously segregating a sub-population of your users. Free and open source fundamentally detaches itself from morality to provide equality (not what is fair nor what is justice) to the users.
You're missing the fundamental problem. You cannot define a license that provide broad strokes and freedoms to everyone while restricting freedoms based on your own morality. Accepting that people can and should freely use code even with exceptions or restrictions means that there will always be a group that falls in your grey area where your morality isn't well defined. And if it is well defined, then you shouldn't be using an off-the-shelf license; you should be using a license that's truly unique for you—one where all rights are reserved for you to handle out, rendering these ethnical licenses all but moot.
No one understand your morality. The burden of defining your own morality (and thus who uses your software) is thus placed on you. If you truly want a license that expresses your morality, not having a license is the best way forward. Licenses are legal documents, and the legal system is far from supporting individual morality.
While I really do wish there was a perfect solution to this, I've come to a conclusion that the best we have, must we choose a license, is one that forces cooperation, such as the GPL licenses, rather than one that attempts denial.
I encourage people to continue to attempt to improve and iterate on these ethical licenses. I personally have little expectation but would love to be proven wrong. It's possible that there is some iteration where this is possible; it's possible that it is impossible. Only continuous effort will determine which is correct.