There are a few different angles to this.
1. If any other state had done this, we'd be correctly calling this a terrorist attack and there wouldn't be any question about it; and
2. Palantir was a partner in developing several AI systems used for targeting missile strikes in Gaza. Collectively these tend to be called Lavender [1][2]. Another of these systems is called "Where's Daddy". What does it do? It targets alleged militants at home so their families with be collateral damage [3]; and
3. These systems could not exist without the labor of the humans who create them so it raises questions about the ethics of everything we do as software engineers and tech people. This is not a new debate. For example, there were debates about who should be culpable for the German death machine in WW2. Guards at the camp? Absolutely. Civilians at IG Farben who are making Zyklon-B? Do they know what it's being used for? Do they have any choice in the matter?
My personal opinion is that anyone continuing to work for Palantir can no longer plead ignorance. You're actively contributing to profiting from killing, starving and torturing civilians. Do with that what you will. In a just world, you'd have to answer for your actions at The Hague or Nuremberg 2.0, ultimately.
[1]: https://www.business-humanrights.org/es/%C3%BAltimas-noticia...
[2]: https://www.972mag.com/lavender-ai-israeli-army-gaza/
[3]: https://www.businessinsider.com/israel-ai-system-wheres-dadd...