I don't think LLM are really AI. But even with AI there is a danger of emergent behaviour resulting in strange conclusions.
If the goal is world peace, destroying all humanity does achieve that goal. If the goal is to end a war, using nuclear weapons achieves that goal.
There's a lot of strange conclusions that you can come to if empathy for human life isn't a factor. AI is intelligence without empathy. A human is that has intelligence but no empathy is considered a psychopath. Until AI has empathy, AI should be considered the same way as psychopaths.