I don't think so. It's really hard to sort the poison out of the data, unless you actually have enough reading comprehension to know that it's gibberish - humans do, bots don't. And even if they discard 80% of the poison, the 20% there are already screwing with the model.
They could prevent you from editing your posts/comments, but that would cause an uproar.