There are models especially tuned for it even open weight ones. llms even multimodal ones are not up to the task. You know what doesn't help the discussion at all? That everyone's response is as usual just about titties.
4 months ago I tried every dedicated NSFW-image-classifier model I could find on HuggingFace or GitHub. They have a high false positive rate on certain kinds of benign content, like close up photographs of hands with painted fingernails, and a high false negative rate on artistic nude photographs. I even tried combining multiple models with gradient boosting but the accuracy barely improved; maybe everyone is training with very similar data sets. At this point I should train my own model but I was hoping to find something capable off-the-shelf, since content moderation is such a common task.