Let's first admit that the Essential v2 is not designed for or able to generate any explicit imagery what so ever. The only way I can describe what it is capable of is vanilla.
From that point on the only concern and effort should be in protecting a) minors / child pornography and b) abusive content (probably involving violence). I would presume a) from the reference or b) from the prompt.
As it is impossible for us to know how the "machine" checks for infringement or copyright, but I assume it is the same for you or the upload to be able to check what the image is or intended for. It's just assumption.
Anything (else?) beyond that seems to me like a patchy service. At the moment it seems like the "frog" censoring is mostly random. I can't know what the algorithm is censoring, but from the past experience it can't be much, because... vanilla. On top of that the most "outrageous" vanilla could either be or not be censored, judging by the current results.
I'd love to remain here (mostly because I have better things to do than to write lengthy comments and/or to teach myself "how to NSFW with Stable Diffusion", but we need to have something actually functional.
As for other members, I would suggest, if you like to use the convenient NSFW on occasion, like me, maybe provide devs with some usable feedback. As for the devs, maybe tell us how we can help.
PS: not posting any screenshots for the risk the post get taken down.