recently i've been finding that people, more often than not, seem to be afraid of the naked human body. this phobia of something so natural and beautiful is baffling, and frankly, it's also a rather disturbing testament to the grip of censorship and authority.
the idea that it's "okay" to reject what is underneath the superficial (clothing, makeup, accessories, etc.) and decide that seeing a man or woman's genitals is rightfully deemed "inappropriate" is so wrong to me. maybe i'm missing something? maybe there is some book i never read or some morbid fact i need to know about a man's penis? do vaginas really spit acid and have teeth?
i thin