The culture around birth in the United States is a damaging culture of fear, guilt, and shame. It is a culture that teaches us that once we become pregnant, we are no longer capable of making our own decisions, no longer the stewards of our own bodies. It tells us that our bodies are broken and can’t bring a baby into this world without the help of synthetic hormones or a scalpel, while simultaneously reinforcing the idea that childbirth should be a perfect and beautiful experience where we act like amazing warrior goddesses who don’t yell or poop or beg for drugs.
Source: jezebel.com
10 Notes/ Hide
todaysamlearned liked this
shittierandshittier reblogged this from chabletapstick
mightyisobel liked this
darneareads reblogged this from feministlibrarian
darneareads liked this
chabletapstick reblogged this from feministlibrarian
queersolitude liked this tangledinwordsandyarn reblogged this from feministlibrarian
tangledinwordsandyarn liked this
feministlibrarian posted this