Witches of East End
It seems in Hollywood, you can’t talk about women without talking about witches.
Since paganism revolves around the ideas of female and male deities, with special emphasis placed upon the role of women’s bodies and their natural connection to the earth, its accessible and inspiring.
In the end, most of these films and shows end up being a tangled dichotomy of supernatural darkness and violence, contrasted with very standard aspects of career and love; also, usually a lot of “girl talk” about boys and shoes.
Therefore, it begs the question, do women ask for these shows? Or are they merely consuming what media executives think they want?