I've been seeing a huge rise of zombies in the media. It seems like werewolves tried to upstage the vampire trend, but zombies ate both of them. What is it that mainstream culture loves about these creatures? Is it the post-apocalyptic settings, the gore of it all, or are those lovable rotting faces just too adorable? Everywhere I look, there are zombie books, movies, TV shows, and mass produced merchandise. Even the CDC did a post on emergency preparedness citing the zombie apocalypse as an example. (here's the link) There have even been zombie walks, where people dress as zombies and parade through the streets. I have personally attended a local zombie walk, and it was great fun! My best friend Justean hugging the "free hugs" zombie The first zombie movie Zombies are not a new thing. (The Zombie had it's origin in Haitian culture). So why is it just now that Zombies have especially been on the r...