Hollywood makes me so bitter. It’s always hogging the spotlight from every other town. I mean for some reason Hollywood takes credit for all the movies made in the world. I mean what about poor Studio City or even Los Angeles. Do you have any idea how bitter LA is knowing that you have some stupid sign and that makes you the movie capital of all creation? And what about Pasadena? They make movies there sometimes. That isn’t even the most bitter thing about Hollywood (other than white legs hanging out of black dresses). Hollywood needs to just admit to the whole world that they are just one big studio. They actors are all played by actors themselves, all the historic movie sets are just part of a movie set, and so on and so forth. It makes me so bitter that they continue to think that we believe that people like Angelie Jolie and Julia Roberts are real people. I mean we all know that their lives are just a reality show and that some really good acting is going on to make them look so annoying. So come on Hollywood, stop making me bitter and just come out of the studio closet and admit what you are. A place that is bitter and one that just needs to admit what it is. One big stage and one big reality show that isn’t fooling anyone. ARRGGHHH, so bitter!