Well, I was looking for a new tv show to watch, and stumbled on awkward. I tried hard to stomach it, and made it to the end of episode 2 before giving up. I noticed a trend in more modern times, TV that attracts youth sucks now. It focuses on sex, and the pressures of school, aka trying to have sex, how showing you are popular is more important than having a shred of dignity as well as wealth is great. No mention of the importance of success, at all. This is rather depressing to me honestly, shows like awkward, laguna beach, the prequel to laguna beach, that horrible show with snooki, twilight, pretty little liars, etc. encourage youths to be pathetic and childish. What about the world that these kids create? On the guys part, we get shows like entourage, while entertaining, is basically showing that being a douchebag leads to awesome things. What happened to shows that used to tackle issues in an interesting way? Even cartoons have changed so much. Then we turn to music, which is mostly just hip hop base, with lyrics of getting high, drinking, having more rolexes than you. This is fucking depressing to me... I am 24 this year, and well I am seriously thinking of settling down. In a few years, I believe i want kids. However, looking at the mass marketing of now, how the hell can I seriously believe that my kid is not going to turn out really shitty? Do I really want to have kids in this kind of society? Do the current teenagers stand a chance when they come out to live? Are we all just going to fail after this generation? Maybe this is a sign that I am getting old. I should probably seriously consider getting a rocking chair and sit on my porch all day, shouting at anyone who goes by to get off my lawn.