Breast cancer is surely a serious problem that kills many people ever year (including men). I personally am friends with two girls that lost their mothers due to breast cancer. I don't want to undermine or demean the disease at all, having witnessed the effects personally. That being said, have you guys noticed that breast cancer is like the disease du jour? It feels like any time there's a charity or benefit, from the chicks in my office doing a wine tasting fundraiser to Demi Moore, Jennifer Aniston, and Alicia Keys injecting glamour to the fight against breast cancer, it just feels like breast cancer is the illness of choice. I remember a long time ago, during one of those awful "My Super Sweet 16" episodes, one of the rich girls throwing a party for herself had a donation box for Susan G. Komen's fight against breast cancer. The NFL even requires their players to wear pink gloves during breast cancer awareness month for Christ's sake! And it feels like this has been going on for a while. I'm not very good at articulating this, but I just feel like if these people truly cared about saving the maximum number of women, they would focus more on heart disease. But you rarely ever hear anything about that. It's just breast cancer-this and breast cancer-that. And why doesn't prostate cancer get any love? The NFL should require players to wear brown gloves during prostate cancer awareness month.