i was having a discussion with a buddy about college football. i HATE college football. he was curious why. when it came down to it, i hate college sports because universities end up losing money just to keep athletics going, and i feel like college athletics are just a huge clusterfuck that's taking away resources that could be spent on something that actually matters: like college. you know, like books and shit. learning. what college is actually supposed to be about. few universities actually MAKE money off of their football programs. what do you guys think?