Bully Culture in American Schools
There's an interesting new documentary out called Bully that chronicles the hell that many teens are going through due to rampant bullying in American schools. I haven't seen the full film yet, but from snippets and interviews it seems to make a good point that despite anti-bullying campaigns, problems with bullying have actually gotten worse over the past 20-30 years.
So, here's my question - do you guys think that bullying is far worse in today's high schools, or is it simply getting more attention? If it's worse, why? Does this reflect a certain moral decay and apathy in our society at large? And the million dollar question - should bullying be seen as just another part of growing up, and are we sheltering our children and giving them unrealistic expectations if we don't teach them how to deal with bullying?