I'm not a medical health professional and I will not claim that I'm extensively knowledgeable about the statistics regarding depression but it seems like within the last couple years a lot of people are being diagnosed with depression. I do realize that this is partially the result of attempts to de-stigmatize mental health issues and I think that's awesome. I am in no way trying to come off insensitive or mean but I just think it's getting out of control. I work at a High School part time and 70-80% of the students I see are on anti-depressants. I'm also currently in College and the majority of my friends are on them also. I'm becoming increasingly concerned about the willingness of doctors to lazily prescribe these types of meds. Now, I'm not saying that these people aren't sad or even possibly depressed but there's absolutely no way all of them have depression or even depression that's treatable through medication. What happened to therapy and talking it out? I'm really concerned with this and I'm fearful of the future of psychiatric medicine if this is the direction it's going in! Thoughts?