I'm from Europe, and none of the people I know would take drugs for psychological problems. I've seen it depicted in movies that Americans take (prescribed) drugs when feeling bad, but I can't believe anyone would really do this - is this for real, or are the movies exaggerating? I mean, why would a doctor tell you to drug yourself, even if you feel depressed.
I've had this discussion before and American's won't understand your point of view. Drugs are so commonplace in American society that taking them for anything or in some cases nothing is seen as normal. There are cases where medication is the right answer but they are a minority of the cases, medication appears to be prescribed in the vast majority of cases.
So yes, American's get prescribed drugs when they tell their doctor they feel bad even where those drugs are inappropriate. People rarely question their doctor because they went to medical school so must know what they are doing.