The problem, of course, is that Disney is evolving into the most liberal, subversive, and prosperous studio on the block.
Is it really? I wonder if you had anything specific in mind?
Maybe I'm just not as familiar with their movies, but my impression seems to be quite the opposite, especially with Disney/Marvel - war is awesome (fuck yea!), women are for decoration and/or manpain, gay people don't exist - isn't that about as non-liberal a viewpoint as it gets?
Aren't they just now starting to get out of the "token black guy" phase?