And now you have a situation where the top leaders of these companies are harming their own culture
Ugh...I just don't get it with the whole company "culture" and the importance some people put on it.
Back in my early days of working when I was W2 direct....a number of companies went all out on the "culture"....we had rah-rah meetings, hell had company outings to promote whatever the latest culture motto that year was.
They spent money on consultants to come out with new mottos and branding....and such.
And, just what does that gain or give you as an employer or employee other than wasting a ton of money on consultants ever year or so?
What is company "culture"?
I mean, you go in....work to get paid and leave.
WTF culture is involved in that.?
It's not like your employer really cares about you....they want work from you.
I'm old enough to remember when companies were a bit more "caring" of their employees...and still the culture thing was bull shit.
Don't get me wrong, I enjoyed the paid days off on day trips for team building....we went bowling, tubing down river all day, laser tag...etc.
But it didn't really. mean squat as far as work went...
Again it was fun...but I can't perceive it did fuck all for work or productivity....and hell, I'd rather have had more raises than the money spent on this crap.
So, I really don't get it.
Work is work....I wouldn't be doing it if I didn't need a paycheck, and the more $$ the better.
I went contracting 1099....and just got away from all that crap...strictly business.
Anyway..does anyone actually see the culture thing as anything that's real? Valuable? Does it make you more money? The company more money?
I just don't get it....yet I see it mentioned quite frequently, especially nowadays with the WFH topic.