The cost of a new language is that programmers have to take time to learn it and develop an infrastructure for it, things such as software libraries to do math, communications, user interfaces, etc. Not every programmer will learn every language, and, in particular, learn every language well, which may be a handicap for employers as they have a smaller pool to hire their programming teams from in their chosen language.
Companies have to spend time deciding which language to use, with a lot of people pitching this language over that one, and developing the languages will include a lot of duplication of effort.
The benefit is that new languages may improve productivity, with emphasis on 'may'. There's a lot of controversy about how good some languages are that have been widely deployed for a long time. Ideally, there will be a shakeout and the best languages will win, but I'm not convinced that's the way things happen. More likely it's whichever language gets the best salesmen out there selling it, and, to paraphrase Mae West, 'goodness has nothing (or little) to do with it.'
The earliest writing systems were complex and required years to learn. When easier to learn alphabets came along, the old scribal class would resist because it meant they couldn't command the high pay and respect they were used to. I suppose that might also exist to some extent with programmers and programming languages. It's hard to know who is being genuinely fair in judging languages.
The holy grail for employers and people who just want to use their computer to get work done, as opposed to people who make a living writing code (and I was one of those people who made a living writing code for over 20 years), is to have an artificial intelligence program where you can just explain what you want to it in human language and it will generate an optimal block of binary code to do it (after clearing up any ambiguities in your human language specification).