A computer scientist can implement any algorithm in any language.
Just because it's possible, it doesn't mean it's effective. Developers could write applications with Brainfuck or Whitespace, but they'd take far longer, have a lot more bugs, and be incredibly unhappy.
There's a lot of variation between programming languages, and it makes a big difference in how productive programmers are. Better programming languages are valuable.
Why are these companies using their own languages?
Because they saw an opportunity to provide better tools for their developers. Take a look at the bridges between Objective-C and other languages. They are pretty clumsy. Apple designed Swift with Objective-C interoperability in mind, and this means using the system libraries is easier with Swift than other languages.
Work a few years at XYZ company working on their proprietary algorithms in their ABC programming language?
Good luck getting another job.
All of the decent developers I know can make those kinds of leaps without a problem.
There are always transferrable skills and there are always non-transferrable skills. Using one language doesn't lock developers into that language in the future, and using a common language doesn't avoid lock in. If iOS developers used Java, they'd still struggle with Android development at first because the majority of the knowledge you need relates to the platform, not the language. And likewise, just because iOS developers work with Objective-C, it doesn't mean they can't make the leap to Android.