Clojure is designed to be be compatible - not backwards compatible, but intercalling compatible, with Java. The consequence is that a Clojure program can crash out of stack when it still has masses of heap. Why? Well, the JVM was designed for small embedded devices which would run small programs, which weren't expected to do a lot of recursion; and were low power with limited memory so allocating stack as a vector was seen as an efficiency win. The fact that most of the time we don't run Java on small embedded low power limited memory systems is beside the point: Java is designed to work in those circumstances, and therefore it allocates stack as a vector of fixed (limited) size. When it hits the top of that stack it's stuck, and falls over hard.
Clojure doesn't need to be like that. Even running on the JVM, it would be possible to implement a separate Clojure spaghetti stack in heap space. But the design decision was to make Java interoperability easy at the expense of limiting recursion depth. Similarly Clojure does not automatically fail over from storing integers as java Integers to storing them as bignums, as many much older Lisps are able to do. It easily could have, but it doesn't. Again, I think this is for interoperability with Java; otherwise it looks like a really odd decision.
Easy Java interop is not a bad thing. It's a good thing. It allows access to a wealth of pre-existing Java libraries. But it's a choice, and one should not blind oneself to the fact that other choices could have been made - and would have had significant merits.