Make scales just fine. Badly using make, through mistakes like using recursive make, causes scalability problems.
The paper "Recursive make considered harmful" by Peter Miller identifies common mistakes in using make, and how to fix them. The biggest mistake is using recursive make; this is a common mistake that is NOT required by make. Once you stop making this mistake, make is suddenly much faster.
Two other issues with standard make were not part of POSIX, but they are now:
Issue 1: Historically, standard make only implements deferred assignment (where values are calculated when referenced, not when set).
This meant that as size grows, there was an exponentially increasing calculation effort (eek). Miller recommends using immediate assignment op, but although GNU make has one (as :=) that wasn't in the POSIX standard. He also suggests using an appending assignment (+=_, which wasn't in POSIX either.
Since then, POSIX has added the immediate-assignment operator ::= and the appendix-assignment += (see http://austingroupbugs.net/view.php?id=330). GNU make 4.0 implements "::=", so you can now start using it. This gets rid of a major scalability problem.
Issue 2: The "obvious" ways to implement automatic dependency generation in make require the ability to "include" multiple from one line, and the ability to silently ignore errors when including, and those weren't in POSIX either. These have since been added to POSIX (in http://austingroupbugs.net/view.php?id=333 and http://austingroupbugs.net/view.php?id=518).
Just getting something into the POSIX spec doesn't cause anything magical to happen. But if a capability is in a standard, it's way more likely to be implemented, and people are far more willing to depend on it.