The explanation offered by the authors is that these higher level languages are doing a lot of work behind the scenes to handle the concatenation, such as creating new objects and copying the strings in order to accommodate the extra bytes of data. “The above explanation applies to any data structure that has to be stored contiguously and increases in size, or is immutable,” they wrote. Conversely, the disk-access approach was faster because the operating systems handled the writes efficiently via buffering and only actually wrote to disk when necessary.
They're trying to point out exactly what everyone here is trying to say they're missing. Not really sure it warrants a research paper, but yeah, common sense if you've ever studied computer science at all.