I would love to see physicists stop writing garbage like this which is completely ignorant of the literature it purports to analyse. There are so many problems with the basic data gathering here that I don't know where to even start. They seem to think that literature research and argument on the Táin stopped somewhere in the 1960's and they seem to think that using a known modern editorial admixture is the same as the original text.
The University of Chicago has already just done this.
The only way to fix the patent problem is to shove GOBS OF MONEY down the throats of ever hungry politicians and their banks.
I see this all the time (I have a PhD in the humanities and I am a software engineer) where someone from outside the field does something and claims it is a universal law but really, they just worked on English and cannot (or will not) prove that it works for other languages. Usually, these papers also lack any kind of literature review and ignore many of the problems that this would uncover. I saw one paper by a physicist that tried to use bit fields to model language change; it was just massively reductionist and couldn't explain anything at all for all the mathematical rigour.
I go to my University's language lunch which has lots of this and scare the pants off grad students by saying "this is all very well but does this work for Japanese or Old Irish or any other language?" This usually makes their faces go white because naturally English is the ONLY language that matters and is therefore "universal".
The world is coming to an end--save your buffers!