I must confess, I am not familiar with the Australian popular history of their treatment of Aboriginals. But in America, it seems to me that there is a very dim view on the treatment of Native Americans. Aside from the feel-good stories told during Thanksgiving, popular culture seems to view early Americans as barbaric towards the native peoples. I welcome further input on the subject, in any case.