Of course the checkpoint system was initially created back in the days when consoles had a miniscule internal storage capacity (usually a small chip on the game cartridge itself), and only the most basic of game data could be retained. It was quite revolutionary for its time in the console world, meaning that players could play through complex worlds without having to start the entire game over when they ran out of lives or ran into a particularly nasty boss battle. This allowed games to progress beyond the early days when plotless games were relatively simple, with each "level" often just speeding up or adding more enemies until the player died. Given storage limitations, the checkpoint system (often just a simple check indicating what level a player had reached, with no other data saved) made sense.
But with the ubiquitous presence of hard drives and SSD's, with gigs of storage to spare, in modern consoles, why does the checkpoint system still remain in so many games? Many open world games and RPG's have shown the attractiveness of a system that lets players save anytime (some limiting this only in combat situations). And constructing a good checkpoint system (which can mean the difference between only having to replay a few minutes and throwing a controller at the wall) is still a tricky proposition, fraught with pitfalls.
Grand Theft Auto 4 and Dead Rising were particularly notorious for having checkpoint problems, for example. And since many games only allow a player to access one checkpoint save file, a checkpoint autosave at a particularly bad moment could mean having to replay an entire level.
So, why do console developers keep using the checkpoint system in games? Is it force of habit? Is it a reluctance to move into the modern "save anytime" age or adapt their game styles? Or is there still some merit to the checkpoint system that warrants its continued use?"