Hmm... As I suspected you don't really understand climate models and their limitations. Climate models aren't expected to predict a 10 year period. That's more akin to a weather prediction than a climate prediction. As I have said the standard period for climatology is 30 years.
In fact it's impossible at the present time and may never be possible to produce a climate model that would accurately (to your standards) predict a 10 year period. That's because it's impossible to predict ahead of time the natural variability of things like ENSO and volcanic activity (to name a couple of the big ones). If no one can predict ahead of time the phases of ENSO and volcanic activity how can you expect a climate model to factor them in to its predictions.
The thing about natural variability is its mostly quasi-cyclical stuff (with the possible exception of volcanic activity). That is natural variability factors like ENSO and other ocean oscillations cycle from one extreme to the other over time periods that are not precisely predictable but in the long run their effects net out to near zero. 30 years is the minimum time period for that to happen for the most part.
A note on terminology: Climate models make projections, not predictions. I used "predictions" above to avoid confusing you. They are called projections because many of the real world factors that affect climate like natural variability are not predictable ahead of time and how the level of CO2 in the atmosphere rises depends on what we may or may not do to reduce emissions. So they feed realistic simulations of them to the model and the output is a projection of what will happen if the real world matches the simulation. They run the model many times with different realistic simulations to capture the full range of possibilities of the evolution of climate and what the public mostly sees is graphs of all of the model runs averaged together with uncertainty bars that cover the range and variability of the different runs.
Which brings up a scientific study that was published in Nature last year. Nature is paywalled but here is an article on it by one of its authors.
The gist of the study is that when they selected individual model runs where the realistic simulations of ENSO coincidentally happened to match well with real time observations of ENSO the models temperature output matched up well with real world observations of temperature (maybe even good enough to satisfy you). When they selected model runs where the realistic simulations least matched the real world the temperature projections were way off.
AGW is not based on temperature observations at all. It's based on the radiative absorption characteristics of carbon dioxide and the expected side effects of that. It started at the dawn of human industrialization in the late 1700's although the effect was small enough until some time in the early 1900's to hardly be noticeable.