First page Back Continue Last page Summary Graphics
Whats wrong with this picture?
In the ~30 years since these early models, computers have increased in speed by 215, according to Moores law.
This should have produced models that are perhaps 32,000 times better, but the deviations = (model - data)/data have barely improved a factor 2. Err=100% is still considered good agreement.
Why has the convergence been so slow?