In a recent article in Scientific American, there is a major statistical blunder. In the article, A Great leap in Graphics (subscription required), the author discussed the time it would take to render the images in their recent movie Cars. Sciam wrote “Even with Pixar’s fast network of 3,000 state-of-the-art computers, each second of film took days to render.”
This is absurd on its face. The film is 116 minutes long. If it took “days” and you assume that is merely more than one, that would be 13,920 days or over 38 years. I don’t think it took that long to make the movie.
Days of CPU time (with the reported 3,000 computers) I would buy. But not days per second of elapsed time.