The article “Big data, speed and the future of computing” in today’s New York Times hit home with my experiences.
The author, Steve Lohr, describes how the design of computers is evolving, and has to evolve further, to meet the needs of today’s demanding applications. I am one of the people pushing the machines to their limits. I have a 2-processor (each a 6-core Xeon 2.66GHz processor) Mac Pro with 4x1TB hard drives (in a RAID-5 setup) and 16GB RAM. That sounds like a lot of power but I recently finished running simulations that took over 10 years of core-processor time and generated over 3GB of data. This data is then fed into a series of programs that analyzes the data, generating a series of reports (with anywhere from 100 to 25,000 pages) and graphs (one of which is displayed here).
It took a long time to run these simulations and generate these reports. While 12 cores in one machine is good, the way the system is configured I could generally only run 12-20 simulations at one time before I ran out of CPU cycles or RAM. The hard disk space is cheap and plentiful but I needed lots more of both processor capacity and RAM — on the order of 10x more. And if you gave me that I would probably ask for 10x more again.
The Von Neumann architecture has served us well for 65 years and counting. It will take the promise of significant improvement for a whole industry to move away from that model because of its remarkable performance; however, the time has come, and the demand is here (at least in some quarters), for the next stage in the computer’s history.
No comments:
Post a Comment