The biggest time sink at my job is the system that exports CSV files to use in Excel. If you don't select your data and copy into a new Excel spreadsheet, updating the calculations on a 70MB file takes 90 minutes. That's not a problem on a clean Excel spreadsheet.
I'm curious - in a previous story and previous post you said you were a programmer. Why can't you write a program that eats in that CSV file and spits out the numbers and charts you need?
I did that once for a multi-megabyte spreadsheet with millions of rows that took roughly 60 mins on the user's computer in Excel. They ran this perhaps once a day. Exporting that data to CSV takes them a few mins, and my program (written in R, of all things) processed it and spat out the correct matrices (in CSV form) in a matter of minutes.
Most people have no idea how fast their calculations can be done when it is a command-line program reading all millions of numbers from a file straight into RAM. The numbers in your 70MB spreadsheet should easily fit into RAM into a single contiguous memory location. After that it's simply a matter of iterating over the RAM with the correct functions written in native code.