I enjoy looking at new storage technology, measuring its speed and trying to figure out where this new gear fits in a media workflow.
Recently, as I was writing about upgrading my network, I discovered something that is making me change my process when measuring storage speed. (Read about it here.)
Here’s an example. The same storage hardware, the same test run four times each about a minute apart. Write speeds are within 5% of each other. But read speeds varied by up to 100%!
In most cases, the first test is the slowest. Then, as the storage system gets warmed up, or disk caches become enabled, performance – especially read performance – sky-rockets.
I’ve noticed this on a number of performance tests, but this example is a good illustration; and I remembered to retain my test numbers. The purpose of my mentioning it here is that, while I knew test results varied, I did not realize they varied by this much.
Going forward, when I test for performance, I plan to do at least five tests, then average them to get a better representation of performance. In the past, I would only do two or three. Also, I probably won’t record the first test, because it does not seem to be representative of what the gear can do in real-life.
I found these results interesting and wanted to share them with you to use for your own tests.
NEW & Updated!
Edit smarter with Larry’s latest training, all available in our store.