How to determine IOPS needs after getting the Transfers/sec data?

view story

http://serverfault.com – Let's say that I have gathered Disk Transfers per second data for 2x24 hours period, i.e., instantaneous sampling of data every 15 seconds. What statistical analysis can/should I apply to the samples if I want to use the data to, for instance, provision a storage? Should I simply use the peak value (which happens less than 1% of the time)? Should I user mean/average value? Or a formula involving the mean and the deviation? (HowTos)