Definition

IOPS (input/output operations per second)

IOPS (input/output operations per second) is the standard unit of measurement for the maximum number of reads and writes to non-contiguous storage locations. IOPS is pronounced EYE-OPS.

IOPS is frequently referenced by storage vendors to characterize performance in solid-state drives (SSD), hard disk drives (HDD) and storage area networks. However, an IOPS number is not an actual benchmark, and numbers promoted by vendors may not correspond to real-world performance.

Along with transfer rate, which measures how fast data can be transferred from contiguous storage locations, IOPS can be used to measure storage performance. While transfer rate is measured in bytes, IOPS is measured as an integer.

As a measurement, IOPS can be compared to revolutions per minute (rpm) of a car engine. If a vehicle is in neutral, stating that the engine is capable of spinning at 10,000 rpms in that moment is meaningless. Without taking into account the data block size (or I/O size), read/write activity or I/O stream, IOPS as a stand-alone measurement says little.

IOPS, latency and throughput explained

Throughput measures how many units of information a system can process in a period of time. It can refer to the number of I/O operations per second, but is typically measured in bytes per second. On their own, IOPS and throughput cannot provide an accurate performance measurement.

Latency measures the time between issuing a request and receiving a response. With regards to IOPS, latency is a measure of the length of time it takes for a single I/O request to be completed from the application's point of view.

While not providing a complete picture, combining latency, IOPS and throughput measurements can help gauge performance.

Measuring IOPS

IOPS is often measured with an Open Source network testing tool called an Iometer. An Iometer determines peak IOPS under differing read/write conditions. Measuring both IOPS and latency can help a network administrator predict how much load a network can handle without performance being negatively affected. Intel discontinued work on Iometer in 2006, so some may find the tool to be dated.

It is possible to calculate IOPS without an Iometer, but results will vary depending on the performance category of the workload. In general, there are three types of workload performance: random, sequential and a mixture of the two. RAID can also impact IOPS calculations, since each write operation results in multiple writes to the storage array.

IOPS can be measured using an online IOPS calculator, which determines IOPS based on the drive speed, average read seek time and average write seek time.

IOPS in SSDs vs. HDDs

HDDs use the standard equation to determine IOPS, but SSDs perform differently. With HDDs, IOPS is dependent on the seek time, but SSDs are primarily dependent on the device's internal controller. SSD performance changes over time, peaking early on. However, even after it drops into the steady state, SSDs still outperform HDDs in terms of IOPS. HDDs also grapple with higher latency and longer read/write times.

Do IOPS numbers matter?

Despite being touted by storage vendors, it's questionable how much IOPS as a measurement matters. Depending on the workload, the numbers can vary wildly, so grading performance based solely on IOPS makes little sense.

Because IOPS numbers are affected by the size of the data block and workload performance, it's unlikely that vendors will use standardized variables when listing IOPS. Even if a standard system were used to determine IOPS, with a set block size and read/write mix, that number means nothing unless it matches up to a specific workload.

This was last updated in December 2016

Continue Reading About IOPS (input/output operations per second)

Dig Deeper on Flash memory and storage

Disaster Recovery
Data Backup
Data Center
Sustainability and ESG
Close