This is a great observation, and it made me do some math:
If my point of comparison is something like a seagate ironwolf 4T vs a WD Ultrastar 4T:
Seagate Ironwolf:
- 3.7W*24 Hours/day*365 days/year = 32kWh per year * $0.18/kWh = $5.84 per year in power usage * 12 disks in an array = $70.02 per year
*Edit: Looking at this closer, a more reasonable comparison would be an ironwolf PRO disk, since this is a NAS use-case (24-7 run time, large and repeated writes and reads, ect). The power consumption for that is 5.5W, which is a lot closer to the Ultrastar*
WD Ultrastar:
- 7W*24 Hours/day*365 days/year = 61kWh per year * $0.18/kWh = $11.05 per year in power usage * 12 disks in an array = $132.6 per year
Seems like i'd save maybe $70 per year. I feel like that difference might even be justifiable if the enterprise drives are half as likely to fail (seagate ironwolf has an AFR of 0.87%, WD Ultrastar is 0.44%).