SRAM leakage is a significant fraction of the total power consumption on a chip. Traditional SRAM supply voltage scaling reduces the leakage power, but it increases stored-data failure rate (e.g., due to soft-errors). Accordingly, this work studies SRAM leakage power reduction with a data-reliability constraint ensured by system-level design techniques, like error-correction, supply voltage reduction, and data-refresh (scrubbing). A statistical or probabilistic setup is used to model failure mechanisms like soft-errors or process-variations, and error-probability is used as a metric for data-failure rate. Error models which combine various SRAM cell failure mechanisms are developed. Using these error-models, system level optimization of leakage power constrained by a constant data error-probability requirement is studied. Circuit-level simulation results and leakage power reduction estimates for the CMOS 90nm technology are presented.