The power-gating (PG) technique has been used widely in modern digital ICs to reduce standby leakage power during the idle periods. The virtual supply voltage (VVDD) of a power-gated circuit is a function of the strength of the PG device and the total current flowing through it. As a result, VVDD is susceptible to 1) the bias temperature instability (BTI) degradation that weakens the PG device and 2) the temperature variation that affects the active leakage current (thus the total current) of the IC. To account for the BTI degradation, the PG device must be upsized such that it guarantees the minimum VVDD level that prevents any timing failure over chip lifetime. Moreover, the PG device is also sized for the worst-case voltage drop partly resulted by a large amount of active leakage current at high temperature. The PG device sizing consideration for both effects leads to higher VVDD (thus active leakage power) than necessary at low temperature in early chip lifetime. In this paper, to minimize the leakage power due to the aforementioned effects, we propose two techniques that adjust the PG device strength (thus VVDD) based on the usage of the PG device and the temperature of the IC at runtime. They are applied to an experimental setup modeling the total current of an IC in 32nm technology and their efficacy is demonstrated in the presence of within-die process and temperature variations. On average, the proposed technique reduces active leakage power by up to 10% in early chip lifetime.