As technology scales, the aging effect caused by Negative Bias Temperature Instability (NBTI) has become a major reliability concerns for circuit designers. Consequently, we have seen a lot of research efforts on NBTI analysis and mitigation techniques. On the other hand, reducing leakage power remains to be one of the major design goals. Both NBTI-induced circuit degradation and standby leakage power have a strong dependency on the input patterns of circuits. In this paper, we propose a co-simulation flow to study NBTI-induced circuit degradation and leakage power, taking into account the different behaviors between circuit active and standby time. Based on this flow, we evaluate the efficacy of Input Vector Control (IVC) technique on mitigating circuit aging and reducing standby leakage power with experiments on benchmark circuits that are implemented in 90nm, 65nm, and 45nm technology nodes. The IVC technique is proved to be effective to mitigate NBTI-induced circuit degradation, saving up to 56% circuit performance degradation at 65nm technology node, and on average 30% circuit performance degradation across different technology nodes. Meanwhile, IVC technique can save up to 18% of the worst case leakage power. Since leakage power and NBTI-induced circuit degradation have different dependencies on the input patterns, we propose to derive Pareto sets for designers to explore trade-offs between the life-time reliability and leakage power.