Ensuring reliability of operations under increasing temperature and process variations has emerged as a major design concern at nanometer scale. Adaptive systems that can dynamically change its operating conditions (voltage, frequency etc.) to accommodate for temperature and process variation has emerged as a promising solution. Such a system, however, requires efficient design techniques that can detect these variations and calibrate the system dynamically to adapt to the variations. In this paper, we propose a low-overhead design technique that helps an adaptive system to accurately estimate the temperature-induced delay variations and adapt to it, thus avoiding an overly-conservative design approach with respect to temperature variations. The principal idea is to choose the most temperature-sensitive timing paths in a circuit; dynamically transform them to a ring oscillator during calibration; and use the frequency of the ring oscillator to estimate the operating frequency of the system. Simulation results demonstrate the effectiveness of the technique for a set of ISCAS89 benchmark circuits with less than 3% error in delay estimation while incurring 1.7% average overhead in delay, 3.4% in area and 0.2% in power.