In signal processing applications, large energy gains can be obtained by accepting some degradation in the output signal quality. Filters are at the core of many such systems. In this paper, we demonstrate the potential of a new paradigm for achieving favorable quality-energy trade-offs in digital ﬁlter design that is based on directly accepting timing errors in the datapath under aggressively scaled VDD . In an unmodiﬁed design, such scaling leads to rapid onset of timing errors and, consequently, quality loss. In a modiﬁed ﬁlter implementation, the onset of large errors is delayed, permitting signiﬁcant energy reduction while maintaining high quality. Speciﬁcally, the innovations in the design include techniques for: 1) run-time adjustment of datapath bitwidth, and 2) design-time reordering of ﬁlter taps. We tested the new design strategy on several audio and image processing applications. The designs were synthesized using a 45nm standard cell library. Results of SPICE simulations on the entire designs show that up to 70% energy savings can be achieved while maintaining excellent perceived signal-to-noise ratios (SNRs). Compared to a traditional ﬁlter design, the area overhead of our architecture is about 2%.