As device sizes shrink, device-level manufacturing challenges have led to increased variability in physical circuit characteristics. Exponentially increasing circuit density has not only brought about concerns in the reliable manufacturing of circuits, but has also exaggerated variations in dynamic circuit behavior. The resulting uncertainty in performance, power, and reliability imposed by compounding static and dynamic non-determinism threatens to halt the continuation of Moore's law, which has been arguably the primary driving force behind technology and innovation for decades. As the marginal benefits of technology scaling continue to languish, a new vision for stochastic computing has begun to emerge. Rather than hiding variations under expensive guardbands, designers have begun to relax traditional correctness constraints and deliberately expose hardware variability to higher levels of the compute stack, thus tapping into potentially significant performance and energy benefits, while exploiting software and hardware error resilience to tolerate errors. In this paper, we present our vision for design, architecture, compiler, and application-level stochastic computing techniques that embrace errors in order to ensure the continued viability of semiconductor scaling.