Semiconductor devices are aggressively scaled each technology generation to achieve high integration density while the supply voltage is scaled to achieve lower switching energy per device. To achieve high performance, however, commensurate scaling of the transistor threshold voltage (Vth) is needed. Scaling of transistor threshold voltage is associated with exponential increase in subthreshold leakage current . Aggressive scaling of the devices in the nanometer regime not only increases the subthreshold leakage, but also has other negative impacts such as increased drain-induced barrier lowering (DIBL), Vth rolloff, reduced on-current to off-current ratio, and increased source-drain resistance . DIBL increases the dependency of Vth on channel length. A small variation in channel length might result in large Vth variation, which makes device characteristics unpredictable. To avoid these short-channel effects (SCE), oxide thickness scaling and higher nonuniform doping needs to be incorporated  as the devices are scaled in nanometer regime. The International Technology Roadmap for Semiconductors (ITRS) predicts gate oxide thickness of 1.2 to 1.6 nm for sub-100nm CMOS . The low oxide thickness gives rise to high electric field, resulting in considerable direct tunneling current . This current destroys the classical infinite input impedance assumption of metal-oxide semiconductor (MOS) transistors and thus affects circuit performance severely. Higher doping results in high electric field across the p-n junction (source-substrate or drain-substrate), which causes significant band-to-band tunneling (BTBT) of electrons from the valence band of the p-region to the conduction band of the n-region. Peak halo doping (P+) is restricted such that the BTBT component is maintained reasonably small compared with the other leakage components.
|Original language||English (US)|
|Title of host publication||Low-Power Electronics Design|
|State||Published - Jan 1 2004|
Bibliographical notePublisher Copyright:
© 2005 by CRC Press LLC.