We present anchored multi-touch, a technique for extending multi-touch interfaces by using gestures based on both multi-touch surface input and 3D movement of the hand(s) above the surface. These interactions have nearly the same potential for rich, expressive input as do freehand 3D interactions while also having an advantage that the passive haptic feedback provided by the surface makes them easier to control. In addition, anchored multi-touch is particularly well suited for working with 3D content on stereoscopic displays. This paper contributes two example applications: (1) an interface for navigating 3D datasets, and (2) a surface bending interface for freeform 3D modeling. Two methods for sensing the gestures are introduced, one employing a depth camera.