An algorithm for the robust detection and recognition of gestures for the interaction between human and a domestic floor cleaner robot is presented. The gestures are selected through a user study, in which the participants are asked to show natural gestures to the robot in given specific interaction scenarios. The gestures selected are those repeated by majority of participants and consist both commanding (say start cleaning) and social interaction (say greeting) gestures. The gesture recognition algorithm is developed using a combination of robust angular, positional, and directional features. The frontal and sagittal plains of human body are identified and invariant angular features are extracted from the skeletal data of a Kinect sensor. Robust positional and directional features are extracted by skeletal reconstruction using the invariant angular features and link rotation matrices. Dynamic time warping of features is done to make the algorithm robust to gesturing speed. Gestures are detected and recognized by calculating multiclass probability estimates by pairwise coupling method. The algorithm provided 97.26% recognition accuracy for a ten class robot commanding gesture database collected from multiple subjects.