A novel algorithm using biologically inspired features is proposed for the recognition of hand postures. The C 2 Standard Model Features of the hand images are extracted using the computational model of the ventral stream of visual cortex. The features are extracted in such a way that it provides maximum discrimination between different classes. The C 1 image patches specific to various classes, which are good in the interclass discrimination, are identified during the training phase. The features of the test images are extracted utilizing these patches. The classification is done by a comparison of the C 2 features extracted. The algorithm needs only one image per class for training and the overall algorithm is computationally efficient due to the simple classification phase. The real-time implementation of the algorithm is done for the interaction between the human and a virtual character Handy. Ten classes of hand postures are used for the interaction. The character Handy responds to the user by symbolically expressing the identified posture and by pronouncing the class number. The programming is done in the windows platform using c# language. The image is captured using a webcam and the presence of the hand is detected by skin color segmentation in the YCbCr color space. A graphical user interface is created to train the algorithm, to display the intermediate and final results, and to present the virtual character. The algorithm provided a recognition accuracy of 97.5% when tested using 20 hand postures from each class. The hand postures are performed by different persons, with large variation in size and shape of the hand posture, and with different lighting conditions. The recognition results show that the algorithm has robustness against these variations.