This paper presents a new vision-based method for real-time assessment of upper-body postures of a subject who is sitting in front of a desk studying or operating a computer. Unlike most existing vision-based methods that perform offline assessment from human skeletons extracted from RGB video or depth maps, the proposed method analyses directly single images captured by a webcam in front of the subject without the prone-to-error process of extracting the skeleton data from the images or depth maps. To this end, this paper proposes to assess postures by classifying them into predefined classes, without explicitly measuring the variables required for calculating risk scores. Each class of postures is associated with a configuration of the upper body, and an ergonomics risk score is assigned by following one of the scoring methods, e.g. Rapid Upper Limb Assessment (RULA). A data set of upper-body postures that cover the various scenarios when a subject is sitting in front of a desk as well as some extreme cases when the subject turns away from the desk is collected for evaluating the proposed method quantitatively. The proposed method achieved an on-average accuracy of 99.5% for binary classification (low- vs. high-risk postures), 88.2% for classification of 19 risk levels and 81.5% for classification of 30 risk levels on the data set, and the demo developed based on the method runs in real time on a regular computer.