Development of a haptic rendered virtual model for automatic gear assembly is reported. The overall objective of the project is to develop a new paradigm for programming of robotics manipulator performing complex constrained motion tasks. The virtual model is used by the human operator to perform the task in a virtual environment. Position and contact force and torque data generated in the virtual environment combined with a priori knowledge about the task is used to identify and learn the skills in the newly demonstrated tasks and then to perform the assembly by a robotic manipulator. The two metric module spur gears meshing engagement problem is used as a case study. The gears are modelled using virtual proxy algorithm. The operators' control traces are recorded from the virtual environment and are used to acquire manipulation skills by machine learning algorithms. The development platform consists of a six degree-of-freedom PHANToM Premium 1.5 and Reach in API. An overview of the whole project is provided and the development carried out to construct the haptic rendered virtual model is described. The progress up-to-date is provided.