This paper proposes the concept of perceptual pruning to improve the scalability of immersive video conferencing (IVC) systems. The aim of perceptual pruning is to enable servers to dynamically reduce the bit rate of videos on-the-fly in response to changes in virtual distance and orientation of videos with respect to the receiving client. It is shown that the video of each participant will have to undergo varying degrees of texture to pixel mapping before it is rendered by the client. By reducing the resolution of the video at the scale of each transform block in response to nuances of this mapping, the video bit rate could be reduced without any noticeable degradation of quality. This paper presents the results of experimental tests and also real-world scalability tests to quantify the impact of perceptual pruning. It is concluded that perceptual pruning will be of significant benefit, especially in those situations when the number of videos in the client's viewpoint is large.