Computer science (CS) subjects have been rapidly growing in popularity, and demand for CS education and training has put increasing pressure on teaching resources in higher education (HE) and elsewhere. HE in the People's Republic of China (PRC) has also been developing, with one product of this evolution being Sino-foreign HE institutions (SfHEIs). Much of the popularity growth for CS can be linked to the growth of CS-based technology and innovation, especially in the form of Artificial Intelligence (AI) and Machine Learning (ML). AI/ML-based innovation has been forecast to offer increases in quality of life for consumers. However, AI/ML systems face a challenge for software quality assurance (SQA): They are so-called 'untestable systems' - identifying the correctness of AI/ML system outputs or behaviour may not be feasible. Preparing SQA professionals to be able to ensure AI/ML SQA will require innovative and creative education and training. An SQA approach called metamorphic testing (MT) has a proven track record of alleviating the oracle problem, and has great potential as a testing methodology for AI/ML systems. Metamorphic exploration (ME) is a new addition to the MT literature, and involves developing the user's understanding of the system under study. This paper reports on experiences at an SfHEI of using ME and MT to test an AI/ML system.