Generating high-quality complete question sets (for example, the question, answer and distractors) in reading comprehension tasks is challenging and rewarding. This paper proposes a question-distractor joint generation framework (QDG). The framework can automatically generate both questions and distractors given a background text and the specified answer. Our work makes it possible to combine complete multiple-choice reading comprehension questions that can be better applied to educators’ work. While there have been independent studies of question generation and distractor generation in previous studies, there have been few joint question-distractor generation studies. In a past joint generation, distractors could only be constructed by generating questions first and then by sorting the answers with similar words. It was impossible to generate question-distractor pairs in an end-to-end unified joint generation approach. To the best of our knowledge, we are the first to propose an end-to-end question-distractor joint generation framework on the RACE dataset. This paper finds that distractors are somehow relevant to the background articles, by suppressing those related parts, thus enabling the generated questions to be better focused on the relevant parts of the correct answers. The experimental results show that the model achieves a giant breakthrough in the question-distractor pair generation task. The question generation task achieves better performance than baselines. For further evaluation, we also manually people for evaluation to demonstrate the educational relevance of our model in generating high-quality question-distractor pairs.