For the past decade the general notion of ‘digital natives’ has attracted considerable attention in both academia and the popular media. While proponents of the idea use a variety of labels, such as ‘Net Generation’, or ‘millenial learners’, the claim they make is essentially the same: younger generations have grown up with digital technologies as part of their everyday worlds and so behave and think differently to older generations to whom these technologies have been introduced later in life (Howe & Strauss, 2000; Palfrey & Gasser, 2008; Prensky, 2001; Tapscott, 1998; 2008). This claim has led to the argument that supposedly old-fashioned teachers and outdated education systems are failing to meet the needs of these younger generations of learners (eg., Prensky, 2010; Tapscott, 1999, 2008). This chapter critically examines the idea of ‘digital natives’ by identifying findings from research that can shed light on questions about young people’s aptitude for and interest in digital technologies. We analyze the key features of claims about digital natives and consider their possible implications for education and educational research. In short, we argue that most claims made about digital natives lack a rigorous and transparent empirical basis and do little to progress educational thinking or policy. It is time, we argue, to move this debate on, not simply to more nuanced versions of the idea, such as ‘digital wisdom’ (Prensky 2009), but rather beyond the very notions and ways of thinking that underpin claims made about digital natives. Indeed, we suggest that moving on from the grounds of this debate is necessary to provide firmer foundations for educational technology research as a serious intellectual field and avoid becoming akin to a faith-based religion.