Home | Opinion | Opinion: Robots should help, not replace, early-childhood educators

Opinion: Robots should help, not replace, early-childhood educators

The new Australian Curriculum: digital technologies have arrived, shifting the conversation from whether to include computing in the curriculum to deciding on the best way to teach it. Schools are now considering how to incorporate science technology engineering and mathematics within all school year levels, with computer coding (programming) being the major focus for the T in STEM. Consideration is even being given to teaching coding in early childhood. Coding blocks, simple languages and apps are available for young children to learn the essence of coding.

Simple robots are being used to introduce students to technology and teach students essential computational concepts. However, in the future, robots could be used in the early-childhood arena in a much more interesting way. We suggest that placing robots in the classroom could help both teachers and students achieve better learning outcomes, provided that their designs continue to advance and overcome some current obstacles.

We believe the robots of future classrooms need to work with teachers, not replace them. Unfortunately, the possibility of robots co-operating with teachers receives less attention than robots replacing them. Some zealots advocate dispensing with teachers altogether. For example, the Hole in the Wall project pioneered by Sugata Mitra suggested that students should learn completely unsupervised.

Robo-benefits

Here are three ways robots could assist teachers in the classroom. First, a robot or computer can provide differentiated instruction, tailoring it to individual needs. For example, the NAO robot developed by Aldebaran Robotics and distributed by The Brainary in Australia. The NAO has a suite of games for autism. It can work well with children by being completely predictable, and not getting bored with repetitive behaviour. Also, the emotions of the NAO robots are easy for autistic children to read. Having the robot engage with them separately could allow them to stay in the classroom with the other students and avoid the teacher constantly having to give attention to them.

Second, engaging with a robot can help students maintain their attention, which is a prerequisite for learning. A current project between Swinburne University of Technology and the Victorian Pediatric Rehabilitation Service has programmed an NAO robot to do exercises. The robot is deployed by physiotherapists to work with children at the hospital, and motivates them to do their exercises. Initial results are promising; children are indeed motivated to engage better with their exercises. The same levels of attention could help children in early childhood.

Third, a robot can act as eyes around the classroom, to help a teacher monitor students. A challenge teachers face in classrooms of 25 or more students is how to be aware of all that is happening, especially when students are working in small groups. Having a roving, co-operating robot in the early-learning centre observing the students and notifying the teacher when extra help is needed would be an advantage.

Robo-challenges

What are the challenges that need to be addressed to allow a robot to augment the capabilities of a good teacher?

The first is communication and the computer-human interface. Currently, two-way interactions between robots and students are not possible in close-to-naturalistic ways that would allow for productive learning exchanges with young children. This is because natural language processing – the ability of a robot to understand our spoken language – is not sufficiently efficient.

There have certainly been advances in the field of natural language processing over the years. For example, in the late 1990s, speech recognition software Dragon Naturally Speaking needed to be trained extensively before it could recognise your speech and transform your dictation into text. The more recent versions of Dragon require less training and can be attuned to particular accents. They make fewer mistakes, but they do still make nonsensical errors that are a severe impediment to use in the classroom. Apple’s Siri, and other digital assistant software existing on our phones, can recognise our speech. They are constantly improving but they are also still far from flawless. The lack of proficiency in recognising speech even when we are being precise in our diction and pronunciation is a major limiting factor when you want to create an intelligent system that interacts like a teacher. Responses also need to be instantaneous with young children, or else their attention may stray.

Another barrier in the communication between humans and robots comes from the fact that computers are poor at recognising symbolic representations such as handwriting and mathematical notation. Again, there are no systems that can, like a human teacher, take a look at a student’s attempt at solving an algebraic equation or their diagrammatic explanation of photosynthesis and make sense of it. While a robot might get by in some topics without the ability to recognise written or drawn symbols, it would be a serious limitation in teaching science, for example. Similarly, robots cannot currently look at any picture or photograph and know what is represented in it, which would impose major constraints on working with young children, when communicating about pictures is a part of current approaches to good teaching.

A third barrier is the limited ability of robots to detect and understand human emotions. We know that the emotional state of the learner is a key component of effective learning. If a student is not emotionally ready to learn due to agitation, worry, depression or being upset, then productive learning is much less likely. Progress has been made in robots recognising emotions by sensing the tone of the person’s voice and the expression on their face. However, there are cultural differences between facial expressions and software is not yet able to deal with such variations. Detecting non-verbal cues like posture or body language presents considerable challenges to robotics.

Teach and be taught AI

For robots to be truly useful in supporting teachers, the fields of robotics and artificial intelligence in education (AIED) need to come together. AIED is a research field that has for the last three decades or more been developing tutoring systems and intelligent learning environments, but for computers not robots. The work has tended to be concentrated in the mathematical and scientific domains, where the subject matter lends itself more to interpretation by machine. There are few intelligent learning systems that deal with topics such as writing or history, so the field could not supply systems that could be embedded into an all-purpose robot teacher.

Deploying AIED systems in robotic bodies would bring advantages. Some cognitive researchers believe the computational components of artificial intelligence are insufficient on their own to be regarded by learners as human intelligence. They argue that human intelligence sits within a physical body, whose sensorimotor systems play a strong part in how we make sense of the world around us and act in it. This embodied cognition approach proposes that our physical bodies influence our brains in the same way the mind influences bodily actions. Human beings interact with one another through our visual, auditory and sensory system, using skills that we have evolved over millions of years, such as recognising objects, moving around physically, judging people’s motivations, recognising a voice, setting appropriate goals, paying attention to things that are interesting. These are also the kinds of skills needed to support teachers in the classroom.

So, while robots are beginning to appear in early-childhood classrooms, at the moment they are being used mainly to teach aspects of STEM learning. It will be a while before teachers might be supported in their work by intelligent social robots but the ingredients exist and we now need to bring them together.

Leon Sterling is a professor emeritus in Swinburne University of Technology’s Faculty of Science, Engineering and Technology. Mike Timms is the director of assessment and psychometric research at the Australian Council of Educational Research (ACER).

Do you have an idea for a story?
Email [email protected]

Get the news delivered straight to your inbox

Receive the top stories in our weekly newsletter Sign up now