From chits to chatbots: cheating in India’s education system
Cheating and plagiarism are already widespread in India's higher education sector. AI tools are making the problem worse but could also make things better.
Cheating and plagiarism are already widespread in India’s higher education sector. AI tools are making the problem worse but could also make things better.
In June 2024, a video clip of mass cheating during MA and MBA exams conducted by the Indira Gandhi National Open University in Bihar raised serious questions about the state’s education system.
It’s the latest in a string of such incidents in Bihar.
In February 2023, a video clip surfaced in Bihar’s Samastipur district, which showed family members of Class X students passing chits to their wards through window grills and telling or showing them answers to questions at an examination centre.
Five years before this video went viral, young men, again in Bihar, were photographed climbing up buildings and passing handwritten chits to students so they could cheat during an exam.
Regrettably, such instances of traditional forms of cheating regularly occur in states such as Bihar and Uttar Pradesh.
This is due to a combination of reasons — lack of political will to stop adoption of unfair means, poor in-school learning and a lack of sufficiently qualified teachers — push desperate students to cheat in exams, particularly in these two states.
But now a new form of cheating appears to be gaining ground across India’s education system, especially in higher education.
Students and researchers are turning to artificial intelligence-driven technology to help them cheat, which makes detection of their wrongdoing difficult if not impossible.
The real fear among education academics across India is that college and university students and researchers who use AI-driven tools to plagiarise or cheat routinely, will enter a variety of job markets and may continue using such unfair means to advance their careers.
The perpetual problem of plagiarism
Examples of using AI to cheat or plagiarise have already been seen.
In 2023, British universities were confronted with a worsening problem.
About 7,300 student applications to various undergraduate programmes were identified as containing plagiarised content, more than double detected the year before
Of these, 765 were applicants from India who were found to have used AI tools to “cheat” on specific application segments such as on their personal statement.
There is no data available on the magnitude and extent of cheating at the undergraduate, Master’s or PhD levels in India.
But back in 2016, the country’s higher education regulator, the University Grants Commission drafted a law for the central government to prevent “rampant plagiarism in academia”.
This became a reality in 2018 when the “Promotion of Academic Integrity and Prevention of Plagiarism in Higher Educational Institutions” regulations were adopted.
This followed the regulator in 2015 making it mandatory that anti-plagiarism software is used to “check PhD theses”.
This decision was a consequence of a number of “central university vice-chancellors and teachers” being charged with plagiarism. Plagiarism was sought to be prevented at the level of Master’s, MPhil and PhD and teachers.
Yet even as strong suggestions have been made abroad, particularly in the United States, to use AI to “scrutinise academic works for potential plagiarism”, the lack of academic integrity in India has received patchy attention.
Endemic cheating
In 2023, a former Delhi University vice-chancellor revealed that a Supreme Court judge had written an “angry note” to him when he was in charge of the university saying that he had “detected a blatant case of plagiarism in a doctoral dissertation by a research student” at the university.
To his consternation, the vic-chancellor subsequently found that a “significant portion” of academic contributions by university faculty “turned out to be plagiarised”.
At that time, AI — in the form of plagiarism checking software — came to his aid.
Later, as his awareness grew, the same vice-chancellor found “more and more cases of plagiarism in several universities“, a phenomenon which he described as “endemic” at the level of India’s undergraduate institutions where students “routinely took material from several sources without acknowledging”.
More importantly, the teaching community could do little to put an end to this malpractice, as plagiarism detection software was not in widespread use at the time.
In 2018 the University Grants Commission also introduced stringent measures to address cheating in academic research, especially given it is an open secret that even researchers used chatbots to produce scholarly papers.
Depending on the severity of plagiarism, these regulations outlined various levels of prescribed penalties, ranging from resubmission of work to the cancellation of degrees.
Universities across India have widely adopted the use of regulator-approved plagiarism detection tools, which can also detect plagiarism by students using generative AI tools.
But how such students are using AI tools is becoming increasingly sophisticated.
For example, in essay writing students use generative AI tools to script answers for them and then they use writing software to paraphrase what the original AI tools produced.
This multi-step process makes it difficult for plagiarism detection tools to spot copying and plagiarism.
Or in math classes and exams AI software can generate codes for solving the problems being set and not directly solve math problems, thereby escaping detection of cheating.
Reliance on detection tools not enough
The problem is students produce good work using AI tools, but they often miss developing the learning or problem-solving skills they will need to navigate work and life.
Educators’ natural response to this problem is to rely on detection tools which are evolving to cope with AI-generated materials, such as Turnitin, GPTZero and Grammarly.
Some are even turning to AI-powered exam proctoring software to monitor students during online exams. Such software can use facial recognition to recognise the student and also detects suspicious behaviour such as unusual movements or accessing prohibited devices, websites or other programs.
The rapid and widespread use of AI in higher education in India and the resultant angst in the debates around its use is a reminder of the days when calculators rapidly grew in capability.
At first, no one really bothered as these little machines did simple arithmetic functions. Then they became a bit more sophisticated and replaced the need for a slide rule and then trigonometric and allied functions were introduced, and the education establishment was flummoxed.
The use of these machines was for a while prohibited from being used in math exams.
But that battle was quickly lost as educators concluded that math tests needed to accept that machine capability can go beyond testing our human students.
There is a close parallel with ChatGPT and related AI tools which can do almost everything we traditionally tested in the social sciences and humanities.
But restricting AI capability used in exams and research work is largely due to the lack of imagination on the part of professors.
Instead they could use AI to design tests which accept the availability and use of such technology in the tests themselves and so unleash a whole new level of higher education.
Of course, concerns about new technology are not always misguided.
There is a real fear that chatbots are decreasing students’ and researchers’ critical thinking skills.
It is important that we train young (and older!) minds to read, write, think, research and communicate succinctly as they prepare to enter the job market.
However, changes to the way we teach, including how we assess students, must also be made.
Professor Naresh Singh is Executive Dean of the Jindal School of Government and Public Policy at O.P. Jindal Global University in Sonipat, Haryana, India.
Originally published under Creative Commons by 360info™.