×
  • Tech - News - Tech Companies
  • Updated: January 20, 2023

ChatGPT: How to Optimise Stunning Originality Threat From AI

ChatGPT: How to Optimise Stunning Originality Threat From AI

ChatGPT AI Tool

As an interactive chatbox that conversationally answers user queries and interactively writes poems, reviews codes, and solves math problems, ChatGPT has the potential to compromise human originality.

As an open artificial intelligence (OpenAI) tool, ChatGPT, has raised global concerns in regard to its impact on education.

In fact, the app is such that it has the ability for producing essays and articles of high quality using near-zero human input.

Founded in 2015 by Elon Musk, Sam Altman, and others, ChatGPT happens to be the latest chatbot from OpenAI but became available publicly in November 2020.

Fantastically, ChatGPT is capable of generating texts on any subject in response to a prompt or query.

Owing to its potential for promoting plagiarism by thwarting genuine, self-driven initiatives and efforts to learn, the stunning AI tool has been banned already for all devices in New York’s public schools.

It may be recalled that Australian universities are strongly considering returning to conventional pen and paper for examinations.

Similarly, UK university lecturers have been urged to review their course assessment methods to balance the new normal.

Moreover, the Guardian UK has described ChatGPT as “a game-changer” ready to challenge the status quo in universities and schools. Despite that GCSE and A-level courses are accessible with traditional end-of-course examinations, experts are disturbed that technophile pupils could apply this technology for doing their homework and habitually becoming dependent on AI-generated answers, thereby eradicating proper knowledge and skills acquisition.

Geoff Barton, the General Secretary of the Association of School and College Leaders, acknowledges that schools are prone to negative implications should they fail to demonstrate firm grips with how ChatGPT is utilised.

“As with all technology, there are caveats around making sure that it is used responsibly and not as a license to cheat, but none of that is insurmountable,” he said.

Also, Dr Thomas Lancaster, a computer scientist working at Imperial College London best known for his research into academic integrity, contract cheating, and plagiarism, said it was in many ways a game changer. 

“It’s certainly a major turning point in education where universities have to make big changes."

“They have to adapt sooner rather than later to make sure that students are assessed fairly, that they all compete on a level playing field and that they still have the skills needed beyond university.

"There’s been technology around for several years that will generate text.

"The big change is that this technology is wrapped up in a very nice interface where you can interact with it, almost like speaking to another human. So it makes it available to a lot of people.”

Also, the University of Sydney’s latest academic integrity policy now specifically mentions “generating content using artificial intelligence” as a form of cheating.

In the words of a spokesperson, "While few instances of cheating had been observed, and cases were generally of a low standard, the university was preparing for change by redesigning assessments and improving detection strategies."

“We also know AI can help students learn, and will form part of the tools we use at work in the future – so we need to teach our students how to use it legitimately,” they said.

Conclusion

It may be difficult or almost impossible to sustain originality when an AI tool like ChatGPT comes into the scene.

While it may be recommended for hybrid use cases (balancing ChatGPT's use with human intellect) in corporate settings to speed up work processes and add value.

Strong efforts and policies must be put in place to completely discourage its use in schools.

This will ensure that natural learning processes occur amongst scholars.

Related Topics

Join our Telegram platform to get news update Join Now

0 Comment(s)

See this post in...

Notice

We have selected third parties to use cookies for technical purposes as specified in the Cookie Policy. Use the “Accept All” button to consent or “Customize” button to set your cookie tracking settings