There’s been a lot of buzz lately around the use of artificial intelligence to help with academic research and writing. I’ll get the positives out of the way first. I think AI can be incredibly useful, especially for parsing through large amounts of data, or for transcription or proofreading. It can be an excellent learning tool and it’s probably going to be important for a lot of researchers to know how to use AI effectively.
However…
There are huge risks to using AI in your PhD, and you need to be very careful about how you use it.
In some ways, the risks are the same as with any use of software. For example, with statistical software it’s very easy to plug in numbers and get statistical information out, saving a huge amount of time compared to doing it manually. However, in some ways it’s too easy to use, meaning you can get statistical information without understanding what the software is doing on your behalf, or even what the numbers mean or whether they are appropriate to use.
You need a certain amount of statistical knowledge to be able to interpret the data correctly or to check whether there’s a mistake, but all too often researchers just copy and paste from the software (and all too often this gets past the reviewers).
AI multiplies the potential for these kinds of problems, with some people talking about using AI to automatically generate papers directly from the raw data.
If you don’t know what the AI has done to analyse the data (which may not necessarily be what it says it’s done), but you put your name on it, you’re responsible for it.
A good reviewer or examiner should catch it (especially an examiner who gets to question you), but it could actually be worse for you if you get away with it as it could mean losing your job several years after the fact1
So if you’re using AI (and again, to those who comment on videos without watching them, I’m not saying you shouldn’t use it), I think you should follow Kevin Kelly’s2 take… AI is like an intern: You can get it to do certain useful tasks, but you have to check its work.
This means that no matter how good AI gets, you still need well-developed research skills to be able to check what it’s done.
No matter how good AI might be at summarising papers, you still need to be able to read and understand them yourself. And no matter how good chat GPT is at generating text, you still need to be able to write and express yourself clearly.
My worry is that AI will make it too easy for those who want to cheat. There have always been people willing to fake data1 3 4, but if we end up with AI generated papers, citing other AI generated papers, being reviewed using AI, then we’re in real trouble.
But my hope is that this just makes the human factor more important. Maybe conferences will become the dominant forum for peer review in the future, where humans can ask questions to other humans, and maybe the whole process will become more transparent.
So I don’t want to be too negative about AI… by all means use it, but use it carefully and make sure you know enough to check what it’s done and make sure you can defend what its done if you’re going to put your name on it.
How to write your PhD literature review WITHOUT using AI (part 1)
I offer one to one coaching in academic writing and general PhD survival skills. Click below to learn more..
Starts 16th September 2025. Just 30 places available.
PhD: an uncommon guide to research, writing & PhD life is your essential guide to the basic principles every PhD student needs to know.
Applicable to virtually any field of study, it covers everything from finding a research topic, getting to grips with the literature, planning and executing research and coping with the inevitable problems that arise, through to writing, submitting and successfully defending your thesis.
All the text on this site (and every word of every video script) is written by me, personally, because I enjoy writing. I enjoy the challenges of thinking deeply and finding the right words to express my ideas. I do not advocate for the use of AI in academic research and writing, except for very limited use cases.
See also: