Answered By: Research support team
Last Updated: Aug 09, 2024     Views: 109

ChatGPT will invent citations for nonexistent articles.

We frequently see examples where ChatGPT has simply invented articles for users, rather than suggesting articles that actually exist, and it's especially frustrating in that AI generated citations look very real.

Here are some clues to determine whether citations are AI generated hallucinations:

  • you've already searched for the article in Discovery, Google Scholar, and Google, and you still can't find it
  • you've searched the journal website directly and the volume/issue number doesn't make sense, and you can't find the article
  • the citation includes actual authors who have published similar work
  • the DOI link is broken or links to a different article
  • the hallucinated citations feature real journal names and authors, and/or the page numbering or volume or issue number doesn't match

By Googling a real citation, you can usually confirm whether an article exists; this is not the case with ChatGPT hallucinatory citations. If you're still unsure, feel free to ask us for assistance with a specific example.

Additional information on this topic

The RRU Writing Centre has developed a guide with information about using ChatGPT.

The University of Adelaide has created a guide with excellent information on AI tools and different aspects of AI such as scholarly publishing and acknowledgement. Important considerations for the usage of AI tools they outline include: whether they meet research needs, whether the instructor, publishers, or grant funder allows the usage of these tools, issues of copyright, privacy, and data retention, and the limitations of such tools.

Still have questions? Ask us!

Contact Us

Toll free in North America (8:30 AM-8:30 PM, Pacific Time): 1-800-788-8028 and ask to be put through to the Library

Related Topics