Statement on research integrity and digital assistance tools

Read the University’s position statement on the use of digital assistance tools in research

Acknowledgement and use of digital assistance tools in research outputs

University of Melbourne researchers are responsible for ensuring that the material in their research outputs, including research theses, is their own or, if derived from other sources, its use is transparent and appropriately acknowledged and cited.

Digital assistance tools, such as AIs, can only be used if the following criteria are met:

Use of digital assistance tools to falsify or fabricate data is a breach of the University of Melbourne’s Research Integrity and Misconduct Policy [MPF1318] and the Code.

Use of digital assistance tools by graduate researchers to substantially redraft text that is included in a thesis is a breach of clause 4.77 of the University of Melbourne’s Graduate Research Training Policy [MPF1321].

Authorship and digital assistance tools

The University of Melbourne’s Authorship Policy [MPF1181] sets out that Authorship of a research output is attributed when:

  • A researcher has made a significant intellectual or scholarly contribution to a research output, and is willing to take responsibility for the contribution.

All named authors must consent to being named and must be able to ensure the accuracy and integrity of the reported research. Digital assistance tools cannot be named authors, as they are unable to provide consent or confirm the accuracy and integrity of the research output.

For a research thesis, text can be edited as set out in clause 4.77 of the University of Melbourne’s Graduate Research Training Policy [MPF1321] which states that assistance (including assistance via digital tools) can be sought in accordance with the Australian Standards for Editing Practice. As explained in the Standards, this assistance is limited to elements such as language and expression (for example clarity, grammar, spelling), completeness, and consistency. Assistance with elements such as content and structure can only be provided by supervisors.

Use of digital assistance tools in research writing

Support for authoring can be obtained through digital writing assistance tools. At an elementary level, authoring tools such as Microsoft Word correct spelling and suggest grammatical clarification. More recent tools provide further assistance. An example is Grammarly, which is used to check expression and grammar but can be used to assist with rephrasing which may in some cases be of concern.

Tools based on artificial intelligence technologies, sometimes known as ‘AIs’, or more specifically as large language models (LLMs) are now widely available. Example tools include ChatGPT and you.com. These tools create novel text in response to questions or prompts, can change the style in which a provided text is expressed, or can produce summaries or expansions. Other tools, such as QuillBot, offer rephrasing and rewriting, producing entirely new wordings of existing material. There are also digital assistance tools that can automatically generate images or alter images in various ways, some of which can be misleading.

There are some legitimate uses of AI and LLMs. For example, they can provide informative summaries of topics, and thus are a mechanism for becoming informed, noting that accuracy cannot be assured.

In order to quote the output of a LLM, we recommend that:

  • Authors undertake the necessary due diligence to establish the veracity of the quoted material, where appropriate
  • The material be subject to a plagiarism check, as LLM output can be a close paraphrase of another source
  • A footnote be used instead of an in-text citation. The footnote should note the tool used, the version (if available), the date on which it was used, and the exact text of the prompt.

For coursework subjects, the University has published advice that sets out, as for research writing, that all use of LLMs in work submitted for assessment must be fully acknowledged.