Statement on responsible use of digital assistance tools in research

University of Melbourne researchers are responsible for ensuring that all material in their research outputs, including research theses, publications and non-traditional research outputs, is created by themselves or their co-authors; or, where materials are derived from elsewhere, the sources are specifically and transparently acknowledged and cited.

Authorship and digital assistance tools

Tools based on artificial intelligence technologies, sometimes known as ‘generative AIs’, or more specifically generative tools based on large language models (LLMs) are now widely available; example tools include ChatGPT and Bard. These tools create novel text in response to questions or prompts, and can change the style in which a provided text is expressed, or can produce summaries or expansions. Other tools, such as QuillBot and Grammarly, offer rephrasing and rewriting, producing entirely new wordings of existing material. There are also digital assistance tools that can automatically generate images or alter images in various ways.

The University of Melbourne’s Authorship Policy [MPF1181] sets out that Authorship of a research output is attributed when:

  • A researcher has made a significant intellectual or scholarly contribution to a research output; and is willing to take responsibility for the contribution.

All named authors must consent to being named and must be able to ensure the accuracy and integrity of the reported research. Digital assistance tools cannot be named as authors, as they are unable to provide consent or confirm the accuracy and integrity of the research output.

Acknowledgement of digital assistance tools

Digital assistance tools that generate or alter text, such as generative AIs, can only be used in the preparation of research outputs if the following criteria are met:

Note that generative AI can be used in a wide variety of ways in research, including authoring or redrafting; translating text between languages; literature discovery and summarisation; coding; and generation of images or data. The kinds of acknowledgement required in research outputs will vary with the task for which the generative AI is being used.

Many publishers and journals prohibit use of generative AI for authoring or informing paper reviews. Researchers should ensure that they comply with community expectations and publisher requirements when considering the use of generative AI for such tasks.

Responsible use of digital assistance tools in research outputs

Digital assistance tools can offer various levels of writing support for researchers along a spectrum: from suggested edits of text that has been provided by the author (editing assistance) through to generating new text (content assistance).

Editing assistance:

At an elementary level, authoring tools such as Microsoft Word correct spelling and suggest grammatical clarification. More recent tools provide further editing assistance; an example is Grammarly, which is used to check expression and grammar but can also be used to assist with rephrasing.

Whether rephrasing text moves beyond editing assistance, and therefore impacts the integrity of the research output, depends on whether the rephrasing is beyond that acceptable according to the Australian Standards for Editing Practice. As explained in the Standards, editing assistance is limited to elements such as language and expression (clarity, grammar, spelling), completeness, and consistency. Thus substantial redrafting would fall outside these limits, as would translation into English of a substantial body of text that has been drafted in another language.

For a research thesis, text can be edited as set out in clause 4.77 of the University of Melbourne’s Graduate Research Training Policy [MPF1321], which states that assistance (including assistance via digital tools) can be sought in accordance with the Standards. Assistance with elements such as content and structure can only be provided by supervisors, or other academics (such as advisory committee members), who may provide content assistance, and such assistance should be acknowledged.

For research outputs other than a research thesis, a similar distinction can be made between editorial assistance and content assistance offered by more sophisticated digital assistance tools. Editing assistance may be acknowledged, but acknowledgement is not required for digital assistance tools that limit edits to language and expression.

Content assistance:

There are legitimate uses of generative AI and other such tools in research. For example, they can assist with literature discovery or provide summaries of topics, and thus are a mechanism for becoming informed, noting that accuracy and completeness cannot be assured without some checks. We emphasise that all authors are responsible for ensuring the correctness of published work. If a paper includes text generated by AI then the onus is on the authors to establish that it is correct and sufficient, and to acknowledge the content assistance provided for specific sections of the work.

Several digital assistance tools are designed to assist with the preparation of literature reviews. As set out in the Statement on Graduate Researchers and Digital Assistance Tools, the University discourages the use of generative AI to write literature reviews. We note further that such tools should not be relied on for understanding of the literature, and that the collections of papers they identify are usually neither comprehensive nor the most pertinent.

In order to quote the output of a digital assistance tool that has provided content assistance, it is recommended that:

  • Authors undertake the necessary due diligence to establish the veracity of the generated material, where appropriate;
  • The material be subject to a plagiarism check, as generative AI output can be a close paraphrase of another source;
  • The material be subject to a copyright infringement check, as generative AI may produce outputs protected by copyright law;
  • A footnote be used instead of an in-text citation, stating the tool used, the version (if available), the date on which it was used, and the exact text of the prompt;
  • Authors are explicit as to which material, such as specific passages of text or images, were created or modified by AI tools.

Beyond editing assistance and content assistance for text, it should be noted that the use of digital assistance tools to falsify or fabricate (in the sense of ‘inventing in order to deceive’) data is a breach of the University of Melbourne’s Research Integrity and Misconduct Policy [MPF1318] and the Code.

Implications:

Authors should be aware that unacknowledged use of digital assistance tools to generate or alter text that authors imply is their own may constitute a breach of the Code. Even when the use of digital assistance tools for content assistance is acknowledged, this may impact the degree to which the work is considered their own (with impacts for GR thesis examination), or trigger plagiarism concerns.

For coursework subjects, the University has published advice that sets out, as for research writing, that all use of LLMs in work submitted for assessment must be fully acknowledged, and may impact the assessment outcomes.

The use of open-source tools can have significant impacts on intellectual property considerations. Researchers should be aware that when using open-source tools text provided as inputs is offered without copyright protection for others to use freely; likewise, any generated text they use may breach copyright.