Skip to Main Content

GenAI

Generative AI information

🔁 GenAI and the research lifecycle

The use of GenAI should complement scholarly methods and expert knowledge when undertaking academic research, not replace it. As with any AI-generated content, it's crucial to critically assess any outputs and to consider potential ethical, copyright and academic integrity issues. 

To ensure alignment with university and national policies, take a look at Southern Cross University's Guidelines for HDR candidates – use of Artificial Intelligence tools and the Top 10 tips for using GenAI in research created by the Tertiary Education Quality and Standards Agency (TEQSA). 

 

💬 GenAI prompts

Most of the research activities outlined below can be supported by generalist Large Language Models (LLMs) such as Microsoft Copilot and ChatGPT. These tools can assist with tasks like brainstorming, refining questions, drafting, translating, and summarising. 

Designing a good prompt will help you maximise the quality of this support and any outputs generated by GenAI. Click on the stages of the research workflow below for prompt ideas based on the Context → Task → Output structure. The more context you provide, the more useful the output. 

✅ Tip: Always check, refine, and fact-check GenAI outputs before using them in your research.

Research planning & topic generation

Brainstorming ideas

Context: I’m starting a project in environmental engineering on renewable energy.
Task: Brainstorm research topics addressing technical + social aspects.
Output: 10 ideas with 1-sentence explanations.

Refining research question / finding gaps

Context: My topic is AI in academic libraries.
Task: Refine my draft question: “How are AI tools transforming library services?” Identify gaps in the literature.
Output: 3–5 refined questions + list of gaps.

Research design / methodology

Context: I want to study mindfulness practices and student stress.
Task: Suggest suitable qualitative, quantitative, or mixed-methods designs with pros/cons.
Output: Comparison of 2–3 designs with rationale.

Research discovery & literature reviews

Literature search / keywords

Context: I need to search for studies on urban green spaces and mental health.
Task: Suggest keywords, synonyms, and Boolean strings for databases like Scopus.
Output: A table of terms + 3 example searches.

Mapping research connections

Context: I’m reviewing blockchain in supply chain management.
Task: Map themes, key authors, and seminal papers.
Output: A concept map (or outline) of connections.

Translating texts

Context: I have a Spanish journal abstract on climate policy.
Task: Translate it into English, keeping terms accurate.
Output: A fluent, technical English version.

Summarising / synthesising literature

Context: I have 5 articles on social media in disaster communication.
Task: Summarise findings, similarities, and differences.
Output: A table + 150–200 word synthesis.

Data collection, analysis & writing

Data analysis / visualisation

Context: I have survey data from 300 people on public transport use.
Task: Suggest statistical analyses and visualisations.
Output: A step-by-step analysis plan + 3–4 chart types.

Code generation / review

Context: I want to run sentiment analysis on survey responses using Python.
Task: Generate code to clean text and apply sentiment analysis. Flag common pitfalls.
Output: Well-commented Python script with explanations.

Publishing research / journals

Context: My article is on sustainable housing in Australia.
Task: Suggest journals suited to this topic.
Output: Ranked list of 5 journals + justification.

Publishing & sharing research

Editing assistance

Context: I drafted the introduction for my marine biodiversity article.
Task: Edit for clarity, conciseness, and academic tone.
Output: A polished version with suggested changes.

Drafting grant applications

Context: I’m applying for a grant on Indigenous knowledge in land restoration.
Task: Draft a compelling 300-word project summary highlighting aims, significance, and outcomes.
Output: A persuasive grant-style summary.

Finding conferences

Context: I’m in digital humanities and want to present on AI + text mining in 2025.
Task: Identify relevant international conferences.
Output: A list of 5 conferences with dates, locations, and deadlines.

GenAI academic research tools

Specialised GenAI academic research tools can assist with various stages of the research lifecycle and offer capabilities that go beyond Large Language Models (such as Copilot), especially when conducting systematic-style reviews. However, discipline-specific database searching, along with the human validation and screening of results, remains essential to meet the gold-standard PRISMA guidelines required for systematic reviews.

Tools such as ElicitUndermind, and Consensus are among the growing number of platforms that can fast-track parts of the research process, including:

  • Identifying seed references and seminal research
  • Mapping citation connections between studies
  • Exploring emerging themes and research gaps
  • Suggesting research methodologies
  • Testing the effectiveness of search strategies

You can also view the AI Search Tools table from Monash Health Library which evaluates a range of tools here

Click on the tools below to learn more about their features, strengths, and limitations. Keep in mind that this is a rapidly evolving field, and access may change over time.

Elicit

Cost: Free version with limited features; Plus and Pro plans add higher extraction limits and advanced tools.

Coverage: Searches across millions of papers in the Semantic Scholar corpus and PubMed (strongest in STEM and medicine). May not include books, grey literature, or content outside Semantic Scholar.

Benefits

  • Extracts and organises findings from academic papers.
  • Helps identify study designs, sample sizes, and key results quickly.
  • Useful for systematic review tasks such as screening and summarising evidence.
  • Provides links back to the original sources for verification.

Limitations

  • Coverage limited to Semantic Scholar – may miss relevant publications elsewhere.
  • Extracted information sometimes requires manual checking for accuracy.
Undermind

Cost: Free version available –premium plans add advanced analytics and export options.

Coverage: Indexes research papers from major open-access and publisher sources (details less transparent than Elicit/ResearchRabbit). 

Benefits

  • Designed to streamline literature reviews.
  • Summarises research findings and highlights key themes.
  • Aims to reduce time spent searching across multiple databases.
  • Supports efficient discovery and synthesis of literature.

Limitations

  • Newer tool with less visibility; coverage and reliability may vary.
  • Transparency about data sources is limited.
  • May not integrate with institutional library subscriptions.
  • AI summaries require critical evaluation for accuracy and bias.
Consensus

Cost: Free version available with core features; paid plans unlock advanced tools. Pricing varies, with institutional licenses available.

Coverage: Built on a corpus of over 200 million academic papers and book chapters, primarily sourced from Semantic Scholar. Strongest in STEM and health sciences; coverage in humanities and social sciences is more variable.

Benefits

  • Uses AI to synthesise findings from peer-reviewed academic literature.
  • Summarises research findings and highlights key themes.
  • Can assist in identifying seminal research and seed references at the beginning and as a tool to test the veracity of the search strategy at the end of a systematic review.

Limitations

  • Coverage limited to Semantic Scholar and indexed sources – may miss grey literature, books, or non-indexed studies.
  • AI-generated summaries still require critical evaluation for accuracy, bias and context.
  • Some advanced features (e.g., Pro Analysis, Deep Search) are not accessible with the free version.