Artificial Intelligence
Last updated: September 2025
Definitions
Below are the definitions for a few key terms often used when discussing AI.
- Artificial Intelligence (AI) is a broad term for digital technology that can perform tasks that require “intelligence”, such as reasoning, making decisions, learning from mistakes, communicating, and problem solving.
- An Algorithm is a set of instructions or rules that a computer (including smartphones) uses to complete a task.
- Generative AI is a type of AI that generates text, images, audio, video, or other media based on user prompts using machine learning. Chatbots are a common example of generative AI.
- Hallucinations refer to responses from large language models such as ChatGPT that seem plausible but are inaccurate. An example would be a citation for a book that doesn’t exist.
- Large Language Models (LLMs) are a foundation model, which is a type of machine learning model trained on vast amounts of data. LLMs are trained on massive amounts of text to carry out language-related tasks. Google Translate and ChatGPT are both examples of LLMs.
- Machine Learning is a type of AI that uses algorithms to ‘learn’ without all of it’s instructions being explicitly programmed. Examples include LLMs (see above), virtual assistants like Alexa, and facial recognition.
Further glossaries
- Parliamentary Office of Science and Technology: Artificial Intelligence (AI) Glossary.
- The Alan Turing Institute: Data science and AI glossary.
- NHS AI and Digital Regulations Service: Glossary.
Keep updated
The library produces a regular AI update with the latest published research. Read the latest edition below or visit our Current Awareness page to sign up to receive updates direct to your inbox.
NHS England produces a Technology Update – sign up to receive a weekly update covering genomics, AI, and digital medicine.
Guidance and statements
- NHS AI and Digital Regulations Service: All adopters’ guidance – compiles guidance for adopters of AI technologies.
- NHS England: Guidance on the use of AI-enabled ambient scribing products in health and care settings (Updated April 2025).
- Government Digital Service: Artificial intelligence resources for the public sector – compiles guidance on using AI for public sector organisations (Updated June 2025).
- AI Playbook for the UK Government (Published February 2025).
- Department for Science, Innovation and Technology: Implementing the UK’s AI regulatory principles: initial guidance for regulators (Published February 2024).
- International AI Safety Report 2025 (Updated February 2025).
- NICE Guidance: All AI guidance and advice.
- The British Standards Institution (BSI): AI in Medical Devices.
- British Medical Association (BMA): Principles for artificial intelligence (AI) and its application in healthcare (Published October 2024).
- NHS Postgraduate National Recruitment Programme Board: Position statement on the Use of Artificial Intelligence (AI) During Interviews (Published October 2025).
AI in healthcare
- NHS England: AI knowledge repository (Updated July 2025).
- NHS Confederation: AI in healthcare: navigating the noise (Published September 2024).
- The Health Foundation: AI in health care: what do the public and NHS staff think? (Published July 2024).
AI for study
If you are currently studying, your university may have a policy regarding the use of AI in assignments.
- Univeristy of Buckingham: Academic Integrity Policy and Procedures.
- University of Chester: Using AI in your academic studies.
- Keele University: Artificial intelligence.
- University of Manchester: Can I use a chatbot or AI tool in my assignments?.
- Manchester Metropolitan University: Artificial Intelligence and Assignments.
- Staffordshire Univeristy: AI libguide.
Ethical considerations
Data protection
- Information Commissioner’s Office (ICO): Artificial intelligence guidance and resources.
- NHS England: Information governance – artificial intelligence (Published April 2025).
Transparency and fairness
- ICO: Explaining decisions made with AI.
- The Lancet: Tackling algorithmic bias and promoting transparency in health datasets: the STANDING Together consensus recommendations (Published January 2025).
Environmental impact
- Forbes: AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water (Published February 2024).
- Massachusetts Institute of Technology (MIT): Explained: Generative AI’s environmental impact (Published January 2025).
Examples of AI tools
Below are some examples of tools that use AI. Do not enter patient identifiable information into any generative AI tool as this would be a data breach.
Research tools
- Ask Trip – Trip has an AI search tool that uses a large language model (LLM) to produce answers to clinical questions from the Trip Medical Database. Responses will include citations that link through to articles and guidance. You can also search other clinical questions that have been previously asked. Log in using your NHS Athens account to access Trip Pro.
- Elicit uses machine learning and natural language processing (NLP) to help you find relevant research papers. It will then summarise the paper and extract key information such as formulas or statistical tests.
- Consensus also uses machine learning to identify research papers that will answer a specific research question. Based on the result of the papers it will provide a ‘consensus’ answer to your research question. It only searches for published research via Semantic Scholar so it is more reliable that the GPT tools. There is currently a 6 month lag of data due to indexing – this is good as there is indexing.
Chatbots
- Microsoft Copilot uses GPT-4 (a LLM) to answer questions in a conversational way. Copilot can generate images using Dall-E 3, a text-to-image generator created by OpenAI. Images generated, such as anatomical images, may not be accurate. Make sure you’re logged in to your NHS Microsoft account (the same as your NHSmail account) to access a more secure version of Copilot.
- ChatGPT is a LLM chatbot developed by OpenAI. GPT = Generative Pre-Trained Transformer. It is trained on a massive dataset of text and code and can generate human-like text in response to a wide range of prompts and questions.
- Gemini is a LLM chatbot developed by Google. It will answer questions in a similar way but it is trained on a dataset that includes both text and code (GPT is trained on text only).
