Technical concepts | data | technology | analytics | web development | digital design
Notitia's Technical concepts
If you’ve landed here because you're trying to make sense of technical terms in a project, proposal, report or conversation, you’re in the right place.
This is Notitia’s A–Z of technical concepts, written in clear language and grounded in real data, design and engineering practice.
We’ve expanded this glossary to reflect our core service areas: Data & Analytics, Digital Innovation, Data Governance & Literacy, Data Strategy, Design & Development, and Cloud & Managed Services.
It’s designed to support leaders, analysts, designers and technical teams across healthcare, government, community services, and industry-based organisations in Australia.
We’ll keep adding to this list so it remains one of the most comprehensive plain-English data and digital glossaries available.
Agile
An iterative approach to project management and software development that helps teams to deliver value to customers faster. Agile teams deliver work in small (but consumable) increments. Change is responded to quickly. Read about Notitia's approach to project management here.
Alerts and monitoring systems (for data workflows)
Alerts and monitoring systems for data workflows are tools that provide real-time tracking and notifications about the performance and health of data processes. They help detect anomalies, errors, or deviations in the workflow, allowing prompt identification and resolution of issues to maintain the reliability and integrity of the data pipeline. Read more here.
API (Application Programming Interface)
A structured way for systems to talk to each other. APIs allow applications, databases, cloud services or products to exchange data or trigger actions reliably and securely. Our expert web dev team work with APIs in the project work, find out more about them here.
Artefacts
Artefacts is information that is used to detail the product being developed, actions to produce it and the actions performed during the project. In software development, the term artefact refers to key information needed during the development of a project.
BI Analytics
Business Intelligence (BI) Analytics refers to the process of using various tools, techniques, and technologies to analyse and interpret data, in order to make informed business decisions. It involves collecting, integrating, and analysing large volumes of data (from multiple sources) to collect insights that can help drive business performance and competitiveness.
BI Analytics involves the use of various data visualization tools, dashboards, and reports to present data in a way that is easy to understand and can be used to make informed business decisions. The insights gained from BI Analytics can be used to improve operational efficiency, identify new business opportunities, optimise customer experience, and gain a competitive edge in the market.
Overall, BI Analytics provide businesses with a comprehensive view of their operations, enabling them to make data-driven decisions that can help them achieve their business objectives. Read about our BI dashboard development here.
Big Data
Big data is a term used to describe extremely large datasets that are too complex and voluminous for traditional data processing methods to handle. It refers to the massive volume of structured, semi-structured, and unstructured data that is generated by various sources such as social media, sensors, web applications, and other digital platforms.
The concept of big data is often associated with the three Vs: Volume, Velocity, and Variety. Volume refers to the enormous amount of data generated every day, while velocity refers to the speed at which this data is generated and processed. Variety describes the diverse types of data that can be generated, such as text, images, videos, and audio.
Big data technologies, such as Hadoop, Spark, and NoSQL databases, have emerged to help organisations to manage, store, and analyse this data, and to extract valuable insights from it.
The insights derived from big data can be used to improve business decision-making, optimise processes, and identify new opportunities for growth. Read about our data analytics services here.
Change Data Capture (CDC)
A technology (used in tools like Qlik Talend) that tracks and updates changes in real time from source systems so downstream datasets and analytics stay current without full reprocessing. Read about how we used Change Data Capture (CDC) in our project with Victoria's BarwonWater.
Cloud & Managed Services
Ongoing support, optimisation and management of data platforms, analytics environments, and cloud infrastructures, ensuring secure, reliable and scalable operations. Read about Notitia's many cloud transformation projects with Australian organisations in our case study section here.
Core Software Stack
A core software stack refers to the foundational set of essential technologies, frameworks, and programming languages that form the basis of a software application or system. It typically includes components such as the operating system, database management system, programming language, and other fundamental tools required for developing and running the software. Read about our data analytics services here.
Dashboard
A visual display of your data. It pulls together a comprehensive overview of data from different sources. Dashboards are useful for monitoring, measuring, and analysing relevant data in key areas. Read about Notitia's dashboard projects with Australian organisations in our case study section here.
Data
Information, especially facts or numbers, collected to be examined and considered and used to help decision-making, or information in an electronic form that can be stored and used. Read about our data analytics services here.
Data Analytics
Data analytics is the management and analysis of data to drive business processes and improve business outcomes through more effective decision making and enhanced customer experiences. Read about our data analytics services here.
Database
A database (or electronic database) is any collection of data that is organised for rapid search and retrieval. Databases are structured to facilitate the storage, retrieval, modification, and deletion of data in conjunction with various data-processing operations. A database management system (DBMS) extracts information from the database in response to queries. Read about our data analytics services here.
Data Dictionary
A data dictionary provides detailed information about the structure, contents, and relationships between the data elements in a database or other data repository. It typically includes a list of all the data elements or fields in the database, along with their data types, descriptions, and any constraints or business rules that apply to them.
It may also include information about the source of the data, the data format, and any transformations or calculations that have been applied to the data. It can be used as a reference guide for database administrators, programmers, analysts, and other stakeholders who need to understand and work with the data in the repository.
In short, a data dictionary serves as a metadata repository that helps to ensure consistency and accuracy in data management. Read more on creating a data dictionary in our blog post on financial data quality management, data governance and quality, and data literacy.
Data Evaluation Script
A data evaluation script is a set of instructions written in a programming language that analyzes and assesses data based on predefined criteria or algorithms. It typically processes and interprets data to derive meaningful insights, identify patterns, or make informed decisions within a given context.
Data Governance
Data governance are the processes, roles, policies, standards, and metrics that ensure the effective and efficient use of data. It establishes the processes and responsibilities that ensure the quality and security of the data used across a business. Data governance defines who can take what action, upon what data, in what situations, using what methods. Read more about data governance and AI and data governance and quality.
Data Literacy
The ability for individuals across an organisation to read, understand, interpret and use data confidently. Notitia delivers data literacy uplift programs as part of Data Governance & Literacy services. Read why organisations are training their teams in data literacy here, read about Notitia's approach to data literacy here, or visit our services section to find out about Notitia's data literacy training here.
Data Lake
A data lake is a centralised repository designed to store, process, and secure large amounts of structured, semistructured, and unstructured data. It can store data in its native format and process any variety of it, ignoring size limits. Find out more about Notitia's data lake services here.
Data Pipelines
Data pipelines are a set of processes that extract, transform, and load (ETL) data from various sources to a destination, ensuring a smooth and organised flow of information. They enable efficient data management, analysis, and storage, facilitating the integration of disparate data into a unified and usable format. Read more about data pipelines on Notitia's services page here.
Data Pipeline Debugging
Data pipeline debugging involves identifying, analysing, and resolving issues or errors within the data pipeline infrastructure to ensure the smooth and accurate flow of data from source to destination. It often includes tracking and troubleshooting data inconsistencies, errors in transformation processes, and addressing any bottlenecks or disruptions that may impact the reliability and efficiency of the data pipeline.
Data Point
A data point is a single unit of information or observation within a dataset, representing a specific value or measurement related to a particular variable. In statistical analysis, data points serve as the building blocks for generating insights, trends, and patterns by collectively forming the dataset.
Data Quality
Data quality is defined as: the degree to which data meets a company's expectations of accuracy, validity, completeness, and consistency. Read more about data governance and quality here.
Data Strategy
A data strategy is a long-term plan that defines the technology, processes, people, and rules required to manage an organisation's information assets. Read more on Notitia's approach to data strategy on our services page here, or check out our data strategy client case studies section here.
Data Source
A data source is a location or system from which data is collected or retrieved. It can be a database, file, sensor, application, or any platform that generates or stores information, serving as the origin point for data acquisition and analysis.
Data Visualisation
Data visualisation is the process of representing data and information in a graphical or pictorial format. It involves creating visual representations of data to help people understand complex information and to make it easier to identify patterns, trends, and relationships.
Data visualisation tools can be used to create charts, graphs, maps, and other visual representations of data. These tools allow people to explore data and to see how different data points relate to each other. By visualising data, people can better understand complex information and make more informed decisions.
Data visualisation is used in a variety of industries, including business, finance, healthcare, and science. It is a key component of data analytics and is an important tool for communicating insights and findings to stakeholders. Read more on Notitia's blog about data visualisation: "Data Visualisation, why you need it", "Data visualisation, how to select the right graph or table for the job", "2026 Guide to data visualisation", "Human-centred design: Data visualisation tools".
Data Warehouse
A centralised repository that collects, integrates, and stores large volumes of structured data from various sources within an organisation. Read about Notitia's data warehouse services here or our blog article on the types of fact tables in a data warehouse.
Data Workflow
A data workflow refers to the end-to-end sequence of steps and processes involved in handling, processing, and analysing data from its initial collection or acquisition to its final output or decision-making. It encompasses tasks like data ingestion, transformation, analysis, and visualisation, providing a structured framework for managing and extracting insights from data throughout its lifecycle.
Databricks
A cloud-based data engineering and analytics platform used to build scalable pipelines, dimensional models, and gold-layer data structures — used extensively in Notitia's healthcare and government projects. Read more about Notitia's partnership with Databricks here.
Dimensional Modelling
A method of structuring data for analytics using fact and dimension tables — foundational for Qlik Cloud Analytics, Databricks gold layers, and TAC’s hub-and-spoke analytics model.
Developer (or Web Developer)
Developers, web developers, and software engineers focus on developing applications, features, and functionality for end-users. Read about our web development services here, read more about Notitia's web developers Livia Gui and Ben Cuttance on our blog or visist our team page to contact our web developers.
Discovery (Human-Centred Discovery)
A structured process used at Notitia to understand users, needs, workflows, challenges and context before any technical decisions are made. Discovery reduces rework and ensures that solutions are aligned with the real problem.
Emailing or Messaging API Service
An emailing or messaging API service is a software interface that allows developers to integrate email or messaging functionality into their applications, websites, or systems. It enables automated sending, receiving, and management of emails or messages, providing a seamless communication layer for applications without the need for developers to build the entire messaging infrastructure from scratch.
ETL / ELT
Data integration approaches that extract, transform and load (ETL) or extract, load and then transform (ELT) data into a target system. Modern platforms increasingly use ELT for cloud-based pipelines.
Fact Table
A fact table is a central table in a data warehouse that stores quantitative data about a business process or activity. It typically contains measurements, metrics, and keys that connect it to dimension tables for comprehensive analysis in data analytics. Read "What are fact tables and why do data analysts use them?", and "Types of Fact Tables in a Data Warehouse".
Feature Driven Development (FDD)
Methodology within an agile framework that organises software development around making progress on features (user stories).
Information Architecture (IA)
The structure and organisation of content, data and interactions within a digital product. IA is foundational for human-centred design and digital innovation projects.
Minimum Viable Product
A version of a product with just enough features to be usable by early customers who can then provide feedback for future development. A focus on releasing an MVP means that developers avoid unnecessary work. Instead they iterate on working versions and respond to feedback, challenging and validating assumptions about a product’s requirements.
Mood board
A mood board is a visual arrangement of images, materials, text, and other design elements to portray the final design's style. Mood boards can be used for creating brand designs, product designs & other design projects.
Modern Cloud Technologies
Modern cloud technologies refer to a set of tools, platforms, and infrastructure that enable the delivery of on-demand computing resources over the internet. These technologies are designed to provide businesses and organisations with scalable and flexible IT infrastructure, as well as the ability to rapidly develop, deploy, and manage applications and services.
Some examples of modern cloud technologies include:
- Infrastructure as a Service (IaaS): This technology provides virtualised computing resources, such as virtual machines, storage, and networking, over the internet.
- Platform as a Service (PaaS): This technology provides a platform for developers to build, test, and deploy applications without having to worry about the underlying infrastructure.
- Software as a Service (SaaS): This technology provides access to software applications over the internet, typically through a web browser.
- Serverless computing: This technology allows developers to write and run code without having to manage the underlying infrastructure. The cloud provider automatically scales the computing resources up and down based on the application's needs.
- Containers and container orchestration: This technology provides a lightweight, portable way to package and deploy applications. Container orchestration tools, such as Kubernetes, help manage and scale containerised applications.
- Hybrid cloud: This technology allows businesses to use a combination of public cloud and private cloud infrastructure to achieve greater flexibility, scalability, and cost savings.
Overall, modern cloud technologies enable businesses and organisations to build and deploy applications and services more quickly and efficiently, while also reducing the need for costly on-premises infrastructure.
Read about Notitia's clients who have migrated to the cloud in our case study section here.
Observability (Data Observability)
The practice of monitoring data freshness, quality, lineage, schema changes and pipeline behaviour to detect issues before they affect dashboards, decisions or reporting.
PlotBeam
Notitia’s secure, real-time data visualisation platform used to connect, display and update datasets dynamically. PlotBeam is designed for responsive, interactive analytics across sectors including healthcare and government.
Qlik Cloud Analytics
Qlik’s cloud-native analytics platform enabling interactive dashboards, governed self-service reporting, real-time insights, and AI-supported analysis. Notitia is recognised with Qlik Healthcare and Public Sector Partner badges.
Qlik Talend Data Fabric
A data integration and quality suite enabling ingestion, transformation, application integration, data quality management, lineage tracking and real-time CDC pipelines. Read more about Notitia's partnership with Qlik here.
Scheduled Log
A scheduled log is a record of events, activities, or system information that is systematically captured and stored at predetermined intervals or specific times. This structured logging approach allows for the organized tracking of changes, performance metrics, or other relevant data on a regular and scheduled basis.
Solution Design (Human-Centred Solution Design)
A structured design approach used at Notitia to align user needs, organisational context, data requirements and technical architecture. This ensures solutions are fit for purpose and deliver impact. Read more about how Notitia uses Human-Centred Design (HCD):
- Human-centred design in data and digital transformation: why your project depends on it
- Design meets data: Why UX/UI & human-centred design is key to results
- How a human-centred approach to data design enhances Australian healthcare
- IN THE NEWS: TechDay Australia speaks to Notitia about its approach to better software solutions
- Human-centred design essential in software development
- What does a digital designer do and how can they help your project?
- Digital Innovation: Better Discovery, Better Design and Better Tech Deliver Real Outcomes
Transformed Data
Transformed data refers to data that has been altered or modified from its original form to make it more useful or informative for analysis or modeling. Data transformation involves applying a set of mathematical or statistical operations to the data, which can include scaling, normalisation, standardisation, encoding, imputation, aggregation, or feature extraction.
Transformed data can be useful in many different contexts, such as data preprocessing for machine learning, exploratory data analysis, data visualisation, or data cleaning. By transforming data, it is often possible to reveal patterns, relationships, or trends that may not be apparent in the raw data, and to make the data more amenable to statistical analysis or modeling.
UI/UX Design
In the context of data analytics, digital and graphic designers work with the client to ensure their dashboard is visually appealing, inline with the business brand, while also being simple to use & functional. They need to understand the requirements of the developer, data analyst and the client.
UI/UX design refers to the process of designing digital products such as websites, mobile applications, software programs, and other user interfaces with a focus on user experience (UX) and user interface (UI) design.
UX design is concerned with creating a positive and seamless experience for the user by understanding their needs and behaviors and designing the product accordingly. This involves conducting user research, creating user personas, wireframing, prototyping, and testing to ensure that the final product is usable and effective for the user.
UI design, on the other hand, focuses on the visual and interactive aspects of the product, such as the layout, typography, color scheme, and graphic design. The aim is to create an aesthetically pleasing and intuitive interface that allows the user to interact with the product in a simple and effective manner.
Overall, UI/UX design aims to create a product that is not only visually appealing but also user-friendly and meets the needs of the target audience. Meet our in-house design team.
Vendors (Analytics Software Vendors)
Data analytics software vendors are companies that develop and sell software solutions designed to help organisations analyse large volumes of data to extract insights and make data-driven decisions.
These software solutions typically include features such as data visualisation, data mining, predictive analytics, and machine learning algorithms that allow users to identify patterns and trends within their data and gain valuable insights into their business operations.
There are many different types of data analytics software vendors, ranging from large enterprise software companies to smaller niche vendors that specialise in specific industries or applications. Notitia's list of partners.
Wireframe
A wireframe is the visual representation of a website or dashboard which shows where the elements are positioned & how they interact with each other. It allows the user to understand the visitor’s journey to achieve certain actions. Read about how our UI UX design team create wireframes here.






