Current maturity

It is good practice to establish the current state of your data maturity before embarking on improvement activity within the data maturity framework.

Data touches multiple areas of organisational operations, and it is likely that some are better developed than others. The data maturity framework includes maturity levels to help pinpoint issues which may benefit from focus. Given that some areas of relatively high maturity may be more realistically improved further than those of low maturity, the intent is not to define an order in which issues should be addressed. An overview of the existing landscape should allow organisations to flexibly choose the next steps that work for them.

That said, reaching the highest levels of maturity in output focused areas (reporting, decision making) is not likely to be possible or sustainable without a base level in input areas (strategy, governance, systems).

The reality of activities in HE and FE organisations is that maturity is likely to be inconsistent across different functional areas. There may be pockets of good governance practice, well managed systems, visually creative and engaging reporting products and confident teams who are utilising these. This good practice should be championed, but it is not unusual for this to be based on individual skillsets and innovation, particularly in reporting production, and can mask relatively low maturity in other areas.

While this framework can be used within individual areas, departments or functions, progress towards the upper end of maturity requires organisational level capabilities in strategy and architecture without which maturity will likely be compromised.

Data strategy

Level 1 reactive

  • No data improvement proposals are sponsored or generally visible at a senior level
  • We have no senior experience of data and analytics in an organisational context
  • Statutory responsibilities are known to our leadership teams, but we do not focus on delivery processes or wider use
  • There is no recognition that data management is a priority for the organisation beyond statutory obligations
  • Our senior teams have no oversight or responsibility for data and reporting activities 
  • We don't know whether our data is secure, and we don't know how to find out
  • Siloed working is normal and activities are mostly performed by individuals which means duplication and discrepancy is common
  • It is not clear who is responsible for data activities beyond IT Services, and we have not considered this strategically
  • Roles are inconsistent across functions and there is little or no expertise in some areas 
  • The only priority for data activities is to fix whatever is currently broken 
  • We don't consider the impact of change at all on data. We assume our teams fit it in as it happens
  • Opportunities to use data to add value are not promoted or supported because data is not seen as valuable for outcomes 
  • We don’t know what data we hold or where it is
  • We don’t understand risk appetite, so it is not part of our decision making or processes
  • We have not really considered ethics - we just use data as and when we need to
  • The value of outputs from data are unknown - both in terms of what they are for and to whom they may provide value
  • Data is not used for decision making – most decisions are made on gut instinct, and this is not validated
  • We have no visibility or awareness of limitations. Any access restrictions are decided in an inconsistent manner

Level 3 proactive

  • Senior management are aware of the value of data improvement, but support is limited to specific projects or issues, rather than an organisational priority
  • Organisational data literacy is starting to be valued by our senior team and we understand our role in enabling improvements
  • Our senior leaders are starting to see the value of data, but our position remains generally defensive
  • As a senior team we have a role in the formal management of data so that our teams understand their responsibilities
  • Our senior team understand current legislation and take active steps to ensure we adhere to this as an organisation
  • We have recognised classifications in place for our core datasets, and are confident enough in this to be formally internal audited 
  • Some of our teams are starting to discuss how they can work together more effectively and they collaborate on ad hoc projects
  • We are starting to assess skills and capabilities across our teams and understand the need for engagement and delivery outside IT 
  • We recognise that our teams need investment in tools and skills development, but this remains limited to a few specialist individuals
  • We recognise that we need to prioritise and plan how to permanently improve our data management and reporting
  • We ensure the right people are represented on larger projects and initiatives though many changes still happen before being assessed properly
  • We have access to quite a lot of data and reports but now we need to align these to the metrics we need to better understand our strategic goals 
  • We understand that data is an asset and have prioritised and begun to develop key data polices and processes 
  • We are aware of how to manage risk, but it does not influence our approach
  • Our teams understand that our data assets should be used responsibly but we rely on them to do this rather than support them
  • Data is valued as the source of our key institutional, statutory and internal outputs and the costs are understood
  • Our key activities and responsibilities are data informed 
  • Some key data sets and outputs are accessible, but many remain limited by default, and it can be hard to address this

Level 5 integrated

  • All data improvement initiatives are sponsored by senior management with strong support for providing the resources needed to undertake them
  • Our leadership team includes a range of senior experience in data and analytics delivery and use. We consider these skills in our senior recruitment
  • Our teams use data to find points of difference that deliver clear benefits for our students and our staff
  • We have published a framework to define our data management approach. Accountable individuals ensure we meet the highest external auditing levels.
  • We promote data as an asset that is everyone's responsibility, and as a senior team we accept accountability for the role data plays in successful delivery of key activities
  • Our teams are collaborating to consider how new technologies will impact on our data and to ensure new tools can be rolled out and used without risk to our data assets
  • Collaboration across teams is proactive and we are confident that upcoming requirements and change activity is effectively managed 
  • We have a defined staffing establishment for our data and analysis functions and capabilities, and this is represented within our succession planning
  • Our people, both report providers and report users, are a point of difference and we consider them as leaders who innovate and look to future capabilities
  • Data activities are core to our operations and strategy and are high priority as they underpin and enable delivery in other areas
  • Data capabilities are fundamental to how we operate. We undertake impact assessments early when our functions change to ensure appropriate resource is in place
  • Data is considered offensively and drives questions that prompt new strategic thinking 
  • Our strategy requires data to be managed effectively across the whole institution or college which ensures it remains in the spotlight
  • Risk is clearly measured, considered and embedded in all our practices
  • We have defined acceptable usage cases and have appropriate controls in place to ensure all our data is never knowingly or unknowingly used unethically
  • The outputs of data are an integral part of how we deliver our most important services, and their value is well understood across our whole organisation
  • We consider our organisation to be data enabled. Everybody understands what that means for them and role specifications and objectives support this
  • Our controls ensure that access and restrictions are appropriate to business needs. We manage risk and changes in circumstances as part of our business as usual processes

Data governance

Level 1 reactive

  • Our datasets have no clear accountability for ownership or sign-off 
  • We don't have specific roles for data management or any concept of why we would need them beyond considering data protection issues
  • There is no collaboration between cross-disciplinary teams to diagnose, troubleshoot or resolve data issues 
  • We don’t know who our key stakeholders are, so they are part not of any process considerations
  • We have no concept of best practice. Our data management is chaotic, and we don't have time to understand scope and reach 
  • It's a struggle to get anyone interested in data governance and we cannot really point to any area that we would say is engaged
  • If anything about our organisational structure changes, no one considers how our data will be affected 
  • When new systems are proposed, understanding the impact on data is not seen as a priority and our data products have broken unexpectedly
  • We have little visibility of the dependencies we have on individuals, which leaves us vulnerable if people move roles or leave the organisation
  • We don't believe data is referenced in any organisational policies 
  • We do not have an approach to improving data quality and there is a sense of apathy towards such initiatives across the business
  • We fix errors as they are reported to us. No analysis of why the failure occurred is done and we do not track issues after resolution 
  • It is not clear what definitions are in place and there is no standard approach to managing them
  • We have no processes in place to track what happens to the data we use, so we struggle when we find errors
  • There is no oversight of how our datasets relate to each other and no processes in place for how they might be combined effectively
  • We do not perform risk assessments on our datasets. Therefore, nothing is recorded in any logs 
  • We've never seen a business continuity plan, and we assume there is no strategy for data assets. 
  • The organisation fears risks associated with GDPR and there are assumptions that data cannot be shared even where there may be a legitimate purpose 

Level 3 proactive

  • Key datasets are a jointly owned asset between those managing the data and those accountable for it. Other data has unclear accountability 
  • We have defined some formal data management responsbilities in our key areas like student support teams
  • We have a forum for sharing issues around quality and other data management issues which meets regularly and is respected 
  • Our stakeholders are known to us and are consulted with on an adhoc basis - we need to ensure they trust us and will honestly share their informal as well as formal processes
  • The data management function is fit for purpose and no more. Therefore, we do have some best practice, but it is not organisation-wide 
  • Our leadership and operational teams understand what data governance is. We are developing a plan for each area, so people are clear what will drive value
  • When our organisational structure changes, we have processes in place to manage any impact on our data, though the focus is statutory compliance
  • When new systems are proposed, we know there will be an impact on data, and we get enough notice to find and manage downstream impacts
  • We understand which of our people understand our processes though we can sometimes struggle when key individuals leave their posts
  • We have data policies, but adherence is poorly monitored and sanctions are rarely invoked when policies are not followed
  • We have data quality targets and Key Performance Indicators (KPIs), but these are not regularly assessed or challenged
  • We track and record all data issues using automated error checking though we do not yet have standards in place for resolution
  • We have designed and populated a data dictionary for our top priority data area that we believe can be adopted across the university or college
  • We are defining standards to document how data transformations are undertaken and piloting these in one area
  • We understand and have documentation in place for at least two datasets and we can point to examples of where combining them has added value to the organisation
  • We record and review data related risks and issues in our corporate risk logs 
  • Data is included in the Business Continuity Plan (BCP), and we are developing regular testing to ensure it will work under stress
  • Data is generally made available to those that need it, subject to GDPR. However, we struggle with volumes, archiving and deletion.

Level 5 integrated

  • We have a senior management sponsor for all data assets. Information asset ownership is embedded and effectively distributed across the organisation 
  • Data citizenship is embraced by people in all parts of the organisation, who advocate and manage data as an organisational asset 
  • Data issues are worked on collaboratively between all functions, and prioritised according to wider business initiatives and needs 
  • Stakeholder relationships are firmly established and are consulted on key topics - they confirm how they work, not how they know they are supposed to
  • We ensure that the principles and goals of best practice data management are embedded and advocated in all appropriate policy documents 
  • There is a shared understanding and collaboration across our teams. Connecting how we work and how we monitor our effectiveness is seen as being everyone's responsibility
  • Senior leaders consider the impact on data and reporting outputs when discussing organisational structure changes so that the cost of change is fully understood. 
  • We ensure data specialists join project initiatives at an early stage to advise on the impact of system and process changes on our data, which ensures there is no downstream impact
  • Business continuity is critical. Our framework defines roles and responsibilities and managers reallocate roles and ensure adequate knowledge transition when people leave
  • We have defined and respected data principles, goals and practices which are consistently applied to all our data operations 
  • We ensure all our data is rigorously maintained to the published levels of quality using well-understood metrics
  • Our data quality plan is supported by automated error reporting which is monitored by senior management
  • We have oversight of a data dictionary that covers all our data assets. We actively ensure content remains accurate and engagement remains high
  • We have automated our processes wherever possible to track changes to data values and meaning. This ensures we can manage our data assets effectively
  • The data we hold is a strategic capability. Where new systems or data sources could add value, our standards ensure we are clear how they relate to our existing entities and attributes
  • Risk and quality are actively managed for all datasets. Breaches and metrics are flagged and responded to by senior management 
  • Data drives a significant portion of the BCP plan as data availability is core to our operations during an event. Plans are always up to date and regularly tested 
  • Subject to GDPR, everyone has access to the data they need in a secure environment and our assurance processes ensure no data is retained any longer than needed 

Processes and systems

Level 1 reactive

  • There is no concept of data architecture in our organisation. Consequently, we have no formal vision, metrics or principles for data 
  • Technology dependencies are hindering us in managing our data 
  • We usually feel on the back foot when the university or college needs something new. We have started system projects that have failed
  • We continue with data collections for no obvious rationale other than 'we've always done this' 
  • We have not mapped out any processes for data so we cannot integrate them with wider business processes 
  • There is no recognisable understanding from any business unit of how data is managed by IT 
  • We do not have clear sight of the systems environment and therefore the extent of data sitting on shadow or non-shadow IT
  • We really struggle with our data - it's all over the place, on individual computers and in some cases in hardcopy formats
  • We are dependent on manual activity to move data from one application to another
  • We have no consistency of data input - our systems do not support this and none of our people have access to any standards in relation to this
  • Data resides in our systems and onward use is through manual extraction on an ad hoc basis
  • When we extract or move data in our systems, we are totally dependent on the person responsible for the extraction to consider if the data is accurate and documented effectively
  • We have no idea how many datasets we have, where they are or what they are used for. Therefore, no master copy exists 
  • Data has no formal presentation. There are no links between what the organisation does and the data that supports it 
  • We don’t have any formal metadata available for any of our core or non-core datasets 
  • We don't have any understanding or visibility of the data available to us or what our systems hold
  • There is no consistency across our systems in how our organisational units, business terms, or their attributes are described
  • We do not have any visibility of the reference data used in systems and onward reporting - what we see is inconsistent

Level 3 proactive

  • We have a data architecture function, but it is not fully staffed, nor does it have a mandate for real enterprise-wide change 
  • We have limited data management specific technology, but this is not well integrated with wider management solutions 
  • Our structures and knowledge ensure we can react to new priorities in a timely fashion, though we would like to be more proactive
  • There is some confusion around why some data is collected, but we understand our systems of record and our primary data feeds
  • We have mapped out our operational processes allowing us to replicate frequent activities in our departmental teams 
  • IT is included in wider business processes around outputs, but not fully integrated with new requirements and change 
  • We know what systems our organisation needs to support its processes. A roadmap is in place to deliver those we do not yet have in place
  • Our data is mostly stored in a secure, backed up environment. This is usually a local server with managed access
  • We have mapped the dependencies between our applications and have an agreed plan and budget in place to automate integrations
  • We know how data should be collected to support our strategic goals. New system developments contain appropriate field level validation
  • We have mapped out how automation of data flows could support more effective analytics, and are starting to plan out how to deliver this
  • We capture limited audit processes and are starting to consider how we might use this information to improve our processes
  • We have an agreed single source for core datasets, even if this means we just know where the copies are 
  • Data is represented through a variety of models and schematics but not well integrated with wider business operating and change models 
  • Metadata is available (if not complete) for our core datasets, but generally developed and maintained within the student team
  • There is knowledge of the data within our systems but it is siloed and we have no formal mechanism to collate it in one place
  • There is some consistency across systems with key identifiers and we have a plan in place to expand this
  • We manage key reference data, including our organisational hierarchy, in a central function though it's use in reporting is limited, and we do not yet have appropriate governance processes in place

Level 5 integrated

  • Data architecture is part of our enterprise level blueprint - it plays a major part in operation and change within the organisation 
  • Effective use of technology enables us to manage data across its full lifecycle by analysing, improving and controlling information assets 
  • Our target operating model includes provision to manage new requirements and we aim to pre-empt systems and process changes
  • We regularly review our data collection activities in line with our operational and strategic needs. Data collection is driven directly from these models 
  • Data processes are clearly documented and rigorously maintained; performance monitoring is in place as a 'business as usual' activity 
  • Operational and change activity is seamlessly integrated between IT and the wider business with all roles and responsibilities defined and understood 
  • Our systems architecture forms part of our enterprise architecture. Our focus is on managing change so we are prepared for new requirements from the organisation
  • We are streamlining the data we store as part of our strategy to ensure we only hold the data we need, in a supported, securely accessed environment
  • We have an abstract data layer linking our applications via APIs. This ensures we can change applications when we need to without impact on downstream data needs
  • We have processes and standards to support data collection and entry. All our corporate systems are configured to support these standards.
  • All appropriate data processes from system admin to corporate priorities like reporting, are supported by automated, scheduled and maintained data flows
  • We have visible and documented audit data and impact assessment for our systems, appropriate error handling processes and a schedule to help us understand how to manage a variety of scenarios
  • Our data is created, integrated, consumed and purged with traceability to the master data model, and supported by rigorous business process 
  • We have a common business vocabulary for data which builds into a master view of the data journey through the organisation 
  • Metadata is complete, rich, managed and maintained. This lack of ambiguity enables high re-use and ensures we can develop new services in a timely fashion
  • Our business has good knowledge of the data held in our systems, it is well-documented, and we understand it's potential in helping us address our strategic priorities
  • We have a complete centralised view of the master data in all our business systems. It is well documented in an appropriate format and we consider the impact of changes before we make them
  • We have agreed reference data sets across our organisation, with consensus on content, onward governance, and who is accountable for its purpose and change requests

Reporting

Level 1 reactive

  • We have not considered what reporting is needed across our organisation and any development is purely ad hoc
  • We do not consider "business questions" in our reporting provision and tend to make assumptions about the reporting that is needed
  • The definitions for the reports we build are held in individual's heads and there is no visibility beyond this
  • Extracting, transforming and loading data seems to be labour intensive as it is not standardised - even for frequent operational activities 
  • We do not manipulate our data; we only analyse it as it is structured in our systems, and we are unable to integrate datasets
  • No one seems to understand how or when our data processing happens or what the impact will be on our reporting if something goes wrong. Fixes happen when a user spots a problem
  • All our data is in silo. There is no modelling or analysis performed when creating or modifying datasets and entities within them 
  • We have little exposure to data storage, definitions, and management options
  • Our data is primarily in raw tables only accessible to IT or business systems owners. We usually start from scratch each time we need something
  • We create reports whenever they are needed, and they are provided solely to those that request them
  • Our reporting outputs are usually in data tables, created either in spreadsheet software or from largely transactional reporting solutions (e.g., SQL server reporting services)
  • Reports are emailed directly to individuals or groups who are then responsible for storing them locally
  • When a senior leader wants to understand an issue, we struggle to get some data together and provide a spreadsheet with basic outputs that often contradicts other reporting
  • We only look at data we have been asked to look at by whoever needs it
  • We can barely provide descriptive data and there is little confidence in the accuracy of the outputs we produce
  • It always feels like we are firefighting, because we are struggling to keep up with demands of operational and change activity 
  • We have no analytics capability other than spreadsheets in individual departments 
  • Teams or individuals involved in data and reporting activity work in siloes and may not be aware of how others work with data and what areas they are responsible for

Level 3 proactive

  • We understand our business areas and sub areas, and have visibility of the reporting in each area
  • We hold our requirements information in a central repository and define key business questions to direct our reporting
  • It is clear what definitions have been used in some of our reporting but there is minimal consistency across reports which can create multiple versions of the truth
  • Our processes are documented and can be carried out by any trained individual. We struggle when the process doesn't create the output we expect 
  • We have standard processes to transform our key datasets but they still depend on some level of manual intervention
  • Our ETL is monitored through ad hoc checks. Where there are issues, our scripts are updated so that our data remains accurate. We do not yet have a consistency of approach across people or datasets
  • We have the concept of mastering data through our primary datasets, but interfaces and extracts do not follow a recognisable model 
  • We understand how our operational data store supports some of our reporting but we are not yet clear how we can improve our analytics capability
  • We have standard datasets for some of our business areas containing trend data but coverage is quite limited, and they are only available to the BI or MIS team
  • We provide standard trend reports in our most important business areas at key times of year. Transactional reporting is available year-round
  • We are using software to create standard dashboards where visualisations display key metrics. We can also exploit spreadsheet software to produce visually attractive outputs
  • We have created a dedicated space to hold reports. This ensures data and reports are not emailed or stored unnecessarily. We are working to ensure all reporting is stored there
  • Our analysts can provide information to senior leaders when asked, though it can still take some time. We recognise it could be more customer friendly and put less onus on the end user to work out what it means
  • We are aware that we have access to lots of different data. When we have time, we might have a look at what is there and think about how it might be useful but it is not really a priority
  • We have a good foundation of descriptive data outputs that are mostly trusted by the organisation. We want to consider how to harness data for predictive activity but we lack the skills to do this
  • Our most repeated activities are reasonably well resourced. We struggle to deal with change or new initiatives
  • We have a basic, people-driven analytical capability for one or two datasets 
  • A central function is in place to define best practice. We are starting to define which team owns which area, and where teams should collaborate

Level 5 integrated

  • We have an agreed hierarchy covering the full scope of business activities and users, enabling the management and analysis of our reporting products
  • We present a business question backlog as part of our service. Business questions are the starting point for our data orchestration activities
  • Definitions reflect business questions and statutory specifications. They are available to technical users and the wider business in appropriate formats and are maintained as part of our reporting governance structures
  • Our processes are fully integrated with our organisation's operating model and most of our data operations are automated 
  • We have a documented and well-structured approach to transforming data, creating consistent datasets across all systems and business areas
  • We have a well-documented and maintained schedule for ETL. We know who is responsible for our processes, their accuracy and relevance and issues are flagged and rectified before reporting is impacted
  • We have complete models that drive our data design, and our approaches ensures the whole organisation sees 'one version of the truth' 
  • We have considered which methodologies best suit our needs and can articulate to key stakeholders the benefits they will bring to our analytical data management
  • Managed and governed datasets, consistent with our core reporting offer, are available to "superusers" via appropriate reporting software and with appropriate permissions
  • We provide a core offer of transactional and analytical reports by defined user group across the full range of our business areas, accompanied by user focused documentation
  • We have a suite of interactive dashboards which utilise best practice visualisation techniques, built from a consistent corporate template using colour and accessibility best practice
  • A dedicated website or portal provides access to all corporate reporting, guides and documentation, for all user groups. Role-based permissions ensure individuals can access what they need at appropriate granularity
  • Our analysts can access standard datasets to investigate our organisational challenges in innovative ways. Outputs are presented as a visual narrative, directly addressing the issue, informing decision makers
  • We understand our existing and likely new data assets and are proactive in considering how they could help solve our organisational challenges. We support our analysts to develop the skills to exploit them.
  • We are trialling the use of predictive techniques in one of our business areas as the first step on our journey to leverage our data assets more efficiently and effectively
  • Daily activities are largely automated and supported by simple business processes. It is very rare we need to intervene 
  • We have made an investment in predictive analytics capability and supporting technology in support of our most important activities and aspirations 
  • We are collaborating to consider if data mesh architecture would benefit our organisation and how that might be technically enabled to allow our teams to work with consistency and agility

Decision making

Level 1 reactive

  • We have very little visibility of individual skills and confidence in data work and interpretation and we don't really think about it as a priority
  • Decision making is largely intuitive and not supported by data
  • Our teams do not tend to think about how data can support them in making decisions
  • We struggle to access the basic information we need to deliver our processes. It feels like we are always firefighting
  • We find that the information we can access does not tend to be at the level we need it. This means we often keep our own records.
  • The information we see in centrally produced reporting rarely matches what we can see on our systems, so we don't really trust it to help us deliver our processes
  • When we receive a freedom of information request, it's difficult to find anyone who has the time or capability to respond to it.
  • Professional and Regulatory body returns are dealt with by individual areas and there is no oversight or central support for this activity
  • Our major statutory returns are incredibly resource intensive to deliver and we are totally dependent on a small number of key individuals for their completion
  • We struggle to find out basic information like "How many students have we had for the last five years"
  • We struggle to understand our baseline position, far less get a sense of how different types of cohorts or activity vary from that
  • Benchmarking feels like something that is done to us - via league tables and statutory bodies. We always feel like we are on the back foot.
  • We are focused on our customers - students and external bodies. There always seems to be a new priority and we never seem to have enough people
  • When we consider our strategic direction, we don't really focus on the sector, and we cannot say that we use data to inform our priorities
  • We are mostly focused on our own operations and do not really consider how other organisations utilise data in their decision making
  • When we undertake projects, we don't tend to think about using data to support them
  • We don't really have PIs - our KPIs cover all our areas of measurable activity
  • We have a series of KPIs which someone reports on. It's not clear what actions they drive, and there seem to be lots of them

Level 3 proactive

  • We are actively looking at job specifications to ensure our expectations are clear. We are also starting to ensure decision making responsibilities are clearer to help our teams understand where data may add value
  • Data is trusted to support several key decisions for daily operations and planning purposes
  • Our teams talk about data and reporting a great deal, and often ask for more information. Our focus needs to be on understanding materiality, so that lack of data does not hold us back
  • There is a lot of operational reporting, in systems and produced by central teams. It is not always easy for people to find what they need.
  • We are starting to assess what information we are holding to feed into our reporting teams as requirements so that we do not need to hold unnecessary data locally
  • We understand that we need to be clear what information we need to deliver our operations and how often it needs to be updated. We have not really started progressing this yet.
  • We recognise that it is crucial for our data to be accessible to those dealing with FOI requests so they can respond efficiently and effectively. We are actively considering how to manage this better.
  • We have agreed requirements for PSRB obligations but we are trying to decide how they should be delivered between a reporting team and the department or subject area
  • We have developed processes to create the data we need for our returns. We can still struggle with interpreting guidance and managing the quality of the data we need
  • We can access a basic level of trend information for our key areas of activity but many datasets remain inconsistent
  • Our core reporting includes many of the attributes we want to understand better. It is still hard to tell what we should do next, though
  • We can access external data for our major areas of activity. The datasets can feel very lagged but they do give us a sense of how we compare to the sector
  • We understand the performance of each of our areas but we tend to have a one size fits all approach. We want to use data to understand our different markets better
  • We recognise there are finite resources and customers in the market, and we should define what we believe our share of this should be. We are conscious that we could be better informed
  • We know we want to be data enabled. We know we need foundational work to achieve this and that we need to define what data will bring to our organisation
  • We ensure a data or reporting specialist is part of our project teams, to ensure we can access baseline and impact information if we need it
  • We are trying to align the collection and management of PI data in our area with reporting in the organisation so that it is consistent
  • We have an automated KPI dashboard produced by a named team or individual. It is consistent with other reporting, updated in a timely fashion and accessible across the organisation

Level 5 integrated

  • We view data capability as a core skill for most of our people. Everyone understands the data expectations in their job and has a development plan in place to improve where this is needed
  • Data is presented in customisable, analytical output and supports us in sophisticated analysis
  • Our organisation is data enabled. We are aware of the benefits and limitations of our data, and people communicate with confidence. Data is embraced as part of the fabric of how we work and how we innovate
  • We actively consider the impact of new processes and projects well in advance to ensure the reporting we need is available, fit for purpose and will be retired when we no longer need it
  • We can access information at multiple levels of aggregation. Where there is a business need, and subject to GDPR, we can access individualised student or staff data to ensure we can progress activities effectively
  • We can access information updated to an agreed schedule, appropriate to the process or decision we need to complete. Our teams understand how this information relates to what we can see on our business systems
  • Managing FOI requests is part of our operating procedures. It is clear who is responsble for data and communications activity and we are comfortable rejecting requests. This is covered within our governance frameworks.
  • We know exactly what PSRB obligations we have and in what format. We proactively manage our relationships with our funders and accreditors to ensure we can manage changes to specifications
  • Our organisation prioritises its statutory obligations. Our governance processes ensure guidance is interpreted appropriately for our environment, our teams understand definitions and usage, and change is managed
  • We can access a core level of trend information for all our major areas of activity that is trusted and utilised across the organisation
  • We can ask specialists to use statistical techniques to interrogate datasets to determine causality between factors - this can help us get a much more nuanced understanding of what we should be doing better
  • We have agreed comparator groups tailored to our different business areas. We know where we rank against these in internal and external metrics and use these to inform our strategy and priorities
  • Our areas have clearly defined roles within our operating plan. We assess performance against targets suitable for each market and focus on organisational strengths rather than intra subject comparators
  • We consider the future - population and volume growth - and diversifying markets in our strategic thinking. We can access or request information which informs our priorities
  • We aspire to be market leading in how we use data. We encourage our people and our specialists to innovate in how they think, and the tools they use. We all agree this approach will benefit individuals and the organisation
  • Our change activity is driven by our strategy. This means we already understand our baselines, so our focus is on monitoring the impact of our development activity
  • Our business areas have agreed PIs based on the operating plan. Where an area is an organisational priority, we monitor our PIs to help inform progress in the linked KPI
  • Our KPI reporting includes drill through to linked PIs, metrics, trend and transactional activity, to ensure our senior teams can monitor how we are progressing towards meeting our targets

Culture

Understanding your current maturity may reinforce what you feel you already know – or it may provide visibility of a wider range of relevant processes and considerations.

Next section: culture