OfS consultations

The Office for Students OfS has issued three related consultations with detailed proposals on their approach to regulating quality and standards in higher education.

As the three consultations are extremely long and detailed, the EPC has produced brief summaries of each of the consultations’ key points to help distil the detail for our members (with thanks to London Higher for their starting point summary) and an introduction to the general policy implications for engineering.

Each consultation stands alone, the EPC has produced two member surveys to ensure the EPC’s response to the consultations is fully representative: one on quality and standards (B3) and one on excellence (TEF). Where we think there is a particular relevance to engineering, we have also included some general questions on the detailed metric proposals; please skip these if they are too technical in nature or provide details where you think we should be making more technical comments.

We hope the information we have provided will help you understand the proposals and respond to our surveys, by Wednesday 9th March. Your comments will enable us to respond fully on members’ behalf by the OfS deadline of 17th March.

If you require further information, links to the consultations themselves are included below, and the OfS has provided introductory video presentations about each of the consultations and recorded events and slide packs on their website.

  1. Consultation on a new approach to regulating student outcomes (changes to B3 baseline)
  2. Consultation on the Teaching Excellence Framework (TEF) 
  3. Consultation on constructing student outcome and experience indicators for use in OfS regulation 

Regulating student outcomes - February 2022

The 'Regulating student outcomes' consultation outlines a new approach to setting ‘minimum requirements’ for positive HE outcomes in England. The continuing narrative is the threat of ‘low performing providers’ to the public purse and public confidence. The consultation is part of the wider regulatory risk-based approach we’ve seen which is couched as reducing regulatory burden for some (low-risk) providers. The proposals include ‘non-determinative’ minimum outcomes thresholds. That means that falling short of these will trigger other processes prior to a limiting judgement (ie. any penalties). A host of metric estimates will identify potential underlying performance issues which, depending on OfS’s focus, would lead to a closer look at the providers’ context submission and compliance history. This could lead to an improvement notice. While minimum requirements are proposed for a dizzying number of indicators, no subject level thresholds are identified, although it is proposed to unpick provider performance at subject level (‘engineering’ not course, or engineering discipline) to ‘enable us to identify pockets of provision where a provider is delivering outcomes below our numerical thresholds even though its overall performance is above our numerical thresholds.’  OfS is looking to introduce the changes as soon as July 2022, the publication of all indicators in September and the identification of providers for assessment by October 2022.

Regulating student outcomes: summary of OfS proposals


  • Condition B3 will include minimum requirements for ‘positive student outcomes’.
  • These will be set using a series of numerical thresholds. These are sector-wide numerical minimum expectations.
  • Minimum thresholds will be based on performance in absolute terms. While there is no benchmarking, in calculating the minimum thresholds, OfS has made some downwards adjustments to take account of sector-wide historical (data) context.
  • Numerical thresholds are calculated for three student outcomes:
    1. the proportion of students continuing on a higher education course;
    2. the proportion of students completing a higher education qualification; and
    3. the proportion of students progressing to managerial or professional employment, or further study (note, it is no longer proposed to use LEO earnings data for this measure).
  • Degree classifications metrics have also been dropped from earlier B3 proposals and will continue to form part of Access & Participation Plans.
  • A set of indicators will be produced, each with its own threshold. Each indicator will be formed by student outcome + mode + level.
    1. As described above, student outcomes are: continuation; completion; progression +
    1. Mode relates to mode of study: full-time; part-time; apprenticeships +
    1. Level relates to nine levels of study: first degree; other undergraduate; undergraduate with postgraduate components; postgraduate taught masters; postgraduate research; PGCE; other postgraduate; undergraduate apprenticeships; postgraduate apprenticeships.
  • This could mean each provider having up to 48 indicators – and 64 thresholds (there are two completion thresholds) – depending on the range of modes and levels of courses it offers.  See figure D1, p95.
  • There will additionally be split indicators to disaggregate performance by indicatorSplit indicators are: time series, subject, student characteristics, course type, teaching arrangements.
  • Split indicators will not have thresholds. This means there will be a single numerical threshold across all subjects and across course types not reflected in the nine levels of study above (e.g. integrated Foundation Years and HTQs).
  • This extensive dataset will be published to providers as a dashboard every year. There will also be publicly available information of provider performance. It is not clear if all provider data will be in the public domain.
  • For the purposes of the consultation, the accountable officer in all registered providers has been given access to the provider dashboard. Meanwhile, sector level data and a fictional provider dashboard are available on the OfS website.
  • Engineering performs well above threshold in OfS sector level data, except for part-time course continuation undergraduate with postgraduate elements (where approximately half is below threshold).
  • For those of you in Materials and technology (CAH10-03), completion is predominantly below threshold for undergraduate apprenticeships; part-time first degree, postgraduate taught and other postgraduate; and full-time other postgraduate. There are also pockets of continuation under performance part-time (postgraduate taught and other postgraduate).
  • The numerical thresholds will not operate as an automatic mechanism for determining compliance with condition B3. Instead, where a provider’s outcome data is not at or above the numerical thresholds, contextual information will be used by OfS to make a judgement about whether the provider has nevertheless achieved positive outcomes.
  • This will take the form of information provided by the institution plus information to which the OfS already has access, including an assessment of benchmarked values.
  • Information which may be taken into account includes factors that may explain the reasons for a provider’s historical performance and ‘actions a provider has taken, or will take, to improve its performance, and the extent to which those actions appear credible and sustainable and capable of improving the provider’s performance.’
  • The following contextual information has been deemed irrelevant and will not inform the judgement: the mission and strategy of a provider; funding; entry tariff; sustainability; and reputation.
  • Regulatory action will take the form of an Improvement Notice, which will specify indicators requiring improvement, require a provider to take actions and/ or require evidence of sustained improvement on relevant indicators. Regulatory action will not be taken in relation to every provider with an indicator below one or more numerical thresholds each year.
  • Providers would be selected by a yet-to-be determined prioritisation exercise. The following ideas are mooted in the consultation: thematic; focus on the most severe breaches by either number or proportion of provider’s students or the distance between indicator value and threshold; focus on those breaches related to particular groups of students, where there strongest statistical confidence in the data; a random approach.
  • Regulatory action may be focused on either the indictor which is below the numerical threshold or look across all the indicators which fall below the numerical threshold across the institution. This is a trade-off between more frequent shorter assessments versus in depth assessments across institutions.
  • Numerical thresholds will be reviewed every 4 years, in line with but staggered from the proposed TEF assessment cycle.
  • The OfS does not intend to ‘ratchet up’ the numerical thresholds over time as continuous improvement is incentivised through TEF – at undergraduate level, that is.
  • Eligibility to apply for and retain a TEF award will be determined by a provider’s current and previous compliance with the B conditions, including condition B3, as it may not be considered ‘appropriate for a provider to obtain a TEF award that signals ‘teaching excellence’ if there is or has been a breach of our minimum requirements for student outcomes.’
  • Separate consultations are proposed to consider:
    1. Benchmarking approaches (see the current TEF consultation).
    2. Student outcomes for courses which are delivered in partnership agreements; these are nonetheless, expressly in scope and registered providers will take responsibility for these.
    3. Transnational education (TNE) which is out of scope of these proposals.

Regulating student outcomes: implications for engineering


Access and participation

OfS’s proposals are based on the assumption that ‘a provider has a considerable influence over the outcomes students achieve, and that factors beyond its control are not so extensive as to make it impossible to establish a minimum expected level of performance.’ They also hold that ‘meaningful’ improvements in access and participation must be built on a minimum level of quality and standards including ‘taking steps to meet the needs of students from underrepresented groups where those needs are different from other students’ needs.’

There is a vast literature on the structural barriers experienced by disadvantaged learners throughout their education. By explicitly rendering universities accountable for addressing the failings of education pre-higher education, OfS will encourage HE institutions to avoid selecting students with lower prior attainment or any other circumstances that might hamper their employment outcomes (such as socioeconomic disadvantage, ethnicity, gender, disability, etc.). The effect will obviously be the opposite, undermining OfS’s legal duty to promote fair access.

Ultimately, the approach as it stands is likely to encourage universities to be risk-averse at admissions, effectively raising the bar for any students that don’t look like those who have been successful in the past. This will likely close down opportunities and make higher education ever more the preserve of those whose pre-existing privilege is the most ingrained while marginalising those for whom the transformation potential of higher education is the greatest. Furthermore, although we do not want to endorse a post-pandemic deficit model, it is unrealistic to ignore pre-HE ‘gaps’. Widely evidenced amplification of disadvantage will distort the creaming effect further.

Universities will be incentivised by these proposals not only to recruit those who are less likely to drop out and more likely to ’succeed’ in the graduate labour market, but they will also face potentially severe punishments for doing anything other than that. With entry tariff cited as irrelevant context – notwithstanding the messaging that KS5 results are not an equitable proxy for potential – creaming is inevitable. Meanwhile, providers who currently make the greatest contribution to the access and participation agenda risk being penalised for their efforts.

Diversity

There will be pressures against diversity of students for the same reasons as above, and this will be felt in engineering where SEND teaching approaches are commonly applied even when there is no diagnosis. Engineering’s efforts to better balance gender (women earn less than men) would be disincentivised. It also seems likely that these proposals would trigger the re-introduction of the traditional maths and physics A level prerequisite, undoing years of work to improve access and diversity, particularly gender diversity in engineering which is essential to address the well documented skills gaps.

With HE planners able to pore over a host of split indicators, the volume of data required to hunt the outliers to improve progression rates – together with the complexity of internal analysis of split indicators by other split indicators (i.e. subject by sex and domicile) – could reasonably be such that the resource to understand the student factors affecting underperformance is greater than the benefit. Why bother when the known ‘problem areas’ (i.e. courses in the arts and any students that don’t look like those who have been successful in the past) can simply be cut off at source? Even without such a sweeping approach, without the ability of adequate targeting, providers will have little choice but to use a sledgehammer to crack a walnut; a disaster for diversity.

Teaching and learning

This approach threatens multidisciplinarity and innovation (if-it-ain’t-broke-don’t-you-dare-try-to-fix-it and copy-whatever-works-for-someone-else-or-always-used-to-work). Institutional autonomy allows and encourages innovative approaches, driving quality and adaptation to an ever-changing world (which is particularly important in Engineering). It facilitates innovation and diversity to serve diverse needs and a diverse body of students, serving diverse societal, economic and labour market needs and is often cited as a contributing factor to the UK’s high quality HE sector. OfS should use its position and powers to encourage innovation rather than incentivise homogenous approaches designed to deliver metric-satisfying outcomes.

Outcomes

The sole definition of a ‘positive outcome’ being defined in terms of ‘the proportion of students progressing to managerial or professional employment, or further study’ fails to address the concerns raised by EPC and others that the purpose of higher education is broader than employment. As previous attempts to define “graduateness” have proved.

By recognising only the ‘exchange value of HE’ – that is the development of skills and knowledge in order to pass the required assessments and gain a ‘qualification’ that will allow the individual to access career choices that are not possible without this qualification – the OfS reject the other aims of HE including:

  • use value: the development of skills and knowledge for use in employment and everyday life;
  • self- actualisation: the benefits for mental and spiritual wellbeing given to an individual when they are able to reach their full potential;
  • education of the next generation: by educating one generation, we are providing the generations to come with a support-network better able to nurture their educational needs. The critical role played by the family in supporting education has never been more clear as during the pandemic where parents and guardians have been expected to home-school.

This leads to a distortion of education where “teaching to the test” becomes the norm; this also leads students to view their educational experience as transactional rather than the rich transcendent experience that it should be. This generates an employee pool which does not reflect the requirements of business; life-long learners driven to deliver their very best every day.

Context

Employment outcomes are only indirectly related to factors within the universities’ control such as the quality of teaching and support. The outcomes baselines do not adequately consider that these measures are highly influenced by region, by industrial sector and by imbalances in recruitment practices (gender, socio-economic background, etc.). Not to mention wider economic conditions. Outcomes will always be the product of the health of the economy in the context of regional, national and global market forces that are far beyond the control of the sector. We note that factors outside a provider’s control inform the historical performance aspect of the context submission, but with recession, pandemic and recovery accounting for most of the past decade and a half and accepting that employment measures for the next few years may be wildly unrepresentative of actual standards or performance over time, this surely undermines the whole process anyway?

Regional development and levelling up

The value of higher education in improving social mobility, developing local economies, and “levelling up” – especially post-pandemic – is in addressing the precise inequalities that these proposals ignore. Under these proposals, institutions where graduates supply workplaces in disadvantaged parts of the country will be in the bottom third of the table for employment and be at risk of being fined. It’s been pointed out that the best thing a university like Sunderland could do is relocate to London.

The approach would benefit from a regional appraisal and nuance to prevent local brain drains and enforced geographic mobility. Local retention / employment is not only a positive outcome for graduates, but also a really positive choice for the Government’s goal of levelling up regions by creating high-skilled employment in disadvantaged areas. Given regional variations in the labour market, the goal of regional development through educating and upskilling the local workforce is inconsistent with the desirable outcomes as set out in the strategy as it stands. It should not fall to universities to resource consultants to outline regional priorities to a government regulator.

Standards

Universities will be disincentivised to let anyone fail, leave or even repeat a year. Instead, everyone will pass (grade inflation) and students who have lost enthusiasm for their course will be conveyor belted through rather than allowed to drop out.

Engineering is heavily regulated; in any accredited engineering course there is already a very clearly defined set of standards, governed by the Engineering Council and assured by Professional Engineering Institutions. Indeed, PSRBs already set standards in terms of learning outcomes in engineering (and many other professional higher education courses). Accreditation standards are high; but if meeting accreditation standards threatens performance metrics, universities may opt not to be accredited rather than risk letting students fail or drop out. This is a threat to standards.

We would suggest prioritising subjects where there are no PSRBs.

High-cost courses

While expensive, universities are likely to see that engineering has a positive effect on their indicators overall, which could be good news for engineering. However, these proposals are clearly a threat to the arts. How will universities be able to afford to run the more costly engineering courses without cross subsidisation?

Non-standard provision

These proposals may deter future partnerships, limit student choice, and disproportionately impact upon small, specialist providers. For example, data for student cohorts of fewer than 23 will be suppressed. Typically, the proposed statistical uncertainty approach in this context relates to the size of the student population, where smaller numbers can lead to greater uncertainty.

Teaching Excellence Framework (TEF) - February 2022

As promised, the OfS is consulting on the future of the TEF, based on the narrative that putting a reputational spotlight on provider quality informs student choice and drives improvement. The TEF builds on the proposed continuous minimum requirements for positive student outcomes under Condition B3 shifting the focus periodically to excellence in teaching, learning and student outcomes. The consultation follows a statutory independent review chaired by Dame Shirley Pearce, which argued for a rebalancing of metrics and qualitative information articulating an institutional view, evidence of excellence, and institutional education gain measures. The Pearce Review also recommended that students be supported to make their own independent submission alongside their provider. Under the new proposals, universities and colleges in England would be assessed on undergraduate courses for a TEF award – of gold, silver or bronze – every four years, plus a new designation of ‘requiring improvement’. It is proposed to open the submission window between September and November 2022 and announce outcomes in spring 2023. The last TEF exercise was carried out in 2019.

TEF: proposals summary


  • Condition B3 will include minimum requirements for ‘positive student outcomes’.
  • These will be set using a series of numerical thresholds. These are sector-wide numerical minimum expectations.
  • Minimum thresholds will be based on performance in absolute terms. While there is no benchmarking, in calculating the minimum thresholds, OfS has made some downwards adjustments to take account of sector-wide historical (data) context.
  • Numerical thresholds are calculated for three student outcomes:
  1. the proportion of students continuing on a higher education course;
  2. the proportion of students completing a higher education qualification; and
  3. the proportion of students progressing to managerial or professional employment, or further study (note, it is no longer proposed to use LEO earnings data for this measure).
  • Degree classifications metrics have also been dropped from earlier B3 proposals and will continue to form part of Access & Participation Plans.
  • A set of indicators will be produced, each with its own threshold. Each indicator will be formed by student outcome + mode + level.
  1. As described above, student outcomes are: continuation; completion; progression +
  1. Mode relates to mode of study: full-time; part-time; apprenticeships +
  1. Level relates to nine levels of study: first degree; other undergraduate; undergraduate with postgraduate components; postgraduate taught masters; postgraduate research; PGCE; other postgraduate; undergraduate apprenticeships; postgraduate apprenticeships.
  • This could mean each provider having up to or 48 indicators – and 64 thresholds (there are two completion thresholds) – depending on the range of modes and levels of courses it offers.  See figure D1, p95.
  • There will additionally be split indicators to disaggregate performance by indicatorSplit indicators are: time series, subject, student characteristics, course type, teaching arrangements.
  • Split indicators will not have thresholds. This means there will be a single numerical threshold across all subjects, and across course types not reflected in the nine levels of study above (e.g. integrated Foundation Years and HTQs).
  • This extensive dataset will be published to providers as a dashboard every year. There will also be publicly available information of provider performance. It is not clear if all provider data will be in the public domain.
  • For the purposes of the consultation, the accountable officer in all registered providers has been given access to the provider dashboard. Meanwhile, sector level data and a fictional provider dashboard are available on the OfS website.
  • Engineering performs well above threshold in OfS sector level data, except for part-time course continuation undergraduate with postgraduate elements (where approximately half is below threshold).
  • For those of you in Materials and technology (CAH10-03), completion is predominantly below threshold for undergraduate apprenticeships; part-time first degree, postgraduate taught and other postgraduate; and full-time other postgraduate. There are also pockets of continuation under performance part time (postgraduate taught and other postgraduate).
  • The numerical thresholds will not operate as an automatic mechanism for determining compliance with condition B3.
  • Instead, where a provider’s outcome data is not at or above the numerical thresholds, contextual information will be used by OfS to make a judgement about whether the provider has nevertheless achieved positive outcomes.
  • This will take the form of information provided by the institution plus information to which the OfS already has access, including an assessment of benchmarked values.
  • Information which may be taken into account includes factors that may explain the reasons for a provider’s historical performance and ‘actions a provider has taken, or will take, to improve its performance, and the extent to which those actions appear credible and sustainable and capable of improving the provider’s performance.’
  • The following contextual information has been deemed irrelevant and will not inform the judgement: the mission and strategy of a provider; funding; entry tariff; sustainability; and reputation.
  • Regulatory action will take the form of an Improvement Notice, which will specify indicators requiring improvement, require a provider to take actions and/ or require evidence of sustained improvement on relevant indicators. Regulatory action will not be taken in relation to every provider with an indicator below one or more numerical thresholds each year.
  • Providers would be selected by a yet-to-be determined prioritisation exercise. The following ideas are mooted in the consultation: thematic; focus on the most severe breaches by either number or proportion of provider’s students or the distance between indicator value and threshold; focus on those breaches related to particular groups of students, where there strongest statistical confidence in the data; a random approach.
  • Regulatory action may be focused on either the indictor which is below the numerical threshold or look across all the indicators which fall below the numerical threshold across the institution. This is a trade-off between more frequent shorter assessments versus in depth assessments across institutions.
  • Numerical thresholds will be reviewed every 4 years, in line with but staggered from the proposed TEF assessment cycle.
  • The OfS does not intend to ‘ratchet up’ the numerical thresholds over time as continuous improvement is incentivised through TEF – at undergraduate level, that is.
  • Eligibility to apply for and retain a TEF award will be determined by a provider’s current and previous compliance with the B conditions, including condition B3, as it may not be considered ‘appropriate for a provider to obtain a TEF award that signals ‘teaching excellence’ if there is or has been a breach of our minimum requirements for student outcomes.’
  • Separate consultations are proposed to consider:
    1. Benchmarking approaches (see the current TEF consultation).
    2. Student outcomes for courses which are delivered in partnership agreements; these are none-the-less, expressly in scope and registered providers will take responsibility for these.
    3. Transnational education (TNE) which is out of scope of these proposals.

TEF: implications for engineering


Indicators

NSS Students’ self-reported satisfaction is emphatically not equivalent to teaching quality, but rather a reflection of the gap between the student’s expectation and what they perceive to have been delivered. This can be seen in the long continual variance in self-reported satisfaction rates in these categories in NSS data between students studying STEM subjects, which tend to be higher than among students studying arts subjects. Most students lack an objective point of reference for what good teaching at higher education level looks like and, while student satisfaction is an important indicator to monitor, it is not appropriate to use it as a proxy for teaching quality.

There are also other problems with NSS data, including: (a) the likelihood that satisfaction is less dependent on teaching quality than on demographic patterns (gender, age, socioeconomic background, ethnicity, etc) and on students’ circumstances (commuter students, part-time study, part-time work, etc); (b) the possibility of gaming; and (c) non-comparability owing to factors such as student boycotts.

NSS is subject to a continuing review.

Educational gain

This is a seriously underdeveloped concept intended to recognise the diversity of mission, not to be confused with learning gain, HEFCE work on which all but disappeared under OfS. Under the proposals providers will report on the education gain of their students (and, importantly, how this may vary across subjects) with little guidance. See Wonkhe for an interesting blog on this important facet.

Requires improvement

The introduction of a new award category of “requires improvement” (RI). So, what is TEF if not designed to be used heuristically?  That is to say it is used over-simplistically as a shortcut to ‘good’, ‘okay’ and ‘bad.’ Heuristic information discourages students from further consideration of their choice and encourages sub-optimal choices. A greater range of outcomes compared with the previous iterations of the TEF will just shortcut some of those institutions previously at the lower end of the Bronze level to ‘really bad’. Requires improvement can only be extremely damaging unless it is applied very leniently in terms of context.

Given the availability and presentation of the proposed student outcomes data, in Engineering the dashboards will, in almost all cases, provide the confidence students need, without the need for TEF. Research shows that students chose subjects, not universities, first.

Qualitative statements

In the context of RI, these will be essential to pick out institutions adding a great deal on the basis of prior attainment, specialist (and arts) institutions, those delivering local education and regional levelling up. However, this additional weight to providers’ case for their excellence in their submission, will create inequity; smaller universities and specialist providers (those most likely to be RI) will not have the resources for specialist consultants to write these.

Accreditation

In Engineering, accreditation is the baseline, regardless of B3 proposals. There is an established system of accreditation of engineering degrees, and degrees accredited by Professional Engineering Institutions licensed by the Engineering Council are recognised internationally through a number of international accords. The accreditation process focuses on assuring that degrees will deliver to at least a threshold standard of learning outcomes specified by the engineering profession. These learning outcomes are developed and maintained in consultation with employers and other stakeholders. There may be synergies between subject-level TEF and accreditation, although these have distinct and different purposes. At the moment, the two exercises – although they are designed for fundamentally different purposes and should never be conflated – create obfuscation rather than clarity. For example, some Bronze courses may be accredited under the Engineering Council while some courses deemed Gold may not be.  Of the two, accreditation serves its purpose more effectively.

The new four-year TEF cycle means a provider will hold an award for longer and coupled with the accreditation timescales in Engineering, this is likely to mean that anyone using the rating may be looking at information that is constantly partial, out-of-date and potentially conflicting.

Timing

The plan to open the TEF submission window in September 2022 will make for a high-pressure start to the new academic year, which could put added strain on already-stretched staff who are already set to be dealing with the third academic year under the uncertainty of Covid.

The exclusion of TNE (i.e. the Aggregate Offshore Record) and HE modules or credit only courses is not commensurate with the current policy thinking around microcredentials and the Lifelong Loan Entitlement (LLE). We fear that the overall approach adopted accross OfS does not adequately accommodate courses which allow students to do anything other than join a course, stay the duration, and graduate. Given the Government’s intention to expand flexibility throughout the education sector to encourage lifelong learning, modularity and hop-on-hop-off courses, the applicability of the TEF in the light of hop-on-hop-off and emergent HTQs is remiss. This is a game changer for the sector.

Student outcomes and experience indicators

The OfS proposals for regulating student outcomes and the TEF involve using detailed data and analytical evidence to inform their regulatory judgements. The focus of this consultation is on the technical aspect of the outcomes; how the data thresholds and indicators proposed are constructed, presented and interpreted. This consultation also brings the OfS Access & Participation data dashboards into scope, to establish consistency and alignment of indicators and their parameters across the B3 conditions, TEF and the Access & Participation dashboards (despite differences in their coverage). It is proposed to continue with an update of the latter in Spring 2022, then publish another version later in 2022 following consultation on these measures. TEF data will be updated annually.

Student outcome and experience indicators: summary of OfS proposals


  • The consultation proposes a continued use of the indicators that currently inform assessments of condition B3 and TEF awards, and of the regulation of access and participation, with some changes:
  1. ‘Measures of continuation, completion and progression’ – i.e. student outcomes indicators – are proposed as the continuing basis for assessments of condition B3.
  2. TEF assessments will draw on both student outcomes and student experience indicators (measures of continuation, completion, progression, and student experience.
  3. Regulation of access and participation will continue to use access and outcomes indicators (‘measures of access, continuation, degree outcomes and progression’)
  • The consultation proposes the introduction of a completion measure alongside continuation.
  • It proposes to measure progression using data from Graduate Outcomes (for which data exists for two cohorts currently – 2017-18 and 2018-19).
  • International students do not feature in the Access and Participation data dashboard student populations, but they do in some of the populations used to assess B3/make TEF awards. The proposal is to exclude them from the calculation of progression measures, with a commitment to review this in future.

Student outcome and experience indicators: implications for engineering


While the EPC welcomes consistency of approach, we are concerned that the limitations and constraints of the existing metrics and their collection methods compromise the utility of this approach beyond fitness for purpose.

Of particular concern in engineering:

Value of HE

The use of the Graduate Outcomes dataset supports out-of-date thinking around measuring success through high-achieving entrants going on to earn large sums. It does not account for students’ own views about their post-HE destinations, nor does it capture the distance travelled for students from different backgrounds entering higher education. What’s more, the below target response rate to this survey currently has, together with structural response bias, not been helped by changes to the methodology around telephoning students.

While the use of LEO data has been dropped from the formal metrics, the OfS has shared that it expects LEO data to form part of context statements.

Professional pathways into engineering

We note the limitations of the Graduate Outcomes survey’s Standard Occupational Classification (SOC 1-3) approach. Specialist engineering graduates, in particular, may not progress onto conventional graduate work (for example aid work) yet be considered world-leading.

The 10-year revision schedule is particularly limited for the fast-paced technological discipline of engineering. Its approach to interim study and employment activities also presents as a disadvantage to engineering graduates, where those going on to professional pathways or taught postgraduate programmes, may disproportionately achieve negative interim outcomes.

International

The partial exclusion of international students from metrics – in particular progression – is likely to lead to an incomplete and potentially inconsistent picture, particularly in Engineering, where the proportion of international students is higher than in almost any other discipline.

Foundation years

Foundation year students will not be separated out from first degree numbers. This will discourage institutions from offering this pathway for access in case students do not progress. This may align with the proposal recommend by the Augar review that Foundation years should be defunded, but it is at odds with the same Review’s call for greater flexibility in the delivery of courses, particularly in the form of hop-on-hop-off models and recognition (through interim qualifications) of each level reached.

Lifelong learning

This approach is not fit for purpose in a more modular system of higher education as presented in current Government thinking for the LLE. The consultation does not adequately accommodate courses which allow students to do anything other than join a course, stay the duration, and graduate. Given the Government’s intention to expand flexibility throughout the education sector to encourage lifelong learning, modularity and hop-on-hop-off courses, the applicability of the strategy in the light of hop-on-hop-off and emergent Higher Technical Qualifications (HTQs) is remiss. This is a game changer for the sector. In Engineering, there are already calls for the Institute for Apprenticeships and Technical Education to invite OFSTED to join accreditation visits; making better use of existing quality mechanisms should be more fully considered by OfS.

Student factors

OfS has not evidenced sufficient grasp of how student factors influence progression and outcomes; evidence shows that one of the most significant influences was whether they came from an affluent background with school attainment a stronger predictor of graduate earnings than subject or university choice. BTEC students, for example, have higher non-continuation rates on engineering courses than students with A level Maths and Physics. When they do graduate, they face higher hurdles in gaining employment because they may not have the social capital and may not have had the same extra-curricular opportunities. However, the earnings premium for BTEC students in engineering is greater relatively than for their high-achieving A level counterparts. This complexity is lost in the proposals.

Local Skills Priorities

The proposed progression benchmark draws on ‘geography of employment and earnings’ quintiles but the reality of how higher education institutions help to deliver local skills priorities is much more complex than this. This is really important in engineering.

Proxy measures

The use of proxies (e.g. outcomes for teaching quality, compound indicators for completion) inevitably leads to gaming. Continuation and completion measures are particularly liable to this given that it is proposed to treat the outcomes of students who have transferred to another course or provider as neutral, and also recognise that completion of a lower award than originally intended could still be interpreted as a positive outcome. For engineering, this does mean that MEng to BEng transfers would be a positive outcome.

Further information


You can read the full OfS consultation and watch introductory video presentations here. You can watch the recorded OfS consultation / Q&A event here.

Let us know what you think of our website