| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Surveys for Enhancement Conference 2011

Page history last edited by Alex Buckley 12 years, 10 months ago
  • Date: 19 May 2011

  • Location/venue: The National College for School Leadership, Triumph Road, Nottingham Nottinghamshire , England, NG8 1DH

 

This conference brings together research and practice in using student experience surveys in order to explore their implications for the enhancement of learning and teaching. The event will be attended by a broad range of people interested in using student experience data to enhance policy and practice, including senior management with strategic learning & teaching roles, educational developers and lecturers at the chalk face.

 

The main themes for the conference are:

 

  • Using the National Student Survey to support enhancement;
  • Postgraduate Taught and Research Experience Surveys;
  • Linking undergraduate and postgraduate surveys to explore institutional, subject or programme-based issues;
  • HEI internal surveys;
  • Making the most of qualitative data.

 

This page contains the programme for the day (with links to presentations), as well as some useful links.

 

Contents

 


 

Programme

 

Time

Session

Presenter

0930 - 1000

Registration

 

Morning Keynote

1000 – 1030

Students’ evaluations of university teaching: dimensionality, reliability, validity, potential biases and usefulness

Professor Herb Marsh
(University of Oxford)

Parallel sessions 1-4

1040 - 1125

Making NSS data count: Using qualitative data-gathering to illuminate NSS results and stimulate institutional action

Paul Richter

Sarah Walsh
(Newcastle University)

1040 - 1125

Rediscovering a formula to enhance the student learning experience: Lessons from advanced analyses of the NSS dataset

Dr Aftab Dean
(Leeds Met University)

Steve Probert

(BMAF subject centre)

Dr Mark Langan (Manchester Metropolitan University)

1040 - 1125

Improving student satisfaction: A whole-institution strategic enhancement approach?

Dr Andrew Turner

Dr Christine Broughan

Katie Hartless

Ian Dunn

(Coventry University)

1040 - 1125

PRES and PTES: promoting the surveys, engaging with students

Paul Tobin (NUS)

Parallel sessions 5-8

1135 - 1220

Student engagement, satisfaction and study within different research environments: an analysis of NSSE, NSS and RAE data at the University of Reading

Dr John Creighton
(University of Reading)

1135 - 1220

Can’t get no satisfaction: discrepancies between NSS qualitative & quantitative data and implications for quality enhancement

Dr Clare Milsom
(Liverpool John Moores University)

1135 - 1220

Using the Assessment Experience Questionnaire to engage Course Teams in the revision of programme-level assessment regimes

Professor Graham Gibbs

Yassein El-Hakim
(University of Winchester)

1135 - 1220

Using survey data to enhance the taught postgraduate experience: issues and challenges

Dr Rachel Segal

Dr Laura Hodsdon

(Higher Education Academy)

Lunch (1220 – 1300)

 

Afternoon Keynote

1300 – 1330

Measuring student engagement: Findings from the Australian Survey of Student Engagement (AUSSE)

Ali Radloff

(Australian Council for Educational Research)

Parallel sessions 9-12

1340 - 1425

From individualised quality enhancement to national quality assurance: an Australian perspective

Deanne Gannaway
(University of Queensland)

1340 - 1425

Rage against the machine? The views of academic staff towards the National Student Survey

Adam Child
(Lancaster University)

1340 - 1425

Compared to whom? Benchmarking NSS scores, from subject comparisons to identifying institutional priorities

Jason Leman
(Sheffield Hallam University)

1340 - 1425

Exploring the postgraduate experience

Dr Rachel Segal

Dr Laura Hodsdon

(Higher Education Academy)

Question and Answer sessions

1435 - 1515

Questions from delegates on the following theme:

“The benefits and challenges of using surveys for enhancement”

Panellists:

Professor Adrian Randall – Chair (University of Birmingham)

Alex Bols (NUS)

Dr Rachel Segal (HEA)

Professor Herb Marsh (University of Oxford)

Hannah Pudner (HEFCE)

Ali Radloff (Australian Council for Educational Research)

1530 - 1600

Opportunity for informal Q&A with panellists

As above

Top of page

 

 

Keynote Presentations

Professor Herb Marsh (University of Oxford)

Students’ evaluations of university teaching: dimensionality, reliability, validity, potential biases and usefulness

 

Herb Marsh is Professor of Education at Oxford University, having spent much of his career in Sydney after completing his PhD at UCLA. He is widely published (350 articles in 70 journals, 60 chapters, 14 monographs, 350 conference papers) and co-edits the International Advances in Self Research monograph series. He founded SELF Research Centre that has 450 members and satellite centres at leading Universities around the world. In 2008 he was awarded one of the highly competitive ESRC Professorial Fellowships (awarded to only 3-5 social science researchers across all of the UK).

 

Ali Radloff (Australian Council for Educational Research)

Measuring student engagement: Findings from the Australian Survey of Student Engagement (AUSSE)

 

Ali Radloff works as a Research Fellow as part of the Higher Education Research Programme at the Australian Council for Educational Research (ACER). During her time at ACER, she has been closely involved in several large-scale survey and assessment projects run within Australia and internationally. She has worked on a diverse range of projects in the areas of student admissions, student learning, teaching, student outcomes, and academic leadership. She has worked with all Australasian universities as well as several other higher education and tertiary education institutions, peak bodies and government agencies. As part of her work in higher education research, Ali manages the Australasian Survey of Student Engagement (AUSSE), a large cross-national survey of undergraduate students, postgraduate coursework students and academic staff at Australian and New Zealand universities, private providers and tertiary education institutions.

 

Top of page

 

Presentations

 

Parallel session 1

 

Making NSS data count: Using qualitative data-gathering to illuminate NSS results and stimulate institutional action

Paul Richter and Sarah Walsh (Newcastle University)

 

The NSS exercise clearly represents a useful source of student opinion data for HEIs. According to a recent HEFCE-commissioned study, the NSS “has become generally accepted across the sector as a valuable contribution to quality assurance and enhancement.” It is based on a robust instrument, it generally achieves a high response rate, and it has a “strong foundation in a theory of student learning that links student experiences to learning outcomes.” (HEFCE, 2010: 9)

 

At the same time, as with all surveys, it has its limitations. While the data can point towards areas of the student experience about which respondents are (un)satisfied, there is often ambiguity about which particular aspects of students’ experience led to feelings of (dis)satisfaction, and therefore what lessons HEIs can draw about student expectations and what action it might be appropriate to take. So while it is useful to know, for example, that half of the respondents disagreed with the notion that ‘the course is well organised and is running smoothly’, it is unclear which elements of the course were poorly organised and in what ways. Neither does the statistic help understand what students’ expectations were in terms of course organisation. Thinking specifically of how useful the NSS exercise is as a basis for implementing enhancements, a HEA-commissioned report observed that “how to use the data as part of an integrated set of knowledge resources within institutions is a challenge.” (2007: 6)

 

In this presentation we will set out how one Russell Group institution is turning to qualitative data-gathering in an effort to better understand and further explore some of the issues raised by the NSS data. While the NSS free text comments section does provide respondents with an opportunity to qualify their opinions, more systematic qualitative data-gathering exercises can be very valuable for posing the ‘how’ and ‘why’ questions to students, for challenging students’ opinions in a non-threatening way, and for seeking students’ recommendations for practical improvements.

 

In recent years Newcastle University’s Student Opinion Steering Group has commissioned a series of research studies employing qualitative data-gathering methods. In this presentation, we will set out the research we have undertaken for the University which has yielded some interesting, and at times unexpected, findings to supplement and illuminate the broad trends identifiable from the NSS data. The latest study we are undertaking is designed to maximise the impact of the findings. As such, it will involve direct engagement between the research team and school-based staff with a view to stimulating meaningful discussion of the student voice and of how schools might enhance the student experience. During this presentation, we will also report on this experience.

 

We are confident that the work we have undertaken at Newcastle University holds lessons for the sector more broadly. We will invite audience members to critically reflect on our work and encourage them to share any experiences of utilising qualitative data-gathering methods to supplement NSS data.

 

References

 

HEA (2007) An exploratory evaluation of the use of the National Student Survey (NSS) Results Dissemination website. http://www.heacademy.ac.uk/assets/York/documents/ourwork/nss/web0274_national_student_survey_full_report.pdf

 

HEFCE (2010) Enhancing and Developing the National Student Survey. http://www.hefce.ac.uk/pubs/rdreports/2010/rd12_10/rd12_10a.pdf  

 

Top of page

 

Parallel session 2

 

Rediscovering a formula to enhance the student learning experience: Lessons from advanced analyses of the NSS dataset

Dr Aftab Dean (Leeds Met University), Steve Probert (BMAF subject centre), Dr Mark Langan (Manchester Metropolitan University)

 

The use of student evaluation schemes in the US and the UK are becoming increasingly important sources of information that empower students with information on expectations of the educational experience at a specific university. While it could be argued that the current National Student Survey (NSS) administered, to final year students, in the UK is a blunt instrument that does not provide a truly holistic impression of the circumstances and experience of the students in higher education. It is nevertheless being used as a measurement tool of university performance in delivering a quality and valued learning experience to students. Furthermore, national newspapers are using the NSS results as part of their calculation to determine a university’s league table rankings. Due to the increasing prominence of NSS results, universities are investing in numerous projects to improve their NSS scores. Analysis of the NSS results by year would reveal incremental improvements by virtually all universities although several universities have consistently achieved significantly higher NSS scores.

 

Analysis of the 2010 NSS data set, of over 400,000 responses, has revealed a number of key areas that will significantly improve student satisfaction. The results challenge previous views and demonstrate through regression analysis that targeted efforts on these areas will not only improve NSS scores but also league table ranking. The paper proposes a number of practical solutions, previously highlighted by Dean (2010) on how universities can practically enhance the learning experience of students in Higher Education.  

 

Top of page

 

Parallel session 3

  

Improving student satisfaction: A whole-institution strategic enhancement approach?

Dr Andrew Turner, Dr Christine Broughan, Katie Hartless, Ian Dunn (Coventry University)

 

In this presentation and workshop, a whole-institution strategic enhancement approach to improving student satisfaction will be presented followed by an activity to promote the sharing of practice between participants. As a result of the strategic enhancement approach, return rates of internal surveys significantly improved, enabling the evaluation of subject areas that were previously ‘hard to reach’. Increased and rapid reporting of results has enabled the use of data to feed-back to students and inform interventions and enhancement of practice within an appropriate time frame.

 

In common with other institutions where the NSS is conducted, targets for student satisfaction and performance in various University League tables are key institution performance indicators. A relatively small percentage change in overall student satisfaction is often amplified in a change in the league table position of an institution.

 

The context of the presentation is a post-1992 institution where, in response to student satisfaction and employment indicators, a co-ordinated range of key strategic management interventions and enhancement activities were introduced with the aim of enhancing the student experience and satisfaction.

 

Key interventions and activities included:

 

  • Student satisfaction became a key institutional performance indicator and all faculties and schools required to provide a response and strategy to enhance student satisfaction.
  • An institutional survey unit was established as part of a research unit within Student Services allowing a research-led approach to the design and analysis of internal institutional surveys.
  • Introduction of a mid-term internal survey of all final year modules administered by student ambassadors within a defined two week period with a rapid turnaround of data.
  • The introduction of a new academic and personal tutor scheme with specified minimum requirements for student entitlement and with personal tutors supporting student employment as well as pastoral support.
  • Provision of a continuing professional development programme open to all staff with an explicit focus on the enhancement of teaching and learning practice.
  • Revision of accredited professional practice courses for new members of academic staff to provide a greater focus on the processes of teaching and learning.
  • Involvement of the Student Union in the planning process.

 

The introduction of a revised range of paper-based internal surveys administered by paid student ambassadors in timetabled slots was key in enabling the identification of priority areas for interventions to enhance the student experience. The survey unit had the responsibility to timetable the administration of surveys by paid student ambassadors with the result that student response rates were very high with virtually all modules surveyed. With the previous tutor-administered paper-based and online scheme response rates were generally lower with variable compliance in administering surveys with various subject areas viewed as being ‘hard to reach’.

 

A critical evaluation of the process will the presented together with the key lessons learned about the process of implementing a university wide evaluation.  

 

Top of page

 

Parallel session 4

  

PRES and PTES: promoting the surveys, engaging with students

Paul Tobin (NUS)

 

During 2010-11 NUS and HE Academy have coordinated a campaign designed to encourage students' unions to engage more effectively with the postgraduate surveys PRES and PTES. Students' unions were identified as potentially useful in the promotion and dissemination of the surveys to students, and as key actors in ensuring that the surveys were picked up and used for enhancement purposes.

 

This workshop will explore good practice emerging from students' unions championing PRES and PTES in the 2011 cycle, both in promoting the surveys to students and establishing the surveys as an underpinning evidence base for institutional and student union engagement with the postgraduate student voice. The role of student experience surveys as a valuable area of shared work between HEIs and students' unions will also be explored, and opportunities and challenges identified.

 

The workshop will be delivered as a combination of presentation and discussion among participants.

 

Parallel session 5

  

Student engagement, satisfaction and study within different research environments: an analysis of NSSE, NSS and RAE data at the University of Reading

Dr John Creighton (University of Reading)

 

In 2008 the University of Reading undertook a NSSE/AUSSE style survey of all its taught students. This activity-focused survey gathered very different data to the satisfaction-based NSS. At the same time the RAE was underway. This provided us with information about student activity, satisfaction and the research cultures within which they were being taught. This enabled us to drill into the relationship between the three.

 

The presentation will be divided into three parts:

 

The first is briefly introducing our survey instrument based on NSSE. As part of the work was funded by a CETL looking at undergraduate research, we added questions to ensure that we were gathering data about students’ direct engagement in research as active learners rather than passive receptors of knowledge. Marked differences across years and disciplines were found. However, the data acquired enabled us to appreciate which activities were being undertaken in some disciplines but not others that were pushing and driving our students.

 

Our correlation of the NSSE-style data with NSS satisfaction data proved worrying. Apart from a few satisfaction questions in our survey instrument which had a high significant correlation with NSS data (these were our control questions), we have very few other significant correlations between activities that the originators behind NSSE called ‘educationally enhancing experiences’ and student satisfaction. From a policy point of view, what then is more important: providing potential students with data bout how satisfied/happy students are; or information about the level of different types of activity they will be engaging in for their £6-9K fees?

 

Our comparison with the RAE data however generated a large number of statistically significant correlations, some positive and some negative. This was fascinating as the current orthodoxy based on a meta-analysis by Hattie and Marsh states that there is no linkage between the quality of teaching and research. However, much of the data that fed into their analyses that they pulled together was based on student satisfaction data rather than measures of student activities. In the current market-positioning of University mission groups, some research intensive and others more teaching focused, these results suggest that research-focused departments have certain great strengths which are reflected in student engagement, but at a cost to certain other activities. The same is the case with teaching –focused departments. Though of course the kind of correlations or lack of, of course, need repeating in other institutions. These results have always generated heated discussion.  

 

Top of page

 

Parallel session 6

  

Can’t get no satisfaction: discrepancies between NSS qualitative & quantitative data and implications for quality enhancement

Dr Clare Milsom (Liverpool John Moores University)

 

Since the NSS published its first results in 2005 the ‘scores’ have dominated research into the validity of this form of student evaluation. Factor analysis confirmed the factors the survey was designed to measure (Marsh and Cheng 2008) and supports high internal consistency but with some subject groupings showing consistent differences (Fielding et al. 2010). Psychometric robustness has been proved with evidenced satisfactory levels of internal consistency, construct validity and concurrent validity (Richardson et al. 2007). However, York (2010) states that in spite of its robustness it may ‘nevertheless become compromised by the sociological and psychological circumstances of its use’ (p.735). Most research concentrates on the quantitative data. Very little attention has been given to the qualitative, free text data. Richardson et al. (2007) notes that free text responses revealed the same issues as quantitative data and that having such questions appears to help the overall response rate but there is no analysis of the students’ comments.

 

At Liverpool John Moores University in 2010, in addition to scrutinising satisfaction scores, thematic and content analysis were used to examine the qualitative, student comment data. Results show that the comments only reflect the scores in 40% of the subject reporting groups and in 5% of the groups comments contradict the scores. Meta analysis revealed that some scales were more likely to show discrepancies between the quantitative and qualitative data. In particular, teaching, as comments tended to be polarised with both positive and negative occurring together. Students use extreme positive vocabulary (fantastic, brilliant!!) in their description of lecturers. However, negative comments suggest that a small number of very poor and unsupportive lecturers account disproportionate amount of dissatisfaction with teaching. By comparing the qualitative and quantitative data an institution or subject group is able to get beneath the skin of the score profiles. Qualitative data can also help differentiate between more transient causes of dissatisfaction (for example disruption related to building change) or more deep rooted issues.

 

From the perspective of quality enhancement we feel that the qualitative data is more helpful and informative when aiming to improve student experience. These data explain the context, extent and strength of the issues summarised in the scores and helps to identify the causes. A solely summative approach results in over analysis of the scores and a tendency to focus only on the past situation (Flint et al. 2009). Student comments enable formative action to be taken in the context of quality enhancement. With this intention the technical quality of the survey instrument can therefore be taken as to be ‘good enough’ for the purpose (York 2010) and the value of data used to maximum effect.

 

Top of page

 

Parallel session 7

  

Using the Assessment Experience Questionnaire to engage Course Teams in the revision of programme-level assessment regimes

Professor Graham Gibbs, Yassein El-Hakim (University of Winchester)

 

The NSS identifies assessment and feedback as the primary areas of student disquiet, but provides very little diagnostic information to help course teams to adopt more effective assessment strategies. The Assessment Experience Questionnaire (Gibbs and Simpson, 2003) is not as widely used as student experience questionnaires that have been adopted nationally. Nevertheless it has been used in universities in many countries and has been translated into Mandarin Chinese and Spanish. The programme-level version of the AEQ (Dunbar-Goddet and Gibbs, in press) is currently being used in more than a dozen UK universities, across a wide range of disciplines, supported by the TESTA project (Transforming the Experience of Students Through Assessment) based at the University of Winchester (http://www.testa.ac.uk/) . Unlike the NSS and the CEQ, the AEQ has an evidence based underlying pedagogic rationale (Gibbs and Simpson, 2004) and scale scores demonstrate clear and consistent patterns of relationships with assessment practices (Gibbs and Dunbar-Goddet , 2007).

 

The workshop will:

 

  • introduce participants to the AEQ and illustrate very wide differences in AEQ scale scores between degree programmes and between institutions. Participants will examine and discuss sample data for programmes, and what it means.
  • demonstrate, with case study data, how it is being used in conjunction with a quantitative audit of assessment practices (so that clear relationships between practice and students’ experience can be demonstrated, Gibbs and Dunbar-Goddet, 2009) and qualitative focus group interviews with students (so as to illustrate the meaning of scale scores). Participants will examine one set of triangulated data in the form it is presented to Course Teams.
  • explain the way the data is discussed in entire course teams, giving examples of wide changes introduced across all modules on a programme.

 

Course Teams find AEQ data convincing and use its underlying pedagogic rationale to select alternative assessment strategies at programme level. ‘Before’ and ‘after’ AEQ data is being collected to evaluate the impact of these changes on student experience.

 

The focus of the ensuing discussion in the workshop will not be on the AEQ itself, but on the way Course Teams are being engaged with, and acting on, AEQ data.

 

References

 

Dunbar-Goddet, H. and Gibbs. G. (in press) A research tool for evaluating the effects of programme assessment environments on student learning: the Assessment Experience Questionnaire (AEQ) Assessment and Evaluation in Higher Education.

 

Gibbs, G. and Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. Oxford Learning Institute, University of Oxford. Accessed 10 September 2008. Available at: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gibbs_0506.pdf  

 

Gibbs, G. & Simpson, C. (2003) Measuring the response of students to assessment: the Assessment Experience Questionnaire. 11th International Improving Student Learning Symposium, Hinckley.

 

Gibbs, G. & Simpson, C. (2004-5) Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1 (2004-5).

 

Gibbs, G. and Dunbar-Goddet, H. (2009) Characterising programme level assessment environments. Assessment and Evaluation in Higher Education.  

 

Top of page

 

Parallel session 8

  

Using survey data to enhance the taught postgraduate experience: issues and challenges

Dr Rachel Segal, Dr Laura Hodsdon (Higher Education Academy)

 

This session will address the particular challenges involved in using surveys such as the HEA’s Postgraduate Taught Experience Survey (PTES) to enhance taught postgraduate (PGT) provision. It will include data derived from PTES results but will be mainly discussion-based, aiming to allow attendees to discuss challenges they have encountered and share effective practice about issues specific to enhancing the PGT experience using survey data in this often ‘grey’ area.

 

Issues to be discussed might include:

 

  • How to effectively engage students and staff in using PTES and other PGT surveys for enhancement (often difficult given that PGT programmes often only last for a year, meaning students can be less invested in long-term enhancement).
  • How PTES results can be used to greatest effect at faculty or departmental level.
  • What are the distinctive features of the PGT student body that should be taken into account (with large numbers of international, part-time or distance learning students, for example), and how do these affect enhancement activities?
  • How to situate PGTs within the whole university context: are they seen as advanced undergraduates or early researchers?

 

Top of page

 

Parallel session 9

  

From individualised quality enhancement to national quality assurance: an Australian perspective

Deanne Gannaway (University of Queensland)

 

The University of Queensland (UQ) in Brisbane, Australia is a large, research-intensive institution with a long and proud tradition of excellence in teaching and learning. In keeping with the ethos of responding to ensuring a quality student learning experience, UQ was an early Australian adopter of formally incorporating student feedback into promotion and tenure application processes (Moses, 1986). Similarly, UQ embraced the use of the Course Experience Questionnaire (CEQ) (Ramsden, 1991) and the Australasian Universities Survey of Student Engagement (AUSSE) (Coates, 2010) prior to universal implementation. Each of these initiatives was initially adopted to provide direction for professional development, for formative evaluation, and for localised improvement in teaching practices and learning experiences (see, for example, Moses, 1988; Timpson & Andrew, 1997; Wilson & Lizzio, 1997; Richardson, 2005).

 

This presentation explores the ways that student experience surveys have been used in Australian higher education as a mechanism for rewarding performance since 1991. It does this through reporting UQ’s evolution of policies and processes in response to national directives. One such example is the compact system described in Transforming Australia’s Higher Education System (DEEWR., 2009). This Australian Commonwealth government policy revolves ensuring an alignment of “government and university priorities for teaching and learning, and research, research training and innovation” through the development of a “holistic, strategic framework for the relationship between the university and the Commonwealth” and draws heavily on student feedback data as primary measures of satisfaction and quality of learning outcomes.

 

The presentation outlines the implementation in 2010 of a process and policy aligned to the Commonwealth directives. It describes the processes and mechanisms of gathering and reporting institutional student feedback on teaching and course experience and data gathered via a full census version of the AUSSE survey. The presentation describes the tools used to communicate those data and the mechanisms designed to support responding to the data to improve student experience. The presentation provides an overview of how these data and responses are used for faculty and school performance funding, before reporting the impact of these policies and processes on teaching and learning at UQ.

 

References

 

Coates, H. (2010). Development of the Australasian survey of student engagement (AUSSE). High Education 60(1), 1-17.

 

DEEWR. (2009). Transforming Australia’s Higher Education System. Retrieved from www.deewr.gov.au

 

Moses, I. (1986). Student evaluation of teaching in an Australian university -- staff perceptions and reactions Assessment & Evaluation in Higher Education, 11(2), 117 - 129.

 

Moses, I. (1988). Academic staff evaluation and development: a university case study. Brisbane: University of Queensland Press.

 

Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The Course Experience Questionnaire. Studies in Higher Education, 16(2), 129 - 150.

 

Richardson, J. (2005). Instruments for Obtaining Student Feedback: A Review of the Literature. Assessment and Evaluation in Higher Education, 30(4), 387-415.

 

Timpson, W. W., & Andrew, D. (1997). Rethinking Student Evaluations and the Improvement of Teaching: Instruments for Change at the University of Queensland. Studies in Higher Education, 22(1), 55-65.

 

Wilson, K., & Lizzio, A. R., P. (1997). The development, validation and application of the Course Experience Questionnaire. Studies in Higher Education, 22(1), 33-53.

 

Top of page

 

Parallel session 10

  

Rage against the machine? The views of academic staff towards the National Student Survey

Adam Child (Lancaster University)

 

This presentation/workshop is related to the conference theme of ‘Using the NSS to support enhancement’.

 

A recent report outlining the potential future of the National Student Survey (NSS) has concluded that the survey be used for three purposes: student choice, quality assurance and quality enhancement (CHES 2010). It is asserted within this report that the NSS is ‘generally accepted’ within the higher education sector. There is a related assumption that this acceptance is widespread within the academic community. However, there is currently no systematic evidence to support this claim and anecdotal evidence would suggest that the opposite is in fact the case. Studies focusing on other national surveys suggest that issues with the survey tool itself can distract people from discussing the meaning of the data, or responding to it in a useful way (Gregory et al, 1995). An understanding of the views of academic staff towards the NSS would enable colleagues to develop strategies that have greater buy-in from all staff across a University and in turn lead to more successful enhancement activities.

 

This session will present the findings from a survey of academic staff that explores views and opinions about the NSS and its use as a tool for improving teaching and learning. The questionnaire was completed by 324 academic staff from twelve institutions across three disciplinary areas, Education, History and Physics. Academics were asked to describe the way that their institutions use the NSS, as well as the way it is used within their individual department. Respondents were also asked to complete a number of Likert scale items where respondents indicated their level of agreement with statements relating to the NSS, for example ‘The NSS provides useful information to help me improve my teaching’ and ‘I think that my own teaching has improved as a result of making changes informed by NSS data’.

 

Discussion during the session will focus on the implications of this research for the design of interventions that intend to improve the quality of the student experience, particularly those that use the NSS data. Attendees will have opportunity to reflect upon the strategies they employ within their own institutional and departmental context for using the NSS in the light of this new research.

 

References

 

Centre for Higher Education Studies. (2010). Enhancing and Developing the National Student Survey. Bristol: HEFCE.

 

Gregory, R., Harland, G., and Thorley, L. (1995). Using a student experience questionnaire for improving teaching and learning, in Gibbs, G. (Ed.) Improving student learning through assessment and evaluation. Oxford: Oxford Centre for Staff Development.

 

Top of page

 

Parallel session 11

 

Compared to whom? Benchmarking NSS scores, from subject comparisons to identifying institutional priorities

Jason Leman (Sheffield Hallam University)

 

Analysis of public NSS data indicates that institutional scores are typically pushed down by just a few poorly performing subject areas. Institutions also tend to lag behind in just one or two areas of the NSS. Therefore there appears within institutions the potential to improve NSS scores significantly through focused effort. However, benchmarking areas of concern is complex, as student expectations, experiences and perceptions vary greatly across institutions and subjects.

 

This discussion of NSS scores will include HEPI student survey and HESA data to explore variation across institutions and subject areas with regard to factors such as class-size, UCAS tariff, and institution type. The presentation proposes the use of local competitor institutions as the most relevant benchmark for subject areas. It will reflect on the dissemination of information around benchmarking within Sheffield Hallam, and the apparent efficacy with regard to encouraging enhancement of learning and teaching practice. The presentation will conclude that institutional actions and research may be informed by the NSS, particularly where used to reinforce known good practice, but that correct interpretation is key. This interpretation must rely greatly upon existing pedagogic research, qualitative analysis and institutional feedback mechanisms. This is due to both the limitations of the NSS with regard to identifying practice that will enhance learning outcomes, and the lack of access to student level NSS data.

 

The presentation will include example analysis on participants own public NSS data, to evaluate priority subject areas and NSS issues within an institution. The main aims of the discussion will be to discuss appropriate benchmarking and how NSS scores can be used to identify or support institutional priorities for enhancement.

 

Top of page

 

Parallel session 12

 

Exploring the postgraduate experience

Dr Rachel Segal, Dr Laura Hodsdon (Higher Education Academy)

 

This session will explore the HEA’s postgraduate surveys PRES (Postgraduate Research Experience Survey) and PTES (Postgraduate Taught Experience Survey) in the light of taught and research postgraduate student focus groups that will be run the night before the conference. It will allow attendees to discuss issues arising from the focus groups that relate to the surveys themselves (e.g. how students interpret the questions, or their motivations for responding to surveys) as well as looking in more depth at some implications of past survey findings (e.g. what the intellectual climate would have to look like for a researcher to rate their institution ‘5’). The session is therefore a valuable opportunity to look beyond the survey data themselves and to explore in more detail what the postgraduate surveys can tell us about postgraduates’ experiences of their taught and research degree programmes.  

 

Top of page

 

Useful Links

 

NSS Resources - Annotated bibliography of key resources.

 

Surveys for Enhancement 2010 - Page containing the material from the HEA Surveys for Enhancement Conference held in May 2010, including the programme, the abstracts and a selection of presentations.

 

 

    

EvidenceNet is the Academy's service to promote and support the use of evidence in higher education, and is the parent site for this Wiki. It contains a variety of material relevant to the NSS, including research papers, case studies, events and networks (for instance, the resources linked from this page are housed on EvidenceNet). If you require resources beyond those that are linked on this page, please visit EvidenceNet for further exploration of the NSS, or of other national surveys.

 

EvidenceNet is also a place to submit material of your own. We hope that as you undertake enhancement work, you consider writing a short case study using the case study submission form

 

Top of page

 

Comments (0)

You don't have permission to comment on this page.