| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Higher Education Academy – SEDA Workshop

This version was saved 13 years, 1 month ago View current version     Page history
Saved by Bex Hedderick
on February 8, 2011 at 3:37:50 pm
 

                

 

Using Student Experience Data to Support Academic Development and the Enhancement of Learning & Teaching

Friday 28th January 2011, Higher Education Academy, York

 

This joint workshop between the HEA and SEDA provided participants with the opportunity to share ideas and examples on using the National Student Survey to support the enhancement of learning and teaching. It was targeted at educational developers across the UK and included presentations from colleagues representing both convening organisations.

 

Programme

Notes

 

Facilitators:

Sally Brown, Independent consultant and Emeritus Professor Leeds Metropolitan University

Julie Hall, Head of LTEU, Roehampton University and co-Chair of SEDA

Helen King, Head of Academic Staff Development, University of Bath

Alex Buckley, Adviser (Evidence-Informed Practice), Higher Education Academy

 

Programme

 

10:00 – 10:30           Coffee and registration

 

10:30 – 10:45           Welcome and Introduction (AB & JH)

                                     

10:45 – 11:15           How the NSS is Used in Different HEIs (HK)

                                     

11:15 – 12:00           Roles of Educational Developers in using the NSS for Enhancement (JH)

 

12:00 – 12:30           Opportunities and Barriers (HK)

 

12:30 – 13:15           Lunch

 

13:15 – 15:15           Institutional Case Study, Sharing Experiences and Considering Long-Term Pro-active Approaches (SB)

 

15:15 – 15:30           Action Planning

 

15:30                      Close

 

 

Notes from the workshop


 

Introduction: Purpose and Uses of the NSS

 

Alex Buckley (Adviser: HEA Evidence-Informed Practice) introduced the day and discussed the HEA’s perspective on using the NSS, emphasising its use as a starting point for discussion on learning and teaching. The HEA provides a variety of activities to support the use of the NSS for enhancement including data analysis at a national level, an Institutional Working Group as a forum for sharing practice, the annual Surveys for Enhancement conference (19th May 2011), case studies and other resources. Alex then went on to provide a useful overview of the NSS in the national context as outlined in his PowerPoint presentation.

 

Julie Hall (Head of the Learning & Teaching Enhancement Unit at Roehampton University and co-Chair of SEDA) then described SEDA’s perspective.

 

For many SEDA members the NSS has resulted in interesting educational development work in departments to help academic teams reflect on their scores, understand more fully student perspectives and plan changes to curricula, teaching methods or assessment practices. However this work can occasionally be problematic and for this reason SEDA is also interested in the processes educational developers use in such situations.

 

 

How the NSS is Used in Different HEIs

 

The large number of participants (~30) meant it was not possible to do ‘round table’ introductions. Instead, colleagues introduced themselves on their tables and discussed which elements of the NSS they were each focusing on in their institutions and why. These discussions were captured on post-it notes and categorised according to the scale of the NSS:

 

Teaching on my course

Organisation and management

Assessment and feedback

Academic support

Personal development

Other

 

Teaching on my course

  • Teaching the value of feedback with staff and students
  • More small group work
  • Recognition that internal surveys play a bigger role in enhancement. The NSS is the final check – it shouldn’t be a starting point’
  • Research into international student experience
  • National Fellowship Scheme, to reward and recognise good teaching
  • Analysis of student interpretations of questions between disciplines to aid contextual understanding of learning and teaching
  • Q3: When you get really bad results it is normally an outcome of a particular large factor, e.g. a highly popular lecturer being absent.
  • Teaching and learning: feedback is a vital part of the process.
  • Curriculum development in a department where there were low scores for ‘teaching on my course’ and ‘overall satisfaction’.

Organisation and management

  • Event bringing together academic and admin/faculty staff to understand each others’ perspective and improve organisation and management.
  • Finding out what do students mean by this? What are the issues for quality enhancement?
  • Unpacking, e.g. what do students understand as ‘organisation and management’?
  • Rise of student engagement agenda – push onto student voice, e.g. ensure students are on committees.
  • Emphasis on timetable: easier to understand / smooth running.
  • What does it look like from a students’ perspective?
  • Need to communicate with students at times of organisational change.
  • We’ve asked subject areas to report back on what they’ve done in response to the NSS.

Assessment and feedback

  • Might lead to improved marking procedures.
  • Formative assessment and interim feedback.
  • We initially focussed on A&F poor scores:
    • established 4-week turnaround time
    • encouraged quality of feedback – cover sheets for feedback.
    • Group work: techniques for managing assessed group, especially for final year students.
    • Highlighting when feedback is given.
    • Can uncover simple issues such as handwriting on feedback sheet.
    • Gimmicks and strategies.
    • Feedback: student perception; what is it?
    • Resourcing projects relating to improving feedback practice, especially Q7 & 8.
    • Department driving to improve assessment and feedback scores from Dental students.
    • Issues tend to be echoed by external examiners.
    • Leeds University Union / student reps working with general student body to ‘drill down’ to issues and find solutions.
    • Can help EDUs design meaningful staff development.
    • Pushing support provided on assessment and feedback: leaflets and workshops.
    • Timeliness and awareness of when feedback is being given.
    • ‘Timely’ return of coursework.
    • Feedback: development of university wide principles of feedback to students.
    • Feedback: timing when can be used – formative.
    • Assessment for learning initiative – student/staff partnership campaign.
      • Response to student needing/wanting to be more engaged with feedback (understanding etc).
      • Helping students to understand the A & F process.
      • Use of ‘Dialogue Days’: staff and students talk about assessment.

Academic support

  • Workshops for academic staff addressing NSS issues.
  • Raising awareness of NSS:
    • As a positive
    • Of content
    • Encouraging students to complete.
    • General workshop to inform staff about NSS.
    • We’ve established a more rigorous personal tutoring and student representation policy.

Personal development

  • Thinking about how we can improve personal tutorials to benefit students.
  • Continued our work around PDP (positive results).
  • ePDP to recognise skills.
  • Introduce student society and careers talks.

Other

  • Does putting a low scoring course into ‘special measures’ if it is scoring badly help or hinder the student experience?
  • To what extent are there differential scores from mature and 18 year old entrants? Are ‘diverse’ HEIs disadvantaged disproportionately?
  • Would there be a correlation between academic achievement and satisfaction? Is it just a shadow of ‘A’ level history?
  • Some managers want ‘quick wins’.
  • Is there such a thing as the ‘student as customer’ model? Does it actually exist?
  • University doesn’t want scores to be low: depends on VC.
  • QAA audit can be a real focus, but need to use it as a way in, not a stick.
  • To what extent does cohort size impact on student satisfaction, for example, on feedback? Are ‘diverse’ HEIs disadvantaged disproportionately?
  • Who owns the process of evaluating student satisfaction? Is it the students? Teachers? Quality managers? Senior managers?

 

The Roles of Educational Developers in using the NSS for Enhancement

 

The particular orientation and role of educational developers in relation to the NSS was discussed by Julie.  Educational development is a complex role and interestingly positioned between supporting academic colleagues and being the agents for change. Julie suggested a variety of different metaphors to describe the role of the educational developer in this context such as:

  • Counsellor
  • Detective
  • Court jester
  • Policeman
  • Confidant
  • Market trader

 

Julie also noted that espoused approaches and approaches in practice may sometimes be different. SEDA members often find the underpinning SEDA values helpful in guiding action:

  • An understanding of how people learn
  • Scholarship, professionalism and ethical practice
  • Working and developing learning communities
  • Working effectively with diversity and promoting inclusivity
  • Continuing reflection on professional practice
  • Developing people and processes.

 

 

NSS Provides an Opportunity to...

 

By way of capturing the discussions and issues raised through the previous activities, the morning closed with group discussions under the following headings; NSS provides an opportunity to….  Yes, but…   And, so….

 

The NSS provides an opportunity to… Yes but… And so…
  • Talk about teaching and learning
  • Listen to the students
  • Prioritise the student experience
  • Celebrate improvement, even if small
  • Catalyst for change
  • Challenge bureaucracy
  • Drive through gimmicks and quick wins
  • Implement institution wide approaches to enhancement
  • Get resources for library
  • Plan staff development which links to perceived needs
  • Working in different ways
  • Monitor whether university strategies are working
  • Change things which aren’t very good
  • Expose negative and problem areas
  • Expose positive areas – provides opportunities to share good practice.
  • Amazing that we get 80% satisfaction rate at this stage of an UG programme.
  • University experience may not be key to all students
  • Unfair to compare across different kinds of institutions and different courses
  • Staff under the kosh can’t engage in learning and teaching development: another managerial device to grind them down
  • Class dimension/entry background impacts of experience/ethnicity/cultural perspectives
  • NSS can’t be only measure
  • Use benchmarks or similar institutions
  • Beat people up who aren’t playing the game
  • Share good practice
  • If you use NSS in a sentence, everyone pays attention
  • Identify and address real problems
  • Refocus central issues
  • Promote discussion of evaluatory work
  • Different disciplines’ interpretation of wording on questionnaire
  • Stifling innovation – avoiding risk
  • Potentially damaging for ‘low scoring’ schools / academic staff
  • No indication of student engagement
  • Somebody has to be at the bottom
  • Change in student behavious (coercion)
  • Additional funding to ‘bottom’ to help develop and move to ‘top’.
  • Hear the student voice
  • Gauge student satisfaction
  • Act as a catalyst
  • Compare performance of institutions / departments
  • Recognise good practice
  • Examine problem areas
  • It is only a snapshot; does not account for full period of a students’ time at university
  • How can different types of institutions be compared with each other?
  • Is it desirable if institutions start to produce identical students?
 
  • Enhance learning and teaching
  • Starting point for dialogue with students
  • Starting point for dialogue with staff
  • Think about the student experience as a whole / joining up the experience
  • Think about a joined up university
  • Celebrate positive change

Constraints:

  • Time
  • External examiners
  • Accrediting bodies
  • Understanding / interpretation expectations
  • Conflicting priorities
  • Central vs local provision
  • Develop / build a staff / student partnership.
  • Think about other metrics that we should also be looking at
  • Look more closely at areas where scores have changed and resulted in subject-specific actions
  • Encouraged a more evidence-based approach to changing practice
  • Empower academics to use the NSS to effect change / for their own benefit
  • Raise awareness of differences / links between quality assurance and quality enhancement
  • Opens a discussion about L&T in appraisal
  • Potentially develop ‘parity of esteem’ between teaching and research

Barriers:

  •  Time willingness of staff to engage.
  • Differences between research / teaching led institutions
  • Links to reputation: are student using the NSS? Does it reflect what they want to know?
  • NSS dominates the discussion when we have richer evidence in-house that reflects the experience for current students
  • We don’t own it
  • Are there real opportunities for educational developers to change practice locally?
  • Change has to be owned locally
  • Contextualise and connect the NSS to discipline/local culture
  • Identifying ‘opinion leaders’ within school
  • If the first time we hear of issues is the NSS there is something wrong; we need to critically evaluate student voice / dialogue mechanisms.
  • Different parts of the HEI to talk to each other
  • Look at learning and teaching issues they wouldn’t otherwise
  • Recognise where we’re doing well / identify good practice
  • See where changes have made a difference
  • Spotlight on good areas
  • To talk to students about process of learning, teaching and assessment (senior management to take a knee jerk reaction)
  • Talk about / acknowledge learning and teaching (in the context that it matters)
  • Explore professionalism / identity around learning and teaching
  • Lever for change
  • Setting expectations from 1st year / long-term planning
  • Different departments perform differently – those that do well might be perceived as ‘blowing our trumpet’ rather than opportunity for sharing practice
  • But some people get carried away, which leads to coaching
  • We could overload with information
 

 

 

A Case Study from Leeds Metropolitan University

 

The afternoon of the workshop was led by Sally Brown (Independent consultant and Emeritus Professor Leeds Met) who provided an animated and interactive case study on the measures Leeds Met put in place in response to low scores on the NSS and issues raised in a Quality Assurance Agency (QAA) institutional audit. The principal actions taken by the institution, that resulted over the next eighteen months in significant improvements in the student experience, included:

  1. Using a university wide approach with change agents including Teacher fellows and the Associate Deans for assessment, learning and teaching in each faculty
  2. Building on positive outcomes where satisfaction was good e.g. the Library
  3. Avoiding questionnaire fatigue by banning competing surveys
  4. Zero tolerance on cancellation of classes
  5. Developing a mutual expectations document for students across the university
  6. Requiring teachers to engage in Peer observation of teaching (with 85%+ compliance)
  7. Focussing on improving feedback on assessed work and a three week turnaround
  8. Changing the  orientation of the university to focussing more strongly on the student experience

Comments (0)

You don't have permission to comment on this page.