Adelaide Business School teaching and learning quality improvement plan
With accreditation in mind, as well as a drive to improve the quality of teaching and learning across the Business School, a strategy was designed by the Assurance of Learning Taskforce to standardise specific pedagogies that would enhance overall student learning, and then make the collection of data related to this progress efficient.
The strategy involves 3 distinct stages that are staggered in terms of time, to allow course coordinators the necessary time to understand and implement the changes. The entire process is supported by LEI.
Ìý
WATCH: the project overview
Ìý
Phase 1 - Alignment of assessment to Course Learning Outcomes (CLOs)
Rationale
The alignment of assessment to the CLOs is the most fundamental and crucial pedagogy: what the course is advertised to be going to teach is then taught and assessed. This alignment drives curriculum design, with the course coordinator developing a series of learning sequences that build knowledge towards students being able to achieve the CLOs. Assessment that doesn’t test the achievement of these outcomes is very poor practice, and ultimately exceptionally unfair to the students who enrolled with a particular understanding of what they were getting involved in.
As a means to encourage the connection and alignment of assessment to CLOs, the Business School recognises that the widespread use rubrics will be of value. The rubric will help to make explicit and transparent the skills and knowledge each assessment is testing. When assessment is designed, the creation of a rubric concomitantly helps the academic to further consider the specific purpose of the task and also how it will be perceived and addressed by students. Such a process makes the implicit, the tacit knowledge you have when marking an assessment significantly more explicit, and helps students to better understand the marking process and what you are looking for in their work. This outcome, when experience in designing precise rubrics increases, will eventually lead to less workload in the marking process. Rubrics also facilitate consistency across markers, relieving pressure on tutors etc and assisting them in feeling confident that they are marking to the expectation of the course coordinator and the School’s quality demands. This will provide a stronger picture of assessment across the school as the reliability of marking and therefore its validity improves.
Ìý
WATCH: the pedagogy of rubrics
It is not axiomatic however that a rubric will achieve what it is supposed to. Careful design is necessary to mitigate against some limitations that can reduce the validity of its use. This design discussion will form part of the resources that accompany the rollout of this school wide pedagogy.
When rubrics are used consistently in the School, the practice facilitates several pedagogical advantages and will naturally take care of other teaching and learning shortcomings: ÌýÌý
- The criterion make it clear what is being tested, and thus makes it very clear if the curriculum didn’t teach it sufficiently. This will positively affect future sequence design.Ìý
- The descriptions in the criterion allow for the outcome to be further defined and can contain the range of relevant skills and knowledge – again helping the design of curriculum to be aligned. ÌýÌý
- Use of coding in the marking of student work can reduce overall marking time, therefore facilitating a better experience for academics who can redirect their time to quality assurance. ÌýÌý
- Students can better understand their areas of strength and weakness which can impact their preparation for upcoming assessment. Academics will soon realise that if they label their course appropriately so it matches the areas of the rubric, it will lead to greater self-regulation by students and reduce workload answering questions related to ‘how can I improve’.Ìý ÌýÌý
- The consistency in marking across a course provides greater confidence to each academic in how the course needs to be taught and how tutorials should be run. Students then get a consistent level of quality and the fluctuation of quality between tutors is reduced. Ìý Ìý
- The use of a rubric encourages moderation process, which improves consistency across marking.
A natural corollary of creating rubrics is to add them to MyUni. When a rubric is attached to an assessment in MyUni it begins the facilitation of more efficient data collection.
WATCH: How to add rubrics to MyUni
-
THE METHOD: Click here to see how this phase was delivered to the School
The method - Each discipline will have a ‘champion for rubrics’ who will sit down with Paul Moss (Learning Design and Capability Manager) and Associate Professor Jean Canil (AOL task force coordinator) to create a rubric for an assessment. This will serve as a contextualised example that will be used to demonstrate the process when the discipline team meets next and the idea of rubrics is discussed with them. The contextual vision for the discipline of how they could implement rubric use in their practice is supported by the fact that one of their own has gone through the process and is likely to be able to offer advice and/or assurance of the usefulness of the process.
Once the rubric champion has designed their rubric, they will assist Paul Moss in presenting a workshop on rubric design to each discipline. This workshop will be preceded by Noel Lindsey’s communication to each discipline head on the School’s new rubric policy, and its dissemination to course coordinators. The workshop will be at least 1hour 30 minutes, and academics will be able to walk out with a rubric added into MyUni (I will provide a dummy one for those who haven’t designed one yet). Multiple videos on how to add a rubric to MyUni are available already.
The workshop will focus on the design of rubrics and discuss how they should be used. Included in this will be advice on a strategy to begin using them, which will be as follows:
- When to create the rubric:
oÌýÌý ÌýRubric should be created before the assessment (doesn’t need to be published however*) – this helps the academic to align the assessment to the CLOs
oÌýÌý ÌýIf not published, clear advice on what students need to do in the assessment must be provided (essentially a less specific rubric) Ìý ÌýÌý
- How to use the CLOs to create criterion Ìý Ìý
- What range of knowledge/skill in each criterion should be used Ìý
- How many criteria should be included in an assessmentÌýÌý
- What the rubric says about attribution of effortÌý
- Why to make the rubric qualitativeÌý
- How to add rubrics on MyUni ÌýÌý
- When to use whole CLOs as criterion ÌýÌý
- How to use codes in the rubric to reduce written feedback
The workshop will not be a place for objections to the use of rubrics, even though the presentation will discuss and provide solutions to the understandable limitations rubric use can have. The central theme will be that whilst there are limitations, the opportunity is far greater than the cost
* Likely objections
Rubrics are powerful, but not infallible. Of particular note, the workshop will discuss the need to provide some flexibility for academics on whether they provide a rubric to the students pre-assessment to guide their activity, or whether they use a rubric post-assessment. This flexibility, for more subjective types of assessment, is necessary as one of the strongest objections to rubric use in such a context is that the pre-assessment rubric constrains the marker if students’ interpretation of the assessment presents ideas not conceived of in the rubric creation – the marker can’t reward insightful ideas as it is not represented in the rubric. Using a rubric to ‘mark’ as opposed to ‘guide’ allows the global, gut feel style of marking that most are comfortable with (as it provides some elasticity in feedback) to still be available. Having said that, well-designed rubrics can ‘allow’ for such subjectivity – but such design skill may be difficult for some in initial implementation of this policy.
Time frame
Meetings with initial rubric champions will take place between Feb 26th and March 7th. Workshops should be run shortly after that. Rubrics should be used for ALL assessment beginning trimester and Semester 2. Many will begin to use them prior to that, and support will be offered from LEI to advise/evaluate on rubric creation. Once the policy is in place, the Business School Assurance team will conduct an audit of rubric use in the School and point the recalcitrant to the policy but also to the best practice that others have created. Peer influence will be a large driver here.
Phase 2 – Using MyUni to collect Course Learning Outcomes (CLOs) for 3rdÌýyear courses
Rationale
The efficient collection of data serves not only accreditation purposes but overall quality assurance administration. Utilising the University’s LMS to collect data, MyUni, is a sensible option if the tools provide a sufficient ability to do so with little effort on behalf of the academic. 3rd year courses are chosen as the starting point of the strategy as they are more closely aligned with Program Learning Outcomes (PLOs).
Some assessment directly measures whole CLOs – say for example end of semester assessments. These are used as the criteria in the corresponding rubric. The results of the rubric’s application to the assessment is that the CLOs are fed into not just the gradebook but also to a learning mastery pathway view, which allows for administration, and students, to see how the student has fared in the pursuit of whole CLO attainment. The display of whole CLOs can then easily be mapped to PLOs.
WATCH: how to add outcomes to MyUni
WATCH: how to add outcomes to a rubric
-
THE METHOD: Click here to see how this phase was implemented
Method – Paul Moss and Jean Canil met with individual course coordinators to guide them through the process. The benefit of the personal touch was to also secure buy-in to the pedagogical rationale of the strategy. Once some momentum and a critical mass of staff have gone through the process it is believed that others will be able to self-apply the necessary steps to add outcomes to rubrics as they are assisted by colleagues who have gone through the process and can vouch for both its efficiency and efficacy.
Time frame
It is possible that this could happen at the end of Semester 2, Trimester 3. By then, staff would be comfortable in using rubrics and so the expectation of applying the whole outcomes to a final assessment wouldn’t be a step too far. It is likely possible that early adopters of using rubrics could provide data at the end of Semester 1.
Phase 3 – Using MyUni to collect Course Learning Outcomes (CLOs) for multiple assessments across all courses
Rationale
Creating summative evaluations on all assessments in terms of how criterion relate to whole CLOs facilitates the learning mastery pathway view to be of use for students’ self-regulation in closing learning gaps. This allows students to ascertain where their strengths and weaknesses are prior to the final assessment and study accordingly.
This process involves the identification of how each criterion are mapped to an outcome, and if more than one criterion is mapped to the same outcome, an aggregation needs to be conducted. This involves more complexity and makes the markers job slightly more difficult in determining if an outcome is achieved. Possible permutations include two or three criterion all related to the same outcome but having different weightings in terms of importance in the assessment. This means that more concentration and effort is required to assess the outcomes level of attainment. However, from experience so far, this has not been a difficult or common occurrence.
-
THE METHOD: Click here to see how this phase was implemented
Method – Paul Moss and Jean Canil met with individual course coordinators to guide them through the process. The benefit of the personal touch was to also secure buy-in to the pedagogical rationale of the strategy. Once some momentum and a critical mass of staff have gone through the process it is believed that others will be able to self-apply the necessary steps to add outcomes to rubrics as they are assisted by colleagues who have gone through the process and can vouch for both its efficiency and efficacy.
Ìý
Paul Moss, Learning Design and Capability Manager, LEI
Ìý
Ìý