Our People

How We Assess Quality in Online Instructional Design

12/03/19   |  
Courtney Mobilia

​Online learning is an ever-shifting landscape. Not only is technology changing rapidly, but learning and teaching styles can vary and change too. 

So, when everything is always in flux, how do online instructional designers (also known as learning designers) know if what they’re doing is top quality?

In this interview, we sit down with Keypath Education Australia Learning Designer, Ges Ng, to hear about the team’s approach to assessing quality of online learning design. Ges shares how our philosophy, while grounded in theory and data-informed, is also collaborative, iterative, and learner-centric, providing a 360-degree view of quality. 

We also hear about the limitations we have in assessing quality with instructional design, and how we are continuing to evolve our understanding of what a high-quality online learning experience is.

Quality in Online Instructional Design: Q&A with Ges Ng

Tell us about your role as a Learning Designer at Keypath Education Australia.

It’s pretty cool! I get to work with various academics with a range of expertise from all around Australia and sometimes even across borders. We work together to design innovative courses for learners, using the latest online pedagogies and technologies. We e design courses that prepare our learners for the jobs of the future and ensure that they receive the best learning experience. 

A week ago, I designed a dating-app game that allows learners to learn about a personality framework for a psychology course. It’s satisfying to create courses that are constructively aligned and fun for the learners.

What are we looking at when assessing quality in online learning? 

We are looking at the entire learning experience and its impact on learners. We look to see if they are engaged with and satisfied with the course, but also completion and quality of assessment tasks (pass rates), and above all, how well the course and assessments set them up to achieve and succeed in terms of their current or future careers. 

What is our approach to assessing quality in online learning? 

We are currently working to a four-pronged approach to assessing quality. It involves data analysis, agile retrospective workshops, a course quality rubric, and the application of an iterative course development model. Our culture of collaboration and self-reflection also supports these approaches.

Tell us about the data analysis? What data are we using and how? 

Learning analytics is a critical component of our evaluation phase. Where possible, we collect quantitative data from the learning management systems of each of our partners and run analyses to measure aspects such as student retention rates, identify students at risk and measure the success of the learning activities that we design. We ensure that we have the appropriate permissions to access the data and use it within ethical grounds.

As for our qualitative data analysis, we collect student and academic feedback to inform quality. This is extremely important to us. How would we know what works unless we ask the end-user? We regularly use student feedback surveys sent by our university partners, and have started to look at ways to bring in students for group feedback sessions.

We also regularly speak to our academics about course quality. Our academics are an excellent resource for gathering feedback on what activities or assessments are working and not. 

Within Keypath, we listen attentively to our Student Success Advisors who support our students through their studies. They offer us a bird’s eye view of how learners are progressing, not only in a course but throughout the entire program. We run meetings at the end of each course iteration to relay the feedback and plan our next steps for improvement.

What is the course quality rubric? 

We have developed a Course Quality Rubric that focuses on these standards:

  • Constructive alignment
  • Teaching and learning activities
  • Interactive object design
  • Assessment strategy
  • Presentation and UX
  • Engagement

What happens is a manager or a colleague will look through our course and mark how we have done on the course design against this rubric, and then we can look to improve on the next course design or course refresh. 

You mention that our process is collaborative and agile. Can you explain how this is so?

Our process varies according to partners. One source of inspiration is the Enterprise design thinking model. It ensures that we focus on user outcomes, diverse empowered teams and restlessly invent (IBM, 2019). The loop consists of the following principles, loop and keys. For now, let’s concentrate on the loop. It is a rapid iteration process consisting of observing, reflecting and making.

This particular design thinking model provides a robust framework for Keypath’s user-centric approach to all stages of a learner’s journey. 

One way that we are driving reflection is by holding retrospective workshops to foster collaboration and peer to peer feedback. These workshops act as a formative evaluation which occurs ongoing between observing and making and allows us to lean on the strength and skill diversity of our teammates to ensure that we meet our quality standards. 

Our workshops are a place where all Learning Designers at Keypath meet to talk about their successes, challenges and share their lessons learned along the way. The focus of the retrospective is not to reflect on the whole course; it can be as simple as building an activity, an interactive, an assessment, or improving the processes of a Learning Designer in the team.

Learning designers at Keypath typically have four to five courses under our belts, and each course is on a different design phase at any point in time. The rapid iteration process prompts reflections throughout the design phases so that we can identify any potential threats, ensure alignment to the end-user and product, and learn from our failures faster. 

This is the retrospective workshop model that we are trialling:

1. Set the stage – get the team ready to engage in the retrospective, perhaps with a warm-up (10 minutes)

2. Discuss what went well (20 minutes)

3. Discuss what needs improvement (20 minutes)

4. Discuss next steps (10 minutes)

How do we use critical reflection?

As a starting point, educators can consider Brookfield’s (1998) questions for reflection, which is divided into self-reflective questions and questions from the perspective of a student or tutor.

For a retrospective workshop to be successful, we must self-reflect, seek feedback or research current literature to find out what is effectively working and what is not. To set up the retrospective workshop, here are some questions we might ask everyone during the set-up phase to prompt self-reflection:

  • What is one aspect of your learning design that made you feel proud this week and why?
  • What moment(s) did you feel most connected, engaged or affirmed with your academic?
  • What moment(s) did you feel most disconnected or disengaged, or distanced with your academic?
  • What activity/topic did you have the most fun designing and building?
  • What activity/topic did you have the most difficult in designing and building?
  • What activity did the tutor, students or your colleagues found most affirming or helpful?
  • What activity did the tutor, students or your colleagues found most puzzling or confusing?
  • What does current literature say about the best practice in designing this activity/topic?
  • How do others feel about the retrospective workshops?

I’ve been getting positive feedback about the retrospective workshops. Learning designers feel like they have a platform to share their thoughts and rapidly iterate. Someone can say “hey I’m having trouble designing this, it doesn’t feel intuitive for the learner, what are your thoughts?” and then we will crowdsource opinions and suggestions to assist. 

Learning designers do just that. We learn through other people’s designs and critical reflection. Whether it be a colleague presenting something that they are proud to have achieved, or perhaps presenting a challenge or obstacle they are facing, we are continually learning and aiming for best practice in our industry. 

The next step for us at Keypath is to iterate quickly, as we work at a rapid pace.

What role do academics play in delivering quality?

They play a pivotal role in delivering quality. I often say, no matter how good our designs are, our academics are the ones who bring our courses to life. Our academics co-create the courses with us by sharing their subject matter expertise, teaching experience and research. 

Our academics also give us design feedback by running quality assurance checks. They ensure the content is current and accurate, fit for purpose and lastly prepares the learner with skills to address current and future situations.

Their feedback doesn’t end there. After the delivery of a subject, our academics provide us with feedback once again and propose recommendations for the next iteration of the course. They very much feel responsible for their courses and students, as well as the other academics delivering on the course.  

What are some of the limitations of our process to assess quality?

There have always been problems with an agile development model; we move forward more than we move back. Often, we are building for the next week, and once we get feedback, we rarely have time to go back. 

This why Keypath has in place course maintenance and course refresh processes. If we identify an area that needs to be updated, it can be done in the next course refresh or maintenance, as it’s unlikely to happen immediately.

What’s exciting about instructional design?

I think that it is exciting to learn from others. It’s what motivates me to be a Learning Designer and is what brought me into Keypath in the first place. To find a workplace that has so many different partners, so many design thinking and educational models, so many different types of technologies, is something very special. 

At Keypath, we are encouraged to take risks, try something new, and use the technology we see as most relevant. We often say that it is important to ‘fail-fast’.

Everyone at Keypath is innovative; we have future-focussed mindsets. Employees aren’t held back by specific barriers, in turn allowing us to create high-quality programs to our best ability for the partner’s students. Seeing this in the retrospective makes myself, and I believe my colleagues too, quite inspired to keep improving on our understanding of quality and delivery of such. 

How would you summarise the Keypath approach to quality?

In one word, I would say it is iterative and flexible. Our adaptive approach to stay current and innovate pushes us to the forefront of instructional design. We acknowledge that we are never going to reach a perfect course because everything is continually changing. 

Change within our industry is inevitable; it is just something we need to be aware of and adapt rapidly.


  • Brookfield, S. (1998). Critically reflective practice. Journal of Continuing Education in the Health Professions, 18(4), 197-205.
  • IBM Design. Enterprise design thinking. https://www.ibm.com/design/approach/design-thinking/