September 2017
UNDERSTANDING QUALITY "AS IS"
All agree – especially students –quality is critical in determining not only the value of a degree, diploma or certificate, or learning experience, but also in determining the long-term viability of a program, course or institution. Quality matters.
But how we define quality can be both a standard barrier for accountability and an inhibitor for innovation and change.
Our notions of quality are very much focused on inputs and a limited range of outputs:
- We are concerned about qualifications of the students admitted.
- We are concerned about the qualifications of faculty.
- We are concerned about the design of the program and its "equivalence" to other similar programs already operating.
- We are concerned about the management of processes within a program – assessment rubrics, appeals, academic integrity, and academic governance.
- We are concerned about the rigor of marking.
- We are concerned the outputs match the intended outcomes of the course.
While formal quality assurance regimes increasingly focused on the student experience, they have not made student engagement the key driver for quality. Nor do such regimes look at whether the program is innovative, flexible, making great use of technology for learning analytics and assessment and is engaging students with potential applications of that learning.
Indeed, the model of pedagogy prized by many quality assurance regimes is particular and specific and derived from face-to-face as the "gold standard", despite the varied quality and experience of face-to-face instruction.
A great many quality assurance regimes do not look seriously and critically at learning impacts over time - impact on career, impact on lifelong learning, and impact on community resulting from that learning – nor do they look in depth at student engagement and faculty satisfaction as drivers of quality. We still have a lot to do to bring our thinking about quality "up-to-date".
QUALITY AS A DRIVER FOR INNOVATION
If we want to see quality as a driver for innovation rather than as a barrier to it, we need to start rethinking our approach to quality. In particular, we should ask ourselves:
- The How 1: How do the students experience their learning?
Is it the best experience it could be, given the resources available to the institution, the faculty member/instructor and the learner?
Were real attempts made to engage the learner with other learners worldwide, with experts worldwide and with their faculty member/ instructor? Did the learning design fit all learners? Were good routes provided for learners who struggled or those who need to fast track?
- The How 2: How satisfied is the faculty member/instructor with their conditions of practice?
Do they have the supports they need to be able to provide the learning opportunities to truly engage learners? Do faculty/instructors feel they "own" the learning agenda and their teaching? Do faculty feel they have a genuine voice in the governance of programs and courses?
- The What: What are the outcomes of a student’s learning?
What matters most is what the student can do or understand now, which they could not do or understand when the program/ course began.
- The So What: What are the impacts of the students we produce in society?
Focus on impacts of the learning in practice, not just immediately, but over time (e.g. in the workplace, in society and in their engagement in the community).
- The Then What: How does the experience of learners lead to innovation and change within the university or college?
After the learning occurred, what changes are made to the design, deployment and delivery of the program the next time it is offered? Can the program and the course be significantly improved for both learners and faculty/instructor?
We need to escape from the "ISO 9000" thinking about quality, which so informed the quality movement in the 1980s, and move to a much more experiential and outcome view of quality if it is to be the engine of transformation.
TEN KEY DEVELOPMENTS WHICH IMPACT OUR UNDERSTANDING OF QUALITY
There are 10 key developments which will drive new thinking about quality and quality assurance. The development of learning analytics:
- The development of learning analytics
- The use of student engagement as a basis for benchmarking and evaluation
- New forms of flexible learning which focus on outcomes not process
- New forms of assessment
- The focus on skills and competencies
- New kinds of credit and skills recognition
- New providers for learning with new institutional models and processes
- The internationalization of learning
- A changed expectation about qualifications and outcomes from employers
- A renewed focus on outcomes and impact
Let us unpack each of these in turn.
The development of learning analytics
The development of analytics, made possible by student use of online learning tools, social media and other digital resources, presents an opportunity to "look under the hood" of learning.
Universities and colleges are using analytics to develop models of student behaviour, including: (a) the behavioural characteristics which help predict when a student is likely to drop out of a course, fail a course or underperform in a course; (b) learning patterns – when and how they use learning resources, the extent to which they use text, video and audio or other learning materials; (c) how they approach learning tasks – their learning and problem solving style; (d) how they engage and interact with peers, mentors, instructors and others; and (d) how they tackle simulations and games – what are their social and behavioural characteristics when they undertake this work. These rich sources of data permit not simply retrospective analysis of learner behaviour, but predictive analysis and the opportunity to improve learner self-management.
Universities and colleges are systematically using student data to help make informed decisions, which can lead to improved student engagement, satisfaction, retention and attainment. They also provide a strong basis for evidence-based quality assurance, focused on the student experience and teaching and learning. Rather than peer reviewers having to adduce evidence of quality, analytics can provide such evidence where students are using online learning and related tools.
Some outstanding work is taking place at The Open University (UK)[1] and through a variety of collaborative projects supported by JISC (UK)[2] and Educause (US)[3]. While there are several concerns – e.g. privacy, security, ethics, cost, technological infrastructure –truly creative work is taking place based on the idea of improving the experience of learning for students, increasing effectiveness of teaching and learning, and strengthening impact: all of the things quality assurance is focused on. Data analytics is the "Gini out of the bottle" – it is beginning to show results and will not be "put back in the bottle".
The use of student engagement as a basis for benchmarking and evaluation
The National Study of Student Engagement (NSSE), an international study involving universities and colleges from several countries, focused the attention of universities and colleges on the relationship between student engagement, learning outcomes and student performance. There are good reasons to do so. The findings from 20 years of research on undergraduate and college education are unequivocal: the more actively engaged students are — with faculty/ instructor and staff, with other students, and with the subject matter they study — the more likely they are to learn, to stick with their studies, and to attain their academic goals. Engagement is a predictor of outcomes.
Student engagement refers to an array of learning activities and experiences associated with such critical outcomes of a university or college education as critical thinking, problem finding and solving, and communications, among others. Specifically, it represents the time and energy students dedicate to mindful and purposeful educational activities: studying; interacting with faculty members/instructors and collaborating with peers about substantive matters; synthesizing what they’re learning and applying it in new contexts; and participating in enriching experiential learning or high-impact practices, including service-learning, internships, diversity and global learning, learning communities, capstone courses, and undergraduate research.
A recent article[4] which explores the link between measures of student engagement and quality assurance ends with this statement:
"To fulfill their responsibility for oversight of educational quality, boards must understand important concepts like student engagement and the prevalence of effective educational practices at their institutions. Boards should set expectations for evidence about educational quality, identify the best ways to share results, and set aside time to discuss relevant assessments. Allocating board and academic affairs committee time to the thorough consideration of student engagement results demonstrates that enhancing the quality of the student experience is a priority. Finally, a commitment to invest in improvement initiatives—in collaboration with administrative leaders, faculty, and staff—is vital to achieving proper board oversight of educational quality".
New forms of flexible learning, which focus on outcomes not process
A variety of new delivery models and practices are emerging in response to student demand for flexibility. Whether these are outcome-based assessments for learning against a competency- based rubric (University of Wisconsin, Western Governors University and many others), credit for MOOCs, nanodegrees or partial credit for modular courses offered 365 days a year (Kentucky Community Technical College System - KCTCS), they are "game changers" in terms of assumptions about the design, development, deployment and delivery of learning. Given the growth of "unbundling", which is explored extensively by Ryan Craig (2015)[5], quality processes become different for different modes of learning and different kinds of learning outcome. Put simply: one size for quality assurance does not fit all.
Let use the example of Kentucky’s modular courses[6]. Normal full semester courses were split into competency-based modules of two or three weeks’ duration. A learner can enroll at any time in a module (365 days a year) but has a fixed time to complete. As they register for the module, they are given a pre-test to determine their knowledge and readiness. If they score 65 or higher, they are immediately given the post-test for this course. If they score 65 or higher, they can earn credit (.25, .30 or whatever the credit weight attached to the module is). If they score less than 65 on either test, they proceed to the course and are connected to their coach and mentor. Two or three weeks later, they sit their course examination or complete their final assessment.
As they progress through the collection of modules, they earn course credit equivalent to a full course, which is then transferable within the college and university system in the United States. All of this learning is online. This is a very different kind of course from the full semester, classroom-based course. It requires a different kind of quality assurance process focused on design, delivery and support. What matters to the KCTCS system is that the learner performs well on outcome-based evaluations, not how they interacted with their coach.
As more universities offer flexible routes to achieving qualifications, making much more extensive use of prior learning assessment and recognition (PLAR), work-based learning agreements with professional bodies and companies, block-transfer arrangements with colleges and other innovations, quality assurance needs to change and adapt accordingly.
New forms of assessment
Peter Hall and Sir Michael Barber (2014)[7] signaled a strong focus for the future is on reimagining and redesigning how learning is assessed. They suggest a transformation of assessment is underway in part because emerging technologies enable change, but also because the current modes of assessment are no longer "fit for purpose". While their focus is mainly on school systems, their analysis and recommendations for action apply equally to higher education.
Core to their proposition is that we are now better able to define what it is students are expected to master when and have access to much more sophisticated tools to assess students, both to aid their learning and to certify mastery. They also suggest our current assessment practices dictate and constrain learning rather than enable it and provide for the full documentation of student capabilities.
They point to emerging assessment systems, which make use of simulations, games, immersive experiences, portfolios of student work, adaptive assessment and other tools as examples of changes already gathering momentum in assessment practices. Others suggest the significant growth of e-portfolios, which capture student work, feedback and analytics, provide a stronger basis for capturing a student’s learning than the mid-term / end-of-term examination.
A great many new tools are emerging, which facilitate the renaissance in assessment which Hill and Barber are describing, many developed by private corporations such as Pearson and McGraw Hill. Both see assessment as their major opportunity for new revenues and work.
As assessment changes, so quality assurance systems need to adapt. In particular, they need to explore a much wider range of approaches to assessment and learning outcomes and look at the richness and intensity of assessment. What is key leadership question?
The focus on skills and competencies
A great many governments - including Japan, UK, Canada and the US – are concerned that universities are not graduating students with the "work ready" skills their respective economies need to enable growth, spur innovation and ensure social well-being. This has led to several important developments.
First, there is a strong focus on science, technology, engineering and math (STEM), despite the fact many fast-growing sectors of the global economy require strong social understanding, design and arts. For example, the creative industries in the UK are amongst the fastest growing in that economy, employing over 1.8 million persons with employment in these creative industries (architecture, fashion, interior design, etc.) growing at twice the rate of other sectors[8]. Closing faculties of humanities, social science and arts (as occurred in Japan in response to government concerns about skills[9]) may reduce opportunities for social and economic development rather than increase them.
Second, 65% of the jobs to be taken by those who began primary school in 2015 do not yet exist. They will be imagined and created while students are still in school, college and university. Work is changing and the way in which skills we deployed a decade from now will look very different from the way that skills are deployed now. Students in universities and colleges need adaptability, resilience and "grit" – all strong predictors of learning outcomes. These "soft", non-academic skills are essential for individuals seeking to enter the workforce.
They need to be explored and understood as part of the quality assurance process: are universities and colleges developing resilience, adaptability and grit?
Finally, to what extent is a degree, diploma or other university or college qualification a statement of skill? According to a study conducted by the U.S. Commerce Department[10], only 25 per cent of the 15 million Americans who have a STEM degree work in a STEM job. And of all the people working in STEM fields, less than half hold a STEM degree. This suggests the key skills a certificate, diploma or degree requires are the skill of employers say they need: critical thinking, teamwork, effective communication, the ability to learn, problem-finding and solving.
These skills can be found in a range of disciplines, including the arts and humanities. While some technical skills are required for some positions, many companies prefer to find the right people with some skills, which they can then quickly acquire. A pre-occupation with specific skills may be counterproductive, especially given forecasts about the future of work (Ross, 2016)[11].
Quality assurance systems rarely attend to skills, both formal and non- cognitive. In a rethinking of quality and its meaning for Universities and colleges, we need to look more closely at adaptability, resilience, grit and the soft skills employers are looking for.
New kinds of credit and skills recognition
There are new forms of credit beginning to appear, triggered by the development of MOOCs. We list them here:
- Specializations: Coursera began specializations in 2014 and now has some 83 specializations. They consist of a group of related courses designed to help learners deepen expertise in a subject. According to Coursera, 1.5 million Coursera learners signed up for courses that are part of specialization.
- Nanodegrees: Udacity began offering nanodegrees in partnership with companies and major employers in June 2014. Partnering with companies such as Google, AT&T, Tata and others to create custom MOOCs to meet the competency and skill needs of these employers. Some of these nanodegrees come with job guarantees.
- XSeries MOOC: Launched by MIT through edX in 2013, each XSeries covers content equivalent to two to four traditional residential courses and take between six months and two years to complete. In a break from previous offerings, the XSeries sequences are composed of shorter, more targeted modules without one-to-one residential course equivalents.
- HBX Core: This is the Harvard Business School offering a credential of readiness (CORE). Irrespective of the background of the learner, all take three modules: Business Analytics, Economics for Managers, and Financial Accounting. The aim is to enable basic competency across these three components of business practice. Look for other developments along these lines as colleges, universities and MOOC providers seek both relevance and revenue.
As these programs are presented as "fast track" passages to work, what is the quality assurance process for the design, development and delivery of the courses? For the engagement of learners? For the assessment of learning outcomes? Given a significant number of students may pursue these routes to employment or credential, are our current quality assurance practices robust and adaptable to the development of such approaches?
New providers for learning with new institutional models and processes
A challenge for many public institutions is the growing private sector in higher education. In Africa, for example, the number of private universities will soon outstrip public universities, according to Professor Olugbemiro Jegede, former Secretary General of the Association of African Universities (AAU). In India, according to the University Grants Committee[12], there are 235 qualified private universities.
As Badr Aboul-Ela (2016)[13] points out in his important chapter in the book The CIQG Quality Principles – Toward a Shared Understanding of Quality: "Governments also have a challenge in dealing with fake quality assurance agencies and higher education institutions, most of which operate online. International cooperation in this regard would help minimize the negative impact of such fake entities, and in turn, improve quality" (page 36).
Growth of new providers is critical in meeting the sustainable development goals related to lifelong learning – they can make a real difference to both access and outcomes and can respond nimbly to the socio-economic needs of a jurisdiction. They can also innovate quickly in ways many public institutions find difficult. Many private universities are outstanding – Harvard, CalTech, Yale, Cornell, Princeton, University of Chicago, and MIT are all high ranking institutions and all private. The challenge is to sort out which are quality institutions and which are not.
As the higher education sector becomes more complex, more international, more online this challenge becomes more acute, causing many to explore new approaches to quality assurance through standards-based assessment. Is this the right way to go?
The internationalization of learning
Finland recently released a report[14] on the impact of its international higher education student placement program on learning. It shows very clearly the international experience of students has a significant impact on their social skills, empathy, communication, tolerance and adaptability. The report is aligned with a similar study from the European Union (EU)[15], which had similar findings.
Globalization is having an impact on higher education and learner mobility is a critical feature of the current higher education landscape. With a growing number of transnational qualification frameworks (an outstanding example being the Transnational Qualifications Framework TQF for the Virtual University of the Small States of the Commonwealth) and students studying abroad for part of their degree or diploma, the complexity of a student’s journey to a degree/diploma is growing.
In Canada, programs in some institutions now have 30% or more of their students who are international students. More programs include international study components and more students are completing part of their Canadian degree programs abroad. More learners are coming to Canada with part of a program completed in another country and more courses have international components and links to international research, applied research or organizations. Higher education is increasingly an international business.
The growth of international student body in each jurisdiction will continue, though it will become an increasingly competitive market as more institutions seek to capture these students. Each country seeking to grow its international student body competes with the USA, UK, Australia, Canada and New Zealand for international students. A variety of estimates suggest that, by 2030, some 3 million individuals will be seeking to study in one of these countries – an increase of 1 million from 2015. At this time, the USA, UK and Australia are preferred destinations, especially for post-graduate study. Indeed, the UK has become increasingly dependent on international students to fund its complex system and requires some 100,000 or more new international students each year to sustain the system. Recruitment depends very much on immigration rules, costs, relevance, security and quality of student life.
Internationalization is not just about who the learners are; it is also about what it is they are learning. As access to knowledge is much more universal (aided significantly by advances in automatic translation engines and open education and research resources), then the curriculum itself also needs to reflect who the learners are, where they come from and are likely to return to. As knowledge develops at a faster rate than ever before and is much more globalized, a failure to ensure international content and focus is likely to lower the interest of international students in a specific program or area of study (with some exceptions).
From a quality assurance perspective, what attention is paid to the experience of international students – especially those spending just part of their study time in a different country? Are we attending to their different needs and are we systematically assessing the impact of their "time abroad" experience?
A changed expectation about qualifications and outcomes from employers
Employers are increasingly interested in what a potential employee can do rather than what or where they studied. For example, Google recognized that past academic performance and achievement, at least in their work environment, does not predict future performance at work. It moved to behavioural and skills indicators (competencies) and behavioural interviewing (Foster, 2013)[16] as the basis for staff selection. Penguin Random House, Amazon, Apple and other large corporations followed suit together with a growing list of employers. In all of the examples of innovative, competency-based programs offered by institutions, employers are at the table as part of the team defining learning outcomes and the competencies to be mastered so there is alignment between learning and the skills they are looking for – something the college sector has done since its inception.
Many professional bodies are moving to competency-based and mastery models for admission to the profession. These include nurse and medical education, accounting, project management, counselling, human resources, pharmacy technician, some engineering professions and many more. Over the coming decade, we can expect more professional bodies to adopt a competency-mastery approach to professional certification.
What is interesting is that a significant percentage of those holding a degree recently awarded by a quality assured university do not succeed in passing a professional certification of licensing examination focused on skills and competency. Take nursing as an example. Canadian (except Quebec) and US graduates of first degree level nursing programs are required to take the same licensing examination administered online so they may practice as qualified nurses. The overall 2015 pass rate in Canada was 70.6% while in the U.S., 78.3% of those who took it passed[17]. Put another way, between 22% and 29.4% of those students who completed their nursing degree within the past six months failed a test of knowledge and competency. The shift in the practice of employers away from a focus on the credential to a focus on what graduates can do places emphasis on experience as well as skills. They are increasingly looking at co-op experience, internships, voluntary service, international experience and "skills beyond school" – at the complete portfolio of the person who wishes to work for them. Quality assurance regimes need to look beyond the program, courses and assessment into the range of experiences which a student is able to add to their portfolio during their time at college or university. A quality program is no longer enough to satisfy employers, who are engaged in a global war for talent.
A renewed focus by Government on outcomes and impact
Governments, many of whom face financial challenges, are exploring their "return on investment" in higher education. The educational pipeline is viewed as the key avenue to increasing a state’s "educational capital" or highly qualified people (HQP) in the workplace. Educational capital is assumed to have a direct impact on a state’s economy and quality of life. Because of this, governments are increasingly requiring assessments of higher education outcomes and evaluations of social, economic, health and other impacts of higher education in their jurisdiction. Over and above looking at completion rates for degrees or equity, governments want to know the answer to the "so what?" question – "what difference does having all these graduates make for our society?".
This is not an easy question to answer with compelling and persuasive evidence. Universities in the UK do their best to present the case in their short pamphlet Why Invest in Universities?[18]. In this document, the authors focus on the impact of graduates on firm productivity, the spin-offs from research as job creators, research producing new products, which can transform industries (e.g. the development of graphene), the value of higher education as an export (in the UK this is estimated to be worth £10.7 billion) and other aspects of the contribution universities make to the fabric of society, economic growth and development and non-profit corporations. Similar documents were prepared by other higher education systems around the world.
From a quality perspective, how far down the path of assessing outcomes and impact should a quality assurance regime go? Is this part of a different sphere of work or is it related to the assessment of institutional capacity and strategic intention? Quality is about more than just the experience and performance of students; it is also about the performance of the institution in society.
WE NEED NEW APPROACHES TO QUALITY ASSURANCE
These ten key developments, chosen from a longer list, have an impact on both how we understand quality and how we practice quality assurance – points well made in the recent collection of papers The CIQG Quality Principles – Toward a Shared Understanding of Quality[19].
Just as universities and colleges are changing and adapting to emerging technologies, shifts in demographics, new economic realities, new demands from their governments and changing employer expectations, so must there be a similar shift in our understanding and practice of quality assurance.
At the heart of this shift is a move away from compliance and the idea of quality as "regulation" to a more "fit for purpose" understanding of quality, which is what both W Edwards Deming and Joseph Juran – the fathers of the quality movement - always argued was the key to understanding what quality is about. Each college and university should determine its purpose, its intentions for its students and faculty and instructors and then make extensive use of a range of quality tools to assess whether what they are doing is fit for purpose. At the heart of this work is understanding and evaluating the student experience of the institution in its totality and the learning outcomes it delivers. It is also increasingly necessary to look more critically at the institution itself, both in terms of its efficiency and effectiveness, but also its social and economic impact.
The idea that one approach to quality – one set of "standards" fits all – is growingly problematic. As new forms of degree granting institutions appear and as higher education becomes both more international and more complex, we need new approaches to quality assurance.
[4] Kinzie, J. et al (2016) Using Student Engagement Results to Oversee Educational Quality. Trustee Magazine, January/February, 2016. Available at http://agb.org/trusteeship/2016/ januaryfebruary/ using-student-engagement-results-to-oversee-educational-quality
[5] Ryan Craig (2015) College Disrupted: The Great Unbundling of Higher Education. New York: Palgrave Macmillan
[6] For a full review of what is occurring in Kentucky, see https://teachonline.ca/tools-trends/quality-guidelines-and-practice/how-...
[7] See Hill, P. and Barber, M. (2014) Preparing for a Renaissance in Assessment. Link expired
[8] Dead link
[9] For details: link expired
[10] Dead link
[11] Ross, A. (2016) The Industries of the Future. New York: Simon and Schuster.
[12] Source: dead link
[13] Aboul-Ela, B. (2016) Quality and Government. In Uvalic-Trumbic, S.[editor] (2016) The CIQG Quality Principles – Toward a Shared Understanding of Quality. Washington, DC: Council for Higher Educa- tion Accreditation / International Quality Group.
[14] Dead link
[15] European Commission (2014): The Erasmus Impact Study. Effects of the mobility on the skills and employability of students and the internationalization of the higher education institutions.
[16] Foster, T. (2013) Hiring Talent, Decoding Levels of Work in the Behavioral Interview. New York: Foster Learning Corporation.
[18] Dead link
[19] Uvalic-Trumbic, S.[editor] (2016) The CIQG Quality Principles – Toward a Shared Understanding of Quality. Washington, DC: Council for Higher Education Accreditation/International Quality Group. Available at https://eric.ed.gov/?id=ED569216