Dr. James Colliander, a Professor at the University of Toronto has been teaching first-year Calculus for 20 years, working with teaching assistants (TAs) for marking. He has also been involved with the Canadian Mathematical Society (CMS) which organizes the annual Canadian Open Mathematics Competition (COMC). This competition involves about 5,000 students, each completing a 16-page exam. The exams arrive at the University of Toronto to be marked by 100 volunteers. With some markers able to mark only limited sections of the papers, the distribution and coordination of the paper exams presented a considerable logistical challenge.
Assessing the difficulties of this situation, Professor Colliander saw the challenge as “breaking the stapled corners” of the exam papers by using the web to deliver the pages to be marked online by the markers, rather than relying on circulating paper. With each page graphically coded, then completed by students and scanned, exams could be marked collaboratively, quickly, efficiently, and provide improved feedback for students. From this concept, he worked with advisors and partners to develop Crowdmark, an online collaboration tool for teams of markers.
Crowdmark is structured so that multiple markers can simultaneously assess a set of exams; each marker can assess entire papers or be assigned specific questions.
- As the first step, the instructor authors and uploads the exam template as a PDF to Crowdmark, together with the roster of students scheduled to write the exam.
- Crowdmark returns a printable PDF file containing exams for all the students, with each page having a unique code to identify it.
- The exams are printed from this file and distributed to students; they are not matched with students at this point, which allows them to be distributed randomly.
- After the exams are completed by the students, they are scanned to produce JPG files and, using the Crowdmark interface, each exam is matched to the student’s e-mail.
- The exams can now be marked online by TAs or other markers who can add comments and equations, links to sources, and other feedback.
- When marking is complete, student marks are recorded, exams are stored electronically and students can access their exams to read comments.
- The system can be scaled and modified to respond to any size of exam cohort, with any type of question and answer format.
- The system can be integrated with institutional learning management systems and online grade books for recording of marks.
- Crowdmark is designed for use in any discipline; it has been used in History, Classics, Philosophy, Physics, Mathematics, and Engineering.
This list of capabilities reflects the current Crowdmark – not the initial product, which has been refined and improved over the past two years. The process of developing Crowdmark as an online platform was matched by steps in establishing a start-up company to commercialize the product.
In February 2012, Professor Colliander began the process of developing Crowdmark as a product and a business by meetings with the Innovation and Partnership Office (IPO) at the University of Toronto, which led to discussions at MaRS Innovation, a commercialization agency located at the MaRS Discovery District, an innovation centre connecting science, technology, and social entrepreneurs with business skills, networks, and capital. In the summer of 2012, Crowdmark joined UTEST, University of Toronto Early-Stage Technology. UTEST is a joint initiative of MaRS Innovation and the U of T Innovation and Partnership Office, offering start-up funding, mentoring, and office space to companies at the beginning of idea generation.
At the same time, Dr. Colliander asked Martin Muñoz, a PhD. student with software design experience, to join him as Co-Founder. Martin’s condition for joining was that the product that they developed had to be “awesome”.
During this initial development, Donna Shukaris at the IPO and Lyssa Neel at MaRS Innovation (now the Chief Operating Officer at Crowdmark) provided essential expertise and guidance.
Dr. Colliander approached the CMS about the possibility of their using the Crowdmark platform for the 2012 COMC exams; the response was that a prototype would have to be ready for assessment by October. An alpha version of the software was built, using public and private funding. Using Crowdmark, the exam template was scanned, the exam sets created and sent to those supervising the students. The completed exams were returned to and scanned at the University of Toronto and sent to 150 markers in 8 Canadian universities who marked the papers over the weekend. The technology had bugs that were fixed in real-time; this first version of Crowdmark resulted in a substantial productivity boost by requiring only half the person hours to complete the marking.
The university markers involved in this alpha test expressed enthusiasm for the platform and wanted access to it for their own courses, once it was fully developed. They also provided useful feedback concerning product improvements, such as:
- The capacity to write comments on the exam;
- The ability to use mathematical notations in feedback to students; and
- A more streamlined system for saving the marks, with confirmation of the marks being received and recorded.
All of these improvements were made.
A preliminary beta version was made available on a limited basis to instructors at the University of Toronto in the summer of 2013. Their feedback, both positive and negative, was used to build the public beta version, released in October 2013. This version was tested in large first-year courses.
One of the users, Alfonso Garcia-Saz, who oversees the 6 instructors who teach the 1,000 first-year Calculus students, asked for a number of improvements.
- The option to print extra exams – the number had been limited to the number of students, but this created problems with multiple exam sites;
- The possibility to award fractional marks rather than be limited to whole integers; and
- Most importantly, he did not want to be the one who scanned in all the returned exams.
Responding to this final request, Crowdmark arranged with RICOH, a company that provides scanning services, to pick up the 12,000 pages of the Calculus mid-term at 9:30 p.m. on the evening the exam was written. By 3:00 a.m., all the pages had been scanned and by the next morning, Martin Muñoz had completed the upload of all the scanned exam page images into Crowdmark. With the online exams distributed to TAs for assessment, the marking was completed that same day. The students received custom-coded e-mails with access to their exams, scores, and feedback on their papers, including corrected formula, hyperlinks to relevant parts of texts, and comments, many of which had been prepared in advance and stored for easy access by markers. In effect, using Crowdmark, the exams had been completed, marked, and returned in a 24-hour time frame.
Crowdmark has had pilots at the University of Southern California, York University, University of Waterloo, Humber College, University of Rochester, University of Illinois, and University of Toronto, as well as the Toronto District School Board. Further pilots are planned at Northwestern University, the Niagara District School Board, University of British Columbia, Queen’s University, Ryerson University, and elsewhere. As of March 2014, over 120,000 pages of assignments and exams have been evaluated using Crowdmark.
Crowdmark has now been officially launched for use at universities and colleges with pricing assessed on a per student basis. The company continues large-scale pilot tests, product development, and surveys of users to guide improvements and expansions.
Outcomes and Benefits
The feedback from Crowdmark users has positioned it as a useful tool for productivity gains, with exams able to be marked quickly by an individual or a group. Marking time is greatly reduced, while timeliness and feedback for students are improved.
The exams are stored online so that institutions are no longer required to keep paper copies. Over the period of a degree, a retrievable portfolio of exams for each student is created.
Crowdmark can gather all feedback provided by a TA, allowing faculty to consider if it is positive, negative, and/or helpful, and to engage with TAs on using feedback as part of the learning process.
In the University of Toronto program, Writing in Teaching, a literacy coach talks with graduate student TAs about how they can provide helpful comments to students about their writing skills, when marking exams in the science, mathematics, and technology streams. Sean Uppal, an Instructor at the University of Toronto, is using Crowdmark for the assessment of assignments in his Linear Algebra course, with the TAs providing comments on language use as well as maths skills. This application is being researched for effectiveness.
Part of Professor Colliander’s initial vision for Crowdmark is the creation of “grading as a service through a web-based labour market” and this remains part of the company direction. One of the steps toward this is what he calls “evaluation multiplicity”. With teams of markers for each question, considerable discrepancies in marks awarded can be found. A chief marker can pinpoint TAs who award marks that are often outside the norm and the lead instructor’s expectations. This can provide opportunities for training for the marking team and improvements in marking. Combined with a review of the quality of feedback of provided, this can lead to the identification of markers who perform exceptionally well.
In the longer term, these exceptional markers could be formed into subject-specialist teams made available to universities to provide full-service marking for their large classes. The Crowdmark platform and teams would allow universities to move beyond such restrictions as multiple choice exams to offering student more complex exams and feedback.