Voting Systems – Metrics in Teaching and Learning

Abstract

The purpose of this theoretical case study is to explore the three main approaches to using voting systems within the teaching and learning process and how the metrics can be used to explore an inclusive curriculum design.

The findings indicate that the critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party.

In terms of value, the most cost effective and accessible method is approach two, which uses QR codes and Google forms. However, other issues such as data ownership and teachers performance need to be considered and litigated against.

Keywords

Voting System, Metrics, Teaching and Learning, Pedagogy, Andragogy, Higher Education in Further Education, Inclusive Curriculum Design, Diversity, Fostering Learning in Large Groups

Introduction

Using technology within the classroom has promised to revolutionise the teaching and learning process.  From the humble calculator to the computer to the interactive whiteboards and the virtual learning environment; state schools has been serving this mantra for many years. However, this shift to technology driving teaching and learning has led to a paradigm shift in teaching, from teacher centred (pedagogy) to student centred (andragogy) as explored by Pollard (2010, p.11). The HEA (2011) framework supports this shift (K1).  Setting aside the pedagogy and andragogy debate (Davenport & Davenport 1985; Delahaye et al. 1994; Holmes & Abington-Cooper 2000; Samaroo et al. 2013). Each education sector has had to come to terms with the constant demand for integrating technology within the teaching and learning process from students, governments and other stakeholders. One such solution for integrating metrics within teaching and learning is the student voting system (K2).

Scope and Development

To limit the scope, a student voting system for this case study is any electronic system which allows students to cast a vote.  Although it is important to recognise that there are other voting tools such as students raising their hands to agree or disagree with a question (K2, K3). Before the development and general acceptance of the ‘bring your own device’ movement, voting systems were integrated into the chair or table or clickers were issued to every student. However, prolific use of mobile technology by students has provided a cheap alternative, where the university provides software (Phone App) and Wi-Fi and the students use their own mobile device (K4).

Where this case study will refer to theories and practices such as Thomas and May (2010) four-pronged typology for student diversity (educational, Dispositional, Circumstantial, Culture) and theories of inclusive design it will not exclusively refer to particular theories (K2, K3, K5). Instead, key terms and concepts will be used, where appropriate, within the case study. The principle higher education thematic area of this case study will be fostering learning within large groups.

Common Approaches to Mobile Voting

Approach One: Most mobile voting apps require the students to install the app (software) onto their personal phone before they can place a vote. Once installed the students’ needs link their account with the university, often this is achieved by activating a provided account by the university. Now the student can use the voting software (Wi-Fi dependent) to interact within the learning experience. The advantage which this approach, is that individual students performance can be monitored and tracked during the module, year or length if the course. As a side benefit, the student’s engagement and the teacher’s ability to share new knowledge in a meaningful way can be gauged (K2, K3, K4).  However, this system is dependent on the student, their technology and their willingness to engage with a system which allows the university to track and monitor their performance. Linked to this, is how the university and teachers actually use the collected data, in respect to the DataProtectionn Act.

Often this approach does not integrate fluently within the presentation process, having to swap and change between different software and interfaces to view the incoming data; making the lesson feel disjoint (K1, K2, K3, K4).  However, more recent attempts, such as PollEveryWhere (2016) have provided plugins which allow the student responses to be directly displayed in presentational software, like Ms PowerPoint. This approach is focused on student centred technology which promotes greater independency and thus a student centred learning approach which supports the andragogical demands (K2, K3, K4, K5).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the constant updating of the app and server related hardware (K1, K2, K3, K4, K5, K6).

Approach Two: can be considered a quick and dirty approach as it relies on online survey software, for example, Survey Monkey, Google Forms or quizzes via the VLE. A principle issue with this approach is presenting the question behind a login function, such as using the VLE.  Although this allows the university to track which students have engaged with the process it does not, necessary provide an instant whole class graphical profile of the student responses (K2, K3, K4, K5, K6).  Also, the students quickly disengage with the process, as they dislike having to login to a system to cast their vote or response (K1, K2, K5, K6). However, software like Google Forms offers an instant profile of all responses allowing immediate feedback to the teacher and students.  This approach does not require additional software to be installed on the students’ mobile device as a web link to be provided via the learning materials, VLE or via a QR code.

This approach offers inclusive curriculum design as it allows flexibility, collaboration and to a large extent transparency and equality. However, there is little accountability as the lecturer holds the data. It offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can be minimal as the software and hardware are provided through a free online service, which is linked to the lecturer, not the university (K1, K2, K3, K4, K5, K6).

Approach Three: removes all technology from the student and places the burden on the university and lecturer.  Using Apps which employ the mobile device camera and pre-printed vote cards the lecturer scans the pre-printed voting card of each student who rotates the card to give different answers A, B, C or D (1, 2, 3, or 4). Although this reduces the number of variables, like the students’ devices and operating systems, it brings in additional issues such as having voting cards for each student (K1, K3, K6).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the cost of licence agreements and printed artefacts (K1, K2, K3, K4, K5, K6).

Value of Voting Systems

Students’ feedback on their current understanding of the presented materials is critical at informing lesson development, pace and future learning (Stewart et al. 2013).  Knowing where additional support is needed can inform workshop and tutorial development or allow the teacher to provide additional real-world examples to foster the learners understanding (Roth 2012). This supports student engagement, collaboration, flexibility and anticipatory needs of inclusive curriculum design (K2, K3, K4, K5, K6). Typically this is achieved through class questioning and teacher observation. However, this becomes problematic in a large lecture theatre. The issue of Socratic teaching (engage students with questions) within this context has been addressed by allowing the audience to post questions during or post lecture which can be answered the lecturer as they pop up on the screen or by other experts in real-time.  However, these posted questions can direct the lecture off topic,  cognitive overload for the lecturer and often the same question can be posed in multiple ways, requiring a filtering process (K1, K2, K3, K4, K5).  More importantly, a loss of anonymity by the student promotes a more authentic students engagement as they are less likely to conform to peer pressure due to the private response (as explored in Ash’s classical line experiment (Asch 1951)) (K1, K2, K3, K5).

A more controlled method of student engagement within their learning process is to pose carefully crafted question(s) which the students respond to (Cline et al. 2012).  Based on the students’ response the teacher makes an informed decision about the moving on or readdressing the previous content (within the lecturer or in the associated workshop/ tutorial).  The student response could also be displayed back to the students allowing them to compare their response to the group. Allowing the lecturer to make judgements about the constructive alignment of the students understanding in relation to the learning objectives as outlined by Biggs & Tang (2011, p.281) (K1, K2, K3, K4, K5, K6).

Voting Systems Analysis

There are many voting systems which can be integrated within the teaching and learning process. However, there are certain critical requirements which need to be considered when integrating mobile (phone and tablets) based voting systems into the learning process  (K1, K2, K3, K4, K5).  For example,

  • Student device ownership/ brings own device (inclusion/ exclusion)
  • Power levels of student mobile device
  • Student requires installing App on own device
  • Will app work with student device (hardware and operating system)
  • Most apps require an internet connection for the student to register their vote
  • Most apps require the student to register before voting (setup time)
  • Teacher computer requires the internet to view students vote

Mobile device ownership in the USA is 85% of students, similar results are presented for the UK , where mobile phone ownership is 93% across the UK (Ofcom 2015, pp.10, 65). However, is it reasonable to assume that all students own a mobile device which has the capacity to install a voting app and connect to the internet or is there a case for reasonable adjustment?  Wi-Fi access across the university is fairly standard, although there are some areas with poor or no access to the university’s Wi-Fi. On the whole, it could be argued that reasonable adjustment has taken place in terms of access to free Wi-Fi. With reference to devise ownership, a bank of tablet devices could be provided, allowing students to chose between using their own device (power levels, app installation etc…) or the university’s’ device (K1, K2, K3, K4, K5, K6).

The issue around installing and using applications on mobile devices can be an emotive one, where students using iPhones cannot access the voting app as it is designed to work on Android and Mobile Windows only.  The university cannot govern this, as the app is not own by the university. However, in recent years, the sector has recognised this as an issue and there has been a shift in multiple operating system software developments. This leads into the “Camera vs Voting App” debate (K1, K5).  The premise is based on the concept that by using the camera there is no software or hardware conflicts thus removing the voting app issue.  Also, QR code reader apps are freely available across all mobile operating systems and short hyperlinks (or hyperlinks can be embedded within the learning materials or VLE) can be provided for students who do not want to install additional software.  Therefore, the students mobile device needs access to Wi-Fi, free QR code reader,and a camera. All of these requirements can be provided via the university tablet devices. This approach also removes the need to register or log into a voting account. However, it will not be possible to automatically monitor specific students’ performance (K1, K2, K3, K4, K5, K6).

Application of Student Voting

The use of student voting is diverse and offers many opportunities to work more efficiently and effectively.

  • Poster Feedback (QR code – Google Form)

The students’ posters were displayed along with a poster number and a QR code.  Students were asked to review and provide feedback about the posters during a 30minute period, choosing to use their own device or the university iPad. All students elected to use the iPads provided by the university.  The online review form mirrored the requirements presented to the students at the start of the assessment. Alongside this peer review, was the teachers marking, where four faculty members marked the posters using the same process. Due to using Google forms the data was anonymous, only differentiated by either student or staff. By the end of the 30 minutes there were 120 responses across all posters. When the students return to the classroom the overall results were displayed via the projector allowing students to make conclusions of their peer marking against the teacher marking, which were very similar increasing transparency and equitability within marking.  The financial cost is very low whilst offering technology centred learning and different approaches to learning and assessment (K1, K2, K3, K4, K5, K6).

Due to the use of rating scales within the feedback form, the results were converted into grades which reduced the marking workload. However, a recommendation would be to include a declaration statement which asks the students and staff to agree a marking boundary (fail, pass, merit, and distinction) for the poster (K2, K4, K5).

  • Lesson Progression

Presenting questions to the students help’s the lecturers to make judgements about the current knowledge and understanding of the class. However, these need to be carefully crafted to avoid drifting off point and to ensure that the student responses are specific and relevant to the taught materials (Cline et al. 2012). The questions can be presented at any point during the lecture.  For example, asking students to respond to questions at the start of the lecture can act as a diagnostic tool (identifying areas of learning which are high and therefore increase the lesson pace) or as a baseline, followed by an end of lesson assessment to show how much progression the students feel they have made. Using the voting system within the mini-plenary process will also allow lecturers to gauge the lecture pace and knowledge transfer (K1, K2, K3, K4, K5, K6).

Within small groups there appears to be little advantage to using voting app’s as teacher questioning can be used effectively. However, through the voting systems students’ progression can be tracked allowing for the identification of students who are at risk. This is important for satisfaction levels, retention and belongingness whilst addressing student diversity (K2, K3, K4, K5, K6).

  • Lesson and Teaching Feedback

Using the voting system within the lecture provides valuable data which can be used to inform teaching, learning and planning.  For example, in the final plenary students can indicate what topic(s) they want to focus on during the workshops or request additional reading or support.  This process can also be used to make informed judgements about the suitableness of the examples and images using within the lecture.  Supporting elements of Thomas and May (2010) typology for student diversity (K1, K2, K3, K4, K5, K6).

This could be extended to identify which areas need addressing or improving for a given lesson, helping to refine the learning process and maximise learning potential. Critical issues like cultural diversity and personal experience are likely to influence the relevance of explanations and images. This could also link to teaching style and approach (K2, K3, K5, K6).

  • Student Engagement

Controversially, voting apps can be used to track student engagement, in the same way as the university virtual learning environment. Knowing which students are actively engaged with the learning process and how successful they are (number of correct vs incorrect response) can help to target support and intervention (Roth 2012). It could also be used to identify the gifted and talented students and provide specific intervention.  This will increase the added value potential and authentic learning experience of these students.  However, this could also be used to identify students who will achieve a high success within their dissertation, allowing staff to cream off the best students for themselves (K1, K2, K3, K4, K5, K6).

This information can also be used to tackle complaints and retention issues, see student voting and teacher’s performance, below.

The Pitfalls of Student Voting

  • Student Voting and Teacher’s Performance (remote observation)

Arguably, student responses can indicate the teacher effectiveness of explaining concept and critical information. The response can also indicate the level of student engagement and active learning within the teaching and learning process. These metrics can be data mined to create a normalised comparison of lecturers and student experience (K1, K2, K4, K5, K6).

  • Student Not Engaging Within The Learning Process

Using technology within the learning environment is always open to abuse. For example, using a laptop to type notes during lectures could lead to students updating their Facebook profile, checking their emails or placing bids on their eBay items, which all distract from the learning intention (K2, K3, K4, K5, K6).

  • Accessibility, Cost Of Software and Hardware

A university controlled system would be very expensive to implement and maintain compared to using a third party app and allowing the students to use their own mobile device.  In effect, the major expense would be the Wi-Fi and the licence agreement of the voting software (K2, K4, K6).

  • Technology Failure

Technology failure, when allowing the user to bring their own device would be focused on a) Wi-Fi issues, b) their party app and c) student and their device. However, if the university implement a university controlled system additional issues such as a) server access, software updates and security, c) software compatibility and d) usability and accessibility legal requirements (K5, K6).

Conclusion

In conclusion, the use of a student voting system offers many advantages which can be used to support the student engagement, constructive alignment and an inclusive curriculum design. The critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party. Other issues such as data ownership and teachers performance need to be considered and litigated against. The most cost effective and accessible method is approach two, which uses QR codes and Google forms.

References

Asch, S.E., 1951. Effects of group pressure on the modification and distortion of judgments. Groups, leadership and men, pp.177–190.

Biggs, J. & Tang, C., 2011. Teaching For Quality Learning At University 4th ed., England: Open University Press. Available at: https://books.google.co.uk/books?id=XhjRBrDAESkC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false.

CiCS, 2011. CiCS: Student Mobile Device Survey, Sheffield. Available at: https://www.sheffield.ac.uk/polopoly_fs/1.103665!/file/mobilesurvey2011.pdf .

Clickers: Beyond the Basics, 2016. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: http://ii.library.jhu.edu/tag/clickers/ [Accessed May 15, 2016].

Cline, K. et al., 2012. Addressing Common Student Errors With Classroom Voting in Multivariable Calculus. PRIMUS, 23(1), pp.60–75. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2012.697098.

Dahlstrom, E. & Bichsel, J., 2014. ECAR Study of Undergraduate Students and Information Technology, Available at: https://net.educause.edu/ir/library/pdf/ss14/ERS1406.pdf.

Data Protection Act, 1998. Data Protection Act, Great Britian: legislation.gov.uk. Available at: http://www.legislation.gov.uk/ukpga/1998/29/contents.

Davenport, J. & Davenport, J.A., 1985. A Chronology and Analysis of The Andragogy Debate. Adult Education Quarterly, 35(3), pp.152–159. Available at: http://www.umsl.edu/~henschkej/henschke/more henschke_5_11_04/a_chronology_and_analysis_of_the_andragogy_debate.pdf.

Davies, S., 2014. Mobile device adoption in UK Higher Education, Manchester. Available at: http://www.elearning.eps.manchester.ac.uk/blog/2014/m-article-series-mobile-device-adoption-in-uk-he/ .

Delahaye, B.L., Limerick, D.C. & Hearn, G., 1994. The Relationship between Andragogical and Pedagogical Orientations and the Implications for Adult Learning. Adult Education Quarterly, 44(4), pp.187–200. Available at: https://core.ac.uk/download/files/310/10873874.pdf .

Harris, P., 2015. Pearson: Student Mobile Device Survey, Available at: http://www.pearsoned.com/wp-content/uploads/2015-Pearson-Student-Mobile-Device-Survey-College.pdf [Accessed May 15, 2016].

HEA, 2011. Framework Guidance Note 2: What are the UK Professional Standards Framework Descriptors?, London. Available at: https://www.heacademy.ac.uk/sites/default/files/downloads/what_are_the_uk_professional_standards_framework_descriptors.pdf.

Holmes, G. & Abington-Cooper, M., 2000. Pedagogy vs. Andragogy: A False Dichotomy? The Journal of Technology Studies, 26(2). Available at: https://scholar.lib.vt.edu/ejournals/JOTS/Summer-Fall-2000/holmes.html .

Knowles, M.S., Holton III, E.F. & Swanson, R.A., 2011. The Adult Learner 7th ed., California: Routledge. Available at: https://books.google.co.uk/books?id=urUVrB1hLKAC&printsec=frontcover#v=onepage&q&f=false.

Ofcom, 2015. Communications Market Report, London. Available at: http://stakeholders.ofcom.org.uk/binaries/research/cmr/cmr15/CMR_UK_2015.pdf.

Pollard, A.J., 2010. Professionalism and Pedagogy: a contemporary opportunity, Bristol: University of Bristol. Available at: http://www.tlrp.org/pub/documents/TLRPGTCEProf&Pedagogy.pdf.

PollEveryWhere.com, 2016. PollEveryWhere.com. About Us, p.1. Available at: https://www.polleverywhere.com/ [Accessed May 15, 2016].

Roth, K.A., 2012. Assessing Clicker Examples Versus Board Examples in Calculus. PRIMUS, 22(5), pp.353–364. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2011.623503.

Samaroo, S., Cooper, E. & Green, T., 2013. Pedandragogy: A way forward to self-engaged learning. New Horizons in Adult Education and Human Resource Development, 25(3), pp.76–90. Available at: http://doi.wiley.com/10.1002/nha3.20032.

Stewart, A., Storm, C. & VonEpps, L., 2013. Analyzing Student Confidence in Classroom Voting With Multiple Choice Questions. PRIMUS, 23(8), pp.718–732. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2013.801381.

What’s New with Clickers?, 2012. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: http://ii.library.jhu.edu/tag/in-class-voting-system/ [Accessed May 15, 2016].

Wilson, C., 2006. No One Is Too Old To Learn: Neuroandragogy: A Theoretical Perspective on Adult Brain Functions and Adult Learning, iUniverse, Inc. Available at: https://books.google.co.uk/books?id=J2EGFaH19vUC&pg=PA96&dq=Knowles%E2%80%99+Andragogy&hl=en&sa=X&ved=0ahUKEwj357Sr-dzMAhXpAsAKHc4HAdcQ6AEINzAE#v=onepage&q=Knowles%E2%80%99 Andragogy&f=false.

 

Voting Systems – Metrics in Teaching and Learning
5 (100%) 1 vote

    Comments are closed.