Voting Systems – Metrics in Teaching and Learning


The purpose of this theoretical case study is to explore the three main approaches to using voting systems within the teaching and learning process and how the metrics can be used to explore an inclusive curriculum design.

The findings indicate that the critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party.

In terms of value, the most cost effective and accessible method is approach two, which uses QR codes and Google forms. However, other issues such as data ownership and teachers performance need to be considered and litigated against.


Voting System, Metrics, Teaching and Learning, Pedagogy, Andragogy, Higher Education in Further Education, Inclusive Curriculum Design, Diversity, Fostering Learning in Large Groups


Using technology within the classroom has promised to revolutionise the teaching and learning process.  From the humble calculator to the computer to the interactive whiteboards and the virtual learning environment; state schools has been serving this mantra for many years. However, this shift to technology driving teaching and learning has led to a paradigm shift in teaching, from teacher centred (pedagogy) to student centred (andragogy) as explored by Pollard (2010, p.11). The HEA (2011) framework supports this shift (K1).  Setting aside the pedagogy and andragogy debate (Davenport & Davenport 1985; Delahaye et al. 1994; Holmes & Abington-Cooper 2000; Samaroo et al. 2013). Each education sector has had to come to terms with the constant demand for integrating technology within the teaching and learning process from students, governments and other stakeholders. One such solution for integrating metrics within teaching and learning is the student voting system (K2).

Scope and Development

To limit the scope, a student voting system for this case study is any electronic system which allows students to cast a vote.  Although it is important to recognise that there are other voting tools such as students raising their hands to agree or disagree with a question (K2, K3). Before the development and general acceptance of the ‘bring your own device’ movement, voting systems were integrated into the chair or table or clickers were issued to every student. However, prolific use of mobile technology by students has provided a cheap alternative, where the university provides software (Phone App) and Wi-Fi and the students use their own mobile device (K4).

Where this case study will refer to theories and practices such as Thomas and May (2010) four-pronged typology for student diversity (educational, Dispositional, Circumstantial, Culture) and theories of inclusive design it will not exclusively refer to particular theories (K2, K3, K5). Instead, key terms and concepts will be used, where appropriate, within the case study. The principle higher education thematic area of this case study will be fostering learning within large groups.

Common Approaches to Mobile Voting

Approach One: Most mobile voting apps require the students to install the app (software) onto their personal phone before they can place a vote. Once installed the students’ needs link their account with the university, often this is achieved by activating a provided account by the university. Now the student can use the voting software (Wi-Fi dependent) to interact within the learning experience. The advantage which this approach, is that individual students performance can be monitored and tracked during the module, year or length if the course. As a side benefit, the student’s engagement and the teacher’s ability to share new knowledge in a meaningful way can be gauged (K2, K3, K4).  However, this system is dependent on the student, their technology and their willingness to engage with a system which allows the university to track and monitor their performance. Linked to this, is how the university and teachers actually use the collected data, in respect to the DataProtectionn Act.

Often this approach does not integrate fluently within the presentation process, having to swap and change between different software and interfaces to view the incoming data; making the lesson feel disjoint (K1, K2, K3, K4).  However, more recent attempts, such as PollEveryWhere (2016) have provided plugins which allow the student responses to be directly displayed in presentational software, like Ms PowerPoint. This approach is focused on student centred technology which promotes greater independency and thus a student centred learning approach which supports the andragogical demands (K2, K3, K4, K5).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the constant updating of the app and server related hardware (K1, K2, K3, K4, K5, K6).

Approach Two: can be considered a quick and dirty approach as it relies on online survey software, for example, Survey Monkey, Google Forms or quizzes via the VLE. A principle issue with this approach is presenting the question behind a login function, such as using the VLE.  Although this allows the university to track which students have engaged with the process it does not, necessary provide an instant whole class graphical profile of the student responses (K2, K3, K4, K5, K6).  Also, the students quickly disengage with the process, as they dislike having to login to a system to cast their vote or response (K1, K2, K5, K6). However, software like Google Forms offers an instant profile of all responses allowing immediate feedback to the teacher and students.  This approach does not require additional software to be installed on the students’ mobile device as a web link to be provided via the learning materials, VLE or via a QR code.

This approach offers inclusive curriculum design as it allows flexibility, collaboration and to a large extent transparency and equality. However, there is little accountability as the lecturer holds the data. It offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can be minimal as the software and hardware are provided through a free online service, which is linked to the lecturer, not the university (K1, K2, K3, K4, K5, K6).

Approach Three: removes all technology from the student and places the burden on the university and lecturer.  Using Apps which employ the mobile device camera and pre-printed vote cards the lecturer scans the pre-printed voting card of each student who rotates the card to give different answers A, B, C or D (1, 2, 3, or 4). Although this reduces the number of variables, like the students’ devices and operating systems, it brings in additional issues such as having voting cards for each student (K1, K3, K6).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the cost of licence agreements and printed artefacts (K1, K2, K3, K4, K5, K6).

Value of Voting Systems

Students’ feedback on their current understanding of the presented materials is critical at informing lesson development, pace and future learning (Stewart et al. 2013).  Knowing where additional support is needed can inform workshop and tutorial development or allow the teacher to provide additional real-world examples to foster the learners understanding (Roth 2012). This supports student engagement, collaboration, flexibility and anticipatory needs of inclusive curriculum design (K2, K3, K4, K5, K6). Typically this is achieved through class questioning and teacher observation. However, this becomes problematic in a large lecture theatre. The issue of Socratic teaching (engage students with questions) within this context has been addressed by allowing the audience to post questions during or post lecture which can be answered the lecturer as they pop up on the screen or by other experts in real-time.  However, these posted questions can direct the lecture off topic,  cognitive overload for the lecturer and often the same question can be posed in multiple ways, requiring a filtering process (K1, K2, K3, K4, K5).  More importantly, a loss of anonymity by the student promotes a more authentic students engagement as they are less likely to conform to peer pressure due to the private response (as explored in Ash’s classical line experiment (Asch 1951)) (K1, K2, K3, K5).

A more controlled method of student engagement within their learning process is to pose carefully crafted question(s) which the students respond to (Cline et al. 2012).  Based on the students’ response the teacher makes an informed decision about the moving on or readdressing the previous content (within the lecturer or in the associated workshop/ tutorial).  The student response could also be displayed back to the students allowing them to compare their response to the group. Allowing the lecturer to make judgements about the constructive alignment of the students understanding in relation to the learning objectives as outlined by Biggs & Tang (2011, p.281) (K1, K2, K3, K4, K5, K6).

Voting Systems Analysis

There are many voting systems which can be integrated within the teaching and learning process. However, there are certain critical requirements which need to be considered when integrating mobile (phone and tablets) based voting systems into the learning process  (K1, K2, K3, K4, K5).  For example,

  • Student device ownership/ brings own device (inclusion/ exclusion)
  • Power levels of student mobile device
  • Student requires installing App on own device
  • Will app work with student device (hardware and operating system)
  • Most apps require an internet connection for the student to register their vote
  • Most apps require the student to register before voting (setup time)
  • Teacher computer requires the internet to view students vote

Mobile device ownership in the USA is 85% of students, similar results are presented for the UK , where mobile phone ownership is 93% across the UK (Ofcom 2015, pp.10, 65). However, is it reasonable to assume that all students own a mobile device which has the capacity to install a voting app and connect to the internet or is there a case for reasonable adjustment?  Wi-Fi access across the university is fairly standard, although there are some areas with poor or no access to the university’s Wi-Fi. On the whole, it could be argued that reasonable adjustment has taken place in terms of access to free Wi-Fi. With reference to devise ownership, a bank of tablet devices could be provided, allowing students to chose between using their own device (power levels, app installation etc…) or the university’s’ device (K1, K2, K3, K4, K5, K6).

The issue around installing and using applications on mobile devices can be an emotive one, where students using iPhones cannot access the voting app as it is designed to work on Android and Mobile Windows only.  The university cannot govern this, as the app is not own by the university. However, in recent years, the sector has recognised this as an issue and there has been a shift in multiple operating system software developments. This leads into the “Camera vs Voting App” debate (K1, K5).  The premise is based on the concept that by using the camera there is no software or hardware conflicts thus removing the voting app issue.  Also, QR code reader apps are freely available across all mobile operating systems and short hyperlinks (or hyperlinks can be embedded within the learning materials or VLE) can be provided for students who do not want to install additional software.  Therefore, the students mobile device needs access to Wi-Fi, free QR code reader,and a camera. All of these requirements can be provided via the university tablet devices. This approach also removes the need to register or log into a voting account. However, it will not be possible to automatically monitor specific students’ performance (K1, K2, K3, K4, K5, K6).

Application of Student Voting

The use of student voting is diverse and offers many opportunities to work more efficiently and effectively.

  • Poster Feedback (QR code – Google Form)

The students’ posters were displayed along with a poster number and a QR code.  Students were asked to review and provide feedback about the posters during a 30minute period, choosing to use their own device or the university iPad. All students elected to use the iPads provided by the university.  The online review form mirrored the requirements presented to the students at the start of the assessment. Alongside this peer review, was the teachers marking, where four faculty members marked the posters using the same process. Due to using Google forms the data was anonymous, only differentiated by either student or staff. By the end of the 30 minutes there were 120 responses across all posters. When the students return to the classroom the overall results were displayed via the projector allowing students to make conclusions of their peer marking against the teacher marking, which were very similar increasing transparency and equitability within marking.  The financial cost is very low whilst offering technology centred learning and different approaches to learning and assessment (K1, K2, K3, K4, K5, K6).

Due to the use of rating scales within the feedback form, the results were converted into grades which reduced the marking workload. However, a recommendation would be to include a declaration statement which asks the students and staff to agree a marking boundary (fail, pass, merit, and distinction) for the poster (K2, K4, K5).

  • Lesson Progression

Presenting questions to the students help’s the lecturers to make judgements about the current knowledge and understanding of the class. However, these need to be carefully crafted to avoid drifting off point and to ensure that the student responses are specific and relevant to the taught materials (Cline et al. 2012). The questions can be presented at any point during the lecture.  For example, asking students to respond to questions at the start of the lecture can act as a diagnostic tool (identifying areas of learning which are high and therefore increase the lesson pace) or as a baseline, followed by an end of lesson assessment to show how much progression the students feel they have made. Using the voting system within the mini-plenary process will also allow lecturers to gauge the lecture pace and knowledge transfer (K1, K2, K3, K4, K5, K6).

Within small groups there appears to be little advantage to using voting app’s as teacher questioning can be used effectively. However, through the voting systems students’ progression can be tracked allowing for the identification of students who are at risk. This is important for satisfaction levels, retention and belongingness whilst addressing student diversity (K2, K3, K4, K5, K6).

  • Lesson and Teaching Feedback

Using the voting system within the lecture provides valuable data which can be used to inform teaching, learning and planning.  For example, in the final plenary students can indicate what topic(s) they want to focus on during the workshops or request additional reading or support.  This process can also be used to make informed judgements about the suitableness of the examples and images using within the lecture.  Supporting elements of Thomas and May (2010) typology for student diversity (K1, K2, K3, K4, K5, K6).

This could be extended to identify which areas need addressing or improving for a given lesson, helping to refine the learning process and maximise learning potential. Critical issues like cultural diversity and personal experience are likely to influence the relevance of explanations and images. This could also link to teaching style and approach (K2, K3, K5, K6).

  • Student Engagement

Controversially, voting apps can be used to track student engagement, in the same way as the university virtual learning environment. Knowing which students are actively engaged with the learning process and how successful they are (number of correct vs incorrect response) can help to target support and intervention (Roth 2012). It could also be used to identify the gifted and talented students and provide specific intervention.  This will increase the added value potential and authentic learning experience of these students.  However, this could also be used to identify students who will achieve a high success within their dissertation, allowing staff to cream off the best students for themselves (K1, K2, K3, K4, K5, K6).

This information can also be used to tackle complaints and retention issues, see student voting and teacher’s performance, below.

The Pitfalls of Student Voting

  • Student Voting and Teacher’s Performance (remote observation)

Arguably, student responses can indicate the teacher effectiveness of explaining concept and critical information. The response can also indicate the level of student engagement and active learning within the teaching and learning process. These metrics can be data mined to create a normalised comparison of lecturers and student experience (K1, K2, K4, K5, K6).

  • Student Not Engaging Within The Learning Process

Using technology within the learning environment is always open to abuse. For example, using a laptop to type notes during lectures could lead to students updating their Facebook profile, checking their emails or placing bids on their eBay items, which all distract from the learning intention (K2, K3, K4, K5, K6).

  • Accessibility, Cost Of Software and Hardware

A university controlled system would be very expensive to implement and maintain compared to using a third party app and allowing the students to use their own mobile device.  In effect, the major expense would be the Wi-Fi and the licence agreement of the voting software (K2, K4, K6).

  • Technology Failure

Technology failure, when allowing the user to bring their own device would be focused on a) Wi-Fi issues, b) their party app and c) student and their device. However, if the university implement a university controlled system additional issues such as a) server access, software updates and security, c) software compatibility and d) usability and accessibility legal requirements (K5, K6).


In conclusion, the use of a student voting system offers many advantages which can be used to support the student engagement, constructive alignment and an inclusive curriculum design. The critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party. Other issues such as data ownership and teachers performance need to be considered and litigated against. The most cost effective and accessible method is approach two, which uses QR codes and Google forms.


Asch, S.E., 1951. Effects of group pressure on the modification and distortion of judgments. Groups, leadership and men, pp.177–190.

Biggs, J. & Tang, C., 2011. Teaching For Quality Learning At University 4th ed., England: Open University Press. Available at:

CiCS, 2011. CiCS: Student Mobile Device Survey, Sheffield. Available at:!/file/mobilesurvey2011.pdf .

Clickers: Beyond the Basics, 2016. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: [Accessed May 15, 2016].

Cline, K. et al., 2012. Addressing Common Student Errors With Classroom Voting in Multivariable Calculus. PRIMUS, 23(1), pp.60–75. Available at:

Dahlstrom, E. & Bichsel, J., 2014. ECAR Study of Undergraduate Students and Information Technology, Available at:

Data Protection Act, 1998. Data Protection Act, Great Britian: Available at:

Davenport, J. & Davenport, J.A., 1985. A Chronology and Analysis of The Andragogy Debate. Adult Education Quarterly, 35(3), pp.152–159. Available at: henschke_5_11_04/a_chronology_and_analysis_of_the_andragogy_debate.pdf.

Davies, S., 2014. Mobile device adoption in UK Higher Education, Manchester. Available at: .

Delahaye, B.L., Limerick, D.C. & Hearn, G., 1994. The Relationship between Andragogical and Pedagogical Orientations and the Implications for Adult Learning. Adult Education Quarterly, 44(4), pp.187–200. Available at: .

Harris, P., 2015. Pearson: Student Mobile Device Survey, Available at: [Accessed May 15, 2016].

HEA, 2011. Framework Guidance Note 2: What are the UK Professional Standards Framework Descriptors?, London. Available at:

Holmes, G. & Abington-Cooper, M., 2000. Pedagogy vs. Andragogy: A False Dichotomy? The Journal of Technology Studies, 26(2). Available at: .

Knowles, M.S., Holton III, E.F. & Swanson, R.A., 2011. The Adult Learner 7th ed., California: Routledge. Available at:

Ofcom, 2015. Communications Market Report, London. Available at:

Pollard, A.J., 2010. Professionalism and Pedagogy: a contemporary opportunity, Bristol: University of Bristol. Available at:, 2016. About Us, p.1. Available at: [Accessed May 15, 2016].

Roth, K.A., 2012. Assessing Clicker Examples Versus Board Examples in Calculus. PRIMUS, 22(5), pp.353–364. Available at:

Samaroo, S., Cooper, E. & Green, T., 2013. Pedandragogy: A way forward to self-engaged learning. New Horizons in Adult Education and Human Resource Development, 25(3), pp.76–90. Available at:

Stewart, A., Storm, C. & VonEpps, L., 2013. Analyzing Student Confidence in Classroom Voting With Multiple Choice Questions. PRIMUS, 23(8), pp.718–732. Available at:

What’s New with Clickers?, 2012. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: [Accessed May 15, 2016].

Wilson, C., 2006. No One Is Too Old To Learn: Neuroandragogy: A Theoretical Perspective on Adult Brain Functions and Adult Learning, iUniverse, Inc. Available at: Andragogy&f=false.


Mobile Devices: Getting A Bit More From The Battery

In the last five years we have seen an explosion of mobile devices within the general population so much that it is estimated that the typical British teenager owns six mobile devices, with 84% owning a smartphone, BBC (2013). Where as my generation grew up with vinyl, cassettes and the ZX spectrum this generation is firmly plugged into connectivity and instant gratification.  Where the operation of technologies such as the touchscreen computer is as second nature to them as feeding their addiction of ‘looking charging points.’

As a business we can harness this almost inherent link to touchscreen devices to reduce the staff training and development cost through converting our systems to mirror the operating systems found on these devices (iSO and Android).  By taking this pathway we reduce or ICT hardware cost as these devices start at £50 unlike laptops and desktop computers. When combined with the development of online software such as MS Office 365 and Google Documents businesses and organisations no longer require local installation of software, only a platform to view and interact with the internet.

However, there are some key issues, well one key issue, that of power.  For using touchscreen tablets with a fix location such as in the office or other backrooms it is a matter of having the device plugged into the power socket. However, for roaming it is much more problematic especially when roaming is offsite. The quickest and simplest solution for roaming on site is to provide a recharging bank which offers fully charged batteries or devices. However, offsite is somewhat more problematic where spare batteries offer a short-term solution, it is by no-means a true solution.

Get Full Article Here

Recent research by Carroll and Gernot (2010) and  Perrucci, Fitzek and Widmer (2011) suggests how smart design and intelligent usage can extend the battery life. Please note that this is not a definitive explanation of all the influencing factors on battery life.  These findings show that If you need to use a wireless connection use a Wi-Fi not 3G as you get an extra 3.5 megabytes of data for the 1400 mW’s of power. However, the frequency of remote connection needs to be reduce to a minimum and where possible a hard-docking, synchronising and charging approach should be used to remove the need for all wireless connections. The brightness of the screen needs to be lowered to about 60% intensity saving 273 mW’s for a white background and 161 mW’s for a black background. Suggesting that interface design needs to use darker colours to reduce power consumption. Finally, data should be written to the flash memory as this will save 22 mW’s for every megabyte stored.

Understanding Mobile Apps: Is There A Difference?


We all use Apps (application) on our phones or touchpads but have you ever thought about how they differ?  There are many ways in which we can classify or group App’s but I want to explore how we group App’s in terms of accessing and storing data.  (For the full article see Understanding Mobile App’s)

This means that App’s can be grouped into three categories

  1. Native App
  2. Integrated App
  3. Web App

The Native App

Installation: The Native App is downloaded and installed on your device and everything it needs and saves is done so on the device.  For example, alarm clock, phone lock or sending a text message.

Advantages: This means that it does not need to access the internet but as access to all functions and resources (hardware and software) on the device. Can be referred to as Install and forget.

Disadvantages: The biggest issue of this type of App is that it is not updated unless the user manually undertakes this process. This in turn opens the App to security risk as coding practices change to counter hacking risks.

Quick StartPhoneGAP and RhoMobile

The Integrated App

Installation: This is middle ground as the App is install on the device giving access to all the functions and resources whilst allowing “occasional connection” to the internet to update the software or stored data.  Now the term occasional is a little misleading.  For example, the Facebook App requires almost constant access to the internet allowing for regular updates where software used to unlock the phone my check for updates every ten weeks. So we could say that the unlock software is more like a Native App and the Facebook App is more like a Web App.

Advantages: is having the ability to use all the function and resources of the device and been able to synchronize with a remote site.  For example, fitness app’s use the devices global position satellite function (GPS) to track your position and the bluetooth to connect with external devices such as a heart monitor. This data is stored locally (on the device) during exercise and synchronised to a website after exercise. Once synchronised to the remote location the data is used to provide a vest array of information about the training event.

Disadvantages: The three main issues with this approach is the synchronisation of multiple uses of the same record (data), which version is correct. The second is the security of data during transfer (hacking) but encryption can reduce this risk.  Finally, any changes to the remote server will result in conflict issues with the installed App, especially if the database configuration is changed.

Quick StartPhoneGAP and RhoMobile

The Web App

Installation: is little more than a icon and a weblink to the webpage which you want to display. Think of it as a window in which a website is displayed and as such the App requires constant access to the internet to allow usage.  Or you are view a webpage in a web browser without the address bar and toolbars.

Advantages: with this approach it takes less than five minutes to create a App and all function and resources are available via the website not the device reducing issues linked to synchronization, software updating and data security.

Disadvantages: is that you need to be connected to the internet which is a fight between internet availability (lack of signal or strength) and the cost of internet roaming.

Quick StartApps Geyser

Try this one…. download my Web App

dancampbell website QR Code

Customer Relationships, Management or Managed?

Traditionally, customer relationships were managed through software tools, marketing and advertising.  This was the era of ‘I have a product and you need it.’  For example, when mobile phones first came within the reach of the general public it was the business sector which embarrassed it.  Now 94% of the British adult population own a mobile communication devices of some description, Ofcom (2013).

It has been argued by many but Greenburg (2009) clearly describes how this relationship between the customer and the company/ organisation has now migrated into a new era, the era of the enlightened consumer or “Social Customer”.  He makes his point by highlighting how the customer ecosystem has changed due to social media and tools which are freely available to consumers.

This new era (CRM 2.0) of the social customer has resulted in a power transfer from the company/ organisation to the customer/ consumer.  The new mantra is ‘tell us what you want so we can sell it to you.”  This has led to a dynamic supply chain which is able to adjust to the customer needs.  For example,  when purchasing a new car the customer can now select the body colour and personalised images, internal finishing, electrical goods such as music player and air conditioning.  Resulting in a customised product and personalised experience.

CRM 2.0 is a philosophy & a business strategy, supported by a system and a technology, designed to engage the customer in a collaborative interaction that provides mutually beneficial value in a trusted & transparent business environment (CRM 2.0)

Why bother to manage the customers relationship, I hear you ask?  Simple, Facebook!  For example, I have a negative experience at my local branch and I update my status as I am waiting to resolve my problem.  Two minutes later I update my Facebook wall again indicating that the company is rubbish and that none of my friends should shop there.  I have an average of 130 friends (Facebook (2013)) who have similar interest to me and they each have 130 friends who have similar interest to them and so on.  Before I have left the shop 16,900 people have potentially seen my comments about the shop and my experience.

Facebook (2013) has shown that friends amplify and echo each others emotions and views, which means that my friends will post negative comments about their previous experience at the shop or recount someone else experience to support my post.  So much for developing a ‘collaborative relationship based on trust and transparent business.’

Gone are the days where the customer complaint is managed (lost) within the company’s formal complaint process and the customer feels powerless.  This new era of CRM 2.0 required the company develop new strategies to maintain and develop the relationship with it current and future customers. Data mining tools within facebook allows companies to quickly identify comments which includes their name and the associated sentiment (negative, neutral, positive) allowing them to quickly respond via the facebook comment to maintain and reinforce trust and transparent business process.

Who is in charge the customer or the organisation?  …. or does the organisation control (management) or manage the relationship?