Quick and Dirty View of an Artefact Project

This is only intended to help you conceptualise the requirements of a project. Please do not use it as a true reflection as you will lose marks or even fail because there is a lot of missing project requirements.


TitleImprove whole school students attendance through targeted policy change
Context/ BackgroundSchools are judged by Ofsted and parents using an array of indicators which influence the schools’ league table ranking. Whole school attendance is one of these indicators which influences not only the stakeholders’ judgment (Campbell, 2013) but students day-to-day behaviour within the classroom (Campbell, 2014). Research has shown that tackling attendance increases the level of students positive behaviour and learning (Campbell, 2015) which results in an increased students performance (Campbell, 2016). This in turn impacts on the schools’ league table ranking.
ProblemCurrently, whole school student attendance (83%) is lower than the 95% statutory requirement (DoE, 2017).
ImpactStudents are missing literacy and numeracy lessons which means they are falling short of their expected learning target, which will influence the schools’ league table ranking.
AimCreate a new attendance policy and associated procedures which identifies and tackles absenteeism on a daily basis. 
SolutionThe creation of a new attendance policy based on best practice. For example, office staff to phone parents by 930am to identify why their child is not in school. After two phone calls home, the child is referred to the educational welfare officer with a view to issuing an attendance fine.
  1. To investigate best practice and absenteeism 
  2. To examine how a new system could be implemented
  3. To explore how software can be used to support the attendance policy procedures, tracking and monitoring. 
  4. To document the attendance policy procedures and in line with the new absenteeism policy
  1. Write a literature review covering absenteeism best practice and policy design 
  2. Create a project plan in the form of a Gantt chart
  3. Create a risk assessment plan
  4. Produce policy procedures for staff to follow
  5. Manufacture a prototype of the ‘software’ to test the tracking and monitoring process of attendance.
  6. Demonstrate that the new approach to absenteeism is more effective

Project Success Criteria

  1. The attendance figure will improve to 95%
  2. Students will have more assessment evidence in their subject books
  3. Students learning levels will start to improve

The principal actors of the new attendance policy and the associated system will be the school’s administration team. Data will be transferred to other stakeholders such as teachers and the school leadership team but they will not have any day-to-day interaction.  There are two critical constraints a) time availability at the start of the school day and b) access to the tracking software. (Note, some students prefer to mindmap their scope.)

Project Resources

  • Hardware
    • Bullet point list of all required hardware
    • Based on all the software demands
  • Software
    • Bullet point list of all required software
  • Electronics
    • Bullet point list of all required electronics
      • Camera
      • Specialist equipment
  • Library
    • Bullet point list of all required library needs
    • State what topics you will need access to
Project Management Approach 
  • Justify why your approach is this the best management approach for your project
  • Linear (waterfall) or non-linear (agile) management style
    • You may use a different approach during the artefact creation
Project Plan
  • Gantt Chart and narrative of time scale
    • Milestones are reference points (literature review), they are not tasks
    • Must have sub-milestones which are tasks and positioned within a milestone. 
    • All milestones and sub-milestones must have predecessor identified, where apropirate. 
Project Risk Assessment
  • You must state your fall back position if your project fails e.g. the system will be restored to its current state. 
  • Provide a list of risks (resulting in time slippage or project failure) which demonstrates you understand your project
  • Include the likelihood of the risk and the severity of the impact if it happens. 
Research Approach
  • You need to use specific keywords to outline your research approach
  • You do not need to define or reference them, just use them in a sentence. 
  • This section will be a small paragraph 
  • e.g. A positivist-deductive perspective, with a mono-method cross-sectional approach will be used to….
Secondary Data
  • Approaches to improving whole school attendance within the UK
  • Approaches to improving whole school attendance within the outside the UK
  • What is the best practice for writing a whole school policy
  • What design features will increase accessibility and usability of the new attendance policy.
Primary Data
  • Whole school attendance figures (before/ after)
  • Number of students who are persistently absent (before/ after)
  • Classify persistently absent students by risk level (high, medium, low)
  • Number of phone calls home per child
  • Number of referrals to the education welfare office per child
  • Teachers book scan data (evidence of student work) (before/ after)
  • Student assessment level identified by the teacher (before/ after)
Ethical Consideration
  • See this link for details – Ethical and Legal Factors
    • Data Protection (GDPR)
    • Ethical Framework
    • Consider the impact on user and organisation
  • Complete a proportionate ethical form 
  • Create a participant ethical statement (this is the statement which the respondent will agree to just before they answer your questions)
  • Workbased Learners & Apprenticeship students need writtern conformation from the workplace approving the project.
Artefact Success Criteria
  • Create a list of indicators that when achieved will demonstrate that your artefact is successful
  • Identify each success criteria with a unique number
  • The success criteria must link back to your project success criteria
    1. Identify the number of students not attended (whole school attendance rate)
    2. Identify the number of phone calls home (reducing number of phone calls supports an increase in students attendance)
    3. Identify the number of referrals to the educational welfare officer (reducing number of referrals supports an increase in students attendance)
Artefact Requirements
  • This is like saying explicitly identify everything that the artefact will do.
  • What key information from the secondary research will you include in your new attendance policy
  • What key information from the primary research will you include in your new attendance policy
  • Identify each requirement with a unique number and link to your stated artefact success criteria (and because your artefact success criteria are linked to your project success criteria you are closing the circle)
Testing/ Validating
  • Use focus groups with senior managers to identify their understanding of what a new attendance policy will include and look like (and map across to secondary research).
  • Check to make sure that the employees job role allow for the change/ increase in workload (and get this signed off by senior management)
  • Make any changes to the new attendance policy and send out to all stakeholders requesting feedback.
  • Have office staff to enter data into the data tracking tool and act on their feedback.
  • Make any further changes based on feedback and get final approval from senior management
  • Demonstrate (using data) that your artefact requirements have been achieved successfully
Test/ Pilot Study
  • Speak to individual (groups) staff who are required to implement the new attendance policy to make sure that they are able to fulfill the policy requirements.
  • Run the new policy for one week and review the collected data
  • Make any changes and implement
Results/ Discussion
  • After one term (12 weeks) analyse the data and present it in charts, graphs and tables
  • Link back to your secondary research
  • Link back to your project title
  • Link back to your stated success criteria
Critical EvaluationWriting your Critical Evaluation
ConclusionWriting Your Conclusion
RecommendationsWriting Your Recommendations
Critical Reflection
  • Link back to your logs
  • Link back to your updated Gantt Chart

For more detail click here

Summaries Testing Results

  • It is important that you summaries your testing as this helps you to make sure that you have fully completed the testing stage.
  • It also helps you to ensure that you can prove that your artifact has achieved the stated project success criteria (outcomes).
  • Within the summary you to:

You need to be explicit if you have achieved your success criteria (outcomes) using evidence (data/ feedback).

Table 1) Success Criteria – Achievement Summary

NoSuccess criteriaEvidenceAchieved
1Increase in website salesThe data outlined in the web metric section (pg64) indicates that the number of sales has increased by 10% when comparing the four weeks before the website updated to the four weeks following the website updates.Yes


You need to be explicit if the artefact has achieved the stated requirements using evidence (data/ feedback). In your viva, you may want to highlight a few of the critical requirements rather than all of them!

Table 2) Testing of Requirement Summary

NoRequirementSC NoEvidenceAchieved
1The time taken to navigate to an item will be recorded1The test data displayed in table 15 and figure 12 (pg72) indicate that this web metric feature is working. Further testing information is displayed in the test plan (see page 65)Yes


Midpoint Assessment

  • Most students will have to complete a midpoint assessment to:
    • Check that the student is on target to complete (done some work!)
    • Check that the first marker and student have not missed anything (fresh eyes)
    • Provide additional guidance
    • For the second marker to gain an insight into the project before they mark it
  • Every course and university will require different components to the midpoint
  • However, they are either a formal presentation or a informal chat

Your Midpoint

  • Have a chat with your first marker and ask them for advice
  • Look at the assessment criteria and make sure you understand what they mean
    • Make a list of requirements (things to cover) for each assessment point
    • Discussion them with your first marker
  • Be clear about the etiquette (rules) and expectations of the midpoint
    • Who contacts who?
      • Does the student request a meeting or does the second marker issue one
    • What information (evidence/ work) does the second marker require?
      • Sometimes this is explicitly written other times you may want to ask
    • Who is in-charge (controls) the midpoint?
      • Often it is assumed that the student will direct the midpoint as it is your presentation
    • When does the second marker ask questions/ provide feedback
      • Some students and markers prefer addressing questions as they go other students find this off putting.
      • However, most second markers will take lead from the student
  • Think of ways to visualise your project to make it quick and easy for the second marker to understand your project (here and here), this significantly increases your grade potential.
    • It demonstrates to the marker that you understand your project and it requirements
    • It also provides as a talking frame
    • It also provides a very quick summary of your project which you can share with stakeholders

Typical Midpoint Requirements

  1. The reason the project was chosen including a background to the problem
    • This was covered in your project proposal
    • What are the key factors which influence the current situation
    • What are the key factors in the current situation which creates your problem
    • State your problem in one sentence
  2. Identify your artefact
    • This was covered in your project proposal
    • State what your artefact is in one sentence
    • Explain how this artefact will overcome the stated problem
  3. Explore the ethical issues of your project
    • This was covered in your project proposal
    • What are the ethical concern for your project
  4. Identify your research strategy (approach)
    • Discuss your population and sample
    • Discuss your data (qualitative and quantitative)
    • Discuss your experiment design
    • Discuss how you will achieve triangulation
    • Harvard referencing to ensure academic robustness
  5. Identify what you will cover in your secondary research (Literature Review)
    • What information will you need to read about
    • How will this information link to your artefact
    • Harvard referencing to ensure academic robustness
  6. Identify what primary data collection tool(s) you will use
    • What data collection tool(s) will you use?
    • Why will you use this tool(s), link to triangulation?
    • How will the data be used to inform your artefact?
    • Harvard referencing to ensure academic robustness
  7. Identify what analysis tools (software/ hardware) and techniques you will used
    • Spreadsheet to generate central tendencies
    • Harvard referencing to ensure academic robustness
  8. Identify your artefact requirements 
    • What are your artefact success criteria?
    • How do the success criteria link to your secondary research (triangulation)?
    • How do the success criteria link to your primary research (triangulation)?
  9. Justification of your chosen Development Methodology (SDLC) for the artefact
    • What project management approach will you use to manage the artefact development
    • Why is this the best approach (Linear vs Nonlinear)?
  10. Identify how you will testing and validation your artefact
    • How will you test the artefact to make sure it achieves the requirements?
    • How will you test the artefact to demonstrate that it overcame the stated problem?
    • Create a test plan


Look at this report outline as a way of summarising your project

Work Based Project – Kick Start

Remember that this is a university academic report and you must write it to achieved the academic requirements of the university. It just so happens that you are using your workplace as a context.

  • When you are thinking about your Work-Based Project I recommend the following:
    • You turn something you are currently doing into a project
    • Consider doing something which will enhance your status or opportunities
    • Recognise that your project is likely to be very different to that of other students
  • The project needs an output (artefact) e.g.
    • Software Application
    • Best practice guidance
    • System drawings for a proposed new system
  • The report itself is not an output (artefact), the artefact is the output

Start to think about your project

  1. What primary data will you collect to prove there is a problem?
    • System performance data
    • Customer complaints
    • Personal observations supported by colleagues
  2. How will this primary data be used later to support artefact success?
    • The system performance data will indicate ….
    • Customer complaints regarding specific issue will reduce\ stop
    • Colleagues feedback data will indicate a change
  3. What key themes will you need to research (secondary research)?
  4. What are the artefact requirements (success criteria)?
    •  This is determined by:
      • Your primary data collected from question one
      • Your secondary research, question three
      • Current system integration requirements
  5. How will you test your artefact?
    • You need to prove that your artefact is fit for purpose otherwise, how do you really know that you can achieve question six! (here and here)
  6. How will you prove that your artefact has made a difference?
    • See question two

Now look at this report outline and Undertaking a midpoint assessment

SQL Data Definition Language

SQL Data Definition Language is about creating the structure of your database. Typically, most people use the user-interface (SQL Developer) to create tables, triggers etc… However, it is very important for you to understand the SQL language which is been generated in the background as it helps you to see how the attributes, data types and relationships interact.

Creating an Entity (table)

Creating an entity is actually very simple once you understand some key principles which I will outline below. First, there is a wrap which encases all the attributes (columns) within the table, this wrap tells the database to create a table called “Bookings” and add the listed attributes.

CREATE TABLE Bookings ( Attribute list here );

The attribute list mirrors the Entity-Relation Diagram (ERD) as we can see in the example below. However, we need to consider additional factors which are not listed on the ERD such as CHECKs. I have used a table because it helps to structure your thinking and syntax.



If I now convert the above example into actual SQL code it would look the example below. Note that the “FOREIGN KEY REFERENCES Customer(Cust_ID)” component identifies the table name (Customer) and the required attribute (Cust_ID).

NumberOfChairs INT CHECK (NumberOfChairs>=1)



Alter an entity


Delete (Drop) an entity

Deleting tables is very simple but can cost you a lot if you get it wrong! If you simply use the “DROP TABLE Bookings” SQL command you will delete the table and all its data. However, it will still be in your recycle bin so you will need to add the “PURGE” command, like below. If you do not use the “PURGE” command you can recover your entity by using the “FLASHBACK TABLE“.


There is an added problem if there are other entities linked to the Booking table so we need to use the “CASCADE CONSTRAINTS“.  Need o FINISH

Add bookings to table

Business Rules:

  • The operator must be able to create a booking
  • Each booking must be linked to one driver


INSERT INTO tells the database to add (create a new record) the following data to a database table. For example:

INSERT INTOBookings(CustName, CustPhone)VALUES(‘Campbell’, ‘1234567’);


The following table explains this SQL string in plain language:

Add to tableTable nameList of table attributesData to followUser input
INSERT INTOBookings(CustName, CustPhone)VALUES(‘Campbell’, ‘1234567’);



Add the data in the table below into your database table ‘bookings’ using the INSERT INTO SQL string. For example:

  • INSERT INTO booking
  • (C_Name, C_Phone, B_Date, S_Time, S_Location, S_PCode, F_Location, F_PCode , Ach, Driver_ID)
  • (‘Campbell’, ‘123456’, ‘17-10-17’, ‘0730’, ‘Home’, ‘ST2’, ‘Work’, ‘ST3’, ‘1’, ‘1’);

Note: there is no Booking_ID within the table below as this is automatically inserted by the trigger which you created it.




From this table (bookings) we can see that there is a many-to-one (N:1) relationship between the booking table and the driver table. We know this because the same Driver_ID is repeated in the Driver_ID column. In plain English, we can see that one booking has one driver but a driver can have many bookings. Another way of thinking about this is to say, ‘one row in the driver table can have many rows in the booking table’.

Figure 1) Many to One Entity Relationship


Creating a table called Bookings

Business Rules: Each booking must have

  • Booking_ID [PK, Integer]
  • customer name [varchar, 100],
  • customer phone number [varchar, 15],
  • date of booking [char, 10],
  • booking start time [char, 4],
  • booking start location [varchar, 200],
  • Start postcode [varchar, 8],
  • booking finish location [varchar, 200],
  • finish postcode [varchar, 8],
  • achieved [Char, 1, Default 0]
  • assigned driver [FK, Integer],


  • UK phone numbers are targeted to be less than 15 digits in length
  • UK Postcodes are up to 8 characters in length
  • Date is defined as “17-10-2017”
  • Time is defined as “1020”



Designing a Relational Database

Key Terms

This list of key words are explained using every day language.

  • Schema is the structure (architecture) of the whole database
  • Entity is a table
  • Attribute is a column
  • Tuple is a row or record
  • Data Type is the type of data which will be stored in the attribute
  • Primary Key is a unique number value
  • Entity Integrity means every table has a primary key (Codd’s Rule 2)
  • Foreign Key is a primary key from a different table
  • Referential Integrity means tables are connected using primary key (Codd’s Rule 10)
  • Entity Relationship how two tables are connected
  • First Normal Form (1NF) group your attributes in a meaningful way and add a primary key
  • Second Normal From (2NF) remove all duplicate attributes and rows
  • Third Normal From (3NF) remove any attributes which can be created using a SQL query

Task One: What data do I need?

You have been asked to create a booking systems for a local taxi which will be used internally by the phone operator only.

  1. Make a list of attributes (data) which you will need
  2. Group the attributes in a logical way e.g. Booking, Drivers (1NF)
  3. Add a primary key to each group of attributes (1NF)
  4. Remove any duplicate attributes from all groups (2NF)



Introduction to SQL

Database Homepage

By working your way through all these tasks you will have a very good understanding of SQL queries and how to use them. As part of this tutorial you will create a very simple relational database, however, the tutorial can be used as a quick reference source too.


Table of Contents

  1. Database Connection
  2. Creating a table called Bookings
  3. Adding bookings to the table
  4. Search your booking table
    1. SELECT
  5. Create a table called diver
  6. Update your driver and booking data
    1. UPDATE
  7. View bookings list with the drivers details
    1. Table JOINS
  8. SQL: Single Row Functions
    1. SubStrings
    2. GroupBy
  9. Create a table called notes
  10. Three-way table joins
  11. Find a free taxi
  12. Driver completed job sheet


Voting Systems – Metrics in Teaching and Learning


The purpose of this theoretical case study is to explore the three main approaches to using voting systems within the teaching and learning process and how the metrics can be used to explore an inclusive curriculum design.

The findings indicate that the critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party.

In terms of value, the most cost effective and accessible method is approach two, which uses QR codes and Google forms. However, other issues such as data ownership and teachers performance need to be considered and litigated against.


Voting System, Metrics, Teaching and Learning, Pedagogy, Andragogy, Higher Education in Further Education, Inclusive Curriculum Design, Diversity, Fostering Learning in Large Groups


Using technology within the classroom has promised to revolutionise the teaching and learning process.  From the humble calculator to the computer to the interactive whiteboards and the virtual learning environment; state schools has been serving this mantra for many years. However, this shift to technology driving teaching and learning has led to a paradigm shift in teaching, from teacher centred (pedagogy) to student centred (andragogy) as explored by Pollard (2010, p.11). The HEA (2011) framework supports this shift (K1).  Setting aside the pedagogy and andragogy debate (Davenport & Davenport 1985; Delahaye et al. 1994; Holmes & Abington-Cooper 2000; Samaroo et al. 2013). Each education sector has had to come to terms with the constant demand for integrating technology within the teaching and learning process from students, governments and other stakeholders. One such solution for integrating metrics within teaching and learning is the student voting system (K2).

Scope and Development

To limit the scope, a student voting system for this case study is any electronic system which allows students to cast a vote.  Although it is important to recognise that there are other voting tools such as students raising their hands to agree or disagree with a question (K2, K3). Before the development and general acceptance of the ‘bring your own device’ movement, voting systems were integrated into the chair or table or clickers were issued to every student. However, prolific use of mobile technology by students has provided a cheap alternative, where the university provides software (Phone App) and Wi-Fi and the students use their own mobile device (K4).

Where this case study will refer to theories and practices such as Thomas and May (2010) four-pronged typology for student diversity (educational, Dispositional, Circumstantial, Culture) and theories of inclusive design it will not exclusively refer to particular theories (K2, K3, K5). Instead, key terms and concepts will be used, where appropriate, within the case study. The principle higher education thematic area of this case study will be fostering learning within large groups.

Common Approaches to Mobile Voting

Approach One: Most mobile voting apps require the students to install the app (software) onto their personal phone before they can place a vote. Once installed the students’ needs link their account with the university, often this is achieved by activating a provided account by the university. Now the student can use the voting software (Wi-Fi dependent) to interact within the learning experience. The advantage which this approach, is that individual students performance can be monitored and tracked during the module, year or length if the course. As a side benefit, the student’s engagement and the teacher’s ability to share new knowledge in a meaningful way can be gauged (K2, K3, K4).  However, this system is dependent on the student, their technology and their willingness to engage with a system which allows the university to track and monitor their performance. Linked to this, is how the university and teachers actually use the collected data, in respect to the DataProtectionn Act.

Often this approach does not integrate fluently within the presentation process, having to swap and change between different software and interfaces to view the incoming data; making the lesson feel disjoint (K1, K2, K3, K4).  However, more recent attempts, such as PollEveryWhere (2016) have provided plugins which allow the student responses to be directly displayed in presentational software, like Ms PowerPoint. This approach is focused on student centred technology which promotes greater independency and thus a student centred learning approach which supports the andragogical demands (K2, K3, K4, K5).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the constant updating of the app and server related hardware (K1, K2, K3, K4, K5, K6).

Approach Two: can be considered a quick and dirty approach as it relies on online survey software, for example, Survey Monkey, Google Forms or quizzes via the VLE. A principle issue with this approach is presenting the question behind a login function, such as using the VLE.  Although this allows the university to track which students have engaged with the process it does not, necessary provide an instant whole class graphical profile of the student responses (K2, K3, K4, K5, K6).  Also, the students quickly disengage with the process, as they dislike having to login to a system to cast their vote or response (K1, K2, K5, K6). However, software like Google Forms offers an instant profile of all responses allowing immediate feedback to the teacher and students.  This approach does not require additional software to be installed on the students’ mobile device as a web link to be provided via the learning materials, VLE or via a QR code.

This approach offers inclusive curriculum design as it allows flexibility, collaboration and to a large extent transparency and equality. However, there is little accountability as the lecturer holds the data. It offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can be minimal as the software and hardware are provided through a free online service, which is linked to the lecturer, not the university (K1, K2, K3, K4, K5, K6).

Approach Three: removes all technology from the student and places the burden on the university and lecturer.  Using Apps which employ the mobile device camera and pre-printed vote cards the lecturer scans the pre-printed voting card of each student who rotates the card to give different answers A, B, C or D (1, 2, 3, or 4). Although this reduces the number of variables, like the students’ devices and operating systems, it brings in additional issues such as having voting cards for each student (K1, K3, K6).

This approach offers inclusive curriculum design as it allows flexibility, accountability, collaboration and to a large extent transparency and equality. It also offers reasonable adjustment, technology enhanced learning and foster student engagement. Financially, this approach can become costly due to the cost of licence agreements and printed artefacts (K1, K2, K3, K4, K5, K6).

Value of Voting Systems

Students’ feedback on their current understanding of the presented materials is critical at informing lesson development, pace and future learning (Stewart et al. 2013).  Knowing where additional support is needed can inform workshop and tutorial development or allow the teacher to provide additional real-world examples to foster the learners understanding (Roth 2012). This supports student engagement, collaboration, flexibility and anticipatory needs of inclusive curriculum design (K2, K3, K4, K5, K6). Typically this is achieved through class questioning and teacher observation. However, this becomes problematic in a large lecture theatre. The issue of Socratic teaching (engage students with questions) within this context has been addressed by allowing the audience to post questions during or post lecture which can be answered the lecturer as they pop up on the screen or by other experts in real-time.  However, these posted questions can direct the lecture off topic,  cognitive overload for the lecturer and often the same question can be posed in multiple ways, requiring a filtering process (K1, K2, K3, K4, K5).  More importantly, a loss of anonymity by the student promotes a more authentic students engagement as they are less likely to conform to peer pressure due to the private response (as explored in Ash’s classical line experiment (Asch 1951)) (K1, K2, K3, K5).

A more controlled method of student engagement within their learning process is to pose carefully crafted question(s) which the students respond to (Cline et al. 2012).  Based on the students’ response the teacher makes an informed decision about the moving on or readdressing the previous content (within the lecturer or in the associated workshop/ tutorial).  The student response could also be displayed back to the students allowing them to compare their response to the group. Allowing the lecturer to make judgements about the constructive alignment of the students understanding in relation to the learning objectives as outlined by Biggs & Tang (2011, p.281) (K1, K2, K3, K4, K5, K6).

Voting Systems Analysis

There are many voting systems which can be integrated within the teaching and learning process. However, there are certain critical requirements which need to be considered when integrating mobile (phone and tablets) based voting systems into the learning process  (K1, K2, K3, K4, K5).  For example,

  • Student device ownership/ brings own device (inclusion/ exclusion)
  • Power levels of student mobile device
  • Student requires installing App on own device
  • Will app work with student device (hardware and operating system)
  • Most apps require an internet connection for the student to register their vote
  • Most apps require the student to register before voting (setup time)
  • Teacher computer requires the internet to view students vote

Mobile device ownership in the USA is 85% of students, similar results are presented for the UK , where mobile phone ownership is 93% across the UK (Ofcom 2015, pp.10, 65). However, is it reasonable to assume that all students own a mobile device which has the capacity to install a voting app and connect to the internet or is there a case for reasonable adjustment?  Wi-Fi access across the university is fairly standard, although there are some areas with poor or no access to the university’s Wi-Fi. On the whole, it could be argued that reasonable adjustment has taken place in terms of access to free Wi-Fi. With reference to devise ownership, a bank of tablet devices could be provided, allowing students to chose between using their own device (power levels, app installation etc…) or the university’s’ device (K1, K2, K3, K4, K5, K6).

The issue around installing and using applications on mobile devices can be an emotive one, where students using iPhones cannot access the voting app as it is designed to work on Android and Mobile Windows only.  The university cannot govern this, as the app is not own by the university. However, in recent years, the sector has recognised this as an issue and there has been a shift in multiple operating system software developments. This leads into the “Camera vs Voting App” debate (K1, K5).  The premise is based on the concept that by using the camera there is no software or hardware conflicts thus removing the voting app issue.  Also, QR code reader apps are freely available across all mobile operating systems and short hyperlinks (or hyperlinks can be embedded within the learning materials or VLE) can be provided for students who do not want to install additional software.  Therefore, the students mobile device needs access to Wi-Fi, free QR code reader,and a camera. All of these requirements can be provided via the university tablet devices. This approach also removes the need to register or log into a voting account. However, it will not be possible to automatically monitor specific students’ performance (K1, K2, K3, K4, K5, K6).

Application of Student Voting

The use of student voting is diverse and offers many opportunities to work more efficiently and effectively.

  • Poster Feedback (QR code – Google Form)

The students’ posters were displayed along with a poster number and a QR code.  Students were asked to review and provide feedback about the posters during a 30minute period, choosing to use their own device or the university iPad. All students elected to use the iPads provided by the university.  The online review form mirrored the requirements presented to the students at the start of the assessment. Alongside this peer review, was the teachers marking, where four faculty members marked the posters using the same process. Due to using Google forms the data was anonymous, only differentiated by either student or staff. By the end of the 30 minutes there were 120 responses across all posters. When the students return to the classroom the overall results were displayed via the projector allowing students to make conclusions of their peer marking against the teacher marking, which were very similar increasing transparency and equitability within marking.  The financial cost is very low whilst offering technology centred learning and different approaches to learning and assessment (K1, K2, K3, K4, K5, K6).

Due to the use of rating scales within the feedback form, the results were converted into grades which reduced the marking workload. However, a recommendation would be to include a declaration statement which asks the students and staff to agree a marking boundary (fail, pass, merit, and distinction) for the poster (K2, K4, K5).

  • Lesson Progression

Presenting questions to the students help’s the lecturers to make judgements about the current knowledge and understanding of the class. However, these need to be carefully crafted to avoid drifting off point and to ensure that the student responses are specific and relevant to the taught materials (Cline et al. 2012). The questions can be presented at any point during the lecture.  For example, asking students to respond to questions at the start of the lecture can act as a diagnostic tool (identifying areas of learning which are high and therefore increase the lesson pace) or as a baseline, followed by an end of lesson assessment to show how much progression the students feel they have made. Using the voting system within the mini-plenary process will also allow lecturers to gauge the lecture pace and knowledge transfer (K1, K2, K3, K4, K5, K6).

Within small groups there appears to be little advantage to using voting app’s as teacher questioning can be used effectively. However, through the voting systems students’ progression can be tracked allowing for the identification of students who are at risk. This is important for satisfaction levels, retention and belongingness whilst addressing student diversity (K2, K3, K4, K5, K6).

  • Lesson and Teaching Feedback

Using the voting system within the lecture provides valuable data which can be used to inform teaching, learning and planning.  For example, in the final plenary students can indicate what topic(s) they want to focus on during the workshops or request additional reading or support.  This process can also be used to make informed judgements about the suitableness of the examples and images using within the lecture.  Supporting elements of Thomas and May (2010) typology for student diversity (K1, K2, K3, K4, K5, K6).

This could be extended to identify which areas need addressing or improving for a given lesson, helping to refine the learning process and maximise learning potential. Critical issues like cultural diversity and personal experience are likely to influence the relevance of explanations and images. This could also link to teaching style and approach (K2, K3, K5, K6).

  • Student Engagement

Controversially, voting apps can be used to track student engagement, in the same way as the university virtual learning environment. Knowing which students are actively engaged with the learning process and how successful they are (number of correct vs incorrect response) can help to target support and intervention (Roth 2012). It could also be used to identify the gifted and talented students and provide specific intervention.  This will increase the added value potential and authentic learning experience of these students.  However, this could also be used to identify students who will achieve a high success within their dissertation, allowing staff to cream off the best students for themselves (K1, K2, K3, K4, K5, K6).

This information can also be used to tackle complaints and retention issues, see student voting and teacher’s performance, below.

The Pitfalls of Student Voting

  • Student Voting and Teacher’s Performance (remote observation)

Arguably, student responses can indicate the teacher effectiveness of explaining concept and critical information. The response can also indicate the level of student engagement and active learning within the teaching and learning process. These metrics can be data mined to create a normalised comparison of lecturers and student experience (K1, K2, K4, K5, K6).

  • Student Not Engaging Within The Learning Process

Using technology within the learning environment is always open to abuse. For example, using a laptop to type notes during lectures could lead to students updating their Facebook profile, checking their emails or placing bids on their eBay items, which all distract from the learning intention (K2, K3, K4, K5, K6).

  • Accessibility, Cost Of Software and Hardware

A university controlled system would be very expensive to implement and maintain compared to using a third party app and allowing the students to use their own mobile device.  In effect, the major expense would be the Wi-Fi and the licence agreement of the voting software (K2, K4, K6).

  • Technology Failure

Technology failure, when allowing the user to bring their own device would be focused on a) Wi-Fi issues, b) their party app and c) student and their device. However, if the university implement a university controlled system additional issues such as a) server access, software updates and security, c) software compatibility and d) usability and accessibility legal requirements (K5, K6).


In conclusion, the use of a student voting system offers many advantages which can be used to support the student engagement, constructive alignment and an inclusive curriculum design. The critical factor in selecting the correct voting system is linked to the collected data.  For the university to data mine, allowing them to identify student support opportunities, the data needs to be linked to a specific student which means using own software or a third party. Other issues such as data ownership and teachers performance need to be considered and litigated against. The most cost effective and accessible method is approach two, which uses QR codes and Google forms.


Asch, S.E., 1951. Effects of group pressure on the modification and distortion of judgments. Groups, leadership and men, pp.177–190.

Biggs, J. & Tang, C., 2011. Teaching For Quality Learning At University 4th ed., England: Open University Press. Available at: https://books.google.co.uk/books?id=XhjRBrDAESkC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false.

CiCS, 2011. CiCS: Student Mobile Device Survey, Sheffield. Available at: https://www.sheffield.ac.uk/polopoly_fs/1.103665!/file/mobilesurvey2011.pdf .

Clickers: Beyond the Basics, 2016. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: http://ii.library.jhu.edu/tag/clickers/ [Accessed May 15, 2016].

Cline, K. et al., 2012. Addressing Common Student Errors With Classroom Voting in Multivariable Calculus. PRIMUS, 23(1), pp.60–75. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2012.697098.

Dahlstrom, E. & Bichsel, J., 2014. ECAR Study of Undergraduate Students and Information Technology, Available at: https://net.educause.edu/ir/library/pdf/ss14/ERS1406.pdf.

Data Protection Act, 1998. Data Protection Act, Great Britian: legislation.gov.uk. Available at: http://www.legislation.gov.uk/ukpga/1998/29/contents.

Davenport, J. & Davenport, J.A., 1985. A Chronology and Analysis of The Andragogy Debate. Adult Education Quarterly, 35(3), pp.152–159. Available at: http://www.umsl.edu/~henschkej/henschke/more henschke_5_11_04/a_chronology_and_analysis_of_the_andragogy_debate.pdf.

Davies, S., 2014. Mobile device adoption in UK Higher Education, Manchester. Available at: http://www.elearning.eps.manchester.ac.uk/blog/2014/m-article-series-mobile-device-adoption-in-uk-he/ .

Delahaye, B.L., Limerick, D.C. & Hearn, G., 1994. The Relationship between Andragogical and Pedagogical Orientations and the Implications for Adult Learning. Adult Education Quarterly, 44(4), pp.187–200. Available at: https://core.ac.uk/download/files/310/10873874.pdf .

Harris, P., 2015. Pearson: Student Mobile Device Survey, Available at: http://www.pearsoned.com/wp-content/uploads/2015-Pearson-Student-Mobile-Device-Survey-College.pdf [Accessed May 15, 2016].

HEA, 2011. Framework Guidance Note 2: What are the UK Professional Standards Framework Descriptors?, London. Available at: https://www.heacademy.ac.uk/sites/default/files/downloads/what_are_the_uk_professional_standards_framework_descriptors.pdf.

Holmes, G. & Abington-Cooper, M., 2000. Pedagogy vs. Andragogy: A False Dichotomy? The Journal of Technology Studies, 26(2). Available at: https://scholar.lib.vt.edu/ejournals/JOTS/Summer-Fall-2000/holmes.html .

Knowles, M.S., Holton III, E.F. & Swanson, R.A., 2011. The Adult Learner 7th ed., California: Routledge. Available at: https://books.google.co.uk/books?id=urUVrB1hLKAC&printsec=frontcover#v=onepage&q&f=false.

Ofcom, 2015. Communications Market Report, London. Available at: http://stakeholders.ofcom.org.uk/binaries/research/cmr/cmr15/CMR_UK_2015.pdf.

Pollard, A.J., 2010. Professionalism and Pedagogy: a contemporary opportunity, Bristol: University of Bristol. Available at: http://www.tlrp.org/pub/documents/TLRPGTCEProf&Pedagogy.pdf.

PollEveryWhere.com, 2016. PollEveryWhere.com. About Us, p.1. Available at: https://www.polleverywhere.com/ [Accessed May 15, 2016].

Roth, K.A., 2012. Assessing Clicker Examples Versus Board Examples in Calculus. PRIMUS, 22(5), pp.353–364. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2011.623503.

Samaroo, S., Cooper, E. & Green, T., 2013. Pedandragogy: A way forward to self-engaged learning. New Horizons in Adult Education and Human Resource Development, 25(3), pp.76–90. Available at: http://doi.wiley.com/10.1002/nha3.20032.

Stewart, A., Storm, C. & VonEpps, L., 2013. Analyzing Student Confidence in Classroom Voting With Multiple Choice Questions. PRIMUS, 23(8), pp.718–732. Available at: http://www.tandfonline.com/doi/abs/10.1080/10511970.2013.801381.

What’s New with Clickers?, 2012. Blog The Innovative Instructor. Johns Hopkins University, p.1. Available at: http://ii.library.jhu.edu/tag/in-class-voting-system/ [Accessed May 15, 2016].

Wilson, C., 2006. No One Is Too Old To Learn: Neuroandragogy: A Theoretical Perspective on Adult Brain Functions and Adult Learning, iUniverse, Inc. Available at: https://books.google.co.uk/books?id=J2EGFaH19vUC&pg=PA96&dq=Knowles%E2%80%99+Andragogy&hl=en&sa=X&ved=0ahUKEwj357Sr-dzMAhXpAsAKHc4HAdcQ6AEINzAE#v=onepage&q=Knowles%E2%80%99 Andragogy&f=false.