In the last 20 years there has been a nearly six fold increase with students in higher education declaring disabilities, equating to over 14000 students in 2019. It is also worth noting that this number is just the students that declare a disability, I am confident that there are many more throughout the sector. For every “visible” disability e.g. someone in a wheelchair or a blind student there are many “invisible” disabilities such as dyslexia, autism and ADHD. Therefore as lecturers we will never know if we have students with disabilities in our class making the adoption of a “Universal Design for Learning” (UDL) approach more vital than ever. While we encourage lecturers to expand access and increase flexibility by moving towards a more blended provision of their courses – are they adhering to the principles of Universal Design. They are specialists in their respective disciplines and not necessarily web developers or accessibility experts. This post outlines how we assessed the accessibility of the course pages on our VLE (Moodle).
Figure 1: Number of students with disabilities in higher education
Last week I had the pleasure of presenting at the ALT (Association for Learning Technology) annual conference. I co-presented Gavin Henrick from Brickfield Education Labs describing the research conducted to evaluate the accessibility of course pages within our virtual learning environment. This presentation described how, using existing open source libraries, we built a reporting tool to define which checks were carried out, how they were carried out, how this data was stored and reported on at module, programme, and faculty level. As the report is available at these various levels, a lecturer can self evaluate their own course pages and staff developers can identify the training and support that may be needed across an entire faculty.
A subset of the Web Content Accessibility Guidelines was chosen for this study. These guidelines created by the World Wide Web consortium are a series of guidelines for improving web accessibility. Twelve separate modules within a programme were analysed for such checks as: are web links and images used on the courses accessible? Are headings within long passages of text used appropriately? The results while promising did highlight there is still room for improvement.
Figure 2 – Course checks per page
Figure 2 illustrates the results from one course in particular, illustrating that 5% of the images on this course have no alt text, 10% have issues with poor layout and 8% have poorly displayed links to other webpages. Figure 3 provides an alternative breakdown of the results illustrating what feature of the VLE is throwing up the most issues. For example we can clearly see on this particular course that the majority of the issues on this course are related to the Moodle “book”
Figure 3 – analysis of checks per moodle feature
These are just two of the reports that are available with several more available both at a course and a programme level. We look forward to providing an update on the next stage of this research at the World Conference for Online Learning in Dublin later this year.
World wide web consortium web accessibility initiative. 2008. Web Content Accessibility Guidelines (WCAG). [Online]. [2 April 2019]. Available from: https://www.w3.org/WAI/standards-guidelines/wcag/
Data on students in Irish Higher Education with Disabilities, 2018: Available from: https://www.ahead.ie/datacentre18-yearonyear
Following our move from the text matching service TurnItIn to a rival product Urkund, the most common complaint that we received was the loss of “Feedback Studio” feature of TurnItIn and in particular the ability to easily provide audio feedback to students. Following consultation with our contacts in the Moodle community we concluded that having the ability to provide audio feedback directly to students directly through Moodle would be a very valuable addition to the core Moodle capabilities.
In Moodle 3.5 a new feature was added into core Moodle to provide webRTC audio and video recording into the atto editor (See https://tracker.moodle.org/browse/MDL-60848 ). This was initially written by Blindside Networks who work on the Bigbluebutton open source conferencing tool and contributed to the plugin database.
We found relevant tickets on the issue on the Moodle tracker system and commented along with others that it was essential to have this also work in providing assignment feedback. We also encouraged members of the community to comment on and vote for this item if they thought it would be worthwhile.
Power of the community
Moodle HQ responded as you would expect and agreed to examine the potential of adding audio feedback into core. I’m delighted to report that 12 months later, thanks to the team in HQ we have a “patch” enabling us to provide audio feedback to students on their assignments for our Moodle 3.5 instance and audio feedback will be available as part of core for Moodle 3.6. See https://tracker.moodle.org/browse/MDL-27520
The video below illustrates how simple it now is to provide feedback directly through Moodle
This development is yet another example of the benefit of Open Source and the awesome Moodle community. I firmly believe that it would take significantly more time and persuasion to convince a commercial LMS provider to not only take on board users feedback but make significant changes to core product in such a short space of time. The community were able to investigate the code themselves, provide suggestions to Moodle HQ developers and troubleshoot problems as they arose. Of course that it not to mention that this significant enhancement is available at no extra cost to the end users once adopted into core.
On so many occasions assessment across a programme can be disconnected. More often that not lecturer “A” doesn’t talk to lecturer “B” to discuss their assessments to identify opportunity for collaboration or at the very least avoid poor scheduling of assignments. Even in circumstances where discussion does take place it normally relies on one individual such as the programme coordinator to instigate conversations with each of the individual lecturers and collate assessment information for the student handbook. In addition to the additional workload created for the coordinator this handbook information can very quickly go out of date if an individual lecturer decides that they need to change their assessment for their module.
To help address this we have built a report in Moodle that works on a course page level and is accessible by the teacher associated with the course. This report uses the core ability of Moodle to tag courses in the course page settings. Each programme has a unique code which we use as a Moodle tag for each year e.g. “science1” for BSc in Science first year, “science 2” for BSc in Science for second year etc. Then Moodle pulls all of the assignments from each of the modules assigned that tag.
The following example may help illustrate how it works
A student studying in first year science studies three modules Chemistry 101, Physics 101 and Biology 101. The plugin that we have developed first adds the tag “science1” to each of the modules. Then any teacher on the aforementioned modules can generate the report through the reports feature on the administration block on their course page. The report pulls all of the assignments in each of the modules and presents them in several formats.
A bar chart with the number of assignments across the next 12 weeks, presented on weekly basis.
A list of each assignment, their opening date and closing date, the assignment description, whether it is a group assignment or not and what module the assignment is from.
The assignment description is available by rolling your mouse over the assignment title. The data is also available to download in a CSV or xls spreadsheet format.
Following an evaluation of this report with our staff we intend to make further enhancements to the report and then release it to the Moodle community.
We’re all in the same boat – so why not share? We are all understaffed and under funded. We all want the best learning experience for our students. We all are juggling numerous projects it doesn’t make sense to me that we are all doing it independently of one another.
This belief was driven home to me at the end of last week. Friday 20th October Cambridge University hosted the first Mahoodle conference in the UK – where educators that were interested in Moodle (the learning management system) and Mahara (ePortfolio platform) presented on various developments/projects that they were involved in involving one or both platforms. While I always welcome events like this for the opportunity to network with colleagues from other institutions, I was particularly looking forward to this one; We got to present the final stages of the AAIM project. This project was a result of a collaboration between University of Sussex and Dublin City University. More details are available on the project website but in a nutshell the goal was to improve the analytic reporting within Mahara.
We are delighted to announce that as of yesterday, the reporting capability developed as a result of this project are now in core Mahara.
— Catalyst IT (@CatalystNZ) November 1, 2017
We are now going to release as many of our Mahara plugins as we can to the community and will work with Catalyst (the company behind Mahara) to get as much of these developments into core so everyone can benefit.
Following our tremendously successful experience working with Sussex we are looking for more Mahara projects that we can collaborate on. If you have any ideas or up for being involved in collaboration please contact myself (@glynnmark) or David Walker from Sussex (@drdjwalker)
I am delighted to announce that a consortium involving DCU led by Ilia State University in Georgia have been successful with our application for Erasmus funding for our project on Academic Integrity. The core objective of the project is to enhance the quality of teaching and learning processes that are based on the principles of academic integrity, supported by policies, mechanisms and tools that help prevent and detect cases of plagiarism in higher education institutions in Georgia. The three year project will involve the design of workshops and resources targeted at both staff and students and involves collaborating with 16 other partners from UK, Austria, Sweden and numerous institutions throughout Georgia.
All resources resulting from this project will be made available openly under creative commons licence. Updates will be posted here and also on the website created specifically for this project (details to follow shortly)
A huge congratulations to the Y1 Feeback team in the final event of their National Forum funded project. I would strongly recommend to anyone teaching in higher education to visit to their project website. Although targeted at first years, the learnings from this project are applicable to everyone.The two year long project culminated with a symposium last Friday, an excellent event with distinguished keynote speakers such as Prof David Carless, Prof David Nicol, Prof Tansey Jessop and Dr Naomi Winstone.
I enjoyed each presentation with key takeaways for me from each one, with Naomi’s work on exporting assignment feedback to a students eportfolio being a standout takeaway for me personally. But aside from the aforementioned excellent speakers another element of this symposium was particular satisfying. It was, what I believe will be the real legacy of this project, the diversity of speakers from across the partner institutions. It was not just the case of the usual suspects representing their institution but a mixture of new and experienced lecturers from a variety of disciplines presenting their work. The National Forum projects have definitely raised the scholarship of teaching and learning across the sector but the Y1 Feedback in my opinion has lead the way in this respect in a very sustainable manner.
While the video below shows my personal contribution to this project, it is only one small piece of the jigsaw that made up this project. A huge congratulations to Lisa O’Regan and the project team. For more information visit http://www.y1feedback.ie
Everyday in school, primary or post primary the teacher called the role, at least in my school anyway. I see the logic behind this and understand the reasoning but higher education is different, we’re dealing with adults so at least it should be different. So should we measure class attendance and if we do should we value it?
Without meaning to sound too much like a politician, my answer to the question is – it depends! It depends on the situation and it is definitely not something to be measured in isolation.
The two following examples are typical scenarios in higher education.
In a laboratory session for a scientist or a clinical skills session for nurses for example, yes absolutely attendance is crucial as attendance should be compulsory. The thoughts that a chemistry student could graduate without ever attending a laboratory practical should not be possible. I am aware of several courses where at least in theory that is possible. In courses where there is no individual failed element i.e. the student can fail the continuous assessment but do well in the final paper and scrape through with an overall pass for the course because the final result is weighted heavily in favour of the final paper. I’m not here to discuss bad course design and poorly constructed course learning outcomes, we’ll leave that for another post. However let’s just take this example of lab attendance to explain the “value” of measuring attendance. If we measured lab attendance we can set a requirement like a student must attend 80% of the lab sessions. We therefore must measure attendance to determine if each student meets this requirement to progress. This raises the question – do we value attendance or do we value knowledge/ability to perform that lab activity? Or do we value both? What if a student misses the lab but writes up the lab report illustrating their knowledge of the topic? Should they get marks for this, should course design allow this? Before answering this question – let’s flip this on it’s head – what if a student attends the lab but fails to demonstrate the knowledge/ability in the lab report? How do these two fictitious students compare in your opinion – who deserves to pass?
Moving away from this type of scenario to a typical lecture – should we take attendance? should we value attendance? Technology can make collecting attendance easier but just because we can measure it does that mean that we should value it? In today’s world information is plentiful – it is not limited to within the four walls of a classroom. Instead of focussing on classroom attendance, should we not be focussing on our lecturers – asking them to produce quality learning resources e.g. small short videos. We should be encouraging them to “flip the classroom”. Therefore engaging the students when they are in the classroom, moving away from the “lecture” to a discussion type situation. There are plenty of examples where flipping the classroom engages the students and this will lead to better attendance.
It is widely accepted from the literature that engaging students will lead to better student attainment. This is what we should value, therefore this is what we should measure – not attendance in a lecture.
So I go back to my opening statement with regards to should we be measuring attendance – “it depends”. I return to my mantra “we should measure what we value, not value what we measure”.
Learning analytics is not a new term for higher education. In fact as far back as 2010 Learning analytics was mentioned in the NMC Horizons report. Last year I had the privilege of attending the LAK conference in Edinburgh in Scotland which is a very established international network of people researching learning analytics. So learning analytics is far from new. But, in my experience, until recently the conversations around student data and learning analytics were limited to a few people within institutions. However in recent months it is a term that is gathering a lot of interest in the every day conversations within higher education institutions. More and more people are seeing the potential and in some cases even realising the benefits of learning analytics. Dr Bart Rientes, Prof Shane Dawson and Prof Dragan Gasevic are people that I follow with great interest. Each of them leading the way in their respective fields, researching different aspects of learning analytics.The National Forum for Teaching & Learning, an organisation supporting T&L in higher education in Ireland has recently launched a project to raise the awareness of Learning Analytics so all of the indicators are that this is an area that is here to stay and will only get bigger in the years to come.
Personally speaking I got involved in learning analytics in DCU nearly 3 years ago examining if there was a correlation between a students engagement with the VLE and their success rates in the module. As you can expect, there is a very strong correlation. But we took it one step forward, using historical data we built algorithms to predict a students success based on their current interactions and we gave this information back to students with the hope of it improving completion rates. More information can be found here. Without spoiling the surprise, while the project was a success this project only gave one piece of the jigsaw.
Several different initiatives emerged following the findings from this project. Each one giving a different piece of the jigsaw.The next series of posts will outline the various learning analytics projects that we have conducted in DCU.
In conclusion learning analytics has proven to be a powerful tool that can help improve the learning experience of the student in so many ways but each bit of data is only one piece of the jigsaw. My word of caution that I would give to anyone interested in learning analytics is measure what you value, don’t value what you measure!
Learning any new software can be difficult at the best of times but if you are not a “digital native” even an apparently simple system can be a huge obstacle. This is most certainly the case with Moodle. If you can’t use it, it won’t matter how good it is, it just won’t be used. In my opinion Moodle has an added complication that as it is an open source product designed by many people to suit a large array of needs it may not always be as simple as you would like it to be.
To tackle this issue we commissioned the development of a plugin for Moodle that gives users a guided tour around each page when they arrive on it for the first time. The guides can be for students or teachers and the plugin is designed in such a way to allow generic tours to be used on any type of course page, no matter how it is laid out or what blocks you have included on your course page.
It has great potential for student orientation with Moodle and also potentially removing a fear for staff if they want to try out new features on Moodle
Hopefully you will find this plugin useful
Here is a short presentation which I gave to staff recently – hopefully explains it a little better.
FYI “Loop” mentioned in this video is the name of our Moodle instance.
Please pass this post onto anyone that you feel would be interested.
Anybody involved in teaching recognizes the benefits of assessment within their courses but equally they curse assessments at the same. Trying to get that balance right over providing enough assessment versus over (or under) assessment; giving detailed feedback versus prompt feedback; offering flexibility yet at the same minimizing opportunity for plagiarism. In DCU we have made several improvements to core Moodle to help improve the Assignment process for everyone involved. Some of these improvements are listed below, with each one discussed in detail in future blog posts
Every staff member and student has a Google Apps for Education account (GAFE). We now have linked Google Calendar with Moodle assignment.If a student has an assignment in Moodle, an “event” is created in their Google calendar. This event contains the assignment title, deadline date, brief description and link directly to the assignment on their Moodle page. This information is dynamic in so far as if a lecturer makes changes to their assignment on Moodle, the students Google Calendar is updated.
Sitting beside every lecturer to explain how to create a quiz in Moodle is just not practical. In addition to creating a suite of instructional videos and user guides we have created a virtual assistant on Moodle. Essentially when a user comes onto a new page in Moodle they are given a “tour” of the page. This tour guides them through key points/blocks on the page with an explanation associated with every point. These tours can be created for every page and every feature on Moodle, we have just decided assessment based features of Loop. This “tours” plugin is available to the Moodle community (https://moodle.org/plugins/local_usertours), all we ask in return is that if you create any tour, that you are willing to share it back to the community via https://moodle.net/mod/data/view.php?id=17
If students have not submitted an assignment Moodle now emails them reminders. A lecturer can choose how many reminders students get in advance of and after the deadline
A student is now presented with their relative grade i.e. how they have performed in a particular assignment relative to their colleagues: are they above or below average, are the close to the lowest or the highest grade
You may have centralised course material e.g. a course on citation and referencing that you want your students to complete as part of your course. Now lecturers can link out to that “other course” via a sub course link. When student click this link they are automatically enrolled in this course and all of the grades from the “other course” are passed back to your course gradebook