Category Archives: Learning Analytics

We’re all in the same boat – so why not share? – reflections from Mahoodle 2017

We’re all in the same boat – so why not share? We are all understaffed and under funded. We all want the best learning experience for our students. We all are juggling numerous projects it doesn’t make sense to me that we are all doing it independently of one another.

This belief was driven home to me at the end of last week. Friday 20th October Cambridge University hosted the first Mahoodle conference in the UK – where educators that were interested in Moodle (the learning management system) and Mahara (ePortfolio platform) presented on various developments/projects that they were involved in involving one or both platforms. While I always welcome events like this for the opportunity to network with colleagues from other institutions, I was particularly looking forward to this one; We got to present the final stages of the AAIM project. This project was a result of a collaboration between University of Sussex and Dublin City University. More details are available on the project website but in a nutshell the goal was to improve the analytic reporting within Mahara.

We are delighted to announce that as of yesterday, the reporting capability developed as a result of this project are now in core Mahara.

We are now going to release as many of our Mahara plugins as we can to the community and will work with Catalyst (the company behind Mahara) to get as much of these developments into core so everyone can benefit.

Following our tremendously successful experience working with Sussex we are looking for more Mahara projects that we can collaborate on. If you have any ideas or up for being involved in collaboration please contact myself (@glynnmark) or David Walker from Sussex (@drdjwalker)

 

Advertisements

Should we measure class attendance in higher education?

Everyday in school, primary or post primary the teacher called the role, at least in my school anyway. I see the logic behind this and understand the reasoning but higher education is different, we’re dealing with adults so at least it should be different. So should we measure class attendance and if we do should we value it?

 

Without meaning to sound too much like a politician, my answer to the question is –  it depends! It depends on the situation and it is definitely not something to be measured in isolation.

The two following examples are typical scenarios in higher education.

In a laboratory session for a scientist or a clinical skills session for nurses for example, yes absolutely attendance is crucial as attendance should be compulsory. The thoughts that a chemistry student could graduate without ever attending a laboratory practical should not be possible. I am aware of several courses where at least in theory that is possible. In courses where there is no individual failed element i.e. the student can fail the continuous assessment but do well in the final paper and scrape through with an overall pass for the course because the final result is weighted heavily in favour of the final paper. I’m not here to discuss bad course design and poorly constructed course learning outcomes, we’ll leave that for another post. However  let’s just take this example of lab attendance to explain the “value” of measuring attendance. If we measured lab attendance we can set a requirement like a student must attend 80% of the lab sessions. We therefore must measure attendance to determine if each student meets this requirement to progress. This raises the question – do we value attendance or do we value knowledge/ability to perform that lab activity? Or do we value both? What if a student misses the lab but writes up the lab report illustrating their knowledge of the topic? Should they get marks for this, should course design allow this? Before answering this question – let’s flip this on it’s head – what if a student attends the lab but fails to demonstrate the knowledge/ability in the lab report? How do these two fictitious students compare in your opinion – who deserves to pass?

Moving away from this type of scenario to a typical lecture – should we take attendance? should we value attendance? Technology can make collecting attendance easier but just because we can measure it does that mean that we should value it? In today’s world information is plentiful – it is not limited to within the four walls of a classroom. Instead of focussing on classroom attendance, should we not be focussing on our lecturers –  asking them to produce quality learning resources e.g. small short videos. We should be encouraging them to “flip the classroom”. Therefore engaging the students when they are in the classroom, moving away from the “lecture” to a discussion type situation. There are plenty of examples where flipping the classroom engages the students and this will lead to better attendance.

It is widely accepted from the literature that engaging students  will lead to better student attainment. This is what we should value, therefore this is what we should measure – not attendance in a lecture.

So I go back to my opening statement with regards to should we be measuring attendance – “it depends”. I return to my mantra “we should measure what we value, not value what we measure”.

Learning Analytics ….. measure what you value

Learning analytics is not a new term for higher education. In fact as far back as 2010 Learning analytics was mentioned in the NMC Horizons report. Last year I had the privilege of attending the LAK conference in Edinburgh in Scotland which is a very established international network of people researching learning analytics. So learning analytics is far from new. But, in my experience,  until recently the conversations around student data and learning analytics were limited to a few people within institutions. However in recent months it is a term that is gathering a lot of interest in the every day conversations within higher education institutions. More and more people are seeing the potential and in some cases even realising the benefits of learning analytics. Dr Bart Rientes, Prof Shane Dawson and Prof Dragan Gasevic are people that I follow with great interest. Each of them leading the way in their respective fields, researching different aspects of learning analytics.The National Forum for Teaching & Learning, an organisation supporting T&L in higher education in Ireland has recently launched a project to raise the awareness of Learning Analytics so all of the indicators are that this is an area that is here to stay and will only get bigger in the years to come.

Personally speaking I got involved in learning analytics in DCU nearly 3 years ago examining if there was a correlation between a students engagement with the VLE and their success rates in the module. As you can expect, there is a very strong correlation. But we took it one step forward, using historical data we built algorithms to predict a students success based on their current interactions and we gave this information back to students with the hope of it improving completion rates. More information can be found here. Without spoiling the surprise, while the project was a success this project only gave one piece of the jigsaw.

puzzle_pieces_-student-success

Several different initiatives emerged following the findings from this project. Each one giving a different piece of the jigsaw.The next series of posts will outline the various learning analytics projects that we have conducted in DCU.

In conclusion learning analytics has proven to be a powerful tool that can help improve the learning experience of the student in so many ways but each bit of data is only one piece of the jigsaw.  My word of caution that I would give to anyone interested in learning analytics  is measure what you value, don’t value what you measure!