In my last post I described four types of learning analytics products.  Here I’ll go into more detail around some of the VLE-based engagement reporting tools. These products for Blackboard and Moodle sit within the virtual learning environment (VLE/LMS), look at its data only, and provide simple indications of a student’s progress, raising flags when the student looks to be at risk.  Unlike many learning analytics tools these are aimed at teachers rather than the students themselves or managers within the institution. Blackboard Retention Centre Bundled with Blackboard Learn is Retention Centre which provides a simple dashboard view of learner participation within a single course. The functionality evolved from earlier features in Learn such as an "early warning system". Retention Centre is primarily aimed at identifying students at risk of failure, based on rules set by teachers.  The key dashboard provides an overview of a single cohort on a course, enabling you to identify students at risk of failure.  You can decide who you want to flag as at risk, add notes on individual learners, and contact them directly from the Retention Centre.  It’s also possible from here to change due dates for assignments.  There are four basic measurements of student engagement: Course activity is a measure of the time a student first clicks on something in the course until he or she clicks outside the course or logs out. Thresholds can be set for particular grades, flagging students who have fallen below that value - or those who have fallen a certain percentage below the class average. A flag for course access is set when users fail to access the system within a defined number of days. The final metric is missed deadline which can be triggered when a deadline is not met within a set number of days or when a specified number of deadlines have been missed. There are default rules for each of these (which can be customised): Course activity in the last week is 20% below average Grade is 25% below class average Last course access was more than 5 days ago There has been 1 missed deadline Another screen allows instructors to view a summary of their own participation in a course and to link directly to activities that are required of them such as grading.  Blackboard suggests that instructors use this functionality to prioritise areas of their courses which need attention.  This is a kind of basic "teaching analytics" where instructors can see an overview of their activity and link easily to tasks such as marking or blogging. The data on what actions you’ve taken through the Retention Centre as a teacher is available only to you.  While this may give you confidence that no-one is snooping on your teaching activities, it limits the options for institutions which want to understand better how learning and teaching are taking place.  Another limitation is that as Retention Centre works only at the level of a course you can’t get a view of programme or student-level activity. Moodle options Other than deploying a generic business intelligence system such as QlikView, Cognos or Tableau, Moodle-specific options include a few plugins for the system, in particular Engagement Analytics and Moodle Google Analytics, and a commercial option, Intelliboard. As an open source system Moodle allows users to develop their own plugins, and a number of institutions have built analytics tools using the data from user log files.  Currently these appear to be considerably less developed than the analytics capabilities of other virtual learning environments, notably Desire2Learn Brightspace and Blackboard Learn.  The last release of Moodle Google Analytics was August 2013, though Engagement Analytics, initially released in August 2012, is still being maintained. As with Blackboard Retention Centre, the tools are primarily aimed at staff - not students as yet. Documentation is limited for all these options.  Intelliboard’s product is clearly at an early stage of development, with an offer on its website that any paying customer can request additional reports for free. Moodle Google Analytics takes the data available from Google Analytics and the web server logs and presents it in Moodle and is thus more of a web analytics than a learning analytics tool, though analysing how learners navigate through a course website may be of interest. Engagement Analytics presents risk percentages for students based on three indicators: Login activity: how often and how recently are students logging in, and for how long? Forum activity: are students reading, posting and replying? Assessment activity: are students submitting their assessed work, and are they submitting on time? You can configure the weighting of each indicator e.g. 60% for logins, 30% for assessment and 10% for forums - depending on the relative importance of these activities in the course.  You can also add other indicators such as downloading files. Limitations and take-up For institutions using Blackboard or Moodle these tools provide simple ways of viewing engagement by students.  It’s surprising that, given how long the VLEs have been in existence, it’s taken so long for such basic reporting facilities to emerge. As I noted earlier these systems use data from the VLE only; there’s no integration with student information or other systems.  None of them appear to facilitate any automated interventions so teachers have to decide what action to take based on the dashboards. As Retention Centre comes bundled with Learn, no additional technical expertise is required to install and maintain this functionality - it merely has to be switched on for a particular course by the instructor.  It should be relatively easy for a Moodle administrator to install the plugins. It’s unclear how widespread the use of these tools is, however many institutions are no doubt experimenting with Retention Centre. One university I spoke to found the interface "ugly" and the functionality not very useful but I’m sure many teachers will find it does give them a better indication of students at risk.  Retention Centre is functionality which allows users to try out some basic reporting and analysis, perhaps later leading institutions to consider purchasing the much more sophisticated Blackboard Analytics for Learn or some business intelligence software. As far as the Moodle tools are concerned Intelliboard claims a few corporate clients on their website - none so far in the UK.  It is not clear how many institutions are deploying the plugins but initial response to Engagement Analytics on the Moodle forums is positive and it’s been downloaded nearly 10,000 times from the moodle.org site. Indicators of engagement What is particularly of interest about these tools is to what extent this data provides an accurate indication of student engagement - which we know can correlate with measures of success. Michael Feldstein points out that the four Retention Centre indicators for activity in the VLE are considered the most indicative of student success according to the inventor of Purdue’s Course Signals, John Campbell. But how do we know that the Retention Centre indicators are more accurate than those measuring login, forum and assessment activity in Engagement Analytics?  Courses have different types and balances of content, communication and assessment - and this is recognised by the tools in allowing you to customise the indicators.  However there are all sorts of other factors at play such as the features of the software itself, alternative ways that students have to communicate, the institutional context and nature of the student body, and to what extent the teacher encourages students to use the tools. Learning analytics is an inexact science and there will always be individuals who perform differently from how we think they will.  Monika Andergassen and her colleagues at the Vienna University of Economics and Business found that there were correlations between time spent in the VLE and final grade, and betweeen self-assessment exercises undertaken and final grade. The correlations in both cases though were modest, and the repeated solving of the same exercises didn’t correlate with better results, implying unsurprisingly that what you do online may be more important than how long you spend in the VLE. Various people I’ve spoken to at my recent visits to UK institutions believe that the more information we have about students the more accurately we’ll be able to predict their success or likelihood of dropout.  A key question is whether adding all the possible sources of data we have will make a significant difference to the accuracy of these predictions.  Will a few indicators from the VLE be sufficient to pick up the majority of your students who are at risk or do we need more sophisticated predictive analytics software which also take data from the student information system and elsewhere? This post was first published on the Jisc Effective Learning Analytics blog, 3rd Oct 2014. Images are from https://docs.moodle.org/22/en/report/analytics/index and are copyright Kim Edgar, available under the a GNU General Public License
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
I’ve described some of the basic reporting functionality available for Moodle and Blackboard but this is just scratching the surface of what is possible with learning analytics.  In this post I’ll look at ways in which analytics from other data sources such as video and ebooks are being brought into the VLE to help gain a better picture of student engagement.  I’ll also describe a system which makes analytics on their activities available to the learners themselves - and helps staff to manage subsequent interventions. Adding analytics from e-book usage While most learning analytics systems use data available in the virtual learning environment to measure engagement, VitalSoure CourseSmart Analytics delves into the students’ use of e-books in order to help "improve retention, control costs and improve learning outcomes". VitalSource is a company which rents out textbooks and enables usage monitoring through their own e-book reader software.  The system assesses engagement using various metrics including the duration of the e-book reading session, the number of pages viewed, and activities such as highlighting or taking notes. The (as yet fairly simplistic) analysis is presented in a dashboard which can be viewed from within the virtual learning environment.  It includes an "engagement index" derived from the usage data.  The company claims that its research shows that this index is a stronger predictor of student academic outcomes than their prior educational attainment. The dashboards are built on top of the GoodData business intelligence platform, and can be viewed from within the VLE.  The software uses the IMS Learning Tools Interoperability framework to make data available to the VLEs. Various US institutions which have deployed the software are profiled on the company website and in some white papers.  The suggested users of the product are: instructors (to assess performance and intervene as appropriate), deans and other senior staff (to assess the uptake and effectiveness of e-books), and publishers (to assess the relative impact of digital course materials in order to improve their offerings).  There’s no suggestion by VitalSource that learners would benefit from viewing the data, and whether students are comfortable about having their ebook usage analysed in this way is another matter.  I did some thinking about this a while back in: Making ebooks more interactive: logistics and ethics. Analytics from video viewing and contributing One thing not handled very well by most VLEs is video.  Various plugins have emerged to deal with this, and notable amongst them is Kaltura, an open source system available for all the main VLE products.  Kaltura deals with the content management aspects of hosting videos, and enabling contributions by students as well as staff.  It also provides analytics on the viewing and contributing of video clips.  This allows staff to see: Which videos are students watching the most? Which students contribute the most videos? Which students watch the most videos? How long are students watching each video? This information can certainly help you discover what your most engaging video content is.  It can also give some indication of engagement for individual students both in viewing and posting.  A table shows the top five most engaged users including how many videos they watched, how many minutes they spent, what their average view time was, and their "average drop-off" percentage i.e. how much of the videos did they actually watch. This is very limited though for the purposes of learning analytics; a natural evolution for the functionality would be to produce an indicator of student engagement in viewing video (and contributing it if appropriate).  In a similar way to the CourseSmart Analytics engagement index this could be made available to the VLE together with other engagement data to build a fuller picture of student participation in courses with high video content or a requirement to contribute video. The Kaltura website lists a number of high profile US universities as customers together with Durham and Manchester Metropolitan in the UK. Course Signals A more sophisticated engagement reporting system than the ones I’ve described so far for Moodle and Blackboard is Course Signals. This was originally developed at Purdue University and is now one of five products which comprise the Ellucian Student Success Suite.  It’s received much publicity due to claims of dramatic positive correlations between use of the software and measures of student success.  Retention in particular was claimed to be 21% higher among students who had used the system.  However Purdue’s findings were subsequently challenged as not necessarily demonstrating a causal relationship. Like the other simple VLE reporting tools, Course Signals provides indicators of whether students are on track to complete their course, based on their online participation.  The software was built to integrate with VLEs including Blackboard Learn and Desire2Learn Brightspace. At the heart of the system are traffic light indicators, displayed in the VLE, which tell students if they are performing well, holding steady or underperforming, and prompts them to take action.  So a staff member might specify a minimum performance requirement for a particular course.  If a student’s performance falls below this a red signal is provided on the staff dashboard and an email sent to the student.   A yellow signal shows that a student’s performance is approaching the minimum acceptable level, while green suggests that a student is doing what is required to pass.  The main metrics are: Grade - allows you to set value ranges for grades which equate to red, yellow and green signals. Effort - a measure of how much a student uses specified course resources in the VLE within a specified date range. It’s possible to filter students based on the red/yellow/green "signals" that are generated for grade and effort, for example those who have worsened in a class or those who have red signals in two or more classes. There are at least five things which make Course Signals much more sophisticated than the basic reporting tools for Moodle and Blackboard: it can bring in data from outside the VLE such as prior qualifications it presents the indicators to students as well as staff it works across courses rather than just on a single course it can trigger automated interventions it adds workflow management for interventions Tools are provided for detecting at risk students, triggering a response, setting up and tracking an action plan and monitoring its success.  It’s possible to tailor and track your communications using the system, and to add automated event-triggered reminders to students to carry out specified tasks. With this level of workflow management Course Signals begins to have the feel of a customer relationship management system rather than a simple VLE reporting system.  The indicators of engagement, with all the potential data sources behind them, are boiled down to simple red/yellow/green traffic lights for grade and effort. But these then trigger a range of automated and human interventions and communications which are tracked by the system.  You can have all the metrics you like for measuring engagement but effective management of interventions is what could really start making an impact on student outcomes across an institution.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
Vendors are rapidly developing products for analysing learners and their activities.  There’s a battle going on between the companies whose primary product is the VLE, those which sell student information systems (SISs) and those who have developed business intelligence systems for industry in general but are now targetting the education sector as a key market. In previous posts I’ve given an overview of the different types of learning analytics systems available and described some of the simple reporting tools for the VLE/LMS.  I’ve also looked at how data can be brought in from other sources to provide more sophisticated indicators of engagement, and how systems are beginning to build in workflow around managing the subsequent automated and human interventions.  This post is about the emerging analytics systems built around the VLE, the ones with an online learning-centric view of the World. Given the vast amount of data accumulated by students as they use their VLE, and the fact that for many learners this is the primary way in which they’re interacting with their institution online, it’s not surprising that VLE vendors have spotted an opportunity to develop products which use this data to help institutions improve student success.  But VLE log files are only part of the picture.  Much of the rest of the data needed for learning analytics is held in the SIS. Vendors such as Blackboard and Desire2Learn have therefore been building in functionality to combine data from some of the most commonly used SISs with data from their own systems, and developing ever more sophisticated analytics products to sit alongside their VLEs. Blackboard Analytics for Learn This product is designed to provide more in-depth analysis of student activity and performance than is available in the Blackboard Retention Centre, enabling the correlation of student behaviours to educational outcomes. Whereas Retention Centre works at level of the course and uses data from Learn only, Analytics for Learn integrates data from the VLE with the student information system and allows analysis of individuals, groups and instructors across multiple courses by staff such as deans, senior management and educational researchers. The software facilitates the analysis of student activity and performance patterns, the identification of at-risk student behaviour, and the measurement of learning outcomes against course grades. It also allows the tracking of KPIs through institutional dashboards.  Blackboard gives some examples of the questions which Analytics for Learn can help answer: How does student performance differ between courses where the VLE is used versus where it is not used? Is there a difference in student performance where the instructor went through a training class and those that did not? How are our efforts at improving course quality making a difference? The second question highlights another key difference with Retention Centre. Analytics for Learn allows the analysis of teachers’ performance as well, something that will prove controversial in many institutions but will be increasingly of interest to senior managers who wish to monitor the quality of teaching in ways that were never possible in traditional classroom settings. A whole host of reports are provided on areas such as activity and gradebook exceptions, student performance against learning outcomes, organisational unit performance against learning outcomes, "student-at-a-glance", "course-at-a-glance", most active instructors, and aggregated activity by organisational unit.  Learning-related metrics include: gradebook scores submissions to interactive tools such as discussions session and course logins; time, tools and content accessed; time on task organisational unit and term information for aggregration and comparison "academic standing" Unlike Course Signals, the software is for carrying out analysis alone, and Blackboard has chosen not to facilitate interventions or any workflow features at this stage through Analytics for Learn. Analytics for Learn is part of the Blackboard Analytics Suite which consists of a data warehouse with data models, a data transformation process and "out of the box" integration with Blackboard Learn and three student information systems: Oracle PeopleSoft, Datatel, Inc. and SunGard Higher Education. Integration with other student information systems is, according to Blackboard documentation, "easy".  Good luck to you though if you have a bespoke SIS. The software is of course only for users of Blackboard Learn. Visualisations can be done with the Pyramid business intelligence tool which integrates with the data warehouse.  Other BI tools can be used as well.  Blackboard provides consultancy for implementing Analytics for Learn. They perform the initial installation which takes "several days". Institutions must have skills in the core technologies such as SQL Server and Windows Server. Desire2Learn Insights Desire2Learn have recently rebranded their VLE package as Brightspace. Built on top of this is a system called Insights which enables reporting on engagement, assessment and outcomes. It includes predictive modelling aimed at helping identify at-risk learners. A large number of data visualisation and analytics dashboards are available. Insights includes the Student Success System which is specifically aimed at enhancing retention. Target users are instructors, educational researchers and senior staff such as deans. Desire2Learn has done some rapid development and added an impressive array of analytics capability to Insights in a short space of time.  It provides a number of pre-configured reports, including some on learning outcomes and how well students have met them. "Competency" reports are available for individuals and at other levels including across courses. Reports can also be obtained on the use of course resources and on "engagement" by users, measured by their logins to the VLE. Other reports are available on enrolments and withdrawals, course grades, quiz scores and statistics on the items themselves which may help to identify questions which discriminate well or poorly between the best or weakest students. There are further reports available on students at risk, showing differences between courses and highlighting individuals.  There is also a series of "data mining reports", enabling insight into student behaviour, grade achievements and usage patterns - as well as the effectiveness of learning content.  Reports can be obtained on tool access by users.  Predictive analytics are deployed to assess risk levels of students using historical course data. Weekly predictions are then prepared, generating a "success index" for each student in the course. An "assessments predictive chart" shows a student’s performance across all course assessments, and compares their performance with their classmates.  Arrows inside the trend signs indicate whether predictions are negative or positive compared to the previous week.  The dashboard can be filtered based on success levels. Unlike Analytics for Learn, the system does facilitiate some interventions and workflow management.  Emails can be directed to all students in a particular risk category, for example. A pin can be added as a visual reminder beside those students the instructor wishes to follow up with. Insights is tightly integrated with the Brightspace virtual learning environment (formerly Desire2Learn), and built using the IBM Cognos business intelligence system.  Data comes from "multiple source systems" - in particular the Brightspace VLE.  Austin Peay State University reports that they are integrating data from their student information system. A large number of metrics are provided relating to course access, content access, social learning, assessments and preparedness.  It really is becoming a quite impressive offering.  What a pity that hardly anyone this side of the Atlantic is using Brightspace… Further thoughts Most higher and further education institutions in the UK use Moodle or Blackboard Learn.  If you’re using Moodle and you want to carry out in-depth learning analytics there are no sophisticated Moodle-specific analytics systems currently available, and you may need to use a business intelligence tool. Moodle hosting organisations are beginning to provide their own reporting functionality to their clients. Two of these are the University of London Computer Centre (ULCC) and MoodleRooms, headquartered in Baltimore, Maryland.  MoodleRooms’ services include the reporting of activity, grades and engagement proxies, flagging lack of activity and low grades - both within and across courses. MoodleRooms also claims to provide "a 360 degree view of your students by integrating with industry-wide SIS systems", so they are basing their analytics not just on data from Moodle. Both ULCC and MoodleRooms no doubt see learning analytics as key potential differentiators for their services and will be looking to further develop their reporting and analytics tools and deploy them across their growing client bases.  Already 33% of UK higher education institutions are using hosted services for their institutional VLE (See UCISA Technology Enhanced Learning Survey 2014). Learning analytics provided alongside such hosted services will no doubt prove attractive for many institutions, particularly smaller ones without data scientists or sufficient technical expertise to install and maintain their own analytics systems.  On the other hand one institution I have spoken to wants to get usage data from its VLE hosting service into its own data warehouse alongside other institutional data to carry out its own analytics. Blackboard Learn users can deploy Analytics for Learn if they want more than the simple tools offered in Retention Centre.  But not many institutions have yet used Analytics for Learn in anger - certainly not in the UK.  One Blackboard presentation mentions three organisations where it’s being piloted: Montgomery County Community College, University of Maryland Baltimore County and Grand Rapids Community College. Given the necessity to enhance retention in many unviersities and colleges and the high penetration of Blackboard Learn there are likely to be many more institutions investigating purchasing the software. Meanwhile only two of the 96 UK higher education institutions who completed the UCISA technology enhanced learning survey in 2014 reported using Brightspace as their primary virtual learning environment.  It’s not clear if either of these is yet using Insights. So is the battle for supremacy in learning analytics systems being lost by the VLE vendors?  It would seem so at the moment - at least in the UK.  Certainly many educational institutions are already using business intelligence software for their broader requirements and are beginning to explore the potential of these for analysing learners and learning.  The business intelligence software is necessary anyway: Analytics for Learn is built on top of the Microsoft business intelligence stack and Insights requires IBM Cognos. If only the VLE-centric analytics products could be made more interoperable then Moodle users for example could plug in Analytics for Learn or Insights and benefit from the innovations of companies based in the world of online learning - as well as benefitting from the underlying business intelligence software and visualisation tools. Meanwhile the SIS vendors are moving quickly into the market and trying to promote an SIS-centric view of learning analytics, some of them even co-developing their products with UK universities.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:11am</span>
While VLE vendors would like their systems to be at the heart of an institution’s learning analytics efforts, vendors of student information systems (SISs) are now attempting to place their products in the analytics driving seat.  The SIS (also known as the student record system) is a vital product for any educational institution. It includes data such as prior educational qualifications, ethnicity and gender, the courses and modules in which students are enrolled, records of absence and assessment results.  More sophisticated SISs can contain many other aspects of the institution’s business.  All of them provide potentially helpful data for learning analytics so it’s no surprise that SIS vendors are developing products to help analyse this data - and integrate it with data from other sources such as the VLE. In the UK, some of the biggest players in the higher education SIS market are Tribal and Ellucian.  Meanwhile in further education, use of the Compass Pro- suite of tools is widespread.  In this post I’ll be examining how the learning analytics offerings of these three vendors are shaping up. Ellucian Student Retention Performance Ellucian Student Success includes five "solutions" designed to integrate with Ellucian’s Banner Student SIS.  In an earlier post I’ve already described one of them: Course Signals. Another, Student Retention Performance, aims to help identify at-risk students, examine retention and degree completion rates, and analyse the effectiveness of retention strategies.  In a complex hierarchy of products, Student Success is one of three modules which comprise the Ellucian Enterprise CRM (Customer Relationship Management system).  We’ll ignore the rest of them just now and assume that your institution uses Banner Student as its SIS. Ellucian is aiming Student Retention Performance at "Academics, advisors, institutional researchers, administrators and executives".  It comprises scorecards, dashboards, reports and "analytic capabilities" aimed at improving student retention and success initiatives. Scorecards provide a visual way of measuring progress towards institutional goals and objectives.   In one example that is given "Improve Academic Performance" is the institutional goal, with a measurable objective of "Increase Number of Undergraduate Students in Good Standing".  Figures for the number of students in good standing are presented as a key performance indicator, together with a status indicator and a trend.  Further visualisations such as historical performance can also be generated.  There are various preconfigured scorecards and dashboards with suggested metrics, however it’s possible to create other customised metrics based on the data in Banner Student. By some definitions this is higher level business intelligence or academic analytics rather than learning analytics but it’s possible to drill down to the performance of individual students too.  Such products don’t fit neatly into convenient categories for researchers attempting to map out the different types of analytics processes and tools that are emerging. One thing is clear though: Student Retention Performance is not designed as an intervention tool; you’d need to use Course Signals if you want to automate and manage actions by staff. Student Retention Performance uses IBM Cognos.  Data appears to be fed from Banner Student only - I cannot find any reference to integration with a VLE product as yet, which makes it somewhat limited.  It may be though that combining this tool with Course Signals, which does integrate data from other systems, would help. Tribal Student Insight Tribal is currently developing an analytics offering for their SITS:Vision SIS product with the University of Wolverhampton.  This is called Student Insight and aims to help predict student performance and at-risk students to enable interventions to be carried out, subsequent student activity tracked and - interestingly - the impact of interventions to be tracked.  Few analytics products so far offer this facility. Tribal refers to library and VLE data being integrated together with demographic data and assessment results.  The product is intended to provide information to staff on issues the student might have such as preparation for higher education, engagement, "academic integration" and "social integration" - presumably producing metrics for each of these.  It will allow comparison with other students across cohorts too. The company claims that their product can predict student success with an accuracy of 70% - some room for improvement there then still.  It’s found in its research with Wolverhampton that VLE usage is highly predictive of student success.  We already know that there is a correlation but it will be interesting to find out what aspects of VLE usage they are measuring - and whether the metric is based on more than a simple analysis of VLE click data from the log files.  Finally they state that social background and the distance students live from the campus is strongly correlated with their success when combined with other factors.  All of these metrics are combined into a single overall "Student Success Prediction" metric. Compass ProMonitor The final SIS-based product I’ll look at  here is widely used in further education and in some respects is ahead of the SIS-centric offerings being developed for higher education.  One of the products within the SIS range from Compass Computer Consultants Ltd is ProMonitor. This presents key details about the learner, enables teachers to record meetings and input comments, track them and follow them up, and it provides automated email notifications.  There are reports on assessments and ones which show student performance against the targets set for them.   Other data sources can be integrated and at risk students can be identified. This has much in common with a customer relationship management system and facilitates the workflow around interventions as well as various visualisations.  It’s unclear how the at risk metric is calculated but a more sophisticated predictive analytics engine might help in this regard. Conclusions The SIS-centric analytics tools are still in their early stages of development, let alone being used in anger by many UK institutions.  The three I’ve profiled here all require you to be using that particular SIS so if you’re not, you’ll need to look at a VLE-centric analytics product or a business intelligence system.  In summary: Student Retention Performance could be used alongside Course Signals if you’re a Banner Student institution to carry out some interesting analysis together with some automated interventions and workflow management. Student Insight looks promising for Tribal SITS:Vision users but you’ll have to wait till it’s released to assess its full potential alongside other types of products. Further education colleges are already using ProMonitor and beginning to understand how they can use it to monitor student engagement. But the system so far lacks some of the tools of emerging analytics solutions such as sophisticated predictive models. I’ve suggested in an earlier post that VLE-centric analytics products are not currently winning the battle for market share in the UK.  Well nor it seems are the SIS-based ones at the moment, with the exception perhaps of ProMonitor in the further education sector.  That leaves us still to investigate to what extent the generic business intelligence tools such as Autonomy, Cognos, QlikView and Tableau are being successfully customised as learning analytics systems.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:10am</span>
In previous posts I’ve looked at the emerging tools for monitoring user engagement, and adding more sophisticated engagement metrics. I’ve also examined the learning analytics systems being offered by vendors of virtual learning environments (learning management systems) and student information systems.  VLEs and SISs are the places where most data about students and their learning are held, so it can make sense to deploy a system built on top of one of the key systems already in use in your institution. Image: BME Solutions Ltd But many universities and colleges are using generic business intelligence (BI) systems to obtain insight on various aspects of their operations - and are beginning to use them for analysing the student learning experience too.  I’m going to look at three of the most widely used systems here:  IBM Cognos, QlikView and Tableau. Cognos is a well-developed BI system in use by many educational institutions to help analyse all aspects of their business including finance, enrolments, human resources and student success. The product is marketed as a comprehensive BI tool rather than being aimed primarily at retention or student success.  With the incorporation of SPSS Modeler and the SPSS statistical engine within Cognos users can perform predictive analytics alongside their other BI functions. Qlikview is marketed as an easy to use BI system (and is reported to be so by some of the institutional users I’ve met, as well as being relatively inexpensive) and already has significant adoption in education.  Some institutions have been jointly building a number of educational apps with the company which are available to users of the system. Tableau is another generic BI and visualisation tool which is being used for analysing many aspects of business, including learning, at educational institutions Worldwide. What do the products do? Cognos is a generic solution for all aspects of business intelligence and analytics.  Its core functionality is described as enabling users to answer key business questions and to: Easily view, assemble and personalise information Explore all types of information from all angles to assess the current business situation Analyse facts and anticipate tactical and strategic implications by simply shifting from viewing to more advanced, predictive or what-if analysis Collaborate to establish decision networks to share insights and drive towards a collective intelligence Provide transparency and accountability to drive alignment and consensus Communicate and coordinate tasks to engage the right people at the right time Access information and take action anywhere, taking advantage of mobile devices and real-time analytics Integrate and link analytics in everyday work to business workflow and process This list could be used to describe much of the functionality present in the other products.  QlikView allows users to integrate data from a variety of sources and develop dashboards and visualisations of areas such as student performance and retention, easily breaking these down by areas such as age, gender, course and faculty.  It does not however deal with workflow and process.  The largest UK reseller of QlikView, BME Solutions, has developed a range of education-specific applications for the system which make it relatively easy for an institution to monitor engagement and other metrics. Image: BME Solutions Ltd Because of the generic nature of all three of these systems any required metrics can be generated from the underlying data.  QlikView provides "models" with a number of educational metrics such as NSS scores and an "at risk" indicator. No customised learning analytics applications appear to be offered with Tableau.  A range of visualisations can be created using a drag and drop interface once the data has been linked to the system.  At the University of Kansas academic advisors are given a dashboard built on Tableau of individual students, synthesizing information from various databases.  The tool has been customised to enable communication with students who fall into particular categories such as those who are not performing well or who have not had an advising appointment. Interventions such as emails to groups of users or students at risk can also be triggered based on analytics in all three products. Who are the systems aimed at? Because of the generic nature of all three systems, dashboards, scorecards and reports can be generated for any type of user across the organisation. How extensively are they being used? Already in use by many universities and colleges for a number of years (Sheffield Hallam was using it as far back as 2006, for example), Cognos is increasingly being seen as a solution for learning analytics.  IBM is working closely with various universities in the UK to customise Cognos for their needs, notably London South Bank University. 36 universities in the UK use QlikView and there is a BME Solutions QlikView UK user group which meets every six months for institutions to share experience.  BME also has a small number of customers in UK further education. Though it has a smaller market share in the UK than Cognos and QlikView, many institutions worldwide are using Tableau, including various Ivy League universities.  It is not clear to what extent they are using the product for learning analytics rather than other functions such as enrolment and fund-raising.  However I do know of at least one UK university which has been trialling it for learning analytics visualisations. Technical architecture Cognos is built on a service-oriented architecture and runs alongside a data warehouse which is populated from key systems.  A database is packaged with Tableau and sits alongside it.   Data can be visualised from the local database or directly from other sources which can include Oracle databases and Excel spreadsheets.  With QlikView, data is stored "in memory" rather than in a data warehouse, and the dashboards store the front end plus the entire dataset they are working with. Final thoughts Most UK universities have at least one generic BI tool in use for monitoring and planning of their business activities.  According to the UCISA CIS survey SAP Business Objects is used by 22% of institutions, Microsoft Performance Point by 19% and Oracle’s BI systems by 14%. Considering that Cognos, QlikView and Tableau have a lower market share than these others it’s interesting that they are gaining such prominence in the learning analytics field.  Meanwhile I’m not hearing of the SAP, Microsoft and Oracle products being extensively used as yet for learning analytics, despite their penetration of the BI market in higher education. BI systems do already however provide the underlying capability for some of the learning analytics solutions available.  Blackboard Analytics for Learn uses Microsoft Performance Point to build its dashboards.  Cognos sits under Desire2Learn’s Insights learning analytics product and Ellucian Banner Student Retention Performance . Tools such as QlikView and Tableau are reportedly easy to use and relatively cheap to deploy. The fact that they sit outside the VLE and the SIS has both pros and cons.  Unlike the new learning analytics products emerging from VLE and SIS vendors, BI systems are so generic that they may require considerable customisation both to the institutional context and for carrying out learning analytics specifically.  The educational applications and the active communities around some products (notably QlikView) may make that contextualisation much easier.  Meanwhile the fact that they are not tied closely into another product has the benefit of reducing vendor lock-in.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:09am</span>
When I blogged about learning analytics systems being developed by vendors of student information systems (SISs) I mentioned Tribal’s emerging system, Student Insight, which it’s been developing with the University of Wolverhampton. Tribal’s SITS:Vision product is used by over half of UK higher education institutions.  Meanwhile the company’s ebs4 SIS (or MIS as it’s known in further education) is in use by more than 120 colleges in England and Wales.  There wasn’t much available on the web about the project so I contacted the company’s Data Scientist, Chris Ballard, to find out more. Niall: What’s the current state of the product? Chris: Student Insight has emerged from R&D carried out with the University of Wolverhampton.  We’re currently working with them and two or three other institutions to validate the capabilities of the product - and next year we plan to expand out the user base considerably.  The system’s available now for early adopters. Niall:  How would you describe its functionality? Chris: Student Insight enables customers to make the best use of the data they hold - not just in SITS but from external data sources, in particular the institution’s VLE or library.  Firstly, it allows you to build a model or set of models to predict student risk.  We’re focussing at the moment on retention and academic performance risk, in other words the risk of dropping out or failing a module. We use the data in SITS to build this model and supplement it with data relating to engagement.  It also shows historic data about the student and enables you to aggregate up risk predictions to module, course and faculty level too, as well as bringing in performance, engagement and retention data at each of these levels. We feel it’s important that an institution is able to use student predictions and analytics of historic data side by side in order to make an informed decision about what should happen. Aggregating this data allows institutions to use the analytics to understand student risk and outcomes at course level, and well as for individual student support. This enables us to meet the needs for those institutions who want a "top down" approach to monitoring student cohorts, as well as a "bottom up" approach where it’s used to proactively identify specific students who may be at risk. The predictive aspect of the system is customisable - we recognise that every institution has different datasets and differing requirements. Niall: What technologies are you using?  Is there an underlying business intelligence system for example? Chris: It’s delivered as a software as a service cloud-based system - and also available for on-site installation.  We use open source technology and data science libraries built by our team.  The database is MongoDB, the widely-used NoSQL platform.  It’s a document-oriented database which we use because it enables flexible modelling of different structures at an institution. The predictive engine is built on top of well-established machine learning libraries implemented in Python.  We use a machine learning ensemble technique to combine multiple predictive models to produce an overall prediction for individuals and groups.  We’re not giving you a black box here but trying to make things transparent to the user.  It shows whether there’s a high risk of underperformance and lets you drill down into that to get predictions on individual elements of performance. Niall: Is it built just to work with SITS or can it integrate with other student information systems? Chris: Actually it could work with any SIS.  You can push data to it in a number of ways, by importing CSV files, using the system’s API or pulling in data by directly interrogating a database.  When we were designing the system, a priority was for it to be open so that it could be used with all of Tribal’s education management systems, though our initial focus is on SITS customers. We decided to provide an API because we recognised that institutions had different approaches and might want to deliver analytics at different levels.  Many institutions are keen to start at a more aggregated level and to embed the analytics in existing portals.  We’re working closely with early adopter partners because we want to make sure institutions can properly interpret the data they see - and we’re using the results to continuously refine the product. Niall: What VLEs and other systems does it integrate with? Chris: You can basically pull in any dataset.  If you wanted to integrate Moodle or Blackboard you’d need to extract data from the user log files. This raw data is then loaded into Insight and processed to extract higher level predictive features.  The next step is to have API integration with key VLEs so we can pull it in directly.  Most institutions seem to be interested in using SIS data first and then to start integrating data from their VLE. Niall: What about the metrics it produces?  Can you tell me a bit more about them? Chris: The two main areas as I mentioned earlier are around the prediction of retention and academic performance risk.  Student Insight allows you to define datasets and map them to factors you determine such as information you know about the student when they enrol like their prior results.  The system builds a machine learning ensemble against each dataset - you supply it with a set of historic training data and the outcome you know happened historically.  In the case of academic performance that might be whether each student passed or failed the module, or achieved a certain grade.  The models the system builds, learn the patterns in the underlying data, which are then used to predict the outcome you’ve trained it against. At its heart it’s a very generic machine learning data modelling tool.  We’re not enforcing any fixed models but we’re providing a framework that allows a set of predictive models to be built up and then delivered to different users using role-based security through dashboards built using HTML5.  But we’re developing a standard set of models that can then be customised and automatically optimised for the institution using them. Niall: Some of the learning analytics systems available are now building in workflow so that you can manage interventions and feed data on the effectiveness of those interventions back into the models.  Does Student Insight do that at all? Chris: Yes, managing interventions is a key part of the system.  It allows you to flag cohorts of students or individuals at risk and then undertake an intervention.  It integrates with another of our products called ESD (Enterprise Service Desk) which allows an institution to manage student support processes and also provides helpdesk capabilities. Intervention delivery can be managed and monitored by the student support team using ESD, based on the institution’s support workflow and policies.  On our product roadmap is to consume what happens as a result of the intervention. We’ll look at the history of predicted outcomes for the student and then feed that back into the predictive model. Niall: Well that’s helped to fill in the gap in my knowledge about Student Insight, which pretty much completes my review of the learning analytics systems available.  Thanks very much, Chris - and please keep us posted about how the product is developing.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:09am</span>
I’m back from yesterday’s excellent Workshop on Ethics & Privacy Issues in the Application of Learning Analytics  in Utrecht organised by LACE and SURF.  Hendrik Drachsler from the Open University of the Netherlands kicked off the session by presenting a background to learning analytics and some of the resulting ethical and privacy issues.  He mentioned the situation in the Netherlands where universities are now partially funded on the basis of how many students they graduate, and concerns that that gives them an incentive not to accept students who are predicted to fail. He also discussed the InBloom debacle in the US - "a perfect example of not taking care about privacy issues".  There was another situation in the Netherlands where an app used on tablets in schools collected data on which further analysis was carried out.  Problems arose because this analysis wasn’t described in the terms and conditions of use. Hendrik mentioned that his call for ethical and privacy issues in the application of learning analytics had produced over 100 issues.  These were then put into four categories: privacy, ethics, data and transparency.  The aim of the day was to discuss these issues and begin to look for solutions to them. The group decided that there are often no clear boundaries between these categories.  Certainly I’ve found it artificial to try to split legal issues from ethical ones when carrying out my recent literature review of the area.  Much of the law is based on ethics - and sometimes an ethical stance has to be applied when interpreting the law in particular situations. The workshop wasn’t purely full of Hendriks, but a second Hendrik, Hendrik vom Lehn then gave an informative presentation on practical considerations around some of the legal issues arising from learning analytics.  Much of what he said, and subsequent discussions during the day, related to the EU Data Protection Directive.  Hendrik thought that a common misconception about the Directive is that the scope of "personal data" is much broader than most people think, and includes absolutely everything which makes data personally identifiable. Another interesting concept in the Directive is that data can potentially be processed without consent from individuals if it is in the "legitimate interest" of the organisation to do so.  However in practice it’s likely to be better to inform students and obtain their consent for data collection and the resulting analytics.  Hendrik also discussed the US concept of "reasonable expectation": people whose data is being processed should have reasonable expectations of what is being done with it.  Thus if you start using it in new ways (e.g. the recent Facebook mood altering experiment) you’re on dangerous ground. Anonymisation is often proposed as an alternative to obtaining consent, but this can be difficult to achieve.  It’s particularly problematic in small groups where behaviours can easily be attributed to an individual. Hendrik felt that where grading is based on learning analytics or can in some way affect the career of the student, this could have legal implications.  Another issue he mentioned, which I hadn’t come across before, was the subordinate position of students, and that they might feel obliged to participate in data collection or learning analytics activities because they were being graded by the same person (or institution) that was analysing them.  Would any consent given be regarded as truly voluntary in that case? A member of the audience then asked if there was a difference between research and practice in learning analytics.  Hendrik suggested that ethically our approach should be the same but from a legal perspective there may be a difference. So what happens if a student asks for all the data that an institution has about them?  Hendrik thought that the Directive implied that we do indeed need to make everything we know about students available to them.  However there might be a possible conflict between full data access and the wider goals of learning analytics - it might make it easier for students to cheat, for example.  Also it may be difficult to provide meaningful access for an individual while excluding other students’ data. Another potentially difficult area is outsourcing and data transfers to third parties.  This is particularly problematic of course when that data is being transferred outside the European Union.  For students the process of understanding what is happening to their data - and accessing it - can then become more complex and they may have to go through several steps.  Ownership of the data is not complete in this situation for any party (though in a later discussion it was proposed that "ownership" is not a helpful concept here - more relevant are the EU concepts of "data controller" and "data processor"). We then split into groups and had the benefit of some great input from Jan-Jan Lowijs -  a privacy consultant from Deloitte.  He described the nine general themes in the Directive which we found a useful way to propose answers to some of the 100 issues that had been submitted.  These are: Legitimate grounds - why you should have the data in the first place Purpose of the data - what you want to do with it Data quality - minimisation, deletion etc Transparency - informing the students Inventory - knowing what data you have and what you do with it already Access - the right of the data subject to access their data, when can you have access to it and what can you see Outsourcing - and the responsibilities of your institution as data controller and the third party as data processor Transport of data - particularly problematic if outside the EU Data security Attempting to answer some of questions submitted using the themes as guidance resulted in the following: Who has access to data about students’ activities? Students themselves and certified access for teachers, researchers etc, based on theme 2 above (purpose of data) What data should students be able to view? All data on an individual should be provided at any time they request it - that’s the situation to aim for, based on theme 6 (access) Should students have the right to request that their digital dossiers be deleted on graduation? Yes, so long as there are no other obligations on the institution to keep the data e.g. names, date of birth, final grades, based on theme 3 (data quality) What are the implications of institutions collecting data from non-institutional sources (e.g. Twitter)? Consent must be obtained from the students first, based on theme 4 (transparency). A case in Finland where two students sued their university who were re-using their Twitter data was noted. Something interesting that Jan-Jan also mentioned was that there are differences in data subjects’ attitudes to privacy, and that a number of studies have shown a fairly consistent split of: 25% "privacy fundamentalists" who don’t want to share their data 60% pragmatists who are happy to share some of their data for particular purposes 15% people who "don’t care" what happens to their data An organisation needs therefore to make an active decision as to whether it attempts to cater for these different attitudes or finds some middle ground. Some of the conclusions from the day in the final session were: It was noted that students were absent from the discussions and should be involved in the future. It was suggested that we fully articulate the risks for institutions of learning analytics. What are the showstoppers?  Are they reputational or based on a fear of loss of students? "Privacy by design" and user-centred design with much better management of their data by users themselves were thought to be vital. InBloom was suggested as an "anti-pattern", to be studied further to establish what we shouldn’t be doing. If you think something’s dodgy then it probably is. I have to admit being slightly concerned to hear that one university has equipment in its toilets to ensure that you’re not using your mobile phone to cheat if you have to nip out during an exam.  A good rule of thumb proposed by Jan-Jan is that if you feel uneasy about some form of data collection or analysis then there’s probably something to be worried about. The outcomes from the day were being much more rigorously written up than I have done above by a professional writer - and will be fed into subsequent workshops held by LACE with the aim of producing a whitepaper or longer publication in the area.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:09am</span>
Jisc today released a new report: Learning Analytics: the current state of play in UK higher and further education.  It was written after a series of visits I made recently to universities and colleges across the UK which were known to be carrying out interesting work in learning analytics. I was inspired by campuses filled with enthusiastic freshers out enjoying the late summer sunshine and no doubt largely unaware of the technological innovations underway aimed at enhancing their chances of academic success.  Indoors I had fascinating discussions with some of the staff who are pioneering the use of the new technologies in the UK. Various things struck me as I carried out structured interviews at each institution.  They varied tremendously in their organisational structures and approaches to education, and hence in their motivations for using learning analytics.  Increasing retention for example was vital for some of them while others, who didn’t have huge problems with student drop-out, were more interested in improving the learning experience, the tutor-student relationship or the institution’s scores in the National Student Survey. It was also interesting to discover just what an early stage the UK is at in its use of learning analytics.  The activities discussed ranged from general business intelligence and the monitoring of key performance indicators, to tools which predict students at risk of failure, to systems which help manage the workflow around staff-student interactions.   The distinction between academic analytics and learning analytics which some commentators have attempted to make didn’t really seem to apply - most people see the data as part of a continuum which can be used by people at all levels of the institution. The approach to gathering data for learning analytics varies widely across the institutions too.  Student information systems and virtual learning environments provide the bulk of the data.  But one of the most surprising findings is that there is little common ground among the participating institutions in the analytics systems they are using.  This seems to confirm the nascent state of the technologies and the lack of consolidation in the marketplace.  Contrast this with the market for VLEs where Blackboard and Moodle dominate. Most interviewees were reluctant to claim any significant outcomes from their learning analytics activities to date.  Several mentioned the strong correlation they have found between physical attendance and achievement.  Others found that a significant outcome of the analytics work has been improved connections between disparate parts of their organisations. When asked about future plans, most institutions were planning to consolidate the use of the systems they had recently put in place but would also be integrating new data sources and improving the presentation of dashboards. There was a strong desire as well to put tools in the hands of students so they can better monitor their own behaviours and performance.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:08am</span>
As MOOCs began to proliferate it was clear that services would emerge which would make it easier to find those of interest to potential learners. MOOC providers are now making data about their courses available through RSS feeds and APIs so it’s possible to harvest those and develop tools to allow people to find and review courses more easily. But what if you want to build in MOOC study as part of your professional development? You might need your manager to allow you to take time off or your company to pay for the certification. Last week I met with two brothers from South Africa, now based in London, who are developing a system to enable just that: Michael and Greg Howe from GroupMOOC. I was acting as a mentor for the Open Education Challenge - a European incubation programme offering mentoring, coaching and investment to some of the most innovative education start-ups worldwide. I’ve found that some of the most inspiring developments in educational technology come from small, young companies. The Start-up Alley at Educause for example is for me the most fascinating aspect of that conference. Back in London at the Open Education Challenge I was fortunate to meet with a range of enthusiastic entrepreneurs who’d given up secure jobs to pursue their business ideas. GroupMOOC enables you to search for MOOCs you’re interested in, read reviews and plan your workload, exporting key events and deadlines to your calendar - particularly useful if you’re studying more than one at once. You can also create groups of friends and colleagues to share your experiences with. The product helps HR directors to develop an overview of what MOOCs their staff are doing and builds in workflow enabling managers to authorise time off (the default setting is "100% in your own time!") or agree to pay for the final certificate. GroupMOOC is well-designed and useful. Whatever MOOCs evolve into in the future, there will certainly remain a massive and growing market for "massive" and not-so-massive online education - and a need for tools which help organise the complexity of thousands of courses from multiple providers. Greg and Michael’s next challenge is to convince a panel of investors to provide funding to further expand the product’s functionality. They’ve made a great start.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:08am</span>
The latest Lace Project event was held in the illustrious surroundings of the Allard Pierson Museum at Amsterdam University this week.  The focus this time was on open learning analytics.  After some lightening presentations on participants’ interests in the area, we split into groups to look at what exactly open meant in the context of learning analytics. We discussed some of the issues with Adam Cooper of Cetis, shown here alongside other fine specimens of humanity. Probably the most obvious aspect of open in the context of learning analytics is the reuse of code (and documentation) that others have created.  Predictive models can also of course be opened up to others.  Having open APIs in learning analytics products is important too - as is interoperability of the different systems.  And a vibrant community of people and institutions who wish to share and enhance all these things is essential. Openness was thought to "keep the market honest" and it also improves transparency for learners and other users of learning analytics systems.  Openness may also mean that the data from one environment can be connected to that in another one, not only across the different systems in one institution but potentially with other institutions too.  Adam has documented some of the organisational intitiatives to share data for learning analytics . In a group discussion later we looked at some of the next steps or "big ideas" for open learning analytics: Clarifying the technical architectures - Apereo has defined an architecture and Jisc is commissioning the components of a basic learning analytics system Developing a privacy-enhanced technology infrastructure Student-facing tools to monitor and analyse their own learning progress Tools to evaluate accessibility issues - an example was given of a system which determines if users are dyslexic and then adapts the learning accordingly The other groups reported back on their proposed essential next steps: Understanding the organisational consequences (or "systemic impact") of deploying learning analytics Gathering evidence for relevant specifications and standards that work or don’t work Obtaining consent from students to help evolve learning analytics, instead of just agreeing that their data can be used (see Sharon Slade and Paul Prinsloo’s suggestions on students being collaborators in the learning analytics process rather than mere data subjects) Building reference implementations of learning record stores Understanding the barriers to the deployment of learning analytics One of the clearest messages to come out of the day was just how important tackling the privacy issues is becoming in order to avoid an InBloom type fiasco  in Europe.  While this is a problem everywhere, concerns about the lack of systems for granting consent to the collection of data are massively holding up implementations of learning analytics in countries such as the Netherlands and Germany.  A further event being planned for Paris in February will attempt to progress understanding in this area and develop transparent learning analytics systems which include mechanisms to obtain consent at granular levels from students to process their data.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:08am</span>
Displaying 21331 - 21340 of 43689 total records
No Resources were found.