Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

A while back I wrote a post called The Learning Walk: A Primer which proved to be quite a bit more popular than I expected. A recommend reading it - everything I say in it still holds true - but the main idea is that walking is a simple habit that can contribute significantly to learning. Since that time, I’ve come across any number of articles and studies that confirm the benefits of walking. For the purposes of this post, I thought I’d highlight two: one that addresses the mental health benefits of walking and one that highlights its impact on creativity. The Power of a Walk in the Woods Not all learning walks are equal. In a recent New York Times article titles How Walking in Nature Changes the Brain, health writer Gretchen Reynolds discusses how walking in nature - in a park, or in the woods, for example - can "may soothe the mind and, in the process, change the workings of our brains in ways that improve our mental health." Walking in an urban environment does not have the same impact. This claim is based on a recent study by Stanford researchers that found that individuals who "went on a 90-min walk through a natural environment reported lower levels of rumination and showed reduced neural activity in an area of the brain linked to risk for mental illness compared with those who walked through an urban environment." This study was a follow on to an earlier study suggesting that walking in nature improves both mood and cognitive abilities. The bottom line is that there is mounting evidence that walking in nature makes you feel better mentally and may improve your ability to think. Walking Your Way to Eureka! Speaking of walking as fuel for thinking, a New Yorker article Why Walking Helps Us Think by Ferris Jabr links to a range of articles and studies on the connection between walking and thinking. Jabr also highlights another Stanford-based study that purports to "directly measure the way walking changes creativity in the moment." In the Stanford study, researchers Marily Oppezzo and Daniel Schwartz tested the performance of seated individuals and individuals who walked on various creative thinking tasks. They also tested whether walking outdoors, as opposed to on a treadmill, made resulted in differences in performance . Walking, in general, improved creativity significantly, but walking outdoors produced the biggest impact. Overall, Oppezzo and Schwartz concluded that walking "opens up the free flow of ideas, and it is a simple and robust solution to the goals of increasing creativity and increasing physical activity." So, Get Moving If you have not yet made the Learning Walk a part of you lifelong learning habits, I hope this extra bit of data, in combination with my original post, will sway you. The greatest thing about walking, of course, is that - assuming you have no physical limitation - it requires nothing you don’t already have. If you are able to get outside and walk in nature, all the better, but walking of any type is beneficial. The key is just to get up and go. best shampoo for hair loss Looking for the perfect holiday gift for the lifelong learners in your life? Be sure to also check out 10 Ways to Be a Better Learner from Mission to Learn founder Jeff Cobb. The post The Learning Walk, Continued appeared first on Mission to Learn - Lifelong Learning Blog.
Jeff Cobb   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:18am</span>
The printer may have to be next I’ve finally bitten the bullet and gone paperless.  I hardly ever look at any of the junk I accumulate in my filing cabinet anyway so it’s all gone in the bin or the shredder.  Everything I need (except a few things which will be scanned) is now in digital format.  So why did I keep all this stuff anyway? I didn’t review what I kept on a regular basis to see if it was worth holding onto For absurd sentimental reasons e.g. offprints of papers I’d written (which are digitally preserved anyway) Some items I had only on paper - e.g. papers people had given me, reports I’d picked up etc Until I got an iPad I felt it was easier to read long documents on paper than on the screen To preserve the near empty state of my filing cabinet I have a cunning plan: Ask people to send me digitally anything they hand me on paper to which I think I might wish to refer again Instead of printing out articles read them on the iPad and bookmark them with Delicious Scan in anything worth keeping which is not already digital and ditch the original I have a slight confession to make at this point.  I’m not really paperless yet - I still have bookshelves.  This is mainly because books look nice and I like to be surrounded by them not because I refer to them very often.  Ditching my books, as recommended by Alexander Halavais, is a step too far for me at this point.  But I’m thinking about it. The entire contents of my filing cabinet. Even this will be gone soon.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:17am</span>
Year on year use of online assessment is nearly doubling here at the OU. In the last year around half a million quizzes were delivered to students in our virtual learning environment using a combination of the Moodle quiz engine and the University’s in-house OpenMark system. Interactive Computer Marked Assessment delivery at the OU The use of the e-assessement tools for summative purposes (affecting the final mark for a module) has risen to around 16% of all quizzes delivered.  Meanwhile a new question engine for Moodle has been pioneered by Tim Hunt and Phil Butcher and is scheduled for release this December.  Phil says "the new engine has a crispness and consistency that inspires confidence" and he’s pleased to "wave farewell to many of the inconsistencies of the old engine". Enhancements planned over the next year include: Drag and drop of words onto images Drag and drop of images onto images New short answer question using pattern-matching algorithm New question type using drop-down lists New question type to enable placing of markers on an image New numerical question type enabling use of mathematical and scientific notation New question type to enable incorporation of Java applets (including automated marking of diagrams) Audio recording question type New authoring interface Inclusion of STACK maths questions Interface to Learnosity audio recording tool Dragging implemented on touch screen devices e.g. iPad Better import and export from question bank to facilitate off-line authoring
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:17am</span>
Tweetstats provides a great way of visualising statistics about people’s tweets - such as their frequency etc.  You can also see your most commonly used words and export them to a Wordle. Thus you can very quickly see what people like to tweet about the most.  Mine doesn’t reveal much except what appears to be  a fairly positive if bland collection of words such as good, yes, thanks, think and great. Tony Hirst demonstrates a strong interest in data and google, and also appears to say "heh" rather a lot.  So you can also instantly pick up something about users’ use of language. People like Gráinne Conole, use particular words so often that they completely dominate the wordle of their tweets.   In this case, I guess, Gráinne uses Cloudworks as a way of pointing people to other resources so  her tweets are not purely about the Cloudworks system itself! In the case of John Kirriemuir, he retweets a lot which certainly shows he’s monitoring others’ tweets and likes to share. Putting aside the retweets, he has no particular obsessions apart from, perhaps, libraries and Birmingham. Endless possibilities for psychological profiling here.  So long as I don’t come out as positive but bland!
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:17am</span>
I’ve just had some interesting conversations at an event for new OU module chairs at Cranfield University after presenting on some of the possibilities of elearning for our students. One academic wondered how he could be expected to design courses for smartphones and tablets when the University was not prepared to buy him these devices - and wouldn’t even upgrade his operating system to Windows 2010 from Windows 2003. Well he clearly had an axe to grind on the latter issue, justifiably perhaps, but he may be missing the point: the tools are primarily server side and all he needs to access them both for authoring and consuming is a web browser. Also if he develops content using our XML-based structured content system the module websites that he creates should automatically look good on a smartphone, tablet, laptop or desktop - without him having to do anything different for each platform. Mock-up of new mobile interface for OU course websites Some of these devices do of course have clear affordances which may facilitate learning experiences only possible on that device - and necessitate alternative designs for different platforms. Thus writing an essay on a smartphone doesn’t make a lot of sense but learning applications involving geo-spatial awareness may well do. Similarly the touch-screen interface of an iPad makes it much easier to manipulate images than using a mouse with your desktop PC. So a visual learning activity designed for a tablet might not work on a desktop. There is a very good argument that this lecturer will never be able to see the pedagogic potential of these devices unless he not only gets them to play about with but takes personal ownership of them and uses them in his daily life. However another argument that occurred to me this week is that we only ever see significant adoption of technologies for teaching and learning when these are already commoditised. Thus while early adopters pioneered the use of the web browser for teaching in the mid ‘90s it was only a few years later when most people were googling for information and shopping online that the web really began to take off in education. Similarly we’re now getting 10% of our students accessing our online systems from mobile devices on a regular basis. The number is growing rapidly but probably more because smartphones are taking off in society than because we’re providing useful podcasts and websites optimised for small screens. I’ve seen a big change in attitudes over the past few years. As the internet encroaches on many aspects of life, and people become ever more used to googling, social networking etc, there are few people who don’t recognise that there must be at least some benefits of studying online. No longer do people say "Why is the OU moving online?" though there are reasonable objections of course to studying online exclusively. The innovators and early adopters need to keep pushing the limits but should we accept that most of our innovations will have minimal impact on learners until similar devices and applications are mainstreamed in society?
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:16am</span>
Fed up being force-fed a whole lot of stuff of no great interest to you in your university’s virtual learning environment? Want to view only the parts of most relevance to your own learning - and blend them in with your other interests such as news, weather and sports updates? The Open University is now moving closer to that vision with a series of "widgets" or "gadgets" which take parts of the OU VLE and make them available on other platforms. The developments have been made possible with a grant from JISC for a project called DOULS (Distributed Open University Learning Systems). The first prototype gadget, built by project developer Jason Platts, makes the module planner from the Moodle module website available to students in iGoogle. Jason has built the authentication module which makes it possible for the gadget to communicate with Moodle. At the moment this is a one way flow of information from Moodle to the gadget, however future versions will enable updates to data held in Moodle via the gadgets. Future gadgets planned for development are: Forum Updates - showing latest updates to forums, blogs and wikis you’re subscribed to Assessment Helper - prompting for when next assessments are due Study Buddy - enabling students to connect with others who have similar interests The idea behind all this is to allow users to work in the environments most comfortable to them and not to be forced to visit an institutional website all the time, which might not be configured in the way they want it.  Learners will be able to create their own dashboard including updates to do with their formal learning as well as anything else they’re interested in. We’re also investigating the development of similar applications in Facebook and LinkedIn. All the code will be made available freely to other institutions. An additional benefit is that we may be able to use the functionality of the other platforms to make possible something that can’t be done solely in the VLE. Of course many students may prefer to visit the VLE in its entirety and they’ll still be able to do so. There are also possible reasons why institutions might not want to lose them entirely from that environment - such as them potentially missing out on guidance and support, news items or knowledge of new courses. However overall it has to be a good thing to give control to learners over exactly what they want to see and in which environment they want to see it.  The VLE is not dead but merely fading away into the background.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:16am</span>
I’ve finally managed to install iBooks 2 and the example book "Life on Earth". This was a frustrating process that took two and a half hours of upgrades and downloads, requiring the right versions of iOS and iTunes with various reboots of both the PC and the iPad. Arguably I should have upgraded to iOS5 long ago but I didn’t when it was released because of allegations that it wasn’t robust. On the first attempt to "read" the Life on Earth iBook the app hung and I had to reboot the iPad - again. On the second attempt it crashed during one of the interactives requiring me to start at the beginning. When it worked it was an enjoyable experience with beautiful images, useful videos and informative interactives - and you can envisage the transformational effects this kind of experience will have for millions of learners in the very near future. There’s nothing that isn’t already done through applications on PCs or via web browsers but a few things make it inherently better on a tablet: portability, use of the touch screen for interaction and page turning, the book metaphor itself rather than the browser metaphor which frequently requires vertical scrolling, the feeling of immersiveness you get because it’s not within a browser window, and no need for internet connectivity once it’s downloaded. My inclination was however to "play", looking for the next fun thing to do rather than to read the text. Presumably many learners, particularly those who’ve grown up without reading much, would act in the same way. To be really useful in education on a massive scale a few things need to be sorted out with iBooks 2: 1. The bugs need to be fixed so the app actually works - or the entire slickness of the user experience is wrecked. It’s surprising that Apple released an app with such fanfare which falls over so easily (at least on my iPad). 2. Players need to be developed for web browsers, android tablets etc. 3. Authoring tools need to be developed for other platforms too so you’re not forced to buy a Mac. 4. You must be able to get hold of the books without going through the iTunes Store. 5. The iBooks 2 format needs to be as open as ePub. Fellow tweeters assure me that it is based on HTML5 and JavaScript but I’d be very surprised if these books work seamlessly as they don’t even work properly on Apple’s own app. Given the incredible commercial success of tying in the iPad so closely to the iTunes store numbers 2 to 5 aren’t going to happen any time soon which leads me to think that an enhanced ePub-type competitor format which runs on and can be developed on all platforms, and distributed freely, is necessary.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:16am</span>
Ok I’m not exactly comparing like with like here but I am very interested in the potential of ebooks as an alternative way of structuring learning experiences - particularly where there is a large amount of reading involved. With the growing penetration of tablet devices (ownership in the US doubled to 19% over last Christmas) ebooks now have enormous potential for providing learning opportunities. And with nearly a third of UK citizens already owning a smartphone, many of them may be prepared to study extensively from ebooks on smaller screens. Due to my deteriorating eyesight I’m not one of them; tablets are clearly more comfortable devices for extended periods of study so this post relates mainly to tablets. While the information delivered through an ebook may be identical to that provided on a website, there are several attributes of ebooks which may make them more appealing to learners than accessing content in a VLE: Learners can own an ebook - they can’t own their institution’s VLE An ebook is a digital version of a familiar physical product that people have grown up with. Physical books cost money and the transition to paying for a digital copy may not be too painful but people don’t like to pay for access to websites, showing that they value ebooks more. This sense of ownership may encourage learners to engage more with the content of an ebook than a website. Ebooks can be viewed offline Once you’ve taken possession of your ebook onto your device you’re free to view it whenever you like which is particularly useful when travelling or away from internet access. Ebooks are self-contained The web is a confusing place with an overwhelming number of sites and pages. It’s easy to get distracted when using the web by hyperlinks and other applications. Ebook readers on tablets take up the whole screen A web browser has all sorts of tool bars, menus and icons which may distract the reader and provide a less immersive experience than reading an ebook on a tablet. You know how much you’ve read and how much there is to go By default an ebook clearly signposts how far through its content you are. Websites may not make this clear - and indeed can’t normally do this as precisely as ebooks due to the variable page lengths of the web. Automatic pagination makes reading easier Due to the variety of devices, browsers and configurations, the designer of a web page cannot produce content that will consistently fill the entire screen. Users have a more complex navigational process which may involve vertical scrolling as well as page turning. One of the key features of ebook reader software is the automatic repagination to suit the platform and user preferences such as font size. Page turning is physically easier with an ebook on a tablet The touch screen of a tablet or smartphone allows the user to move forwards and backwards between pages with a touch or swipe - a simpler and faster process than turning the page of a physical book and also much easier than using a mouse to navigate to and click on a particular part of a web page. Far be it from me to suggest that the VLE is dead but given all the affordances of ebooks accessed on tablets it looks like much of the learning activity currently taking place in VLEs is heading to the ebook instead.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:15am</span>
Module website At the moment the Open University’s policy is to remove module websites in our Moodle-based virtual learning environment for three years after the module presentation.  To have all the module content simply vanish after that period is becoming untenable.  For our students the module website is where their learning activities are coordinated and increasingly where they access their learning content in the form of web pages, PDFs, audio and video.  In the past they were sent books containing most of the content, could display the books on their shelves and refer back to them and the handwritten notes they’d added in the margins if required at a later date.  Now with an increasing amount of content delivered online they stand to lose the permanent access they have had to their core learning materials (barring fire or theft of their physical book collection, that is). Meanwhile collaborative learning activities are increasingly taking place in forums, wikis, blogs and other social media tools within the virtual learning environment.  All the students’ carefully crafted comments or assignment work held in these tools are lost forever when the module website is removed. Our annotation tool, OUAnnotate, is being used by students to comment on their course content and other websites relevant to their studies. We can hardly encourage learners to use this tool to comment on learning materials for them to later discover that their annotations are no longer accessible because the associated content has vanished. An additional issue for the Open University is that students (in England at least) are now required to sign up for a whole qualification in order to obtain a loan rather than selecting individual modules one at a time and later deciding how to piece them together into a qualification.  This is having profound implications for the way the University organises itself with the qualification becoming the primary focus rather than the module.  Content, support and communities are thus available from the qualification website as well as the module site.  Assessment may also potentially be organised around the qualification rather than solely at the modular level. Students might therefore want or need to access online materials they were studying more than a decade previously. So why do we have this policy of switching off module sites in the first place?  There are various reasons including not retaining personal data indefinitely and not increasing data storage by keeping multiple presentations of the same content for many years.  Another problem is that content authored for one version of Moodle (e.g. a quiz) may require a particular version of Moodle to display it properly.  You might end up having to keep multiple versions of Moodle available indefinitely - a technical and logistical nightmare. The solution a small group of us concluded this morning is probably that first of all the student has to take responsibility for retaining their learning materials in the same way that they’re responsible for not losing their books right now.  But we need to give them the advice and tools to do so.  We prompt them somehow to say "now that you’re coming to the end of your module would you like to export the content so you can refer to it in the future?" And again three years later, just before the site is about to disappear the student is prompted to export the content if they wish. We add an "Export Module" button on each module website which exports the core module content i.e. web pages, PDFs, audio and video into a zip file to be stored wherever they want - perhaps also to their Google Drive or another cloud based storage facility of their choice. For forums we differentiate between those which are to be used primarily for social purposes and those provided for core learning activities. The latter are flagged as such and the "Export Module" facility is built to incorporate the forum content in the zip file, perhaps as a PDF. Capturing interactive content such as quizzes is problematic as it would mean replicating complex server-based functionality in e-books or other complex formats, maintaining those platforms when they’re upgraded etc but we agreed to look at this possibility. Students’ annotations to course content remain a problem.  These could be important for revision several years after the module has been studied.  If we export web-based course content as web pages then could the annotations be embedded with the content so they pop up when the content is viewed in the future? Another issue is that of found content.  If this is paid for there may be licensing issues which prevent the found content from being saved in the export file.  If it’s "free" and merely linked to we could just retain the hyperlink in the web content and not worry too much about linked content vanishing in the future.  That’s assuming the content isn’t absolutely key to the module.  We could also possibly find a way to embed an instance of that linked content at the time of exporting. Bespoke software such as chemical modelling tools where the student is provided with a licence for the duration of the module would clearly not be able to be exported in this way and will end up disappearing.  But arguably that happens anyway so no change there. A project is being set up within our Learning and Teaching Solutions unit to tackle all this and we’ll hopefully be able to put an initial solution together quite quickly incorporating the core course content, and later adding things such as forum content and annotations when suitable export functionality has been developed.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:15am</span>
My thinking on MOOCs has been consolidated after doing a fair bit of reading, chatting and thinking recently.  Much has been written on the disruptive potential of MOOCs and also about the problems associated with them such as lack of quality, plagiarism and lack of tutor support.   I have no desire to add to the noise and hype but want to set down two paradoxes that it seems to me are at the heart of the MOOC movement. Paradox 1:  Most MOOCs are offered by elite institutions which don’t need to expand their student base So why are they developing MOOCs?  Are they basically caught up in the hype and working on the proven Amazon business principle of build fast and worry about money later? Maybe, but here are some other reasons why they may be launching into MOOCs: Some providers have argued that MOOCs are aimed at helping them accumulate data on how students learn online which will then allow them to enhance their teaching for regular students. MOOCs, like open educational resources, provide a genuine opportunity to spread an institution’s educational mission outside the campus.  Call me old-fashioned but I believe that people in education are still frequently driven by altruistic motivations such as knowledge creation and a desire to spread the love of learning - as well as economics. It may help boost the profile of an individual professor and develop his or her international reputation.  When it comes to promotion will saying you’ve successfully taught thousands of students via a MOOC boost your career prospects? It may provide additional revenue though for the foreseeable future this is likely to be minimal for the institution and is dependent on developing as yet undiscovered viable business models. Putting aside issues such as quality assurance, plagiarism and lack of tutor support, let’s suppose that MOOCs develop coherent curricula, peer support mechanisms and robust assessment processes which lead to qualifications at very low cost from credible institutions - and employers begin to take them seriously.  That leads us to the next paradox. Paradox 2: Highly successful MOOCs attack the core business of those who are offering them Elite institutions offering MOOCs will therefore never allow them to become as credible as their regular fee-incurring provision.  If an equivalent experience can be had for free no-one will pay fees.  MOOCs therefore will by definition remain an inferior educational experience and have to be offered under a sub-brand or a completely different brand - presumably one reason why institutions are rushing to sign up to Udacity and Coursera so they can jump on the MOOC bandwagon without diluting their own brands.  But successful partnerships where institutions club together to offer modules which build up to full qualifications are fraught with difficulties and have led to some spectacular debacles such as the UK’s £62m e-university. High quality assessed and accredited MOOCs from Ivy League institutions will not be allowed to disrupt their own core business but may ultimately provide viable alternatives to expensive qualifications from less prestigious institutions.  This is where MOOCs could begin to disrupt the higher education market.  Learners are becoming ever more discerning and there is further evidence that the higher education bubble in the US has burst particularly in the for-profit sector with recent announcements such as the University of Phoenix closing half of its campuses.  MOOC-based qualifications will have to be very good and much cheaper to gain ground in an increasingly competitive market.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:14am</span>
Should Open University students be entitled to particular types and amounts of tuition during their studies?  Should provision be consistent across tutor groups, regions and nations, and qualifications?  What are the most successful pedagogical strategies for online synchronous sessions?  How should we engage with Facebook as an institution?  What role do face to face sessions provide in an increasingly online world? I’m just out of a workshop with an enthusiastic and experienced bunch of people from various parts of the University which was examining some of these questions in greater detail.  The OU has never had a tuition strategy before; practice has developed organically across different faculties and regions.  This leads to inconsistencies of approach and inefficiencies, while also allowing great flexibility and responsiveness to local and individual requirements. Various factors are coming together however which make the development of an overarching strategy for tuition a necessity for the University: The necessity to optimise our use of tuition resource and methods in order to help retain students The availability of a range of tools which can be used for tuition within the virtual learning environment - and understanding the many possibilities for how to deploy them The greater use of the Internet in society as a whole and increased acceptance of technology (with the caveat that computer literacy and access to technology is variable) Students’ increased expectations in a world of higher fees The group today brought together a variety of perspectives but achieved consensus on how to develop the tuition strategy and on a number of key issues, namely: There is a lack of evidence on current practice in tutorials and on student perceptions and expectations.  We need to build up an evidence base for what is working in tutorial provision. The default situation for the University should be the provision of online tuition.  We should then supplement this with face to face provision where appropriate. Whatever ends up being in the strategy there needs to be some flexibility to organise provision at a local level to meet changing needs. It may make sense to organise face to face tuition on a local basis while organising online tuition across all regions/nations. We need a clear policy about how we engage with external environments such as Facebook where we have limited ability to take action on misuse. Finally we need to think about tuition at the earliest levels of module production.  In the past our Fordist production methods led us to think of tuition as an add-on, quite separate to the development of learning content. I’ll be drafting the first version of the tuition strategy and then passing it to my colleague Pat Atkins and others to refine.  It will then travel through the University’s governance processes for further enhancement.  The aim will be for the document to set the direction of travel for the University in tuition, to provide guidance for faculties, module teams and associate lecturers, and to ensure that we maintain excellence in and enhance our tutorial support for students.  The challenge is to produce a document that is concise enough for people to be motivated to read, at a high enough and generic enough level that it is acceptable across all faculties and regions/nations, but low level enough that it can actually achieve something concrete.   It needs to help increase consistency in the student experience without being so prescriptive that it restricts the flexibility to respond to dynamic circumstances.  Fortunately I enjoy a challenge.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:14am</span>
Accessing online content and services has become a vital part of the OU experience. The virtual learning environment has been carefully designed over the last seven years and has some excellent features such as a custom-built forum tool and quiz engine.  Meanwhile we have other systems such as StudentHome, Open Learn and Library Services, full of useful content and tools.  These websites have grown up organically, are owned by different parts of the organisation, have different user interfaces and are not as well integrated as they could be.  Navigating through them to find the information or tools you need, particularly if you’re new to the University, can be a confusing experience. A new initiative called MyOU aims to put this right and will optimise the online experience for our students. Currently in the requirements gathering stage, we are consulting heavily with our learners and with the various stakeholders across the University. MyOU will provide a new layer on top of  existing systems making the online experience much better for students.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:14am</span>
MOOCs tend to involve consuming online content, taking automated assessments and peer networking. While students may feel some connection to the academics who create the courses by watching recorded videos of them, the opportunities for synchronous connection with subject experts are limited. Dave Middleton is a tutor manager with the Open University and has been training tutors to use Elluminate effectively for several years. When online tutorials were offered to students in Wales on Exploring Psychology, one of our most popular modules with around 4,000 participants each year, students elsewhere began to complain that providing the events only to the Welsh was unfair. So Dave spotted an opportunity to try something new. He opened up an Elluminate room to the entire module population, advertised a two-hour event, sat back and hoped that 3,850 registered students wouldn’t all turn up at once. In the event 200 did. The received wisdom is that online tutorials become unworkable when numbers exceed 20 but clearly the way this was being handled didn’t result in the expected chaos. Dave was able to enable group work and problem-based learning rather than simply lecturing at the students. 75% of them responded that the session had met or exceeded their expectations. Comments included: "The tutorial exceeded my expectations! It was well organised, easy to understand and packed with useful information" "This was truly amazing and inspirational. The concept is fantastic." "Very enjoyable - I initially thought 2 hours would seem a long time for an online tutorial but the time just flew by. Great to be taking part from the comfort of your own sofa! " "I think these tutorials should be available for all modules as many, like myself, cannot attend face to face ones for the same reason we cannot attend brick uni’s and have chosen to study with the OU. I would like to say well done to the tutors, the organisation and structure was a great improvement. I only wish there were more of them." Faculty policy on online tutorial provision was changed after Dave’s experiment. For the first time there was evidence not only that the tutorials could offer an excellent learning experience to large numbers of students but would also be highly popular with those who didn’t otherwise have the chance to attend face to face sessions. The lesson for MOOCs is that mass synchronous online sessions with subject expects can be motivational and effective. The tools available in Elluminate (now Blackboard Collaborate) and similar systems enable effective interactive teaching with hundreds of students simultaneously. Such sessions have to be properly planned of course both logistically and technically to avoid a "MOOC mess" such as the one which happened on Georgia Institute of Technology’s module with Coursera which resulted in the course being withdrawn. Is Dave’s experience of dealing with a couple of hundred students at once the limit?  I suspect someone somewhere some time soon is going to push the technology and the logistics to accommodate many thousands of students in an engaging synchronous session.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:13am</span>
Ebook reader image (c) Andrew Mason licensed under Creative Commons I had an interesting discussion yesterday with Phil Butcher, the Open University’s e-assessment guru.  He wanted to talk about whether we should invest in embedding our wide range of interactive question types in ebooks. Since the 1970s Open University course teams have attempted to get students to think more deeply about the content they are reading by embedding questions within the course texts.  Accessing such questions on digital devices has some clear advantages over paper: many different question types are available and you can receive instant feedback, tailored to your response. This is of course quite possible with web-based learning.  We have OpenMark questions embedded at various points in some of our texts and are considering whether we should adapt Moodle to be able to present single questions within texts in the same way. But what do we do about ebooks?  Almost all of our content is now available in ebook format on a range of platforms.  If the interactive questions work in print and are even better presented online then shouldn’t we be incorporating them in the ebooks too? To do this at scale we would have to have an automated process to export the items (questions together with possible responses etc) from Moodle into the ebooks.  Ebooks can render interactive questions using HTML5 but there is a variety of ebook formats, differing hardware and software platforms and a range of ebook reader apps.  The software for this export process would require continual tricky and expensive maintenance to stay on top of all the various formats and there would inevitably be features which worked on one platform and not another.  Another option would be to build our own ebook reader software to be able to optimise the user experience but that too would be complex, costly, have to work on multiple platforms and require ongoing maintenance. There’s something about the paradigm of a book as a way of presenting digital content in the confusing world of the Internet which gives it appeal (I expanded on this in Are ebooks better than virtual learning environments?): in particular you can download an ebook as a complete self-contained package, access it offline and feel a sense of ownership over it in a way which you can’t with the content on websites.  You can also quickly grasp how many pages it is, navigate easily and know where you are in it.  And of course the ability to alter the font size of an ebook and have it repaginate automatically, to hold it in your hands, and to not have your experience cluttered with the many icons and menus of a PC-based interface all add to its usability. One of the advantages of ebooks may however be problematic for educational institutions: offline reading.  That means no opportunity to see how students are using any interactive questions.  A valuable source of data for learning analytics to monitor uptake and performance is never gathered - and opportunities for enhancing problematic questions and the associated learning content are lost.  Meanwhile the learner potentially loses out too: there is no chance for institutions to target interventions at students who might be at risk of dropping out or might benefit from being able to compare their performance with other students. A way around this would be to incorporate recording of user activity into ebook reader software and send it to a server every time the user goes online.  And if we’re recording information about the use of interactive questions why not record data such as how often often the book is being read, dwell time on pages or number of times pages are re-read.  Again that might tell us something about how effective our learning content is or what difficult concepts need to be explained better in future iterations, ultimately benefiting students.  This approach is already being taken by CourseSmart, a company which rents out textbooks and enables usage monitoring through their ebook reader software. But what are the ethics of this?  While arguably most people are accepting, if not entirely comfortable, that anything they do online is potentially being monitored by those hosting the website this may not be true for ebooks.  Is there something fundamentally different about an ebook where we feel we own it, as we would a physical copy, and would resist the idea that we are being snooped on - even if the snooping was aimed at enhancing our learning? Universities should be entirely transparent about what they do with any data gathered while students are accessing their systems and content.  It is quite easy to argue that most educational institutions will monitor usage primarily for the purpose of enhancing the educational experience for individuals and for future students.  This is in contrast to commercial entities and social networking sites which monitor usage in order to target marketing at you more effectively or to sell information about you to others. But what if monitoring ebook usage has a negative effect on the learning experience?  If I’m spending a quiet evening at home reading an ebook and I know that every page turn, click or interaction is being monitored will that make me anxious and somehow less able to learn?   Assuming that we could build this monitoring facility into ebooks, or buy it from someone else, the best way forward from an ethical and pedagogical perspective may be to allow users to decide whether data about them can be collected and sent back to the institution or not.  Research is needed into what learners want out of their ebooks and whether they’re prepared to have data collected about their use of the interactive questions that are designed to promote deeper and more reflective learning. Ebook reader image (c) Andrew Mason licensed under Creative Commons
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:13am</span>
Is my professor watching every click? Mr_Stein, CC BY-NC-SA Universities have been recording data digitally about their students for decades. No one would seriously question the necessity of collecting facts for administrative purposes, such as a student’s name and address, module choices and exam results. But as teaching and learning increasingly migrate to the internet, huge amounts of data about individuals’ activities online are being accumulated. These include everything from postings on forums, to participation in video conferences, to every click on every university-hosted website. Most of the records gather virtual dust in log files, never to be analysed by any computer system let alone viewed by a human. Universities have only recently started to realise the huge potential of using this data to help students succeed in their learning, or to improve the educational experience for others. Privacy concerns With these possibilities come dangers that the data could be used in ways undesirable to students. These include invading their privacy, exploiting them commercially by selling their data to third parties or targeted marketing of further educational products. Meanwhile, well-intentioned pedagogical innovations which access the data may have unforeseen negative consequences, such as demotivating students who are told they are at risk of failure. Institutions have clear legal responsibilities to comply with data protection legislation, restricting information from access by third parties and allowing students to view the data held about them when requested. Universities are commercial organisations, but are also motivated by altruistic concerns such as enhancing the life chances of individuals through education. The multinational technology corporations which we unquestioningly allow to collect vast amounts of data about us have altogether different motivations. For them, your data is of immense commercial value, enabling products to be targeted at you with increasing relevance. Most educational institutions need to act differently from for-profit organisations when dealing with users’ data. What’s being done with the data? Predictive modelling enables institutions to build a profile of a student. This can include information they have disclosed about themselves in advance of study, such as prior qualifications, age or postcode. This can then be mapped onto records of their online activity and assessment performance. Predictions can then be made as to the likelihood of a student dropping out or what grade they can be expected to achieve. The Open University is developing models to target interventions at students thought to be at risk. For example, a student who has no prior qualifications and has not participated in a key activity or assessment may be flagged for a telephone call by a tutor. Experience has shown that such a call may be what is required to motivate the student or help them overcome an issue which is preventing them studying. Various ethical issues emerge here. If we establish early on that a student is highly likely to fail, should we advise them to withdraw or to re-enrol on a lower level course? But what if we are limiting their opportunities by taking such an intervention? They might have continued successfully had we not intervened. Meanwhile, for those students thought not to be at risk, we are potentially denying them the possibility of beneficial additional contact with a tutor. Opt out option If the primary purpose of learning analytics is to benefit learners, then should a student be able to opt out of their data being collected? There are two problems with this. We may be neglecting our responsibilities as education experts by allowing some students to opt out. This could deny them the assistance we can provide in enhancing their chances of success. The data collected can also be used to benefit other students, and every individual opting out potentially diminishes the usefulness of the dataset. One environment where a student might reasonably assume they are free from data being collected about them is while accessing an e-book offline on a personal device such as an iPad or a Kindle. Some US institutions are already providing students with e-reader software which captures data such as clicks and dwell times, storing them on the device and uploading it to a server for analysis. But unless users are made aware that this is happening, universities run the risk of being accused of unjustified snooping. It is unclear to what extent the constant collection of data on online activity inhibits learning or even worries students. Do students care any more about what universities do with data on their educational activities than they do about the data collected by Google or Facebook on their personal interests, relationships and purchasing habits? But the trust given to universities by students elevates the importance of caretaking their data and establishing clear policies for what we do with it. Transparency about the data we collect, and how and why we are using it, will help to avoid a backlash from learners worried about potential misuse. Institutions need to develop clear policies arguing why the collection and analysis of data on students and their learning is in their interest. This is a necessary step before being able to exploit the full potential of learning analytics to enhance the student experience. Niall Sclater does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations. This article was originally published on The Conversation. Read the original article.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:13am</span>
How do higher and further education institutions in the UK best share their expertise in learning analytics?  Would a jointly developed code of practice for learning analytics help deal with the legal and ethical issues?  How can Jisc facilitate the development of tools and dashboards to help institutions develop their analytics capabilities to enhance student success? These were some of the questions being addressed by 31 participants last week from UK universities, colleges and organisations including SCONUL, RUGIT, UCISA and the NUS at a workshop in Jisc’s offices in London.  Increasing amounts of data are being collected on students and their learning - but it’s clear that our understanding of how best to interpret that data and make use of it is still at an early stage. Back in April the Co-design Steering Group developed the idea of an effective learning analytics challenge.  Then over the summer we gathered 21 ideas from the community to address the challenge of learning analytics.  Voting from more than 100 people narrowed the list down to nine ideas and accompanying service scenarios that we discussed at the workshop. One theme that emerged strongly from the discussions was a need to involve students in developing learning analytics products and services.  An app for students to monitor their own learning was seen as a critical requirement which Jisc could help provide.  While improving retention was regarded as important, the main long term goals would be to support learning attainment, assist decision making and pathway choice, and improve employability. After some engaging discussions in smaller groups we came back together to vote on the top three themes, merging some of them in the process to form the following priorities: Priority 1: Basic learning analytics solution and a student app One of the most popular ideas was a "freemium" solution for further and higher education institutions, allowing them to gain experience and eventually progress to a more advanced toolset.  It would be based on an existing solution from an institution, vendor or open initiative that could be ready to provide a working product in early 2015.  The product would require the development of an open standard for analytics and an API enabling other compatible basic analytics solutions in the future. Alongside the basic solution would be a student app which works with any learning analytics solution provider using a specified set of data inputs.  Students would be involved in scoping and designing their requirements for the app. Finally we felt there was a need for a tool for tracking and recording interventions which take place as a result of analytics.  This will inform the development of a learning analytics cookbook (see below) which will suggest appropriate ways that staff and systems can intervene to enhance student success. Priority 2: Code of practice for learning analytics A huge priority for institutions is how to deal with concerns around data protection and privacy both from a legal and ethical perspective.  The potential benefits of learning analytics are well recognised but there are also possibilities for misuse. A guide to learning analytics practice is needed and will be informed by a comparative review of existing codes of practice in this and related areas.  Jisc will then develop the code of practice in partnership with the NUS and others. Priority 3: Learning analytics support and networks The group also prioritised the development of a support and synthesis service around the use of learning analytics to share expertise, working on: Technical methods - the nuts and bolts of learning analytics such as what systems and data institutions are using A learning analytics cookbook - with recipes for the use of data and metrics - documenting successful implementations at universities and colleges Synthesis and analysis - giving a high level overview and showing trends across the sector Networks - building networks of institutions keen to share experience both at a basic and advanced level Next steps A great deal of enthusiasm for the possibilities of learning analytics was expressed at the workshop - and we benefitted from the considerable experience that has already been gained by many of the participants. The priorities agreed will now be developed into a project plan that can be taken forward by Jisc over the next two years. Full details will be posted to the Jisc Effective Learning Analytics blog. Educational institutions, vendors and other potential partners will be invited to comment and express interest in participating.  Meanwhile we’ll be looking for some expert critical friends to advise us on each of the projects as they progress. This post was first published on the Jisc Effective Learning Analytics blog, 16th Sept 2014.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:13am</span>
We had a lively session on ethics and legal issues at the Jisc Effective Learning Analytics workshop last week, kicking it off by outlining some of the key questions in this area: Who should have access to data about students’ online activities? Are there any circumstances when collecting data about students is unacceptable/undesirable? Should students be allowed to opt out of data collection? What should students be told about the data collected on their learning? What data should students be able to view about themselves? What are the implications of institutions collecting data from non-institutional sources e.g. Twitter? Students Studying by Christopher Porter CC BY-NC-ND 2.0 Concern was also expressed about the metrics being used for analytics - how accurate and appropriate are they and could it be dangerous to base interventions by tutors on metrics which portray an incomplete picture of student activity? A number of the participants had already been thinking in detail about how to tackle these issues. There was a consensus that learning analytics should be carried out primarily to improve learning outcomes and for the students’ benefit.  Analytics should be conducted in a way that would not adversely affect learners based on their past attainment, behaviour or perceived lack of chance for success. The group felt that the sector should not engage with the technical and logistical aspects of learning analytics without first making explicit the legal and ethical issues and understanding our obligations towards students. Early conversations with students were thought to be vital so that there were no surprises. It was suggested that Informed consent is key - not just expecting students to tick a box saying they’ve read the institution’s computing regulations.  Researchers elsewhere have begun to examine many of these areas too - see the paper for example by Sharon Slade and Paul Prinsloo: Learning analytics: ethical issues and dilemmas. Mike Day at Nottingham Trent University found that students in the era of Google and Facebook already expect data to be being collected about their learning. His institution’s approach has been to make the analytics process a collaboration between the learner and the institution. They have for instance agreed with students that it’s appropriate and helpful for both themselves and their tutors to be able to view all the information held about them. Another issue discussed at some length was around the ethics of learners’ data travelling with them between institutions. Progress is being made on a unique learner number, and the Higher Education Data and Information Improvement Programme (HEDIIP) is developing more coherent data structures for transferable learner records. It will be possible for data on the learning activities of individuals to be transferred between schools, colleges and universities. But what data is appropriate to transfer? Should you be tainted for the rest of your academic life by what you got up to at school? On the other hand could some of that information prove vital in supporting you as you move between institutions? Data on disabilities might be one such area where it could be helpful for a future institution to be able to cater for a learner’s special needs. Ethically this may best be under the control of the student who can decide what information to present about their disability.  However the technology may be in place to detect certain disabilities automatically such as dyslexia - so the institution might have some of this information whether the student wants them to know it or not. Who owns the data about a students’ lifelong learning activity is another issue. Institutions may own it for a time, but once that student has left the institution is it appropriate to hold onto it? Perhaps the learner should take ownership of it, even if it is held securely by an institution or an outside agency. There may be a fundamental difference between attainment data and behavioural data, the latter being more transitory and potentially less accurate than assessment results and grades - and therefore it should be disposed of after a certain period. There are of course different ethical issues involved when data on learning activities is anonymised or aggregated across groups of students. One parallel we discussed was that of medicine. A learner might visit a tutor in the way that a patient visits a GP. The doctor chats to the patient about their ailment with the patient’s file including their previous medical history in front of them. Combining what the patient says with physical observations and insight from the patient’s records the doctor then makes a diagnosis and prescribes some treatment or suggests a change in lifestyle. Meanwhile: A tutor chats to a student about an issue they’re having with their studies and has analytics on their learning to date on a computer in front of them as they talk. The analytics provides additional insight to what the student is saying so the tutor is able to make some helpful suggestions and provide additional reading materials or some additional formative assessment to the student. In both scenarios the professional takes notes on the interaction which are themselves added to the individual’s records. All the information is held under the strictest confidentiality. However the data in both scenarios can also be anonymised for research purposes, enabling patterns to be discovered and better treatment/advice to be provided to others in the future. So in order to help institutions navigate their way around the complexities would a code of practice or guidelines be of interest to institutions? The consensus was yes it would and this was borne out in voting by the wider group later in the day. The schools sector has already developed codes of practice and obviously the NHS is well advanced in the ethics of data collection already so there is much to be learned from these professions - and from research ethics committees in many of our own institutions. There would need to be consultation with the various UK funding bodies - and working closely with the National Union of Students was seen as key to ensuring adoption. A code of practice for learning analytics would have to be clearly scoped, easily understandable and generic enough to have wide applicability across institutions. The primary audiences are likely to be students, tutors and senior management. Mike at Nottingham Trent found the key to success was a four-way partnership between technology providers (who were required to adapt their products to meet ethical concerns), the IT department, students and tutors. There was a strong consensus in the group that this work would significantly help to allay the fears of students and, often just as vocally, staff in their institutions in order to explore the real possibilities of using learning analytics to aid retention and student success.  In fact some stakeholders considered it an essential step at their universities and colleges before they could make progress.  Developing a code of practice for learning analytics will therefore be a key area of work for Jisc over the coming months. This post was first published on the Jisc Effective Learning Analytics blog, 18th Sept 2014.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
Image: BME Solutions Ltd Every educational institution wants its learners to reach their full potential.  Learning analytics can help us to measure and predict student success using data relating to engagement, grades, retention, graduation and employability.  But what products are out there to enable institutions to improve on the indicators of success, and to help visualise and analyse the increasing amounts of data being collected around our students? How well do the various tools facilitate interventions with learners and help us to monitor the effectiveness of those interventions? I’ve been reviewing some of the main analytics solutions for Jisc, and visiting some of the institutions who are beginning to use them - I’ll be discussing my findings in a series of blog posts. There’s a diverse range of learning products available, most of which are still at a very early stage of development.  The marketplace is developing rapidly though, and I’m told that universities and colleges are being bombarded by vendors to try to get them to buy into their nascent analytics products. Picking the wrong solution is likely to be an expensive mistake. On the other hand those institutions who seem to have got it right are already providing significantly enhanced information about their students to managers, academics, librarians and the learners themselves.  Crucially, they’re attributing initial improvements in some of the indicators of student success to the use of these tools and the interventions being taken as a result of the improved information. In order to carry out comprehensive analytics, data is generally extracted from various institutional systems, in particular the virtual learning environment (VLE/LMS), the student information system (SIS - also known as student records system) and a variety of library systems.  So the development of analytics products is complicated by the range of underlying software being used by educational organisations. To make things slightly simpler, in recent years the VLE market has consolidated, with most further and higher education institutions in the UK deploying either the commercial market leader Blackboard Learn or its open source competitor Moodle.  Meanwhile though some universities have in-house developed student information systems there is a limited number of commercial products  in use such as Tribal SITS:Vision and Ellucian Banner.   In further education these products tend to be called Management Information Systems (MIS) and include Capita UNIT-e Student Management Information System. Vendors of VLEs and SISs are rapidly developing products to help institutions make sense of the large quantity of data that is being accumulated as students undertake their learning, drawing conclusions about their likely success by adding this to already established information such as ethnicity or prior educational attainment.  To try to make sense of the range of products out there I’ve classified them in four categories, based on the software they originate from: VLE-based engagement reporting tools - which sit within the VLE, generally look at VLE data only, and provide simple indications of a student’s progress.  Examples: Blackboard Retention Centre, Moodle Engagement Analytics plug-in. VLE-centric analytics systems - developed by VLE vendors, combining data from the VLE with data from the SIS to enable more extensive analysis.  Examples: Blackboard Analytics for Learn, Desire2Learn Insights. SIS-centric analytics systems - which sit alongside the SIS but may also draw in data from the VLE, providing learning analytics alongside wider business intelligence. Example: Ellucian Student Retention Performance, Compass promonitor. Generic business intelligence systems - developed to provide better analysis in any business, not specifically for education, sitting outside both the VLE and SIS but drawing data from those and other systems, often in conjunction with a data warehouse. Examples: QlikView, Tableau, IBM Cognos, HP Autonomy. Most of the VLE-centric and SIS-centric solutions are themselves developed on top of one of the generic business intelligence (BI) solutions so these categories are not mutually-exclusive. I discussed this classification with Richard Maccabee of the University of London Computer Centre, which hosts Moodle for around a hundred institutions.  He suggested a fifth type of learning analytics system: one which examines data from across a range of institutions and provides analysis back to the organisations.  It might be that there is more to learn by comparing your modern language students with those in another institution than there is by comparing them with your own engineering students. Assuming institutional agreement, appropriate anonymisation and compliance with data protection legislation, the sum of big data from multiple organisations may indeed be greater than the parts.  This concept has not been lost on companies which provide hosted VLE services such as MoodleRooms and Desire2Learn.  Richard and I concluded though that rather than a separate type of analytics system, this is more of a dimension or addiional set of services which could be applied to any of the above categories. An institution’s choice of learning analytics products will be limited to some extent by the systems it has already deployed.  Thus if you’ve invested significantly in Blackboard Learn as your sole institutional VLE you’ll not be deploying analytics tools based on Moodle or Desire2Learn Brightspace. Institutions can carry out limited analytics at low cost by deploying one of the VLE-based engagement reporting tools.  The choice for more sophisticated analytics based on multiple data sources may be whether to buy into an analytics system developed by a trusted vendor of your institutional VLE or SIS - or to purchase a generic BI tool.  Some of the latter can already be integrated relatively easily with underlying data sources and come pre-populated with education-specific dashboards. A number of the products available are components of wider analytics offerings which cover activities such as student recruitment and fundraising (though some of these systems are currently quite US-specific). The learning analytics solution you choose may be best considered as part of a broader exercise to develop institutional business intelligence.  Some of the learning analytics initiatives report that they’ve prompted institutions to re-examine and clean-up a wide range of data sources and structures with positive knock-on effects for other aspects of the business. The institution will also need to decide whether it has the expertise to go it alone or requires assistance from one of the increasing number of vendors eager for business in this area or consultancy from a third party.  Meanwhile communities of practice are springing up around some of the tools, which can provide excellent support and examples of successful implementations. This post was first published on the Jisc Effective Learning Analytics blog, 2nd Oct 2014.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
In my last post I described four types of learning analytics products.  Here I’ll go into more detail around some of the VLE-based engagement reporting tools. These products for Blackboard and Moodle sit within the virtual learning environment (VLE/LMS), look at its data only, and provide simple indications of a student’s progress, raising flags when the student looks to be at risk.  Unlike many learning analytics tools these are aimed at teachers rather than the students themselves or managers within the institution. Blackboard Retention Centre Bundled with Blackboard Learn is Retention Centre which provides a simple dashboard view of learner participation within a single course. The functionality evolved from earlier features in Learn such as an "early warning system". Retention Centre is primarily aimed at identifying students at risk of failure, based on rules set by teachers.  The key dashboard provides an overview of a single cohort on a course, enabling you to identify students at risk of failure.  You can decide who you want to flag as at risk, add notes on individual learners, and contact them directly from the Retention Centre.  It’s also possible from here to change due dates for assignments.  There are four basic measurements of student engagement: Course activity is a measure of the time a student first clicks on something in the course until he or she clicks outside the course or logs out. Thresholds can be set for particular grades, flagging students who have fallen below that value - or those who have fallen a certain percentage below the class average. A flag for course access is set when users fail to access the system within a defined number of days. The final metric is missed deadline which can be triggered when a deadline is not met within a set number of days or when a specified number of deadlines have been missed. There are default rules for each of these (which can be customised): Course activity in the last week is 20% below average Grade is 25% below class average Last course access was more than 5 days ago There has been 1 missed deadline Another screen allows instructors to view a summary of their own participation in a course and to link directly to activities that are required of them such as grading.  Blackboard suggests that instructors use this functionality to prioritise areas of their courses which need attention.  This is a kind of basic "teaching analytics" where instructors can see an overview of their activity and link easily to tasks such as marking or blogging. The data on what actions you’ve taken through the Retention Centre as a teacher is available only to you.  While this may give you confidence that no-one is snooping on your teaching activities, it limits the options for institutions which want to understand better how learning and teaching are taking place.  Another limitation is that as Retention Centre works only at the level of a course you can’t get a view of programme or student-level activity. Moodle options Other than deploying a generic business intelligence system such as QlikView, Cognos or Tableau, Moodle-specific options include a few plugins for the system, in particular Engagement Analytics and Moodle Google Analytics, and a commercial option, Intelliboard. As an open source system Moodle allows users to develop their own plugins, and a number of institutions have built analytics tools using the data from user log files.  Currently these appear to be considerably less developed than the analytics capabilities of other virtual learning environments, notably Desire2Learn Brightspace and Blackboard Learn.  The last release of Moodle Google Analytics was August 2013, though Engagement Analytics, initially released in August 2012, is still being maintained. As with Blackboard Retention Centre, the tools are primarily aimed at staff - not students as yet. Documentation is limited for all these options.  Intelliboard’s product is clearly at an early stage of development, with an offer on its website that any paying customer can request additional reports for free. Moodle Google Analytics takes the data available from Google Analytics and the web server logs and presents it in Moodle and is thus more of a web analytics than a learning analytics tool, though analysing how learners navigate through a course website may be of interest. Engagement Analytics presents risk percentages for students based on three indicators: Login activity: how often and how recently are students logging in, and for how long? Forum activity: are students reading, posting and replying? Assessment activity: are students submitting their assessed work, and are they submitting on time? You can configure the weighting of each indicator e.g. 60% for logins, 30% for assessment and 10% for forums - depending on the relative importance of these activities in the course.  You can also add other indicators such as downloading files. Limitations and take-up For institutions using Blackboard or Moodle these tools provide simple ways of viewing engagement by students.  It’s surprising that, given how long the VLEs have been in existence, it’s taken so long for such basic reporting facilities to emerge. As I noted earlier these systems use data from the VLE only; there’s no integration with student information or other systems.  None of them appear to facilitate any automated interventions so teachers have to decide what action to take based on the dashboards. As Retention Centre comes bundled with Learn, no additional technical expertise is required to install and maintain this functionality - it merely has to be switched on for a particular course by the instructor.  It should be relatively easy for a Moodle administrator to install the plugins. It’s unclear how widespread the use of these tools is, however many institutions are no doubt experimenting with Retention Centre. One university I spoke to found the interface "ugly" and the functionality not very useful but I’m sure many teachers will find it does give them a better indication of students at risk.  Retention Centre is functionality which allows users to try out some basic reporting and analysis, perhaps later leading institutions to consider purchasing the much more sophisticated Blackboard Analytics for Learn or some business intelligence software. As far as the Moodle tools are concerned Intelliboard claims a few corporate clients on their website - none so far in the UK.  It is not clear how many institutions are deploying the plugins but initial response to Engagement Analytics on the Moodle forums is positive and it’s been downloaded nearly 10,000 times from the moodle.org site. Indicators of engagement What is particularly of interest about these tools is to what extent this data provides an accurate indication of student engagement - which we know can correlate with measures of success. Michael Feldstein points out that the four Retention Centre indicators for activity in the VLE are considered the most indicative of student success according to the inventor of Purdue’s Course Signals, John Campbell. But how do we know that the Retention Centre indicators are more accurate than those measuring login, forum and assessment activity in Engagement Analytics?  Courses have different types and balances of content, communication and assessment - and this is recognised by the tools in allowing you to customise the indicators.  However there are all sorts of other factors at play such as the features of the software itself, alternative ways that students have to communicate, the institutional context and nature of the student body, and to what extent the teacher encourages students to use the tools. Learning analytics is an inexact science and there will always be individuals who perform differently from how we think they will.  Monika Andergassen and her colleagues at the Vienna University of Economics and Business found that there were correlations between time spent in the VLE and final grade, and betweeen self-assessment exercises undertaken and final grade. The correlations in both cases though were modest, and the repeated solving of the same exercises didn’t correlate with better results, implying unsurprisingly that what you do online may be more important than how long you spend in the VLE. Various people I’ve spoken to at my recent visits to UK institutions believe that the more information we have about students the more accurately we’ll be able to predict their success or likelihood of dropout.  A key question is whether adding all the possible sources of data we have will make a significant difference to the accuracy of these predictions.  Will a few indicators from the VLE be sufficient to pick up the majority of your students who are at risk or do we need more sophisticated predictive analytics software which also take data from the student information system and elsewhere? This post was first published on the Jisc Effective Learning Analytics blog, 3rd Oct 2014. Images are from https://docs.moodle.org/22/en/report/analytics/index and are copyright Kim Edgar, available under the a GNU General Public License
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
I’ve described some of the basic reporting functionality available for Moodle and Blackboard but this is just scratching the surface of what is possible with learning analytics.  In this post I’ll look at ways in which analytics from other data sources such as video and ebooks are being brought into the VLE to help gain a better picture of student engagement.  I’ll also describe a system which makes analytics on their activities available to the learners themselves - and helps staff to manage subsequent interventions. Adding analytics from e-book usage While most learning analytics systems use data available in the virtual learning environment to measure engagement, VitalSoure CourseSmart Analytics delves into the students’ use of e-books in order to help "improve retention, control costs and improve learning outcomes". VitalSource is a company which rents out textbooks and enables usage monitoring through their own e-book reader software.  The system assesses engagement using various metrics including the duration of the e-book reading session, the number of pages viewed, and activities such as highlighting or taking notes. The (as yet fairly simplistic) analysis is presented in a dashboard which can be viewed from within the virtual learning environment.  It includes an "engagement index" derived from the usage data.  The company claims that its research shows that this index is a stronger predictor of student academic outcomes than their prior educational attainment. The dashboards are built on top of the GoodData business intelligence platform, and can be viewed from within the VLE.  The software uses the IMS Learning Tools Interoperability framework to make data available to the VLEs. Various US institutions which have deployed the software are profiled on the company website and in some white papers.  The suggested users of the product are: instructors (to assess performance and intervene as appropriate), deans and other senior staff (to assess the uptake and effectiveness of e-books), and publishers (to assess the relative impact of digital course materials in order to improve their offerings).  There’s no suggestion by VitalSource that learners would benefit from viewing the data, and whether students are comfortable about having their ebook usage analysed in this way is another matter.  I did some thinking about this a while back in: Making ebooks more interactive: logistics and ethics. Analytics from video viewing and contributing One thing not handled very well by most VLEs is video.  Various plugins have emerged to deal with this, and notable amongst them is Kaltura, an open source system available for all the main VLE products.  Kaltura deals with the content management aspects of hosting videos, and enabling contributions by students as well as staff.  It also provides analytics on the viewing and contributing of video clips.  This allows staff to see: Which videos are students watching the most? Which students contribute the most videos? Which students watch the most videos? How long are students watching each video? This information can certainly help you discover what your most engaging video content is.  It can also give some indication of engagement for individual students both in viewing and posting.  A table shows the top five most engaged users including how many videos they watched, how many minutes they spent, what their average view time was, and their "average drop-off" percentage i.e. how much of the videos did they actually watch. This is very limited though for the purposes of learning analytics; a natural evolution for the functionality would be to produce an indicator of student engagement in viewing video (and contributing it if appropriate).  In a similar way to the CourseSmart Analytics engagement index this could be made available to the VLE together with other engagement data to build a fuller picture of student participation in courses with high video content or a requirement to contribute video. The Kaltura website lists a number of high profile US universities as customers together with Durham and Manchester Metropolitan in the UK. Course Signals A more sophisticated engagement reporting system than the ones I’ve described so far for Moodle and Blackboard is Course Signals. This was originally developed at Purdue University and is now one of five products which comprise the Ellucian Student Success Suite.  It’s received much publicity due to claims of dramatic positive correlations between use of the software and measures of student success.  Retention in particular was claimed to be 21% higher among students who had used the system.  However Purdue’s findings were subsequently challenged as not necessarily demonstrating a causal relationship. Like the other simple VLE reporting tools, Course Signals provides indicators of whether students are on track to complete their course, based on their online participation.  The software was built to integrate with VLEs including Blackboard Learn and Desire2Learn Brightspace. At the heart of the system are traffic light indicators, displayed in the VLE, which tell students if they are performing well, holding steady or underperforming, and prompts them to take action.  So a staff member might specify a minimum performance requirement for a particular course.  If a student’s performance falls below this a red signal is provided on the staff dashboard and an email sent to the student.   A yellow signal shows that a student’s performance is approaching the minimum acceptable level, while green suggests that a student is doing what is required to pass.  The main metrics are: Grade - allows you to set value ranges for grades which equate to red, yellow and green signals. Effort - a measure of how much a student uses specified course resources in the VLE within a specified date range. It’s possible to filter students based on the red/yellow/green "signals" that are generated for grade and effort, for example those who have worsened in a class or those who have red signals in two or more classes. There are at least five things which make Course Signals much more sophisticated than the basic reporting tools for Moodle and Blackboard: it can bring in data from outside the VLE such as prior qualifications it presents the indicators to students as well as staff it works across courses rather than just on a single course it can trigger automated interventions it adds workflow management for interventions Tools are provided for detecting at risk students, triggering a response, setting up and tracking an action plan and monitoring its success.  It’s possible to tailor and track your communications using the system, and to add automated event-triggered reminders to students to carry out specified tasks. With this level of workflow management Course Signals begins to have the feel of a customer relationship management system rather than a simple VLE reporting system.  The indicators of engagement, with all the potential data sources behind them, are boiled down to simple red/yellow/green traffic lights for grade and effort. But these then trigger a range of automated and human interventions and communications which are tracked by the system.  You can have all the metrics you like for measuring engagement but effective management of interventions is what could really start making an impact on student outcomes across an institution.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:12am</span>
Vendors are rapidly developing products for analysing learners and their activities.  There’s a battle going on between the companies whose primary product is the VLE, those which sell student information systems (SISs) and those who have developed business intelligence systems for industry in general but are now targetting the education sector as a key market. In previous posts I’ve given an overview of the different types of learning analytics systems available and described some of the simple reporting tools for the VLE/LMS.  I’ve also looked at how data can be brought in from other sources to provide more sophisticated indicators of engagement, and how systems are beginning to build in workflow around managing the subsequent automated and human interventions.  This post is about the emerging analytics systems built around the VLE, the ones with an online learning-centric view of the World. Given the vast amount of data accumulated by students as they use their VLE, and the fact that for many learners this is the primary way in which they’re interacting with their institution online, it’s not surprising that VLE vendors have spotted an opportunity to develop products which use this data to help institutions improve student success.  But VLE log files are only part of the picture.  Much of the rest of the data needed for learning analytics is held in the SIS. Vendors such as Blackboard and Desire2Learn have therefore been building in functionality to combine data from some of the most commonly used SISs with data from their own systems, and developing ever more sophisticated analytics products to sit alongside their VLEs. Blackboard Analytics for Learn This product is designed to provide more in-depth analysis of student activity and performance than is available in the Blackboard Retention Centre, enabling the correlation of student behaviours to educational outcomes. Whereas Retention Centre works at level of the course and uses data from Learn only, Analytics for Learn integrates data from the VLE with the student information system and allows analysis of individuals, groups and instructors across multiple courses by staff such as deans, senior management and educational researchers. The software facilitates the analysis of student activity and performance patterns, the identification of at-risk student behaviour, and the measurement of learning outcomes against course grades. It also allows the tracking of KPIs through institutional dashboards.  Blackboard gives some examples of the questions which Analytics for Learn can help answer: How does student performance differ between courses where the VLE is used versus where it is not used? Is there a difference in student performance where the instructor went through a training class and those that did not? How are our efforts at improving course quality making a difference? The second question highlights another key difference with Retention Centre. Analytics for Learn allows the analysis of teachers’ performance as well, something that will prove controversial in many institutions but will be increasingly of interest to senior managers who wish to monitor the quality of teaching in ways that were never possible in traditional classroom settings. A whole host of reports are provided on areas such as activity and gradebook exceptions, student performance against learning outcomes, organisational unit performance against learning outcomes, "student-at-a-glance", "course-at-a-glance", most active instructors, and aggregated activity by organisational unit.  Learning-related metrics include: gradebook scores submissions to interactive tools such as discussions session and course logins; time, tools and content accessed; time on task organisational unit and term information for aggregration and comparison "academic standing" Unlike Course Signals, the software is for carrying out analysis alone, and Blackboard has chosen not to facilitate interventions or any workflow features at this stage through Analytics for Learn. Analytics for Learn is part of the Blackboard Analytics Suite which consists of a data warehouse with data models, a data transformation process and "out of the box" integration with Blackboard Learn and three student information systems: Oracle PeopleSoft, Datatel, Inc. and SunGard Higher Education. Integration with other student information systems is, according to Blackboard documentation, "easy".  Good luck to you though if you have a bespoke SIS. The software is of course only for users of Blackboard Learn. Visualisations can be done with the Pyramid business intelligence tool which integrates with the data warehouse.  Other BI tools can be used as well.  Blackboard provides consultancy for implementing Analytics for Learn. They perform the initial installation which takes "several days". Institutions must have skills in the core technologies such as SQL Server and Windows Server. Desire2Learn Insights Desire2Learn have recently rebranded their VLE package as Brightspace. Built on top of this is a system called Insights which enables reporting on engagement, assessment and outcomes. It includes predictive modelling aimed at helping identify at-risk learners. A large number of data visualisation and analytics dashboards are available. Insights includes the Student Success System which is specifically aimed at enhancing retention. Target users are instructors, educational researchers and senior staff such as deans. Desire2Learn has done some rapid development and added an impressive array of analytics capability to Insights in a short space of time.  It provides a number of pre-configured reports, including some on learning outcomes and how well students have met them. "Competency" reports are available for individuals and at other levels including across courses. Reports can also be obtained on the use of course resources and on "engagement" by users, measured by their logins to the VLE. Other reports are available on enrolments and withdrawals, course grades, quiz scores and statistics on the items themselves which may help to identify questions which discriminate well or poorly between the best or weakest students. There are further reports available on students at risk, showing differences between courses and highlighting individuals.  There is also a series of "data mining reports", enabling insight into student behaviour, grade achievements and usage patterns - as well as the effectiveness of learning content.  Reports can be obtained on tool access by users.  Predictive analytics are deployed to assess risk levels of students using historical course data. Weekly predictions are then prepared, generating a "success index" for each student in the course. An "assessments predictive chart" shows a student’s performance across all course assessments, and compares their performance with their classmates.  Arrows inside the trend signs indicate whether predictions are negative or positive compared to the previous week.  The dashboard can be filtered based on success levels. Unlike Analytics for Learn, the system does facilitiate some interventions and workflow management.  Emails can be directed to all students in a particular risk category, for example. A pin can be added as a visual reminder beside those students the instructor wishes to follow up with. Insights is tightly integrated with the Brightspace virtual learning environment (formerly Desire2Learn), and built using the IBM Cognos business intelligence system.  Data comes from "multiple source systems" - in particular the Brightspace VLE.  Austin Peay State University reports that they are integrating data from their student information system. A large number of metrics are provided relating to course access, content access, social learning, assessments and preparedness.  It really is becoming a quite impressive offering.  What a pity that hardly anyone this side of the Atlantic is using Brightspace… Further thoughts Most higher and further education institutions in the UK use Moodle or Blackboard Learn.  If you’re using Moodle and you want to carry out in-depth learning analytics there are no sophisticated Moodle-specific analytics systems currently available, and you may need to use a business intelligence tool. Moodle hosting organisations are beginning to provide their own reporting functionality to their clients. Two of these are the University of London Computer Centre (ULCC) and MoodleRooms, headquartered in Baltimore, Maryland.  MoodleRooms’ services include the reporting of activity, grades and engagement proxies, flagging lack of activity and low grades - both within and across courses. MoodleRooms also claims to provide "a 360 degree view of your students by integrating with industry-wide SIS systems", so they are basing their analytics not just on data from Moodle. Both ULCC and MoodleRooms no doubt see learning analytics as key potential differentiators for their services and will be looking to further develop their reporting and analytics tools and deploy them across their growing client bases.  Already 33% of UK higher education institutions are using hosted services for their institutional VLE (See UCISA Technology Enhanced Learning Survey 2014). Learning analytics provided alongside such hosted services will no doubt prove attractive for many institutions, particularly smaller ones without data scientists or sufficient technical expertise to install and maintain their own analytics systems.  On the other hand one institution I have spoken to wants to get usage data from its VLE hosting service into its own data warehouse alongside other institutional data to carry out its own analytics. Blackboard Learn users can deploy Analytics for Learn if they want more than the simple tools offered in Retention Centre.  But not many institutions have yet used Analytics for Learn in anger - certainly not in the UK.  One Blackboard presentation mentions three organisations where it’s being piloted: Montgomery County Community College, University of Maryland Baltimore County and Grand Rapids Community College. Given the necessity to enhance retention in many unviersities and colleges and the high penetration of Blackboard Learn there are likely to be many more institutions investigating purchasing the software. Meanwhile only two of the 96 UK higher education institutions who completed the UCISA technology enhanced learning survey in 2014 reported using Brightspace as their primary virtual learning environment.  It’s not clear if either of these is yet using Insights. So is the battle for supremacy in learning analytics systems being lost by the VLE vendors?  It would seem so at the moment - at least in the UK.  Certainly many educational institutions are already using business intelligence software for their broader requirements and are beginning to explore the potential of these for analysing learners and learning.  The business intelligence software is necessary anyway: Analytics for Learn is built on top of the Microsoft business intelligence stack and Insights requires IBM Cognos. If only the VLE-centric analytics products could be made more interoperable then Moodle users for example could plug in Analytics for Learn or Insights and benefit from the innovations of companies based in the world of online learning - as well as benefitting from the underlying business intelligence software and visualisation tools. Meanwhile the SIS vendors are moving quickly into the market and trying to promote an SIS-centric view of learning analytics, some of them even co-developing their products with UK universities.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:11am</span>
While VLE vendors would like their systems to be at the heart of an institution’s learning analytics efforts, vendors of student information systems (SISs) are now attempting to place their products in the analytics driving seat.  The SIS (also known as the student record system) is a vital product for any educational institution. It includes data such as prior educational qualifications, ethnicity and gender, the courses and modules in which students are enrolled, records of absence and assessment results.  More sophisticated SISs can contain many other aspects of the institution’s business.  All of them provide potentially helpful data for learning analytics so it’s no surprise that SIS vendors are developing products to help analyse this data - and integrate it with data from other sources such as the VLE. In the UK, some of the biggest players in the higher education SIS market are Tribal and Ellucian.  Meanwhile in further education, use of the Compass Pro- suite of tools is widespread.  In this post I’ll be examining how the learning analytics offerings of these three vendors are shaping up. Ellucian Student Retention Performance Ellucian Student Success includes five "solutions" designed to integrate with Ellucian’s Banner Student SIS.  In an earlier post I’ve already described one of them: Course Signals. Another, Student Retention Performance, aims to help identify at-risk students, examine retention and degree completion rates, and analyse the effectiveness of retention strategies.  In a complex hierarchy of products, Student Success is one of three modules which comprise the Ellucian Enterprise CRM (Customer Relationship Management system).  We’ll ignore the rest of them just now and assume that your institution uses Banner Student as its SIS. Ellucian is aiming Student Retention Performance at "Academics, advisors, institutional researchers, administrators and executives".  It comprises scorecards, dashboards, reports and "analytic capabilities" aimed at improving student retention and success initiatives. Scorecards provide a visual way of measuring progress towards institutional goals and objectives.   In one example that is given "Improve Academic Performance" is the institutional goal, with a measurable objective of "Increase Number of Undergraduate Students in Good Standing".  Figures for the number of students in good standing are presented as a key performance indicator, together with a status indicator and a trend.  Further visualisations such as historical performance can also be generated.  There are various preconfigured scorecards and dashboards with suggested metrics, however it’s possible to create other customised metrics based on the data in Banner Student. By some definitions this is higher level business intelligence or academic analytics rather than learning analytics but it’s possible to drill down to the performance of individual students too.  Such products don’t fit neatly into convenient categories for researchers attempting to map out the different types of analytics processes and tools that are emerging. One thing is clear though: Student Retention Performance is not designed as an intervention tool; you’d need to use Course Signals if you want to automate and manage actions by staff. Student Retention Performance uses IBM Cognos.  Data appears to be fed from Banner Student only - I cannot find any reference to integration with a VLE product as yet, which makes it somewhat limited.  It may be though that combining this tool with Course Signals, which does integrate data from other systems, would help. Tribal Student Insight Tribal is currently developing an analytics offering for their SITS:Vision SIS product with the University of Wolverhampton.  This is called Student Insight and aims to help predict student performance and at-risk students to enable interventions to be carried out, subsequent student activity tracked and - interestingly - the impact of interventions to be tracked.  Few analytics products so far offer this facility. Tribal refers to library and VLE data being integrated together with demographic data and assessment results.  The product is intended to provide information to staff on issues the student might have such as preparation for higher education, engagement, "academic integration" and "social integration" - presumably producing metrics for each of these.  It will allow comparison with other students across cohorts too. The company claims that their product can predict student success with an accuracy of 70% - some room for improvement there then still.  It’s found in its research with Wolverhampton that VLE usage is highly predictive of student success.  We already know that there is a correlation but it will be interesting to find out what aspects of VLE usage they are measuring - and whether the metric is based on more than a simple analysis of VLE click data from the log files.  Finally they state that social background and the distance students live from the campus is strongly correlated with their success when combined with other factors.  All of these metrics are combined into a single overall "Student Success Prediction" metric. Compass ProMonitor The final SIS-based product I’ll look at  here is widely used in further education and in some respects is ahead of the SIS-centric offerings being developed for higher education.  One of the products within the SIS range from Compass Computer Consultants Ltd is ProMonitor. This presents key details about the learner, enables teachers to record meetings and input comments, track them and follow them up, and it provides automated email notifications.  There are reports on assessments and ones which show student performance against the targets set for them.   Other data sources can be integrated and at risk students can be identified. This has much in common with a customer relationship management system and facilitates the workflow around interventions as well as various visualisations.  It’s unclear how the at risk metric is calculated but a more sophisticated predictive analytics engine might help in this regard. Conclusions The SIS-centric analytics tools are still in their early stages of development, let alone being used in anger by many UK institutions.  The three I’ve profiled here all require you to be using that particular SIS so if you’re not, you’ll need to look at a VLE-centric analytics product or a business intelligence system.  In summary: Student Retention Performance could be used alongside Course Signals if you’re a Banner Student institution to carry out some interesting analysis together with some automated interventions and workflow management. Student Insight looks promising for Tribal SITS:Vision users but you’ll have to wait till it’s released to assess its full potential alongside other types of products. Further education colleges are already using ProMonitor and beginning to understand how they can use it to monitor student engagement. But the system so far lacks some of the tools of emerging analytics solutions such as sophisticated predictive models. I’ve suggested in an earlier post that VLE-centric analytics products are not currently winning the battle for market share in the UK.  Well nor it seems are the SIS-based ones at the moment, with the exception perhaps of ProMonitor in the further education sector.  That leaves us still to investigate to what extent the generic business intelligence tools such as Autonomy, Cognos, QlikView and Tableau are being successfully customised as learning analytics systems.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:10am</span>
In previous posts I’ve looked at the emerging tools for monitoring user engagement, and adding more sophisticated engagement metrics. I’ve also examined the learning analytics systems being offered by vendors of virtual learning environments (learning management systems) and student information systems.  VLEs and SISs are the places where most data about students and their learning are held, so it can make sense to deploy a system built on top of one of the key systems already in use in your institution. Image: BME Solutions Ltd But many universities and colleges are using generic business intelligence (BI) systems to obtain insight on various aspects of their operations - and are beginning to use them for analysing the student learning experience too.  I’m going to look at three of the most widely used systems here:  IBM Cognos, QlikView and Tableau. Cognos is a well-developed BI system in use by many educational institutions to help analyse all aspects of their business including finance, enrolments, human resources and student success. The product is marketed as a comprehensive BI tool rather than being aimed primarily at retention or student success.  With the incorporation of SPSS Modeler and the SPSS statistical engine within Cognos users can perform predictive analytics alongside their other BI functions. Qlikview is marketed as an easy to use BI system (and is reported to be so by some of the institutional users I’ve met, as well as being relatively inexpensive) and already has significant adoption in education.  Some institutions have been jointly building a number of educational apps with the company which are available to users of the system. Tableau is another generic BI and visualisation tool which is being used for analysing many aspects of business, including learning, at educational institutions Worldwide. What do the products do? Cognos is a generic solution for all aspects of business intelligence and analytics.  Its core functionality is described as enabling users to answer key business questions and to: Easily view, assemble and personalise information Explore all types of information from all angles to assess the current business situation Analyse facts and anticipate tactical and strategic implications by simply shifting from viewing to more advanced, predictive or what-if analysis Collaborate to establish decision networks to share insights and drive towards a collective intelligence Provide transparency and accountability to drive alignment and consensus Communicate and coordinate tasks to engage the right people at the right time Access information and take action anywhere, taking advantage of mobile devices and real-time analytics Integrate and link analytics in everyday work to business workflow and process This list could be used to describe much of the functionality present in the other products.  QlikView allows users to integrate data from a variety of sources and develop dashboards and visualisations of areas such as student performance and retention, easily breaking these down by areas such as age, gender, course and faculty.  It does not however deal with workflow and process.  The largest UK reseller of QlikView, BME Solutions, has developed a range of education-specific applications for the system which make it relatively easy for an institution to monitor engagement and other metrics. Image: BME Solutions Ltd Because of the generic nature of all three of these systems any required metrics can be generated from the underlying data.  QlikView provides "models" with a number of educational metrics such as NSS scores and an "at risk" indicator. No customised learning analytics applications appear to be offered with Tableau.  A range of visualisations can be created using a drag and drop interface once the data has been linked to the system.  At the University of Kansas academic advisors are given a dashboard built on Tableau of individual students, synthesizing information from various databases.  The tool has been customised to enable communication with students who fall into particular categories such as those who are not performing well or who have not had an advising appointment. Interventions such as emails to groups of users or students at risk can also be triggered based on analytics in all three products. Who are the systems aimed at? Because of the generic nature of all three systems, dashboards, scorecards and reports can be generated for any type of user across the organisation. How extensively are they being used? Already in use by many universities and colleges for a number of years (Sheffield Hallam was using it as far back as 2006, for example), Cognos is increasingly being seen as a solution for learning analytics.  IBM is working closely with various universities in the UK to customise Cognos for their needs, notably London South Bank University. 36 universities in the UK use QlikView and there is a BME Solutions QlikView UK user group which meets every six months for institutions to share experience.  BME also has a small number of customers in UK further education. Though it has a smaller market share in the UK than Cognos and QlikView, many institutions worldwide are using Tableau, including various Ivy League universities.  It is not clear to what extent they are using the product for learning analytics rather than other functions such as enrolment and fund-raising.  However I do know of at least one UK university which has been trialling it for learning analytics visualisations. Technical architecture Cognos is built on a service-oriented architecture and runs alongside a data warehouse which is populated from key systems.  A database is packaged with Tableau and sits alongside it.   Data can be visualised from the local database or directly from other sources which can include Oracle databases and Excel spreadsheets.  With QlikView, data is stored "in memory" rather than in a data warehouse, and the dashboards store the front end plus the entire dataset they are working with. Final thoughts Most UK universities have at least one generic BI tool in use for monitoring and planning of their business activities.  According to the UCISA CIS survey SAP Business Objects is used by 22% of institutions, Microsoft Performance Point by 19% and Oracle’s BI systems by 14%. Considering that Cognos, QlikView and Tableau have a lower market share than these others it’s interesting that they are gaining such prominence in the learning analytics field.  Meanwhile I’m not hearing of the SAP, Microsoft and Oracle products being extensively used as yet for learning analytics, despite their penetration of the BI market in higher education. BI systems do already however provide the underlying capability for some of the learning analytics solutions available.  Blackboard Analytics for Learn uses Microsoft Performance Point to build its dashboards.  Cognos sits under Desire2Learn’s Insights learning analytics product and Ellucian Banner Student Retention Performance . Tools such as QlikView and Tableau are reportedly easy to use and relatively cheap to deploy. The fact that they sit outside the VLE and the SIS has both pros and cons.  Unlike the new learning analytics products emerging from VLE and SIS vendors, BI systems are so generic that they may require considerable customisation both to the institutional context and for carrying out learning analytics specifically.  The educational applications and the active communities around some products (notably QlikView) may make that contextualisation much easier.  Meanwhile the fact that they are not tied closely into another product has the benefit of reducing vendor lock-in.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:09am</span>
When I blogged about learning analytics systems being developed by vendors of student information systems (SISs) I mentioned Tribal’s emerging system, Student Insight, which it’s been developing with the University of Wolverhampton. Tribal’s SITS:Vision product is used by over half of UK higher education institutions.  Meanwhile the company’s ebs4 SIS (or MIS as it’s known in further education) is in use by more than 120 colleges in England and Wales.  There wasn’t much available on the web about the project so I contacted the company’s Data Scientist, Chris Ballard, to find out more. Niall: What’s the current state of the product? Chris: Student Insight has emerged from R&D carried out with the University of Wolverhampton.  We’re currently working with them and two or three other institutions to validate the capabilities of the product - and next year we plan to expand out the user base considerably.  The system’s available now for early adopters. Niall:  How would you describe its functionality? Chris: Student Insight enables customers to make the best use of the data they hold - not just in SITS but from external data sources, in particular the institution’s VLE or library.  Firstly, it allows you to build a model or set of models to predict student risk.  We’re focussing at the moment on retention and academic performance risk, in other words the risk of dropping out or failing a module. We use the data in SITS to build this model and supplement it with data relating to engagement.  It also shows historic data about the student and enables you to aggregate up risk predictions to module, course and faculty level too, as well as bringing in performance, engagement and retention data at each of these levels. We feel it’s important that an institution is able to use student predictions and analytics of historic data side by side in order to make an informed decision about what should happen. Aggregating this data allows institutions to use the analytics to understand student risk and outcomes at course level, and well as for individual student support. This enables us to meet the needs for those institutions who want a "top down" approach to monitoring student cohorts, as well as a "bottom up" approach where it’s used to proactively identify specific students who may be at risk. The predictive aspect of the system is customisable - we recognise that every institution has different datasets and differing requirements. Niall: What technologies are you using?  Is there an underlying business intelligence system for example? Chris: It’s delivered as a software as a service cloud-based system - and also available for on-site installation.  We use open source technology and data science libraries built by our team.  The database is MongoDB, the widely-used NoSQL platform.  It’s a document-oriented database which we use because it enables flexible modelling of different structures at an institution. The predictive engine is built on top of well-established machine learning libraries implemented in Python.  We use a machine learning ensemble technique to combine multiple predictive models to produce an overall prediction for individuals and groups.  We’re not giving you a black box here but trying to make things transparent to the user.  It shows whether there’s a high risk of underperformance and lets you drill down into that to get predictions on individual elements of performance. Niall: Is it built just to work with SITS or can it integrate with other student information systems? Chris: Actually it could work with any SIS.  You can push data to it in a number of ways, by importing CSV files, using the system’s API or pulling in data by directly interrogating a database.  When we were designing the system, a priority was for it to be open so that it could be used with all of Tribal’s education management systems, though our initial focus is on SITS customers. We decided to provide an API because we recognised that institutions had different approaches and might want to deliver analytics at different levels.  Many institutions are keen to start at a more aggregated level and to embed the analytics in existing portals.  We’re working closely with early adopter partners because we want to make sure institutions can properly interpret the data they see - and we’re using the results to continuously refine the product. Niall: What VLEs and other systems does it integrate with? Chris: You can basically pull in any dataset.  If you wanted to integrate Moodle or Blackboard you’d need to extract data from the user log files. This raw data is then loaded into Insight and processed to extract higher level predictive features.  The next step is to have API integration with key VLEs so we can pull it in directly.  Most institutions seem to be interested in using SIS data first and then to start integrating data from their VLE. Niall: What about the metrics it produces?  Can you tell me a bit more about them? Chris: The two main areas as I mentioned earlier are around the prediction of retention and academic performance risk.  Student Insight allows you to define datasets and map them to factors you determine such as information you know about the student when they enrol like their prior results.  The system builds a machine learning ensemble against each dataset - you supply it with a set of historic training data and the outcome you know happened historically.  In the case of academic performance that might be whether each student passed or failed the module, or achieved a certain grade.  The models the system builds, learn the patterns in the underlying data, which are then used to predict the outcome you’ve trained it against. At its heart it’s a very generic machine learning data modelling tool.  We’re not enforcing any fixed models but we’re providing a framework that allows a set of predictive models to be built up and then delivered to different users using role-based security through dashboards built using HTML5.  But we’re developing a standard set of models that can then be customised and automatically optimised for the institution using them. Niall: Some of the learning analytics systems available are now building in workflow so that you can manage interventions and feed data on the effectiveness of those interventions back into the models.  Does Student Insight do that at all? Chris: Yes, managing interventions is a key part of the system.  It allows you to flag cohorts of students or individuals at risk and then undertake an intervention.  It integrates with another of our products called ESD (Enterprise Service Desk) which allows an institution to manage student support processes and also provides helpdesk capabilities. Intervention delivery can be managed and monitored by the student support team using ESD, based on the institution’s support workflow and policies.  On our product roadmap is to consume what happens as a result of the intervention. We’ll look at the history of predicted outcomes for the student and then feed that back into the predictive model. Niall: Well that’s helped to fill in the gap in my knowledge about Student Insight, which pretty much completes my review of the learning analytics systems available.  Thanks very much, Chris - and please keep us posted about how the product is developing.
Niall Sclater   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 01:09am</span>
Displaying 21313 - 21336 of 43689 total records