Blogs
|
Jisc’s draft Code of Practice for Learning Analytics is now available for public consultation for the next two weeks: Code of Practice for Learning Analytics v04 [MS Word]. I’ve already had some very useful comments back from a number of invited organisations and individuals which will help to enhance the document and the accompanying online guidance.
The background to the Code is provided in an earlier blog post. The deadline for comments from the public consultation is 5th May. These will then be presented to the Code of Practice Advisory Group which will agree a final version.
I’d be very grateful for any further thoughts from individuals or organisations either by commenting on this blog post or emailed to me at niall.sclater {/at/} jisc.ac.uk
We intend to release the final version of the Code in June and will be developing accompanying online guidance and case studies over the coming months.
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:04am</span>
|
|
In February we ran a workshop in London with university staff and a couple of students to gather requirements for a student app. I’m now carrying out sessions directly with students to find out what they would find most useful. Yesterday I had the pleasure of visiting the University of Lincoln at the invitation of Marcus Elliott. The students were from a variety of levels and backgrounds, ranging from engineering to drama.
MAB Main Admin Building (Credit - University of Lincoln)
Most of them had little idea of what learning analytics was about so I introduced the session by describing a few things that were being done in the area - attempting not to influence their thinking too much. Marcus and I had agreed that we were better starting with a blank slate and then looking at whether there was any common ground with the conclusions of the London workshop.
As with the previous event it was a challenge to keep the group focussed on the applications of learning analytics without straying into all the useful things that apps could do for students. I felt it was better though just to let the ideas flow, and not to impede the creativity in the room.
The students came up with ideas for functionality, put them on stickies, and discussed them with a partner. Then they all came together and spontaneously grouped the ideas into four categories: academic, process of learning, social integration and system monitoring / institutional data.
At this stage we didn’t want to look too much at presentational issues however we provided the students with blank smartphone screen templates to scribble on in order to focus them on what the functionality might involve.
Inevitably there was a focus on assessment and, as with the London workshop, up to date data on grades was thought to be one of the most useful things a student app could provide.
Is this learning analytics? I don’t know - but ideas such as showing your ranking in the class and being able to manage processes from this screen such as clicking to arrange a tutorial would certainly be useful.
Other ideas included calendar reminders of assessment due dates and exams.
The app could provide a one-stop shop for all of a student’s results.
It could also show what percentage of assessments the student has completed but also what grades they need to obtain in future assignments in order to receive different levels of degree award.
Better feedback to students from their lecturers was also thought to be something the app could facilitate. This student neatly links personalised feedback to more detailed suggestions on how to improve particular skills e.g. academic writing skills and options for self-development such as links to help sessions which could be placed directly into the student’s calendar.
Giving real-time feedback to lecturers rather than waiting till it’s too late via student surveys was another option. This could help speed up improvements to courses.
Providing reading list functionality was also popular with the attendees. Here students are presented with metrics showing how much they’re engaging with the reading list on each of their modules.
Reading list functionality could also include reviews, comments and recommendations from other students (perhaps building on the features of goodreads).
They also suggested Amazon-style recommendations for reading e.g. "if you liked x you may like y".
How you spend your time was another application which the students thought could be useful. This example shows the percentage of time spent by the student on various activities. The data itself could be assembled from timetables, calendars, geo-location and self-declared activity.
Recommendations on how much time should be spent on different activities could be another helpful feature.
Managing event attendance was another popular option. The student could be contacted about societies and social events, workshops, guest lectures etc - all of which would be based on their interests, which they could also specify via the app. This would cut down on the amount of "spam" messages from the University which they say have led to many students not bothering to read their emails.
You could invite people to events you are organising, or push events to their app - again based on the interests they have specified.
Rating events would also be a useful feature.
If analytics determined that a student was becoming disconnected the app could introduce them to opportunities such as open day volunteering. There was a suggestion that University and Student Union data could be combined to suggest such opportunities based on career aspirations and interests.
Another option is to use the app to check-in to lectures, perhaps automatically using geo-location, and to enter reasons (such as illness) for non-attendance. There could also be notifications on lecture cancellations.
The app could contribute the events you attend to a portfolio of attended lectures.
Finding other students with similar or complementary interests was a popular suggestion too. This idea came from a postgraduate student who recognised the value of interdisciplinary contact so that you could look for someone to help you in an area you were less familiar with. You could specify what skills you have on offer and what you’re looking for assistance with.
Though we didn’t ask her to do it, showing how the different functions of the app could be accessed was important to this student in order to understand how everything would fit together.
Another generic suggestion was that the app should keep you logged in all the time.
So all in all some great suggestions from this group of students in Lincoln. Some of them aren’t what are normally considered learning analytics applications but they all rely on data - some of it new data such as students being able to specify their interests in more detail in order to receive targetted materials and details of events.
There’s a lot of complementarity with what staff thought of in the London workshop. It’ll be interesting to see now what students at other institutions come up with.
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:03am</span>
|
|
On a fabulous spring day last Friday around 44 people made it to Edinburgh for the second meeting of the Learning Analytics Network, jointly organised by Edinburgh University and Jisc.
I suspect a number of us may have been somewhat bleary-eyed after witnessing the political landscape of the UK being redrawn during the previous night. However the quality of the presentations and discussion throughout the day seemed to keep everyone awake. It was particularly interesting to hear about the various innovations at Edinburgh itself, which is emerging as one of the pioneering institutions in learning analytics.
Edinburgh’s visionary approach is highlighted by the recent appointment of Dragan Gašević as Chair in Learning Analytics and Informatics, and it was great to have Dragan give the first presentation of the day: Doing learning analytics in higher education: Critical issues for adoption and implementation PDF - 1.21MB.
Dragan outlined why learning analytics is increasingly necessary in education and examined some of the landmark projects so far such as Signals at Purdue and the University of Michigan’s development of predictive models for student success in science courses.
In an Australian study Dragan was involved with he found there is a lack of a data informed decision making culture in universities and that, while researchers are carrying out lots of experimentation, they are not focussed scaling up their findings. Finally Dragan looked at ethics and mentioned the Open University’s policy and Jisc’s (soon to be released) Code of Practice for Learning Analytics.
Next up was Sheila MacNeill on Learning Analytics Implementation Issues (Presentation on Slideshare). Sheila gained expertise in learning analytics while working for Cetis and has now been attempting to put this into practice at Glasgow Caledonian University. On arriving at the institution 18 months ago she found it was difficult to get to grips with all the systems and data of potential use for learning analytics. She started by identifying the areas: assessment and feedback, e-portfolios, collaboration and content. This data is hard to get at and needs a lot of cleaning to be able to be used for learning analytics.
Sheila’s summary slide outlines the main issues she’s encountered:
Leadership and understanding is crucial - you need both a carrot and stick approach.
Data is obviously important - ownership issues can be particularly problematic.
Practice can be a challenge - cultural issues of sharing and talking are important.
Specialist staff time matters - learning analytics has to be prioritised for progress to be made.
Institutional amnesia can be an issue - people forget what’s been done before and why.
Zipping back to the East Coast, Wilma Alexander talked about Student Data and Analytics Work at the University of Edinburgh (PDF - 866kB). She discussed attempts to use Blackboard Learn and Moodle plug-ins for learning analytics, finding that neither of them were designed to provide data to students themselves. They then collected 92 user stories from 18 staff and 32 students. Much of what people wanted was actually already available if they knew where to look for it. Students wanted to understand how they compare with others, to track their progress, and to view timetables, submissions and assessment criteria.
The next presenter, also from Edinburgh, was Avril Dewar: Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities (PPT - 351kB). Avril discussed her work at the Centre for Medical Education to develop an early warning system to identify disengaged students. 80% of at-risk students were identified by the system. Metrics included: engagement with routine tasks, completion of formative assessment, tutorial attendance, attendance at voluntary events, and use of the VLE.
Yet another Edinburgh resident, though this one working for Cetis rather than the University, was next. Wilbert Kraan presented on The Feedback Hub - where qualitative learning support meets learning analytics (PPT - 1.86MB). The Feedback Hub is part of Jisc’s Electronic Management of Assessment project, working with UCISA and the Heads of eLearning Forum. It aims to provide feedback beyond the current module, looking across modules and years. Wilbert proposed that feedback related data could be a very useful input to learning analytics.
My colleagues Paul Bailey and Michael Webb (most definitely neither from Edinburgh) and I (from Edinburgh originally!) then updated attendees on progress with Jisc’s Effective Learning Analytics programme (PDF - 318kB). In particular we described the procurement process for the basic learning analytics system (which will be the subject of further publicity and another imminent post on this blog) to be made available freely to UK universities and colleges. We also discussed the Discovery Stage where institutions can receive consultancy to assess their readiness for learning analytics. Paul concluded by mentioning the next Network Event at Nottingham Trent University on 24th June (Booking Form).
Later we had several lightening talks, the first from Prof Blazenka Divjak of the University of Zagreb, though currently a visiting scholar at, you guessed it, the University of Edinburgh. Blazenka presented on Assessment and Learning Analytics (PPTX - 385kB). She’s found the main challenge in learning analytics to be the management and cleansing of data. She discussed two projects undertaken at Zagreb. The first examined the differences in performance between groups based on gender, previous study etc. The second analysed the validity, reliability and consistency of peer assessment. She demonstrated visualisations which allow students to compare themselves with others.
Paula Smith from Edinburgh gave another interesting lightening presentation on The Impact of Student Dashboards. She reported on an innovation in their MSc in Surgical Sciences which expanded on existing tracking of students via an MCQ system to create a student dashboard. This allowed them to monitor their performance in relation to others, to provide information on at-risk students to staff and to evaluate the impact of any interventions that took place as a result. Most students welcomed the dashboard and many thought they would want to view it monthly.
Finally, Daley Davis, from Altis Consulting talked about what his company is doing in Learning Analytics (PDF - 663kB). Altis is an Australian company and Daley discussed how Australian institutions are extremely focussed on retention due to the funding regime. Working with the University of New England, Altis cut attrition rates from 18% to 12%. A student "wellness engine" was developed to present data at different levels of aggregation to different audiences. Data used included a system which asked students for their emotional state.
In the afternoon we split into groups to discuss the "burning issues" that had emerged for us during the day. These were:
Make sure we start with questions first - don’t start with a technical framework
Data protection and when you should seek consent
When to intervene - triage
Is the data any use anyway?
Implementing analytics - institutional service versus course/personal service
Metrics and reliability
Institutional readiness / staff readiness
Don’t stick with deficit model - focus on improving learning not just helping failing students
Treating cohorts / subject disciplines / age ranges differently
Social media - ethics of using Facebook etc for LA
Can’t not interpret data just because there’s an issue you don’t want to deal with
Using LA at either end of the lifecyle
Ethics a big problem - might use analytics only to recruit successful people
Lack of sponsorship from senior management
Essex found through student surveys that students do want analytics
I’m immensely grateful to Nicola Osborne for her comprehensive live blog of the event from which this summary draws heavily. I wish there was a Nicola at every event I attended!
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:03am</span>
|
|
Today Jisc is launching the Code of Practice for Learning Analytics at the UCISA Spotlight on Digital Capabilities event here in the amazing MediaCityUK at Salford Quays.
Developing this document was chosen by institutions as the number one priority for Jisc’s learning analytics programme. The Code aims to help universities and colleges develop strategies for dealing with the various ethical and legal issues that may arise when deploying learning analytics.
It’s a brief document of four pages and is available in HTML or in PDF. The development of the Code was based on a literature review of the ethical and legal issues. From this a taxonomy of the ethical, legal and logistical issues was produced. The Code was drafted from this taxonomy and is grouped into seven areas:
Responsibility - allocating responsibility for the data and processes of learning analytics within an institution
Privacy - ensuring individual rights are protected and data protection legislation is complied with
Validity - making sure algorithms, metrics and processes are valid
Access - giving students access to their data and analytics
Enabling positive interventions - handling interventions based on analytics appropriately
Minimising adverse impacts - avoiding the various pitfalls that can arise
Stewardship of data - handling data appropriately
The Code was developed in the UK context and refers to the Data Protection Act 1998 however most of it is relevant to institutions wishing to carry out learning analytics anywhere, particularly in other European countries which have similar data protection legislation. It can be adopted wholesale or used as a prompt or checklist for institutions wishing to develop their own learning analytics policies and processes.
If you find the document helpful or feel that anything is unclear or missing please let us know. Keeping it concise was thought to be important but that meant leaving out more in-depth coverage of the issues. Over the coming months we’ll be developing an associated website with advice, guidance and case studies for institutions which wish to use the Code.
Acknowledgements
The process has been overseen by a Steering Group consisting of Paul Bailey (Jisc), Sarah Broxton (Huddersfield University), Andrew Checkley (Croydon College), Andrew Cormack (Jisc), Ruth Drysdale (Jisc), Melanie King (Loughborough University), Rob Farrow (Open University), Andrew Meikle (Lancaster University), David Morris (National Union of Students), Anne-Marie Scott (Edinburgh University), Steven Singer (Jisc), Sharon Slade (Open University), Rupert Ward (Huddersfield University) and Shan Wareing (London South Bank University).
It was particularly good to have the student voice represented in the development of the Code by David Morris of the NUS. I’m also especially grateful to Andrew Cormack and Rupert Ward for their perceptiveness and attention to detail on the final draft. I received additional helpful feedback, most of which I was able to incorporate, from the following people (some in a personal capacity, not necessarily representing the views of their organisations):
Helen Beetham (Higher Education Consultant), Terese Bird (University of Leicester), Crispin Bloomfield (Durham University), Alison Braddock (Swansea University), Annemarie Cancienne (City University London), Scott Court (HEFCE), Mike Day (Nottingham Trent University), Roger Emery (Southampton Solent University), Susan Graham (Edinburgh University), Elaine Grant (Strathclyde University), Yaz El Hakim (Instructure), Martin Hawksey (with other members, Association for Learning Technology), Ross Hudson (HEFCE), John Kelly (Jisc), Daniel Kidd (Higher Education Statistics Agency), Jason Miles-Campbell (Jisc), George Munroe (Jisc), Jean Mutton (Derby University), Richard Puttock (HEFCE), Katie Rakow (University of Essex), Mike Sharkey (Blue Canary), Sophie Stalla-Bourdillon (Southampton University), Sarah Stock (University of Essex) and Sally Turnbull (University of Central Lancashire).
Finally, many thanks to Jo Wheatley for coordinating the production of the print and HTML versions of the Code.
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:03am</span>
|
|
I’ve just submitted a paper to a forthcoming "Special Section on Ethics and Privacy" in the Journal of Learning Analytics (JLA). The paper documents the development of Jisc’s Code of Practice for Learning Analytics through its various stages, incorporates the taxonomy of ethical, legal and logistical issues, and includes a model for developing a code of practice which could be used in other areas.
A model for the development of a code of practice
As an open journal the JLA suggests that authors publish their papers before or during the submission and review process - this results in the work getting out more quickly and can provide useful feedback for authors. So here’s the paper - and if you have any feedback it would be great to hear from you.
Abstract
Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organisation which champions the use of digital technologies in UK education and research, has attempted to address this with the development of a Code of Practice for Learning Analytics. The Code covers the main issues institutions need to address in order to progress ethically and in compliance with the law. This paper outlines the extensive research and consultation activities which have been carried out to produce a document which covers the concerns of institutions and, critically, the students they serve. The resulting model for developing a code of practice includes a literature review, setting up appropriate governance structures, developing a taxonomy of the issues, drafting the code, consulting widely with stakeholders, publication, dissemination, and embedding it in institutions.
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:03am</span>
|
|
Jisc’s Summer of Student Innovation in Birmingham
Our app to display learning analytics to students themselves is taking shape. After brainstorming requirements for the app with staff and students at a workshop in London and seeking further input from students at Lincoln University we obtained useful feedback on our draft designs from students at our Summer of Student Innovation (SOSI) session in Birmingham in early August.
Continuing my student app design tour of England, I joined colleagues from Jisc and Therapy Box last week in Manchester to apply the same methodology to app design that our SOSI students have been using. The technique, developed by live|work includes devising personas and user journeys, and competitor and SWOT analyses, defining business and technical requirements, walking through concepts with other teams, and the development and user testing of wireframes.
This was a highly effective process, enabling us in a day and a half of intensive work to narrow down a large number of potential requirements to a manageable feature set, and to tackle key aspects of presentation and navigation. The result is a set of wireframes for the designers at Therapy Box to get their hands on before they start the build.
A major influence on our thinking is the use of fitness apps such as MapMyRide and Google Fit: some of us are already avid users of these technologies. To emulate their addictive qualities in an app for learning is one of our aims. In developing the concepts we were informed by some assumptions which have emerged from earlier work:
That students can be motivated by viewing data on how they’re learning and how they compare with others
That making the app as engaging as possible is likely to result in greater uptake and more frequent access to it, with a corresponding positive impact on motivation
That increased motivation is likely to lead to better learning - with positive impacts on retention and student success
We do of course recognise that the app may have negative effects on some students who find it demotivating to discover just how badly they’re performing in relation to others. However there’s a strong argument that it’s not in these students’ interests to remain with their heads in the sand. Meanwhile if data exists about them shouldn’t we be helping students to see that data if they want to?
Moving on from ethical issues, which I’ve covered extensively in an earlier post, six principles which we want to embody in the app are now explicit. We believe it should be:
Comparative - seeing how you compare with class averages or the top 20% of performers for example may provide a competitive element or at least a reality check, driving you to increased engagement.
Social - being able to select "friends" with whom to compare your stats may add another level of engagement.
Gamified - an app which includes an element of gaming should encourage uptake by some students. This may be manifested in the use of badges such as a "Library Master" badge for students who have attended the library five times.
Private by default - while data that the institution knows about you from other systems may be fed to the app, the privacy of any information you input in the app yourself will be protected. However anonymous usage information may be fed back to the app backend.
Usable standalone - by students whose institutions are not using the Jisc architecture.
Uncluttered - the app should concentrate for the time being on learning analytics data and not be bloated with functionality which replicates what is already present in the VLE/LMS or in other applications.
So let me now take you through the draft wireframes to show how these principles are being taken forward (- click the images to enlarge).
When first logging in the student is able to select their institution from a pre-populated lists of UK universities. If the students’ institution is using other parts of Jisc’s learning analytics architecture, in particular the learning analytics warehouse, then much more data will be available to the app.
For simplicity we’re ignoring for the time being the use case of a student signed up with more than one institution.
But we’re incorporating functionality which we think will be of interest to students regardless of whether their institution has signed up. That’s setting targets and logging their learning activities, about which more later.
While what should go into an activity feed or timeline is likely to be the subject of much debate and future educational research, we plan to integrate this dynamic and engaging concept, so essential to applications such as Twitter and Facebook.
The wireframes are intentionally black and white and allow space to incorporate images but not the images themselves - in order to concentrate on concepts, layout and navigation at this stage.
Here the images may be of your friends or badges awarded. We include examples of possible notifications such as "Sue studied for 2 hours more than you!" but at this stage make no comment as to whether these would be motivational, irritating or otherwise. Future user testing will help clarify what should be included here and how the notifications should be worded.
The engagement and attainment overview mirrors what many fitness apps do: it provides an overview of your "performance" to date. Critically here we show how you compare to others. This will be based on data about you and others held in the learning analytics warehouse. It may include typical data used for learning analytics such as VLE/LMS accesses, library books borrowed, attendance records and of course grades.
We’ll research further how best to calculate and represent these comparators or metrics. At this stage we’ve chosen to avoid traffic light indicators for example as these would require detailed knowledge of the module and where the students should be at a particular point in time.
Now let’s see what happens when you click the More button.
In the activity comparison screen you’ll see a graph of your engagement over time and how it compares with that of others. You can select a particular module or look at your whole course. We’ll populate the drop-down list with options for who you can compare yourself with such as people on my course, people on this module and top 20% of performers (based on grades).
Comparing yourself to prior cohorts of students on a module might be of interest in the future too.
We may show a graph here with an overall metric for "activity" based on VLE/LMS usage, attendance etc. Or we may want to break this down into its components.
The next feature of the app allows you to log your activities. This is some of the "self-declared" activity that we think students might want to input in order to gain a better understanding of what learning activities they’re undertaking and how much effort they’re putting into each.
Let’s click Start an activity.
Starting an activity allows you to select the module on which you’re working, choose an activity type from a drop-down list such as reading a book, writing an essay, or attending a lab, and select a time period you want to spend on the activity and whether you want a notification when that period is up.
A timer is displayed in the image box and you can hit the Stop button when you’ve finished. The timer will continue even if you navigate away from the app.
Setting a target is the final bit of functionality we want to include in the app at this stage. Again this is building on the success of fitness tracking apps where you set yourself targets as a way of motivating yourself.
In this example the user has set a target of reading for 10 hours per week across all their modules. The image will show a graphic of how close they are to achieving that target based on the activity data they have self-declared in the app.
Navigation to your next target may be through swiping.
Setting a target involves selecting a learning activity from a pre-populated list and specifying how long you want to be spending on it.
We added a "because" free text box so that learners can make it clear (to themselves) why they want to carry out the activity e.g. I want to pass the exam, tutor told me I’m not reading enough).
Users may be more likely to select a reason from a pre-populated list than to fill in a text field but we’ll monitor this to see whether it’s being used.
We’re also considering the use of mood indicators here to show how happy or otherwise you’re feeling about how you’re getting on with meeting your target. Lots of potential for comparing your mood with others, showing how it’s changed over time or even sentiment analysis by institutions if students want to share such information with them - but that’s one to tackle later.
This doesn’t include all the screens we’ll need but we do now have a pretty good idea of initial functionality to be incorporated in the app, its layout and navigation. There’ll no doubt be a fair bit of tweaking before v0.1 is built but you should get the general idea of what’ll be available from the screens above. We make no attempt at this stage to incorporate predictive analytics, which might show for example if you’re on track to succeed or drop-out. That will come in future iterations as, no doubt, will some of the other great ideas people have come up with that we’re leaving out for this first version of the app scheduled for release in April 2016.
Niall Sclater
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:02am</span>
|
|
I will begin by distinguishing between what we call 'must know information' and 'good to know information'. Must know information is directly linked to the learning outcomes. This information must be presented upfront and the learner should not have to search for it. Good to information is information that the learner can view if he/she wishes to read a little extra about the topic. This information can be displayed as click to know text, hyperlinks, tabbed presentations and so on. This information must not be presented upfront as it is not crucial to the learning objectives. Another logic behind this is quite simple. Learners tend to miss clicking the other tabs, links, buttons. Therefore, any information that will influence the learning outcome must be presented upfront.
Archana Narayan
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:02am</span>
|
|
What is creativity? Thinking up something new/original? Not necessarily... The ideas we come up with are typically related to the knowledge we already have. No idea comes from nothing. Every idea is inspired by an old one or something your have read/seen/experienced. You either innovate an old idea or put few ideas together to come up with a more 'new' idea. Creativity is about thinking different, stretching the boundaries, trying things you haven’t done previously, or improvising on an exiting idea.Creativity is inspired by passion. When you love what you do, you come up better ideas to do it better. You need your own space and time to be creative. You also need to be free of tension, stress and pressure. Organizations must give its employees the space to think freely and the freedom to execute new ideas. This will encourage employees to be creative at work. Is creativity a skill? I think so. You can consciously work on being creative.Brainstorming helps hone your creative ability.Stay in touch with what’s happening around you. This could be news, movies, good books, music, and so on.Identify your personal space. You need room to think. Make this space for yourself.Get inspired by creative things around you. This could be people, things, words, ideas.Read, read and read some more. Read on varied topics. This will help open your mind.Discuss, debate, argue. Engage discussions with colleagues, friends, and family.Ask questions like ‘Why not?’ rather than accepting things as they are.Think you can and not you can’t. Sit with a notepad and list down various possibilities. You will never know if you can or can’t till you try it.If you think you are stuck and no creative juices are following, take a break. Do something that will help you relax and loosen up.Have confidence in yourself. Only if you are sure of yourself, will you try to do something different.Lateral thinking helps. Think beyond the obvious.Know your facts/stuff.Not all ideas are doable. But noting them down will help you filter and build on the idea that will work for you. What hinders creativity?IgnoranceLack of confidenceNoise and crowdWorking mechanically with no thoughtRigidityLazinessNarrow mindednessYou can see creativity everywhere. You can see this in the way Tupperware boxes are designed, specific advertisements that capture our interest, the choice of clothes we wear, interiors of your house, and so on.
Archana Narayan
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:02am</span>
|
|
Why are gain attention strategies important? Within the first few minutes of the training program, the learner decides whether this course is worth his/her time or not. An effective gain attention strategy has the power to increase the motivational level of your learner.Using this, you can:Arouse their curiosity.Make them think about a particular concept.Make them laugh or break the ice.Help them grasp what is going to be covered in the course.Build expectations.Basically, it will make your learner want to see what comes ahead. Imagine! The learner is actually interested in learning. He/she is going to give you and your training program a chance. What qualifies as a gain attention strategy?1. A pre-test that tells them where they stand at the beginning of the courseFor example: Before we begin this module, let us attempt a brief questionnaire to identify your personality trait.2. A challenge thrown their wayFor example: You are a customer care executive at a call center. You have several customer calling you for information. You need to provide them with the information they require and close the close quickly to take the next call. How many calls can you close by end of day?3. A problem-solution approachFor example: You have been appointed as the manager at SimCom. You manage a team of six smart and talented people. Your teams performance has been very poor over the past few months. You need to motivate your team and ensure that each person gives his/her best to this project. Your boss is keeping a close eye on you. Good luck!4. A statistical reportFor example: Attrition rates are within the range of 30-60% in the BPO industry. The typical reasons for attrition are salary, work timings, better jobs, and so on.5. Did you know?Did you know that The Big Five is a group of animals of Africa: cape buffalo, elephant, leopard, lion and rhino. This term was coined by hunters because of the challenge of hunting these wild ferocious animals when cornered.6. A comic strip7. A story/dramaYou are a detective. Weird things have been happening at Reth City. Reports show that the number of males have been accelerating rapidly and no one seems to know the reason behind this. You have been offered this case. You need to go to the city to understand what is happening. You can talk to the city dwellers. If they seem secretive, you can look around the city for clues. Your assistant, Shweta, will hand out reports, newspaper clippings to help you crack the case. Hurry!You can think of several innovative ways to design grab attention screens. If you have come across any, share them.
Archana Narayan
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 01:01am</span>
|
|
Every Tuesdays and Fridays, we have learning sessions presented by each one of us. This week, it was my turn. I will be presenting on Multi-User Virtual Environments (will blog later on this). As for all presentations, I was reading up on this topic. During my coffee break, I picked up MetroPlus (Hindu) and read the first article, Trapped in the Net. This post talks about Internet addiction disorder (IAD), "...pathological use of computers, to engage in social interactivity.""It is becoming common to know of someone, or have heard of someone, who has become obsessed with online activity to the point that their alternative online lives have masqueraded - and in some cases completely dominated - their identities. " "Broken marriages, lost jobs and plunging college grades are just some of the things that people who spend upto 18 hours per day in virtual reality face."Interesting, isn't it? These quotes had me thinking of Second life. This is a multi-user virtual environment in which you can create an avtaar for yourself. This environment has its own economy (can you believe it?); the currency is Linden. You can buy and sell stuff in this environment. It must be so easy to dissolve yourself completely into this virtual environment that depicts real life through the eyes of the user. The avtaar is probably everything you want to be and are not. It is the ideal person that you want to be.Now, coming to the point about IAD. So, would you start believing that the avtaar is actually the real you? It is upto to the user to realize their responsibilities and not let their avtaar become real.
Archana Narayan
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Aug 19, 2015 12:59am</span>
|







