About two weeks ago, we reached out to some SCORM Cloud users, asking them a few questions about how they use SCORM Cloud.  You see, we built SCORM Cloud in such a way that people can really use it how they want to.  The problem, though, is that we want to make sure we’re supporting our users in the way they need us to, and that requires us knowing a bit about how they use our products. [Personal disclosure: My hate for spam and survey emails makes it incredibly hard for me to actually send these out, even to our customers.  We're going to do a bit more of that this year, so accept my apologies... And definitely opt out if you're not interested in participating.] This is what we learned: Our customers are incredibly kind.  Even those who had complaints had clearly gone to a class on how to offer constructive criticism.  Great ideas were layered with respectful comments.  We definitely came away appreciating the way you guys go about your business. Our customers and trialists are using SCORM Cloud in several distinct ways: Test Track, redux.  Many of you came to us in the days when we offered Test Track as the simplest way to test SCORM content, and those bones are still at the core of SCORM Cloud. The API.  More of you are building applications against SCORM Cloud than we realized, and we love that.  Hopefully, that speaks well of the API documentation we’ve enhanced over the course of the year.  Some of you rightfully complained about the early state of that documentation, and I think we’ve come a long way.  If there are other things we could be doing to make building apps on top of SCORM Cloud easier, tell us. A training delivery system.  This is definitely something less than an LMS, but it lets small organizations get content out to their constituents simply.  Simple seems to be good. A public URL.  It seems this simple, public URL option works well from you.  The people who need to learn aren’t always in an LMS, but that doesn’t meant they shouldn’t be offered great content. In an application we helped build, like Sakai, or Moodle, or WordPress. As a trial for our ever popular SCORM Engine. To deploy your content to other LMSs as a SCORM Dispatch.  Even within Dispatch, we’re seeing different uses.  Some take advantage of how tolerant our technology is (when their LMS’s isn’t).  Others want a layer of protection and tracking placed around their valuable content.  And we’ve got ideas about other ways we could dispatch content on your behalf.  Has anyone heard of AICC PENS or LETSI RTWS? We’ve got a bunch of functionality that we’ve not done enough to tell you about.  People are asking for things that they can already do, or that we know how to do already.  We need to be sure that we’re properly exposing those things. Did you know you can use tags to organize your courses and learners? Did you know that those tags can be used to do some pretty sophisticated reporting? Did you know that you could do some reporting?! We really want to hear more from all of you, but without bothering you in the least.  As we reach out to you more over the course of this year, please tell us what we’re doing right and what we’re doing wrong.  And don’t feel like you have to wait for us to ask… we want to hear from you all the time.    
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:42am</span>
The US Department of Labor just announced their solicitation for grant applications (SGA) and they called it this: "Employment and Training Administration Notice of Availability of Funds and Solicitation for Grant Applications for Trade Adjustment Assistance Community College and Career Training Grants Program".  Whoa. I’m no political pundit, so here’s my short version: The federal government has set up a large grant program that includes the creation of Open Educational Resources, and they’ve required (on page 8 ) that the output conform with SCORM 2004. There’s been a lot of "reaction" to the inclusion of SCORM, and by reaction, I mean many people are pretty angry about its inclusion.  Most of that angst, though, originates from Rob Abel’s post on the IMS forums. I will say this very plainly and directly: Rob’s post contains many inaccuracies and convenient explanations of the sort that you would see in a political campaign.  While it is tempting to break down Rob’s post on a line by line basis, my ever-so-brief analysis of political campaigns (OK, I watched The West Wing) indicates that helps no one.  I’ll limit my comments to a few: SCORM is not based on "outdated technology" as Rob claims repeatedly. The fundamental technologies employed by SCORM are Javascript and XML, and both are absolutely core to today’s web. "SCORM does not provide reliable interoperability or reuse."  Our SCORM Engine alone supports millions of learners and their use of interoperable content every year.  Millions. "SCORM has no concept of or support for assessment."  False again.  Please see the SCORM books for details on cmi.interactions, which are used widely for the reporting of learner assessment. Lest you think I’m one sided here, there are truths in Rob’s post as well.  SCORM is not well suited to "cohort-based" educational courses at this point, because it specifically governs single learner/host system communication.  SCORM also elects (intentionally) to remain silent on countless subjects such as wider IT infrastructure and security.   Setting aside the technical errors in Rob’s post, my primary issue is with his misplaced vitriol.  Rob has a vested interest in this debate. [Note: You could certainly argue that I do as well, given our domain name, but it's worth noting that we have equal support for AICC, and IMS CC has come up as a potential addition for us.  We are definitively not the standards body.]  As the leader of IMS, Rob has plenty of reasons to espouse the virtues of the standards they are creating.  Further, I think Rob would be justified in complaining about the exclusion of IMS CC as a potential approach to reuse as part of the grant program.  Michael Feldstein pointed this out in his balanced perspective on the issue. SCORM and IMS Common Cartridge (the other main contender for a standard educational content interchange format) have substantially different affordances that are appropriate for substantially different use cases.   Michael Feldstein, in OER and Standards My challenge to Rob and others in the conversation would be this.  Argue the things that merit argument and take far greater care when you lambast other solutions. Does IMS CC provide some affordances that might be of use for a program such as this and should it be considered as a potential solution? I think it does. Should a directive such as this specify a single standard for clarity and simplicity?  Or should other standards be options as well?  I have no idea. Should SCORM, in its current state, be the only eLearning standard for the next 30 years?  No way.  Check out Project Tin Can and why SCORM needs to evolve, and tell us how it should evolve. Ultimately, what’s the point here?   Elearning standards have a fundamental purpose: to remove the friction that separates learners from what they need to learn. Rob has succeeded in inciting more than a few folks to criticize SCORM, when few of them have the background to determine the accuracy and reasonableness of his statements.  A vitriolic argument like this does nothing but set us back in the goal of helping learners reach the learning they need. SCORM can absolutely increase the utility of the Open Educational Resources produced by this grant program.  IMS CC may well be able to as well.  Let’s move this discussion past politically motivated and inaccurate accusations to something that helps people get their learning.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:42am</span>
There are so many channels now.  Whether we’re talking radio, TV, or the web in general, there are so many ways that information is pouring over us. Like many companies, we’re doing our best to reach everyone wherever they are… Of late, we’ve been finding that people are missing some important things we have to say.  So, I wanted to lay out the different places we’re talking so you can be sure to visit if you care. Our Blog (RSS) Well, you’re here, so you must know about it already.  For the most part, we tell our big stories here.  Big new projects and products, major software releases, occasional client announcements, and industry brouhahas. support.scorm.com Many of you probably have no idea that our support forum even exists.  We’re constantly answering questions from customers and others in these forums.  As a customer, you’re invited to create tickets whenever you have a question you’d really like help with. More than anything, though, I’d really like to see our customers subscribing to the forum for the product they license.  SCORM Engine customers can follow the RSS feed or use the built in email subscription.  No matter how you do it, this is a great way to know about our newest releases.  (This applies to SCORM Driver customers too, of course.) Project Tin Can Project Tin Can might be the most important bit of work we’re doing these days.  Along with a huge community of real SCORM users, we’re helping to figure out what comes next for SCORM and learning experiences in general.  You can see our contributions daily on the User Voice site, and we’d love to see your contributions there as well.  You can also follow @projecttincan on twitter. Twitter OK, half the time you’ll get inane stuff, but that’s the price you’ll have to pay to get the relevant stuff.  Tim’s tweeting regularly, Mike too, and Joe even has something to say on occasion. If you’re a big SCORM Cloud user, we also use twitter to let the world know when we’re having issues or changes.  @scormcloud is pretty quiet, but it could be useful in an emergency.  (SCORM Engine and SCORM Driver even have accounts, but we rarely use them.) Old School That’s the big picture.  If you need something you’re always welcome to call or email too.  My phone number is 615.550.9522 (yes, that’s me, directly) and my email address is tim.martin@scorm.com.  I actually want to hear from you, so bring it on.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:42am</span>
Key Points: need to separately track team performance and individual performance in team-based learning need to link incorrect simulator input with appropriate remediation more scoring options than just pass/fail & a measure are needed Instructor needs to be able to grade an assessment, assessment should be in a "pending" status while waiting for grade.  Could involve an "Instructor API" need a trigger mechanism to notify instructors of poor or unexpected learner performance informal learning should be supported, formal learning is still important (Nina, referring to "Hazmat Hot Zone" ): The issue that we were really looking at was, that was an instructor-facilitated session, but it was the team activity that the team could fail if one person’s knowledge wasn’t up to par. So, how can we account for the tracking of multiple learners in that kind of environment, whether it’s instructor facilitated or not, but have some kind of tracking model where the the team itself performs well, their interactions are good, their communications are good, but one person in that case has a failure to recognize a hazardous material would cause their entire team to fail when in fact it was really one person that needed some kind of remedial instruction. So do you hold the rest of the team back, because one person’s knowledge, that they should have had going in, or do you allow the rest of the team to move forward and then remediate that person. In which case, that person would need to be remediated to appropriate instruction, not another module in the game, because if they don’t recognize some fundamental concepts, no matter how many times they play the game, they’re still going to fail. You were suggesting that it’s dependent on the individual’s situation or scenario whether you’ll allow the team to pass when only one team member fails, or not, right? You would have to, so in the real case that the Hazmat Hot Zone has used, the instructor decides if the team gets to move forward or is the team going to have to go through another scenario to prove they can do this kind of thing. If somebody’s basic knowledge fails, then the instructor has to take them out of the class, and they have to take another course. So it’s the instructor that does it. But I think if you had the right algorithm, you could make those decisions based on whatever inputs the team made and the individual has made that would determine who moved forward or who didn’t, and when they did, etc. So I have that case, and then I have the simulation case, from the simulator, because the same thing happens. My background is actually in aviation. I started in this industry with aviation, doing pilot training. We would try the same thing. So if someone got into the actual simulator and didn’t perform a procedure properly, or flipped the wrong switch in-flight, or whatever, there was not a way to track that, and remediate that. Again, it is incumbent on a human person to say, "Wow, you really messed this up, you don’t understand how the fuel system actually works, I’m going to reassign you to a fuel system module." So the instructor would have to manually go into the learning management system and fail the person on the simulator, or reassign the simulator module, and manually reassign any kind of course-work they needed for remediation. So, if there was a way to link that up then when something like that happens, depending again on the severity or situation, they would be reassigned automatically to whatever instructional material they needed before they could be allowed to progress. Do you think that adding the team base components, the collaborative components to the data model, is enough to support that sort of scenario at this point, or do you think we need more data in general, and if so, what sort? I think adding the team base piece would go a long long way, I think there’s a lot of data- model elements that nobody uses anymore, and you know with the current technology landscape, I think adding a team base model and a multiple scoring type model would help, because it’s not just having the team based score, but having the ability to track both the individual’s progress and the progress of the team. And I’ve been finding lately, I just did this giant content-migration, but one thing that would have really helped us was a more robust scoring model in general.  So I guess I’m saying we do need more elements that would account for scoring and different types of scoring models. If that makes sense. Did you recall which data model element that you would have wanted for that in particular? I think from a scoring perspective, we do need some better ways to come up with, we do need some model extensions that would enable more scoring options than just a numeric score or pass/fail. Right now we’re stuck with complete, incomplete, or unknown, pass/fail, and a numeric score. One other feedback I’ve seen, they’d like to see a model where scoring doesn’t have to happen instantaneously, there’s a way to track what the response to an essay question is, not give the learner the score, later on the instructor can go in and score it. That’s really important, our Defense Ammunition Center client is having that exact situation right now, where  after they complete a series of activities, we’re going to have to basically mark them as incomplete.  In that system, while they’re going through the instruction, they’re going to create a plan for an explosive storage site, and that plan has to be looked at by a human.  So we’re going to have to have their content sit there, marked incomplete until the human looks at their plan and goes back in and passes or fails them. In the Army this is a problem because that incomplete score will get passed to Army Training Requirements and Resource System  (ATRRS). So they can’t just leave the Army learning management system status to go on to this next piece.  We want it all to be one course, but we’re going to have to just leave it as incomplete in their record, and then a human is going to have to go back into the ATRRS system, like a human administrator, and override their grade to mark it complete. So there’s also a need for someone to be able to see the difference between a course that’s just not completed, and a course that’s complete pending approval? Exactly. Is there anything else you haven’t talked about yet, that you would like to? I have been wanting this for a very long time, because the old training management system that I had years and years ago, at then McDonald-Douglas, now Boeing, did this. It would trip a flag after something like that happened, to the instructor.  Like in a formal schoolhouse setting, there are instructors assigned to groups of students and even though they’re doing web-based training, there is still sort of a lead instructor that oversees what they’re doing. It would be great to have a way to flag a human after someone’s performance has been poor for a certain amount of time. So if you’re doing training and you’ve taken, let’s say you have 10 courses to complete, and you pass the first 2 and then in one you barely pass, the next one you barely pass, the next one you barely pass, something’s wrong; you’re passing, but you’re barely passing. So it would be great to have some kind of automated trigger to notify a human with these problems. Because the human instructors don’t go into the system to check on you. As long as you’re passing, you’re passing. But there are needs: one case would be with our Defense Ammunition Center customer, there are needs for times when they want to know when something is going on with a student and unless they physically go into the system for every student they have, and check every record for that student’s system, they don’t know that. But if there was a way to set up flags and this would be more at a curriculum level, but after so many scores in this area, send a notice to a human being and let them know that this student is struggling. I think keeping the human in the loop, even in this distributed learning world, is really important in many domains. So, for reporting, most reports are pulled monthly. Nobody pulls reports daily. You might get some organizations that pull weekly, so if you have somebody who gets in that situation where they are requiring manual intervention, nobody might look at their records for days or weeks, or even longer. If there was a way, I guess what I’m thinking is a much better integrated system or way to integrate things more, so you talk to me, you realize I do know what I’m talking about, I accidentally missed this question that caused me to fail. So you just want to, when you got that flag notice, you just want to hit a quick button where you assign me to new content or, we always called it "certified pass," so you certify that you’re going to pass me. This is a different kind of passing, instead of just a passing score, it shows that it was a manual pass. So, you certify pass me and I move on. And for you as the instructor it’s all in one little encapsulated communication protocol. I think we’ve tried so much just to take the human out of the loop on this, that we’re shooting ourselves in the foot. So there could be, essentially, an instructor, API, which an LMS could build a UI on top of, or if the LMS doesn’t provide a good UI, then if the API is standard, then … You could choose to do your own thing. So it would be important to have not just the API between the content at the LMS, there should be other APIs. Yeah, I’m all over the multiple API thing. How should learning happen today and in the future? Any way it needs to happen. I guess, from an instructional design perspective, I love the whole concept of informal learning, and collaborative learning, etc, but I want to be sure that we don’t forget about the formal learning experience, the formal designed learning experiences. Because especially in the environment like DOD, you’re teaching processes and procedures and equipment, etc, it’s very important to make sure you still have a robust learning model. But I do think it would be great to find some ways to account for the informal learning that I do on my own.  If there was a button that could appear anywhere after I go and read something or do some activity somewhere online, I could click a button and it could store that to my performance record. Somewhere, showing that I have done that, and it may be that it only appears in certain contexts, maybe after I do that it asks me three questions about the article I just read. And if I get them correct, I get some kind of credit for that. A big thing we’re seeing a lot right now is community of practice, we’re getting a lot of community of practice, like knowledge-sharing. In the whole domain of knowledge management, they talk about knowledge-sharing being a key competency. So your willingness to share information and how frequently you share information, so if I’m on the community practice for whatever topic, I’m on the SCORM instructional design community practice, and I spend half my day on there answering questions, posing resources, that kind of thing, there should be some reward for that somewhere, somehow. So I guess the ability to integrate those kinds of things back into the learning realm. Because not only am I learning by being on there and seeing what other people are doing, I’m helping others learn. So having a way to interface those different systems, so that that type of informal situation could be also tracked.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:42am</span>
We’re pretty excited about the potential (and ever growing reality) of the SCORM Cloud. Over the last year or two, besides putting the SCORM Cloud services out there for use, we created tools to plug into it. The biggest of these is the SCORM Cloud website. Besides the SCORM Cloud site, we produced integrations for open source learning systems such as Moodle and Sakai, we created a plugin that integrates with WordPress, and we even created an application to work within the Google Apps for Domains. But this isn’t about those applications. This is about the code behind those applications. Each of those integrations includes a key element that aids in the integration process. They all use an API library to communicate with the SCORM Cloud. Over the years, we have written a few libraries to make working with the API easier. These libraries cover 4 different languages: Java, C#, Python, and PHP. After 18 months of building libraries that sort of look alike and sort of cover all the basic functionalities of the SCORM Cloud, we found that maintaining these libraries was becoming difficult, and using them was more difficult than it needed to be. We therefore spent the last month creating some uniformity across the libraries and filling out basic functions where they were lacking. We have also created and filled in samples for the basic calls in each of the languages so that you can see how things should work using the libraries. We also have new documentation available for building integrations. The new API documentation hasn’t really changed much in content, but the libraries documentation now covers the calls that exist uniformly across all libraries. In addition, we have put all of the libraries out on a public repository on github, where anyone can download them and even contribute to the projects. The libraries don’t yet provide exhaustive coverage of the full breadth of the SCORM Cloud API, but they do cover enough to create a well functioning application capable of managing courses and training via the SCORM Cloud. (As a hint, the Java library is most complete - that’s what our SCORM Cloud site uses.) With that said, we welcome input about areas that are lacking or could use improvement. Whether you fill in the holes or you just let us know what they are, we want these libraries to be living projects and we want them to be extremely useful and effective. If you are new to developing for the SCORM Cloud, the best place to start is here.  If  you have ideas, comments, or questions, then the forums are a great place for you.  We want to know how you want to use the SCORM Cloud.  Don’t be a stranger.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:41am</span>
You pay for Rustici Software products, and we want to make sure that you’re getting the most out of them. Some of our customers prefer to tuck their use of our products away, and we’re fine with that. But others want to scream from the mountaintop that they’re using the best SCORM conformance software available. If you’re a screamer, then we want you to let the world know that you’re using our stuff. We’ve waded through all the legalese and created a way for you to do just that.     "Powered by" images are now available for you to put to work. Just visit our "powered by" page and grab the HTML or files for print that you need. We’ve provided 3 sizes for each image, but we understand that there will be exceptions. If you need a different size or format, just email support@scorm.com with your needs and we’ll get a custom image made for you — pronto.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:40am</span>
Before we bought 30 inch monitors for everybody, we used to print out all of the SCORM specs as they came out. The hard copy made them a whole lot easier to digest even though it meant the slaughter of many innocent trees. In unpacking the last boxes in our new office today I came across all of them. It makes a nice visual for why it makes sense to work with Rustici Software if you’re serious about providing standards support. What’s in the pile: SCORM 1.1 SCORM 1.2 SCORM 1.3 ("first edition" of SCORM 2004) SCORM 2004 2nd Edition AICC HACP AICC PENS IMS Content Packaging IMS Common Cartridge MedBiquitous Healthcare LOM
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:40am</span>
There is a trophy here at Rustici — the Donnelly Cup. Reserved for the lowest rung on the ping-pong ladder, the Donnelly Cup serves as a reminder to the owner that there is always room for improvement. The Cup changes hands often, and currently resides in Joe’s office. Joe prominently displays the trophy and will challenge anyone to a duel to share its inspiration. Another trophy also appeared at our offices recently —SCORM Cloud earned a Gold Medal in the Brandon Hall Excellence in Learning Technology Awards for Best Advance in Social Learning Technology. When we capture a gold medal we like to celebrate, and we’d like to say "thank you" to Brandon Hall and our fellow winners. We are in good company on a list which includes many of our customers. What’s even more exciting than our new bling are the possibilities that SCORM Cloud offers to advance social learning technology. It allows you to turn virtually any online platform into a track-able learning environment. Whether it’s using the pre-built WordPress plugin to turn your blog into a full blown training source, or developing your own app on top of our API, SCORM Cloud allows you to bring learning to where learners live. So while the Donnelly Cup recognizes the potential for ping pong greatness, we see similar opportunities with SCORM Cloud to inspire new applications within eLearning. If you have an inspiring use of SCORM Cloud, share it with us or let us know what you’d like to see it do next.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:39am</span>
These days, with one click, you can buy a song from iTunes and automatically sync it to your iPod. Remember how long it used to take to buy a CD, burn the songs to your computer and transfer them to your MP3 player? Just think about how much time you saved from this one little improvement- more time to listen to your music, which is what you wanted to do in the first place. What if you could do the same thing with your content? It’s not always a simple process to import a SCORM package into an LMS. It may take 3 steps, or in some cases a dozen or more clicks to publish that course into your LMS. Multiply that by a few courses, and this time and effort adds up quickly. Imagine a tool that lets you export a course directly from your authoring tool into your LMS. One click puts that new course directly where it needs to go. Here is where PENS comes in. PENS (Package Exchange Notification System) simplifies the process of content publishing by automating the transfer of content between systems. While SCORM standards make content and systems work together, PENS takes it one step further. Think of it as rocket fuel for delivering content. As standards geeks, we get excited when specs emerge that further improve the efficiency of how content and systems play together. We see great potential with PENS and are eager to see its benefits realized by the larger eLearning community. So, how can you get started? We recently deployed PENS support in two of our products- SCORM Cloud and SCORM Engine. Now you can publish content directly into your Cloud account or SCORM Engine LMS. Check out how it works. If you’re using an Authoring Tool, see if your provider supports PENS. Have you already adopted PENS? If so, tell us about your experience in the comments section below.
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:39am</span>
The morning started out like any other.  David decided to unwrap a package of Orbit gum.  We’ve been headed down a path toward implementing a bit of BLTI from IMS (Basic Learning Tools Interoperability), and so we have a copy of Dr. Chuck’s book on our island.  (Dr. Chuck’s tattoo has been admired many times, but today, we noticed the back of the bookjacket.) Well, we really started wondering why Chuck was yelling the whole time.  We decided that we would take the opportunity to reenact this important moment.  And if we were going to have that moment, we might as well video it, so that we could share it with you, instead of just the guy who rode by on his bike. Enjoy.  
Rustici Software   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 26, 2015 07:39am</span>
Displaying 14971 - 14980 of 43689 total records
No Resources were found.