Skip navigation
All Places > Higher Education > Blog
1 2 3 Previous Next

Higher Education

100 posts

As a humanities teacher, I love using the RSS feed for Announcements.  There are some phenomenal news feeds and podcasts that support a variety of my course content and it was awesome to have the announcements automatically appear in my Canvas courses.


My biggest frustration, though, was when I found great resources while navigating the internet that I wanted to make available for my students.  I would copy the address, open my Canvas instance, navigate to the particular course, open an announcement, embed the URL with an explanation for my students, and publish it to my course.


What if you're on your phone and find a great link while navigating social media?  The steps to posting can be prohibitive.  You can set up an external feed and "clip" articles to it!


There are two different methods (that we know of): Evernote Webclipper and OneNote Webclipper. This post will address Evernote, but the steps are similar for OneNote!


Steps for Creating a Customized RSS Feed using Evernote:

  1. Download and explore Evernote here.
  2. Create a specific Notebook that will be dedicated to your RSS feed.
  3. Download and install the Evernote Webclipper here.
  4. Create a free account with Zapier.  Note: You can create 5 free "Zaps."  If you are creating a feed or two, the free option will cover all of your basic needs!
  5. Begin a New Zap: Make a Zap
  6. Follow the prompts to create a "Trigger Event" (the action that starts the Zap process):
    1. Choose App: Evernote
    2. Choose Trigger Event: New Note
    3. Evernote Account: sign in to your Evernote Account to link it to Zapier
    4. When asked to Customize Note, select the Notebook that you created specifically for your feed.
  7. Follow the prompts to create an "Action" (the result of the Trigger event created above):
    1. Create the action (this :  When asked, Choose App: RSS by Zapier
    2. Choose action Event: Create Item in Feed.
    3. Customize Item: Create a unique FeedURL 
      1. Make sure to Copy to Clipboard your full Feed URL to use as you set up your Canvas RSS Announcement Feed.
    4. You do not need to enter anything under "Max Records"
    5. Set your Item Title: 
    6. Set your Source URL: 
    7. Provide a brief description of your Feed: 
    8. The remaining options (Author Name, Email, Link, etc) can be left blank.
    9. Select "Continue"
    10. Select "Test and Continue"
  8. Use the web clipper to start the process!
    1. Navigate to any website that you would like to add to an RSS Feed
    2. Use your web clipper and "Save Clip" to the pre-determined Evernote Folder that you established specifically for your RSS feed.


NOTE:  There will be a delay between when you clip an article and when it appears in your Announcement feed.  Most of my tests are delayed a few hours, but I have seen shorter and longer!


Enjoy customizing your own RSS feed!!

Over the summer we gave the New Gradebook a good “tire kicking” before releasing it en masse this fall.  Most of the changes are subtle, some changes are powerful but just under the surface. However, some changes are very informational, yet are confusing until you look a little deeper.  Also, there are some advanced features that you may be excited to try. So, “What’s new?”


Let’s start with the cool stuff.  It's cool on an Academic Technology level, anyway.  I'm referring to the new feature that is helpful by filtering the gradebook down to see only the specific group of students you want to see.  The new gradebook makes it possible to filter your list of students assignment groups, by module, by section, and by student group.  


So, it might be used to find the:

  • weekly reflection (grade group)
  • for the week 2 module
  • in section 004 
  • student group “the cooliest biologists”

 Of course you don’t have to drill down that far to be useful.  The filter feature is super handy, but you have to pull it into view, or that functionality sits hiding below the surface.

To pull them into view, in Gradebook, go to “view”>”Filters” and then click on the one(s) that you want to try.  


In contrast, there are a things that are immediately in view with the new gradebook that are, let’s say not perfectly clear.  One such item is the color based “status” indication.  By color, you can tell if an assignment is late, missing, or excused.  If the color doesn’t work for you, you can change it.  

The defaults are:

Blue : Late submission
Red : Missing submission
Green : Resubmitted assignment
Orange : Dropped grade
Yellow : Excused assignment


Also in the realm of informative but potentially confusing is the icons that appear in the new gradebook.  With a glance you can tell much about the current grade situation, however, you may not have a clue as to what the iconography means.  I find it easiest to just look it up the key and instructions in the Canvas document specific to the icons and colors in the New Gradebook. 


However, there is one icon, hidden icon,that I need to call to your attention right now. The hidden symbol may appear at the top of a grading column, it indicates that the manual grading policy is set for that column.  It also means that at least 1 grade is not visible (posted) to the student(s) for that assignment.  This new icon shows up by default to those that used the "mute/unmute" feature on the previous version of gradebook. That feature allowed you to hide (mute) the grades from view, enter the grades, and later post (unmute) all of the grades at a later time.  The change is that now you can start out with all the grades hidden, and post (make visible to the student) one grade at a time if you want.  So, if you previously used the mute/unmute Canvas assumes that you want to use the manual grading policy.  You do have a choice though, you can also set it to post your grades automatically/instantly by changing the Grade Posting Policy.  but that isn't immediately obvious that is what you need to do from the eyeball on the screen.  You may want to look closer at this one in How do I use New Gradebook?


More functionality includes setting late policies  and  curving grades.  Also, inserting zeros is handy by using the set default grades feature, and you can choose to override your final grades column for whatever reason.  


I hope that helps give you a better idea what the New Gradebook is all about.  If you want help with any of this, shoot us an email, or leave a message below. 



It's the start of another school year, so what better time to use a tool that will make communicating with your students so much easier.  Announcements is so much better than just sending an email to your whole class. If you are not using the Canvas inbox, you will have to go through the trouble of getting all your students' emails and making them a group list in your email program.  If you are using Canvas inbox, it's pretty easy to email the whole class, but with the flood of emails students will be getting from instructors and TAs at the start of a semester, it is really easy to lose an email.


Wouldn't it better if you had a tool that:

  • automatically notified every student in your class and
  • showed up on the course home page the next time students go to the course site and
  • had all the announcements you ever sent for the course in one place where students could easily find old ones and read new ones?

Bonus points if that tool would allow you to:

  • easily link to things in the course site and
  • create your announcements early and have them post later at a date you choose and
  • automatically shift those posting dates when you start a new semester!


This is why you should be using Announcements.  Announcements has all of those features, including all the bonuses!


Creating an announcement

When you want to send an announcement to your entire class, choose to add an announcement (instructions).  

  1. You create the announcement.  You can add links and images or videos using the rich text editor and link to assignments, pages, or files directly using the content selector.
  2. Underneath the text box choose to add a delay date. Even if your delay is minutes, having a delay date will allow the date to automatically shift when importing your announcements to a new Canvas site. (Thanks Elson Boles for the clarification!)
  3. You can also choose to attach files and allow commenting and liking. 
  4. When you save, students are notified there is an announcement based on their notification preferences.  By default this is usually this is email but it can also be by text if students make that choice!


Setting announcements to show on home page

To ensure your announcement shows at the top of the home page you need to adjust the course settings

  1. Go to Settings > Course Details
  2. Scroll down and click on More options
  3. Check the box by  "Show recent announcements on Course home page" and choose how many announcements you want to show.  One is usually enough, or choose two if you communicate often, but avoid three as it takes up a lot of space on the screen.
  4. You can also choose "Disable comments on Announcements" as the default.


Shifting posting dates for a new semester

This process is pretty easy.  When you import your course content from one Canvas site into another, choose to shift the dates (instructions).  Not only will the assignments' dates shift, so will your delay posting dates!  Never again will you have to create that test reminder email!


A few warnings:

  • If you allow comments on announcements, it looks a lot like a discussion, but it cannot be found under the Discussions.  The announcement and all its comments stay under the Announcements item in the navigation menu.  This can be confusing to students.
  • Announcements will not be sent out if the course is not published.  Additionally, students that have never been in Canvas and have never clicked the Canvas agreement (like freshmen) will also not get announcements.  For this reason we recommend that any announcements before the semester starts are sent through your institution's system of record. *For the University of Minnesota, that would be MyU.  Go to your course roster and scroll down to find the Notify All button.
  • This doesn't work for sending emails to individual students, sections or groups.  The Canvas Inbox is the tool that will do that for you.
  • When you import your course to a new site, if you select all content, all the announcements will also be copied over.  If you did not set a delay date, your old announcements may be visible to students!
    • Announcements with a post date before the course is published will be visible to students as soon as the course is published. These announcements are not sent to students unless you edit the announcement. After editing and as soon as you save the announcement, it will be emailed to the address students have set up in their notifications, unless you have set up a delay date.(Thanks again Elson Boles.)  
    • Be sure you either delete or set a delay posting date (instructions) on imported announcements before publishing to prevent students from seeing the all the imported announcements immediately.


Did I miss any Announcement functions?  How do you use this tool in your courses?


*The CBS-RLT Tech Tip is written by academic technologists at the University of Minnesota, College of Biological Sciences.  It may contain references to Canvas settings and integrations that are specific to that institution. 


Updated 9/17/19

Wasi Khan

Measuring contract cheating

Posted by Wasi Khan Jun 27, 2019

Measuring contract cheating


"If you can't measure it, you can't improve it"* is the inspiration behind this blog post. In this post, I discuss why a way to measure contract cheating is necessary and propose a measurement metric. 


The motivation behind this (and future) posts is to journal the process of building this cheating measurement tool, collecting feedback and getting some help along the way. So, if anyone has any thoughts or is interested in helping please feel free to comment 


Okay, so, the question is...


Why do we need to measure cheating?


Over the years, we've all seen interventions in the area of contract cheating increase. And interventions come in many forms: technological (software), political (bans) and pedagogical (less writing assignments/raising awareness). While all such news is great, there is a larger question: how do we know the interventions are working? I feel this is a difficult, yet crucial question to ask (and answer!).


A measurement tool is as necessary as the interventions themselves. Why? Because we will eventually need the measurement tool to gauge the efficacy of the detection/prevention tools. How else can we tell if any government policy/technology is really hurting the businesses of essay mills?


The next question then becomes...


How do we measure contract cheating?


Self-reporting seems like a sub-par method to measure contract cheating interventions in my opinion. Since that approach is a bit biased (un-verifiable), my tiny brain proposes the following way: we measure the popularity of contract cheating websites and essay mills. I mean if cheating is decreasing, contract cheating websites will be less popular and vice versa right? 


Since we obviously don't (and never will) have the actual data of students cheating, I think the popularity of contract cheating websites is the ideal proxy/stand-in to measure the cheating market.


The most straight-forward (and reliable) data we can get on a website's popularity is its traffic/analytics data. But then there are hundreds and thousands of essay mill and contract cheating websites. 


The next question then becomes..


How do we monitor all odem websites?


Fortunately, other people have run into the same problem and they do it as such: they create an index. For example, there are 2,400 companies listed on the stock exchange but the DJIA (Dow Jones Industrial Average) only pools the data of the 30 largest companies and monitors their prices over-time. This 'average' then becomes a proxy for the entire stock market and the economy (by extension). Much like how how our website traffic data will be the proxy for the entire cheating economy


The next question then becomes... 


What do we call this cheating measurement tool?


I'm going to go out on a limb and call it the 'Contract Cheating Index (CCI)'. But if you have any better names, please feel free to suggest. Anyway, I feel we have something to build upon now. 


Which begs the question...



Where do we start?


The plan of action is:


  1. Analyze the traffic of a sub-set of contract cheating websites over-time
  2. Pick the top 30
  3. Create an 'index' which shows an upward or downward movement (much like the DJIA)
  4. Automate the process
  5. Display it


In the next post I shall do task 1 and task 2 and get a sense of the data. Just a heads-up our traffic data will come from Alexa (not the speaker, the website), so if anyone can find the time to collaborate with me on this that would be fun. Maybe Kona Jones, with your statistics experience? 


For now, this journey has to stop here. I hope you enjoyed reading, as much as I did writing. What is getting me excited is: in the next post I'll actually have some numbers to play with and data to share! Ain't that fun! 













*I think  the quote is attributed to Peter Drucker. 

I realize this is a bit 'meta' but I wanted to highlight the usefulness of the Canvas guides. Each of the guides has a table of contents that makes it really easy to find the topic you are looking for. Below is a Google Doc on using the Canvas guides:


I recommend bookmarking the  Canvas Instructor Guides  -- you'd be surprised how many questions the folks at Instructure have already answered for us! Each of the guides are clearly written and include screenshots with annotations.


Feel free to share with others or make a copy for yourself to distribute at your institution!

This CanvasTip actually came from one of my faculty and I thought it was definitely worth sharing.


In a user's account Notification preferences, there's an option under Alerts called Content Link Error. I never paid much attention to it but if you hover your mouse over it, it explains this preference will notify an instructor the location and content of a broken link that a student has interacted with inside a course. The default setting for this preferences is Daily, which may be fine but I suggest changing it to Right away ✅instead.


Content link error notification preference

Think about it: if you're teaching a course and a student tries to access something and is presented with an error, how do you think that student will feel? My guess is probably annoyed . If you were notified right away about this and could potentially fix it in a matter of minutes, you could help avoid any further headaches for your students. 


Below is a link to the help guide in Google Doc form:


Get Notified Right Away About a Course 'Content Link Error'



How do I set my Canvas notification preferences as an instructor?

How do I add contact methods to receive Canvas notifications as an instructor?


Please share if this is helpful!

Quizzes.Next will eventually replace the default Canvas quizzing tool, but in the meantime, there's still a lot of development needed to bring it to feature parity. Here's what led The Wharton School to start using Quizzes.Next sooner rather than later.


Meeting Our Biggest Need

One of the largest core courses taken by all undergraduate students at Wharton is "Introduction to Operations, Information and Decisions" or OIDD 101. Depending on the term, this intro course will have up to 500 students enrolled. The bulk of the course grade comes from six online quizzes--each one has a mix of 10 multiple choice and numeric answer questions. Even with the faculty's substantial teaching experience, sometimes quizzes need to be created quickly without time to review them thoroughly. Often, there can also be more than one way to interpret a question, resulting in the need to regrade quizzes after they are submitted and recalculate student scores.


In classic Quizzes, regrading is triggered by certain actions (eg, changing the correct answer) and is only available for certain automatically-graded question types. Unfortunately, classic Quizzes do not allow regrading for numeric question types. While infrequent, when the need to regrade a numeric question does arises, it's a pretty big headache. In the last instance of this course, even a small handful of regrades resulted in a few hours of manual regrading. And that's just for one course! Even as I was writing this blog post, I received a report of a manual regrade needed for a numeric question in a quiz taken by 240+ students . . . 


Enter Quizzes.Next

If you've reviewed the Quizzes.Next FAQ or Feature Comparison pages recently or even started exploring the tool yourself, you know that while there are a lot of new features and question types in Quizzes.Next, there are still several pending features for development. These include some fundamental features, such as the Preview tool, the ability to allow additional attempts, LockDown browser compatibility, Surveys, and downloadable student and item analysis reports. After weighing the pros and cons of the feature comparison chart, the promise of a more robust regrade tool won us over and generated interest in piloting the tool for OIDD 101. 


We had hoped to start small, by migrating a few low-stakes practice quizzes to the new platform first. But when the faculty told us that practice quizzes would be given on paper this year and that Quizzes.Next would be used for the bulk of the course grades, we quickly went from dipping a toe into the pool to doing a full canon ball. Fortunately, we had the consolation knowing that if anything did go wrong, we could always revert back to classic Quizzes within the same course.


Spring 2019 Pilot


After securing faculty support (the lack of numerical regrade was a major pain point for the three instructors before, so they were eager to try something new), we enabled Quizzes.Next for a single sub-account and also enabled the "Quiz Log Auditing" feature option. This was key to accessing the View Logs, which were extremely helpful in troubleshooting issues later on. Two teaching assistants created the quizzes, after which we checked the settings thoroughly before the quizzes were published (our workaround to the lack of a Preview tool). Because the quizzes were named "Assignment 1, Assignment 2,  etc . . ," rather than "Quiz 1, Quiz 2 . . ." students were able to find them easily under the "Assignments" page. Students said they liked the look of the new interface, while the TAs and instructors found it intuitive to build new quizzes and add images to questions. The regrade feature correctly recalculated grades for  numeric answer quizzes (hooray!) and even handled multiple regrades for the same question (a problem with classic Quizzes). Based on this success alone, the faculty have already agreed to continue using Quizzes.Next in the Fall term.



1. No Auto-Submit with "Until" Date: Each quiz was available to students for an entire week and late submissions were not accepted. Expecting the same functionality as in classic Quizzes, faculty told students that any quiz not submitted by the "Available Until" date would be automatically submitted by Canvas. When this didn't happen as anticipated for Assignment 1 and 10-15 students were left with "In Progress" quizzes, faculty felt like they had lied to students. To fix this issue, we re-opened the quiz for the students with an "In Progress" status, masqueraded as them, and then submitted on their behalf the responses they had added as of the due date (found under "Moderate" > "Attempts in Progress" > "In Progress" log).


For the next quiz, faculty stressed the importance of manually clicking the "Submit" button in order for Canvas to process their quizzes. While there were still a few students each quiz who didn't deliberately click "Submit" (or assumed that clicking "Submit" once, without clicking "Submit" again when the Submission Confirmation message popped up, was sufficient), these incidences lessened over the course of the term. 


2. No Quiz Log Data Saved: In a small handful of instances, students claimed to have answered all the questions, but their responses were not recorded in the quiz logs. After much troubleshooting, we came to realize that a specific behavior was causing the loss of data. Since these quizzes were available to students for a week at a time with no time limit, many students were leaving the quizzes open on their browsers for extended periods of time, sometimes several days without refreshing or closing the page. In that time, the Canvas session was timing out, so that by the time students went to input their responses, the data was unable to push out to the server. Unfortunately, when this happens little information, other than a timestamp for when the student began the quiz, is recorded, even in Instructure's server logs. The problem is avoided by students refreshing the page often or preferably, closing out of the quiz any time they are not actively working on it. 


3. On-Time Submissions Marked Late: If a student submitted a Quizzes.Next quiz within a few minutes of the due date/time, sometimes a processing lag in SpeedGrader resulted in the submission being marked late in the Gradebook. This bug could even happen for on-time submissions that were initially marked as on-time, but then manually graded after the due date! In our situation, the faculty were very understanding of this bug and knew that students weren't actually submitting quizzes late because of the availability dates. But for courses that have New Gradebook enabled and set to automatically deduct points for late submissions, this would be a more serious concern. 


Lessons Learned So Far 

With only one course in the pilot and many more developments in the pipeline for Quizzes.Next, we still have a lot to learn. But we've also gained a lot of experience in this first go-round. Below of some things we've discovered along the way:

  • Saving Quiz Logs: For quizzes that are available to students for an extended period of time, instruct students to close out of quizzes any time they are not actively working on them. This will ensure that their answers are recorded in the quiz logs and not lost due to the Canvas session "timing out" or a disrupted Internet connection. 
  • Auto-Submit: While classic Quizzes would automatically submit when the "Available Until" time passes, this doesn't happen in Quizzes.Next. Make sure students know that unless there's a time limit for the quiz, they will need to click the "Submit" button and confirm their submission in order for it to actually process. 
  • Question Types: Be sure you're using the right question type when you create a question. The question type can't be changed once you start drafting the question so if you need to switch types, you'll have to create a new question. 
  • Accessing SpeedGrader: To view all submissions in SpeedGrader, you'll need to access the link through the Gradebook, not in the quiz itself. Only individual attempts are visible within the "Moderate" tool.
  • New Question Types: The stimulus question type is a good replacement for "text only" questions. Note: If you embed an image in the stimulus that is larger than 600 pixels wide, students will need to scroll to see the whole image. The word count for essay questions is really helpful and it's great to finally have ordering and matching question types! 
  • Item Banks: Item banks are tied to user accounts, not courses, so right now only the user who created the bank can view, edit, or use it. This presents an issue for co-instructors who want to share item banks. According to this post, the ability to share item banks is a pending feature.


Thanks for reading about Wharton's initial experience with Quizzes.Next! I'm looking forward to presenting about Quizzes.Next at InstructureCon 2019 and sharing follow-up blog posts as we continue this pilot. If you have used Quizzes.Next before and have other tips/tricks, or are holding off because of pending features, please comment below!

Hi Everyone,


I'm usually not one to write too many blog posts, and I really debated the best place to put this.  As Ally is an accessibility tool it could have certainly gone in the accessibility group (and beginning my community college career in DSPS I do have a soft spot for UDL and 508/ADA compliance--so important for student success), but this has more to due with implementation and challenges regarding our processes and complexity of getting a tool of this scope in-place at a multi-college district with over 50k FTES.  I do believe this is more applicable to this Higher Education group, as there are specific challenges that we face in our environment that may not be as applicable to some of the other sectors.  Also, please forgive me as I've left some of this intentionally vague so that I don't identify any specific folks at our district, as everyone is wonderful to work with here.


To begin, we had a subgroup that I was part of that was charged with analyzing which potential tools we could adopt in order to enhance accessibility for students, and after looking at a few options it was determined that our best path forward was to explore Blackboard Ally.  We piloted Ally for a semester, and after positive feedback from the small testing group we then signed a 3 year contract.  The thinking was that after using the tool in a somewhat limited capacity with that small group it was found to be valuable, and we could then begin an opt-in rollout to specific courses where faculty could use the tool the first semester (where we could provide additional training and use those experiences to develop additional resources), then roll it out to all courses the following semester.


The main complexity started when we began to look as a District at how the content that Blackboard Ally identified as needing some level of remediation, was in actuality going to be remediated.  Looking at the sheer amount of content that we need to remediate, it is a daunting task.  As I mentioned above, we're a pretty large district, with four colleges and over 50k full-time equivalent students.  Looking back at just one semester of content that Ally identifies, we can see almost 800,000 pieces of content.  While the course numbers are a bit inflated as we create a course shell for every section, the content number is fully accurate regarding what's in Canvas.  


LRCCD FA18 Ally Stats


This leads me into the challenge that we are still facing, and why we have had to delay our rollout--simply that we need a comprehensive plan on how this content is going to be remediated.  Right now we have courses that have content in them that is not fully accessible, and we can see that in the account level reporting.  We are not looking at or evaluating the course specific accessibility reports, though they are available.  The challenges is that content was there before we implemented Ally, as it is there now with Ally implemented, the only difference is that we can't preach ignorance or pass the buck when we have reporting that shows we do have inaccessible content.


We are now having to somewhat on-the-fly come up with plans on how to help faculty remediate content.  Many of the courses we have are fully online (and fully developed) and have been taught and continually have evolved for years.  When there are hundreds of pieces of content, each of which can take between minutes and hours to remediate, there is just too large of a burden to expect faculty members to fully remediate the content themselves in a timely manner.  We are evaluating options such as hiring more faculty coordinators at each campus to help with remediation, hiring district-wide instructional designers to remediate content, having stipends available to faculty for content remediation above their regular teaching load, etc.  With four colleges and so many decision makers needing to be consulted and the ultimate decision needing to be negotiated with faculty, this process is not something that is able to be accomplished in a week or even a month.  It is critical we get this done for students, as they need fully accessible content, but there are so many considerations that need to be made it is quite the process.


In closing, the main reason for making this post was to inform others regarding the challenges that are presented once you begin identifying inaccessible content.  Hopefully you have a good experience using whatever tool or solution that your institution chooses, and I just want to make sure that those charged with making those decisions consider the implications when they choose to implement their solution.  Having a comprehensive plan regarding how to remediate content is very valuable.


Thanks for your time reading this.



We made some cool Canvas stickers! 


We plan to have a Canvas kick-off event for our students soon. If they show us they have the Canvas Student app installed on their mobile device, we're going to give them one of these cool Canvas stickers as a fun way to help with marketing and get them excited about our new LMS.


If anyone wants the Google Drawings files to create (or modify) for your own stickers, see below:



The vendor we used is Sticker Mule and they were fantastic to work with and they turned out great. These stickers are 2-inches in diameter but they have all kinds of different shapes and sizes to choose from.


Colored Canvas sticker animated GIF

Are you interested in university governance and academic leadership? join our LEAD-Community 


   The LEAD2 project is a Capacity Building in Higher Education project supported by the Erasmus+ programme. The overall objective of the LEAD2 project is to strengthen the capacity of higher education institutions in governance and academic leadership and build an online Knowledge Base and an EU-China Centre on university governance and academic leadership in the context of innovation and internationalisation of higher education. The project involves different European and Chinese universities.


We create a space on where teachers, academic leaders, managers, administrators can exchange knowledge and ideas about university governance and academic leadership in higher education. We also update news, events of the LEAD2 project.

Join us to find out more!!!


We also offer a LEAD MOOCs on University governance and Academic leadership in Canvas. Enroll for FREE!! 


Visit our website  


LEAD2 project team



My name is Alan Kinsey, and I am the Instructional Design Specialist at Holmes Community College. We are doing research into how other community colleges and eLearning departments function and exist in their contexts. We have created a short survey to gather this information. This survey is geared to gather information regarding fully online courses, e.g. distance learning, eLearning, online learning, etc.


If you have a few moments, please fill out this survey to help us in our research efforts:


Any information gathered in this survey will be used for research purposes only. Your responses will be kept confidential and will not be shared.


If you have any questions about this survey, please let me know. Have a great day, and thank you for your time!

     Registration is now open for the Mainstreaming Virtual World Learning Colloquium @  The Colloquium is a free event -- on December 1, 2018 (9:00 a.m. - 2:30 p.m. Central Time) -- in the AvaCon Grid Quaternary Stadium (as well as a community event affiliated with the upcoming Open Simulator Community Conference). Still, seating is limited – so reserve your space now!  Featured speakers at the Mainstreaming Virtual World Learning Colloquium include:

  • Joyce Bettencourt (Nonprofit Commons);
  • Cynthia Calongne (Colorado Technical University);
  • Valerie Hill (Community Virtual Library Database);
  • Crista Lopez (University of California-Irvine);
  • Kay McLennan (Tulane University);
  • Eileen O’Connor (SUNY Empire State College);
  • Andrew Stricker (Air University);
  • Barbara Truman (University of Central Florida); and
  • Rachel Umoren (University of Washington).

     Learning simulations created in an Open Simulator virtual world are an excellent choice for educators as well as have immediate (and to be developed) uses within a Canvas course site.  Virtual world simulations are highly immersive, inexpensive (in comparison to commercially available platforms), infinitely customizable, easily duplicated/saved for new uses/re-uses, and FERPA-compliant (when a grid is closed to external visitors).  Also, while a single sign-in to an Open Simulator virtual world through a Canvas course site is not yet available, there are numerous ways to embed virtual world simulation content directly into a Canvas course site.  For example, instructors can use virtual world learning simulations to:

  • Create screen capture video clips of case study dramatizations created within a virtual simulation;
  • Create stationary image or 360 video scenes with embedded hotspots;
  • Capture unique [screen capture] images to illustrate course content; and
  • [When students are given the option to participate in virtual discussions and/or tours of course-related learning simulations...] Transmit information about the optional virtual world discussions/tours through discussion forum communications and/or content pages. 

     On the topic of the future integration of the virtual world viewer into Canvas, Tulane University successfully used virtual machine software to virtualize the viewer needed to access an Open Simulator virtual world (and accordingly, eliminated the "insufficient computing power" barrier that previously limited student participation in virtual world discussions/tours).  Still, educator/Canvas community help is needed to realize the option of the complete integration of an Open Simulator virtual world viewer into a Canvas course site page. 

     Again, registration is now open for the Mainstreaming Virtual World Learning Colloquium @  The Colloquium is a free event -- on December 1, 2018 (9:00 a.m. - 2:30 p.m. Central Time) -- in the AvaCon Grid Quaternary Stadium (as well as a community event affiliated with the upcoming Open Simulator Community Conference). Still, seating is limited – so reserve your space now!

This is somewhat of a note to myself.  Up until recently, YouTube would allow sharing videos without related videos using an option on the Share:Embed popup.  I was updating some pages last night and noticed they no longer offer this option.  It is still possible to keep related videos from showing - you just need to add ?rel=0 in the link.


For example, a src for the embed code has a value like this "


Here is where the ?rel=0 (that is a zero, btw) should be entered:



Hope this helps someone else.  I can't be the only one who doesn't want my students to know that I'm looking for a "cute haircut for fat ladies."

The short answer is no, Cengage Unlimited doesn’t change Canvas courses you’ve already built. However, you may be asked by instructors who have student’s with subscriptions to add a new link to Cengage Unlimited to their course.


To do that check out this quick how-to video that walks you through it.


If you haven’t heard about it, Cengage Unlimited is a subscription service that students can purchase to gain access all Cengage ebooks, digital learning platforms, and more. 


For more information, join the discussion in the Cengage community or feel free to message me with questions. 

Want students’ handwritten work, like diagrams, calculations, and drawing to appear directly in Canvas like it does with typed work? Check out our demo and learn about our exclusive early adopters program during our webinar on June 12th and noon Eastern Daylight time. Sign up here:


- Miranda

Filter Blog

By date: By tag: