The Dashboard is the very first screen when we come arrive into Canvas after successful authentication. At the University of Minnesota it goes to “Card View” by default, which represents your courses as “cards” but I tend to call them tiles. The cards may have images, or they may simply be a color.
If you don’t customize the Dashboard at all, Canvas decides what is on there, and in what order. Usually it does not do a great job of guessing what courses you want to be in the cards, so it’s best to dictate that yourself.
Customizing the “Card View” is easy, but not obvious.
The first thing you should know it that if you add even 1 course manually to the Dashboard, the default automatic list will disappear. So, once you start customizing you control it all.
Once you understand this is the way it works, and you are able to remember this again next semester, you are in business. You can change your Dashboard quickly to match your semester.
You can click on your course card menu, click “move” then select “unfavorite”
Drag and Drop your cards into your order of preference.
Customizing the “Card View” is easy, but not obvious. Once you understand how it works, you’ll be able to make your daily interactions with Canvas easier, more feng shui (digitally speaking).
Related to this:
One of the big hassles with testing accommodations is having students submit a testing request EVERY TIME for each individual test within any course. This is no longer the case with New Quizzes.
Here is an example from Pasadena City College (PCC), our closest local community college, which also utilizes Canvas as their LMS.
Students will be prompted to submit their current Accommodation Plans and let their instructors know that they will take tests under accommodated conditions and that the local Disabled Student Program department will contact the instructors. They will need to have their syllabi ready.
For New Quizzes, the same rules apply. However:
TEST REQUEST FORM AVAILABILITY
Some institutions have various availability times when the request form will be open. Check with your school for more info.
As always, students should submit their requests as early as possible to guarantee a spot, especially during finals.
Students working in groups to learn just learn better. That is what years of research have shown us. Managing groups can be difficult, and not just during class. Canvas has some group functionality that instructors can use to manage group work:
There are tricks to using groups, group assignments, and group discussions, but today we are focusing on how to change group membership in the middle of a term.
When you attempt to move a student into a new group after group submissions have already occurred, you get a very special warning:
Being safe, most instructors will then Create a New Group Set and Submit. Here are the problems that follow:
You have several more things to do to finish changing the membership. Here's what you need to do next.
Now you have adjusted the groups and all the future group assignments and discussions will be set up for the new groups.
Unless you missed one. And if you did, once there is a submission, you can't change the group set. Your only option is to duplicate the assignment, choosing the correct group set and asking students to submit again.
This way may not work for all instructors because it requires one best practice that must always be done when grading Group Assignments:
Once Group grades are entered, edit the assignment and check Assign Grades to Each Student Individually and save.
There may be reasons you are already doing this. It allows you to change the scores of group members that did not contribute or were absent. The key is that this setting LOCKS IN all the grades as individual grades. That gives you the freedom to change the existing group.
That's it, you are done. Future assignments are still using this group set with the newly modified group. Past graded assignments have the grades locked to the individuals and are not changed.
Unless you weren't done grading a group assignment. That's the only condition; you must have any group grading done and set to Assign Grades to Each Student Individually. And this could be a pain if you have many changes to make.
You can imagine that which you will choose will depend on where you are in the semester and what your normal grading practice is for group grades. Because Canvas prompts users to clone the group, that is the easiest, safest solution but the additional work of making the group changes and then changing the future assignments/discussions must be done. If you always change group grades to the individually graded option, ignoring the warning might be for you.
Need more information or a different explanation? Check out Canvas: Changing Group Membership during a Semester.
*The CBS-RLT Tech Tip is written by academic technologists at the University of Minnesota, College of Biological Sciences. It may contain references to Canvas settings and integrations that are specific to that institution.
My favorite Ideas for improving groups in Canvas. The ones without links are feature Ideas I haven't found yet.
As a humanities teacher, I love using the RSS feed for Announcements. There are some phenomenal news feeds and podcasts that support a variety of my course content and it was awesome to have the announcements automatically appear in my Canvas courses.
My biggest frustration, though, was when I found great resources while navigating the internet that I wanted to make available for my students. I would copy the address, open my Canvas instance, navigate to the particular course, open an announcement, embed the URL with an explanation for my students, and publish it to my course.
What if you're on your phone and find a great link while navigating social media? The steps to posting can be prohibitive. You can set up an external feed and "clip" articles to it!
There are two different methods (that we know of): Evernote Webclipper and OneNote Webclipper. This post will address Evernote, but the steps are similar for OneNote!
Steps for Creating a Customized RSS Feed using Evernote:
NOTE: There will be a delay between when you clip an article and when it appears in your Announcement feed. Most of my tests are delayed a few hours, but I have seen shorter and longer!
Enjoy customizing your own RSS feed!!
Over the summer we gave the New Gradebook a good “tire kicking” before releasing it en masse this fall. Most of the changes are subtle, some changes are powerful but just under the surface. However, some changes are very informational, yet are confusing until you look a little deeper. Also, there are some advanced features that you may be excited to try. So, “What’s new?”
Let’s start with the cool stuff. It's cool on an Academic Technology level, anyway. I'm referring to the new feature that is helpful by filtering the gradebook down to see only the specific group of students you want to see. The new gradebook makes it possible to filter your list of students assignment groups, by module, by section, and by student group.
So, it might be used to find the:
Of course you don’t have to drill down that far to be useful. The filter feature is super handy, but you have to pull it into view, or that functionality sits hiding below the surface.
To pull them into view, in Gradebook, go to “view”>”Filters” and then click on the one(s) that you want to try.
In contrast, there are a things that are immediately in view with the new gradebook that are, let’s say not perfectly clear. One such item is the color based “status” indication. By color, you can tell if an assignment is late, missing, or excused. If the color doesn’t work for you, you can change it.
The defaults are:
Blue : Late submission
Red : Missing submission
Green : Resubmitted assignment
Orange : Dropped grade
Yellow : Excused assignment
Also in the realm of informative but potentially confusing is the icons that appear in the new gradebook. With a glance you can tell much about the current grade situation, however, you may not have a clue as to what the iconography means. I find it easiest to just look it up the key and instructions in the Canvas document specific to the icons and colors in the New Gradebook.
However, there is one icon, ,that I need to call to your attention right now. The hidden symbol may appear at the top of a grading column, it indicates that the manual grading policy is set for that column. It also means that at least 1 grade is not visible (posted) to the student(s) for that assignment. This new icon shows up by default to those that used the "mute/unmute" feature on the previous version of gradebook. That feature allowed you to hide (mute) the grades from view, enter the grades, and later post (unmute) all of the grades at a later time. The change is that now you can start out with all the grades hidden, and post (make visible to the student) one grade at a time if you want. So, if you previously used the mute/unmute Canvas assumes that you want to use the manual grading policy. You do have a choice though, you can also set it to post your grades automatically/instantly by changing the Grade Posting Policy. but that isn't immediately obvious that is what you need to do from the eyeball on the screen. You may want to look closer at this one in How do I use New Gradebook?
More functionality includes setting late policies and curving grades. Also, inserting zeros is handy by using the set default grades feature, and you can choose to override your final grades column for whatever reason.
I hope that helps give you a better idea what the New Gradebook is all about. If you want help with any of this, shoot us an email, or leave a message below.
It's the start of another school year, so what better time to use a tool that will make communicating with your students so much easier. Announcements is so much better than just sending an email to your whole class. If you are not using the Canvas inbox, you will have to go through the trouble of getting all your students' emails and making them a group list in your email program. If you are using Canvas inbox, it's pretty easy to email the whole class, but with the flood of emails students will be getting from instructors and TAs at the start of a semester, it is really easy to lose an email.
Wouldn't it better if you had a tool that:
Bonus points if that tool would allow you to:
This is why you should be using Announcements. Announcements has all of those features, including all the bonuses!
When you want to send an announcement to your entire class, choose to add an announcement (instructions).
To ensure your announcement shows at the top of the home page you need to adjust the course settings
This process is pretty easy. When you import your course content from one Canvas site into another, choose to shift the dates (instructions). Not only will the assignments' dates shift, so will your delay posting dates! Never again will you have to create that test reminder email!
Did I miss any Announcement functions? How do you use this tool in your courses?
*The CBS-RLT Tech Tip is written by academic technologists at the University of Minnesota, College of Biological Sciences. It may contain references to Canvas settings and integrations that are specific to that institution.
"If you can't measure it, you can't improve it"* is the inspiration behind this blog post. In this post, I discuss why a way to measure contract cheating is necessary and propose a measurement metric.
The motivation behind this (and future) posts is to journal the process of building this cheating measurement tool, collecting feedback and getting some help along the way. So, if anyone has any thoughts or is interested in helping please feel free to comment .
Okay, so, the question is...
Over the years, we've all seen interventions in the area of contract cheating increase. And interventions come in many forms: technological (software), political (bans) and pedagogical (less writing assignments/raising awareness). While all such news is great, there is a larger question: how do we know the interventions are working? I feel this is a difficult, yet crucial question to ask (and answer!).
A measurement tool is as necessary as the interventions themselves. Why? Because we will eventually need the measurement tool to gauge the efficacy of the detection/prevention tools. How else can we tell if any government policy/technology is really hurting the businesses of essay mills?
The next question then becomes...
Self-reporting seems like a sub-par method to measure contract cheating interventions in my opinion. Since that approach is a bit biased (un-verifiable), my tiny brain proposes the following way: we measure the popularity of contract cheating websites and essay mills. I mean if cheating is decreasing, contract cheating websites will be less popular and vice versa right?
Since we obviously don't (and never will) have the actual data of students cheating, I think the popularity of contract cheating websites is the ideal proxy/stand-in to measure the cheating market.
The most straight-forward (and reliable) data we can get on a website's popularity is its traffic/analytics data. But then there are hundreds and thousands of essay mill and contract cheating websites.
The next question then becomes..
Fortunately, other people have run into the same problem and they do it as such: they create an index. For example, there are 2,400 companies listed on the stock exchange but the DJIA (Dow Jones Industrial Average) only pools the data of the 30 largest companies and monitors their prices over-time. This 'average' then becomes a proxy for the entire stock market and the economy (by extension). Much like how how our website traffic data will be the proxy for the entire cheating economy .
The next question then becomes...
I'm going to go out on a limb and call it the 'Contract Cheating Index (CCI)'. But if you have any better names, please feel free to suggest. Anyway, I feel we have something to build upon now.
Which begs the question...
The plan of action is:
In the next post I shall do task 1 and task 2 and get a sense of the data. Just a heads-up our traffic data will come from Alexa (not the speaker, the website), so if anyone can find the time to collaborate with me on this that would be fun. Maybe Kona Jones, with your statistics experience?
For now, this journey has to stop here. I hope you enjoyed reading, as much as I did writing. What is getting me excited is: in the next post I'll actually have some numbers to play with and data to share! Ain't that fun!
*I think the quote is attributed to Peter Drucker.
This CanvasTip actually came from one of my faculty and I thought it was definitely worth sharing.
In a user's account Notification preferences, there's an option under Alerts called Content Link Error. I never paid much attention to it but if you hover your mouse over it, it explains this preference will notify an instructor the location and content of a broken link that a student has interacted with inside a course. The default setting for this preferences is Daily, which may be fine but I suggest changing it to Right away ✅instead.
Think about it: if you're teaching a course and a student tries to access something and is presented with an error, how do you think that student will feel? My guess is probably annoyed . If you were notified right away about this and could potentially fix it in a matter of minutes, you could help avoid any further headaches for your students.
Below is a link to the help guide in Google Doc form:
Please share if this is helpful!
New Quizzes will eventually replace the default Canvas quizzing tool, but in the meantime, there's still a lot of development needed to bring it to feature parity. Here's what led The Wharton School to start using New Quizzes sooner rather than later.
One of the largest core courses taken by all undergraduate students at Wharton is "Introduction to Operations, Information and Decisions" or OIDD 101. Depending on the term, this intro course will have up to 500 students enrolled. The bulk of the course grade comes from six online quizzes--each one has a mix of 10 multiple choice and numeric answer questions. Often, there is more than one way to interpret a question, resulting in the need to regrade quizzes after they are submitted and recalculate student scores.
In classic Quizzes, regrading is triggered by certain actions (eg, changing the correct answer) and is only available for certain automatically-graded question types. Unfortunately, classic Quizzes do not allow regrading for numeric question types. While infrequent, when the need to regrade a numeric question does arises, it's a pretty big headache. In the last instance of this course, even a small handful of regrades resulted in a few hours of manual regrading. And that's just for one course! Even as I was writing this blog post, I received a report of a manual regrade needed for a numeric question in a quiz taken by 240+ students . . .
If you've reviewed the Quizzes.Next FAQ or Feature Comparison pages recently or even started exploring the tool yourself, you know that while there are a lot of new features and question types in New Quizzes, there are still several pending features for development. These include some fundamental features, such as the Preview tool, the ability to allow additional attempts, LockDown browser compatibility, Surveys, and downloadable student and item analysis reports. After weighing the pros and cons of the feature comparison chart, the promise of a more robust regrade tool won us over and generated interest in piloting the tool for OIDD 101.
We had hoped to start small, by migrating a few low-stakes practice quizzes to the new platform first. But when the faculty told us that practice quizzes would be given on paper this year and that New Quizzes would be used for the bulk of the course grades, we quickly went from dipping a toe into the pool to doing a full canon ball. Fortunately, we had the consolation knowing that if anything did go wrong, we could always revert back to classic Quizzes within the same course.
After securing faculty support (the lack of numerical regrade was a major pain point for the three instructors before, so they were eager to try something new), we enabled New Quizzes for a single sub-account and also enabled the "Quiz Log Auditing" feature option. This was key to accessing the View Logs, which were extremely helpful in troubleshooting issues later on. Two teaching assistants created the quizzes, after which we checked the settings thoroughly before the quizzes were published (our workaround to the lack of a Preview tool). Because the quizzes were named "Assignment 1, Assignment 2, etc . . ," rather than "Quiz 1, Quiz 2 . . ." students were able to find them easily under the "Assignments" page. Students said they liked the look of the new interface, while the TAs and instructors found it intuitive to build new quizzes and add images to questions. The regrade feature correctly recalculated grades for numeric answer quizzes (hooray!) and even handled multiple regrades for the same question (a problem with classic Quizzes). Based on this success alone, the faculty have already agreed to continue using New Quizzes in the Fall term.
1. No Auto-Submit with "Until" Date: Each quiz was available to students for an entire week and late submissions were not accepted. Expecting the same functionality as in classic Quizzes, faculty told students that any quiz not submitted by the "Available Until" date would be automatically submitted by Canvas. When this didn't happen as anticipated for Assignment 1 and 10-15 students were left with "In Progress" quizzes, faculty felt like they had lied to students. To fix this issue, we re-opened the quiz for the students with an "In Progress" status, masqueraded as them, and then submitted on their behalf the responses they had added as of the due date (found under "Moderate" > "Attempts in Progress" > "In Progress" log).
For the next quiz, faculty stressed the importance of manually clicking the "Submit" button in order for Canvas to process their quizzes. While there were still a few students each quiz who didn't deliberately click "Submit" (or assumed that clicking "Submit" once, without clicking "Submit" again when the Submission Confirmation message popped up, was sufficient), these incidences lessened over the course of the term.
2. No Quiz Log Data Saved: In a small handful of instances, students claimed to have answered all the questions, but their responses were not recorded in the quiz logs. After much troubleshooting, we came to realize that a specific behavior was causing the loss of data. Since these quizzes were available to students for a week at a time with no time limit, many students were leaving the quizzes open on their browsers for extended periods of time, sometimes several days without refreshing or closing the page. In that time, the Canvas session was timing out, so that by the time students went to input their responses, the data was unable to push out to the server. Unfortunately, when this happens little information, other than a timestamp for when the student began the quiz, is recorded, even in Instructure's server logs. The problem is avoided by students refreshing the page often or preferably, closing out of the quiz any time they are not actively working on it.
3. On-Time Submissions Marked Late [FIXED]: If a student submitted a Quizzes.Next quiz within a few minutes of the due date/time, sometimes a processing lag in SpeedGrader resulted in the submission being marked late in the Gradebook. This bug could even happen for on-time submissions that were initially marked as on-time, but then manually graded after the due date! In our situation, the faculty were very understanding of this bug and knew that students weren't actually submitting quizzes late because of the availability dates. But for courses that have New Gradebook enabled and set to automatically deduct points for late submissions, this would be a more serious concern.
With only one course in the pilot and many more developments in the pipeline for New Quizzes, we still have a lot to learn. But we've also gained a lot of experience in this first go-round. Below of some things we've discovered along the way:
Thanks for reading about Wharton's initial experience with Quizzes.Next/New Quizzes! I'm looking forward to presenting about New Quizzes at InstructureCon 2019 and sharing follow-up blog posts as we continue this pilot. If you have used New Quizzes before and have other tips/tricks, or are holding off because of pending features, please comment below!
I'm usually not one to write too many blog posts, and I really debated the best place to put this. As Ally is an accessibility tool it could have certainly gone in the accessibility group (and beginning my community college career in DSPS I do have a soft spot for UDL and 508/ADA compliance--so important for student success), but this has more to due with implementation and challenges regarding our processes and complexity of getting a tool of this scope in-place at a multi-college district with over 50k FTES. I do believe this is more applicable to this Higher Education group, as there are specific challenges that we face in our environment that may not be as applicable to some of the other sectors. Also, please forgive me as I've left some of this intentionally vague so that I don't identify any specific folks at our district, as everyone is wonderful to work with here.
To begin, we had a subgroup that I was part of that was charged with analyzing which potential tools we could adopt in order to enhance accessibility for students, and after looking at a few options it was determined that our best path forward was to explore Blackboard Ally. We piloted Ally for a semester, and after positive feedback from the small testing group we then signed a 3 year contract. The thinking was that after using the tool in a somewhat limited capacity with that small group it was found to be valuable, and we could then begin an opt-in rollout to specific courses where faculty could use the tool the first semester (where we could provide additional training and use those experiences to develop additional resources), then roll it out to all courses the following semester.
The main complexity started when we began to look as a District at how the content that Blackboard Ally identified as needing some level of remediation, was in actuality going to be remediated. Looking at the sheer amount of content that we need to remediate, it is a daunting task. As I mentioned above, we're a pretty large district, with four colleges and over 50k full-time equivalent students. Looking back at just one semester of content that Ally identifies, we can see almost 800,000 pieces of content. While the course numbers are a bit inflated as we create a course shell for every section, the content number is fully accurate regarding what's in Canvas.
This leads me into the challenge that we are still facing, and why we have had to delay our rollout--simply that we need a comprehensive plan on how this content is going to be remediated. Right now we have courses that have content in them that is not fully accessible, and we can see that in the account level reporting. We are not looking at or evaluating the course specific accessibility reports, though they are available. The challenges is that content was there before we implemented Ally, as it is there now with Ally implemented, the only difference is that we can't preach ignorance or pass the buck when we have reporting that shows we do have inaccessible content.
We are now having to somewhat on-the-fly come up with plans on how to help faculty remediate content. Many of the courses we have are fully online (and fully developed) and have been taught and continually have evolved for years. When there are hundreds of pieces of content, each of which can take between minutes and hours to remediate, there is just too large of a burden to expect faculty members to fully remediate the content themselves in a timely manner. We are evaluating options such as hiring more faculty coordinators at each campus to help with remediation, hiring district-wide instructional designers to remediate content, having stipends available to faculty for content remediation above their regular teaching load, etc. With four colleges and so many decision makers needing to be consulted and the ultimate decision needing to be negotiated with faculty, this process is not something that is able to be accomplished in a week or even a month. It is critical we get this done for students, as they need fully accessible content, but there are so many considerations that need to be made it is quite the process.
In closing, the main reason for making this post was to inform others regarding the challenges that are presented once you begin identifying inaccessible content. Hopefully you have a good experience using whatever tool or solution that your institution chooses, and I just want to make sure that those charged with making those decisions consider the implications when they choose to implement their solution. Having a comprehensive plan regarding how to remediate content is very valuable.
Thanks for your time reading this.
Are you interested in university governance and academic leadership? join our LEAD-Community
The LEAD2 project is a Capacity Building in Higher Education project supported by the Erasmus+ programme. The overall objective of the LEAD2 project is to strengthen the capacity of higher education institutions in governance and academic leadership and build an online Knowledge Base and an EU-China Centre on university governance and academic leadership in the context of innovation and internationalisation of higher education. The project involves different European and Chinese universities.
We create a space on Facebook.com/LEAD_community where teachers, academic leaders, managers, administrators can exchange knowledge and ideas about university governance and academic leadership in higher education. We also update news, events of the LEAD2 project.
Join us to find out more!!!
We also offer a LEAD MOOCs on University governance and Academic leadership in Canvas. Enroll for FREE!!
LEAD2 project team
My name is Alan Kinsey, and I am the Instructional Design Specialist at Holmes Community College. We are doing research into how other community colleges and eLearning departments function and exist in their contexts. We have created a short survey to gather this information. This survey is geared to gather information regarding fully online courses, e.g. distance learning, eLearning, online learning, etc.
If you have a few moments, please fill out this survey to help us in our research efforts: https://goo.gl/forms/7keb5yjWX9sWdr4C2
Any information gathered in this survey will be used for research purposes only. Your responses will be kept confidential and will not be shared.
If you have any questions about this survey, please let me know. Have a great day, and thank you for your time!