Anyone using Power BI with Canvas Data? Looking to collaborate on building templates.
I've been doing just that. You can see a couple of examples here.
Examples look interesting. Have you tried pulling anything from AWS Canvas Data?
No I haven't. All our Canvas data is surfaced in a data lake
To expand on Daniel's response, we download daily from the Canvas Data Portal using the canvasDataCli tool. This is then uploaded into an Oracle database and presented as views in the data lake that Daniel refers to.
It seems like a lot of institutions are using the bulk downloads. We are querying Redshift via ODBC into excel files and putting them in sharepoint. Our Power BI reports refresh from those sharepoint files. The sharepoint files have auto refresh through a program called Power Update or a PowerShell script. Reworking this via SQL Server is on the agenda but not crucial as we are not dealing with huge amounts of data.
In the future I would like to develop an application which connects Power BI to redshift via the API interface. All the data analysis / visuals would be created through Python scripts and the managing program would be in C++. This is still VERY far away though.
Our requests data is currently about 200 GB and growing at about 3 GB per day during the semester. What is your experience of performance when querying requests data via Redshift?
When downloading using the canvasDataCli sync process, only files which don't already exist in the local data store are downloaded, so it's not really a bulk download.
There are only two reports I have created which rely on the Request table. For both reports I am looking at / analyzing real specific things so the queries I am running are not pulling in more than 500k rows. I also do not bring the tables into excel, rather, I load them to the data model through PowerQuery. Usually, there is a good bit of filtering I can do within PowerQuery before that as well. For Power BI the data query process is very similar to PowerQuery. It does a good job of compressing the data so you can get a good amount in before it starts to lag.
The only other thing that helps is having a computer with an i7, SSD, graphics card and a good amount of ram helps. If I query just an open month's worth of data then it will take a ton of time to complete. However, if you know what you want from the table it is not too bad.
I have started to transition some of our reports to Power BI, although most have been unrelated to academics thus far.
If Canvas BI developers want to collaborate and / or share count me in though.
personally i finde all this to be complicarted
I have started. How are you building your Data Model?
Currently exploring and modelling data downloaded and onsite using Power BI Looking at broad visualizations as the system is only newly launched (logins by day, enrolments) One useful graph showed semester one assignments by deadline. Our libraries and support staff in particular liked this one for more accurately pre-empting peak time. Hoping to expand this out to identify submission type and location so we can look at load on printers, risk of Turnitin slowness etc.
Nest step is joining the data with some user and time dimension tables. User will allow for more granularity in looking at trends within school/programme etc. Time will allow for heatmapping of activities over the course of a day. We have a very supportive database/systems team, which is really helpful.
Happy to share practice if people are interested.
any idea how to generate a customized report to find out what canvas tools have been used in each canvas courses?
Hi SC I'd start by querying the Requests table (this is in our database as table_requests) You can join this with courses using a field common to both tables (course_id perhaps)
The look in table_requests at the column web_application_controller this will allow you to filter by tool type. Simply remove tools you're not interested in and you should be able to query tool use in a particular course.
Hope this helps. Please let me know how you get on
Retrieving data ...