cancel
Showing results for 
Search instead for 
Did you mean: 

How to Use the Canvas Data Loader

How to Use the Canvas Data Loader

    Official Canvas Document

Canvas + Logo transparent (WHITE)- 300px.png


Overview

A small Command Line Interface (CLI) tool for syncing data from the Canvas Data API directly to any of the tool's supported database types .

NOTE: This is currently an example application, but is supported, and welcome to contributions. Please report any bugs or issues you find!

 

Benefits of this tool compared to manually downloading files:

  • It pulls all Canvas Data flat files, and schema, into a specified database type.

  • Is dynamic to handle schema version changes without breaking your database.
  • Can be automated to run daily to keep your database constantly up to date.
  • Has the ability to skip historical refreshes (backload of all pageviews in the request table).

Install

All of this needs to be done through your terminal (OSX) or command prompt (Windows).

The direct Github page can be found here.

Prerequisites

This tool should work on Linux, OSX, and Windows. You may need to have full admin access to modify, and download, certain tasks listed during the configuration steps below.

Clone Canvas Data Loader From Github

git clone https://github.com/instructure/canvas-data-loader.git 

Direction to install git.

Configure

The Canvas Data CLI requires a configuration file with certain fields set. Canvas Data Loader uses a small toml file as the configuration file. To generate this configuration:

1. Run

cp ./config/default.toml ./config/local.toml

2. Navigate to local.toml

Navigate to the directory that the Canvas Data Loader was cloned to on your computer. Under the config folder, open the local.toml file with the text editor of your choice.

3. Edit 'Save Location'

Within the file, edit the saveLocation and unpackLocation to point to where you want to save the Canvas Data output files.

Example #1: saveLocation: '/Users/PandaUser/Desktop/dataFiles'

 

Example #2: saveLocation:'/Users/PandaUser/Documents/Canvas_Data_Ex/dataFiles'

 

4. Generate API Credentials

Click on the Canvas Data API Guide for reference on generating API credentials. Once you have this you must do one of the following:

A. Hard Coding Credentials (easier, but less secure)

    1. Open your local.toml file from step 2

    2. Replace the api_key and api_secret with the secret and key you generated from your Canvas Data instance surrounded by double quotes.

5. Provide URL to database

Provide the authenticated url in the local.toml file to your database that you wish to import into.

Supported databases at this time are:

  • Postgres -  Help on finding URL.
  • Mysql - Format of URL should be mysql://<username>:<password>@<host>:<port>/<db_name>

6. Specify the database type

Within the local.toml file, specify the database type that you are using.

Supported databases at this time are:

  • Psql
  • Mysql

End result should appear like this:

# This determines the log level
save_location = "/tmp/cdl-save"
rocksdb_location = "/tmp/cdl-rocksdb"
skip_historical_imports = true

[canvasdataauth]
api_key = "12345"
api_secret = "6789"

[database]
url = "postgres://pandauser@localhost/panda-db"
# Valid Values are Psql, Mysql
db_type = "Psql"

‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

 

7. Save

Save and close the local.toml file.

8. Install Rust

The Canvas Data Loader is built using the Rust programming language. Rust is necessary to be installed in order to use the Canvas Data Loader. You can install Rust by visiting their webpage here.

9. Add Rust to the PATH environment variable

Within your terminal/command prompt, Rust will finish installation with steps to add to your PATH environment variable. 

For Example on Mac, you will see:

Rust is installed now. Great!

To get started you need Cargo's bin directory ($HOME/.cargo/bin) in your PATH
environment variable. Next time you log in this will be done automatically.

To configure your current shell run "source $HOME/.cargo/env"
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

You will need to copy source $HOME/.cargo/env to the next line and press Enter. Ensure that you are running this command within the root canvas-data-loader directory.

The exact command will differ for Windows. The concept will be same for adding to your PATH environment variable.

 

10. Build a Release Version

Copy cargo build --release  into your terminal and press Enter.

11. Run the Importing Process

 Navigate to the Canvas Data Loader Directory within your terminal/command prompt. 

 Example: cd ./canvas-data-loader

Once in the correct directory, run RUST_LOG=info ./target/release/cdl-runner

[OPTIONAL] Automate Downloading Into Database

It is beneficial to automate the running of the Canvas Data Loader to ensure that you are always using the most recent data set. 

MAC

Information on crontab automation variables

Setup a crontab to run the importer every hour:

  • crontab -e
  • Enter on it's own line, replacing the path to your importer: 0 * * * * cd <my_cdl_location> && RUST_LOG=info ./target/release/cdl-runner > /var/log/cdl-log 2>&1

WINDOWS

Create a scheduled task to run RUST_LOG=info ./target/release/cdl-runner.

Information on creating scheduled tasks

Labels (1)
Comments

I'm having some trouble with the last step and am getting this error when running cdl-runner. 

246880_pastedImage_1.png

This sounds as though it is an improvement on the canvasDataCli tool. Is there a plan to extend support to other databases?

Deactivated user‌  is there any facility in this tool to skip the requests table given its size grows extremely quickly?

Actually, the Canvas Data Loader only keeps one day's worth of requests. It does not accumulate each day's worth of requests for this exact reason.

I'm getting a similar error:

INFO:cdl_runner: Setting up API Client...

thread 'main' panicked at 'Failed to get List of Dumps: Error(Hypererror(Io(Error { repr: Custom(Custom { kind: Other, error: ErrorImpl { code: Message("invalid type: map, expected a sequence"), line: 1, column: 1 } }) })), State { next_error: None, backtrace: None })', src/libcore/result.rs:906:4

note: Run with `RUST_BACKTRACE=1` for a backtrace.

Hi Stuart,

We are planning to add support for SQL server. If we can manage it, our long-term goal is to try and add support for Oracle as well.

Hi Sydney,

Thanks for the update. In my experience Oracle is the only database platform which can handle a dimensionally-modelled star schema well. It supports bitmap indexes for fast fact-dimension joins, implicit star schema transformation, partitioning and compression to handle data such as requests data. Oracle support for materialised views is also the best available, with fast refresh capability and with the latest release you can even get real-time data from a materialised view.

The ability of the Data Loader to handle upload direct to the database and to skip full historical requests dumps sound very useful. However, I'm not clear how schema changes are handled, clearly if a new table is included or an old one dropped or a column added or dropped, the database will need to be changed accordingly.

Regards,

Stuart.

I ran into this error when I had the incorrect API keys in conf/local.toml.

Is there a timeline for when Oracle will be added to the supported list?

When I ran the CLI loader, it take 40 mins to download, but ~8 hours to unpack those files because some files are extremely big. Is there way we can make the process faster? 8 hours is not acceptable for business. Appreciate your thoughts and creative ideas. 

I have been trying to compile the cdl-runner on a windows machine and get the following error:

liblibrocksdb_sys-8964c77fbce86bb9.rlib(compaction_job.o) : error LNK2001: unresolved external symbol "public: void __cdecl rocksdb::port::WindowsThread::join(void)" (?join@WindowsThread@port@rocksdb@@QEAAXXZ)
          liblibrocksdb_sys-8964c77fbce86bb9.rlib(backupable_db.o) : error LNK2001: unresolved external symbol "public: void __cdecl rocksdb::port::WindowsThread::join(void)" (?join@WindowsThread@port@rocksdb@@QEAAXXZ)
          liblibrocksdb_sys-8964c77fbce86bb9.rlib(env_win.o) : error LNK2001: unresolved external symbol "public: void __cdecl rocksdb::port::WindowsThread::join(void)" (?join@WindowsThread@port@rocksdb@@QEAAXXZ)
          liblibrocksdb_sys-8964c77fbce86bb9.rlib(delete_scheduler.o) : error LNK2001: unresolved external symbol "public: void __cdecl rocksdb::port::WindowsThread::join(void)" (?join@WindowsThread@port@rocksdb@@QEAAXXZ)
          liblibrocksdb_sys-8964c77fbce86bb9.rlib(threadpool_imp.o) : error LNK2001: unresolved external symbol "public: void __cdecl rocksdb::port::WindowsThread::join(void)" (?join@WindowsThread@port@rocksdb@@QEAAXXZ)
          liblibrocksdb_sys-8964c77fbce86bb9.rlib(threadpool_imp.o) : error LNK2019: unresolved external symbol "public: bool __cdecl rocksdb::port::WindowsThread::detach(void)" (?detach@WindowsThread@port@rocksdb@@QEAA_NXZ) referenced in function "public: void __cdecl rocksdb::ThreadPoolImpl::Impl::BGThread(unsigned __int64)" (?BGThread@Impl@ThreadPoolImpl@rocksdb@@QEAAX_K@Z)
          C:\Users\canvas-data-loader\target\release\deps\cdl_runner-ca016ca32e7d556d.exe : fatal error LNK1120: 6 unresolved externals

No other errors were encountered during the compile.

I have been running a powershell script to unpack and load the files on a Windows machine and it does not take very long - but to your point our files are not too big as we are newbies on Canvas.

I have been running this script successfully until yesterday. Presently getting this error:

INFO 2018-08-23T13:42:24Z: cdl_runner: Setting up API Client...

INFO 2018-08-23T13:42:24Z: cdl_runner: Connecting to RocksDB Store....
thread 'main' panicked at 'Failed to open RocksDB: Error { message: "IO error: No space left on deviceWhile appending to file: /tmp/cdl-rocksdb/000302.sst: No space left on device" }', /checkout/src/libcore/result.rs:916:5
note: Run with `RUST_BACKTRACE=1` for a backtrace.

Can you point me to what I need to do to fix this?

Jumoke Oladimeji wrote:

 message: "IO error: No space left on deviceWhile appending to file: /tmp/cdl-rocksdb/000302.sst: No space left on device" }', /

Is your Hard Drive full?

Thank you. That was helpful. How about this error: "Failed to turn connection into pool. This should never happen",

HAHA! That's an !excellent error. Not too helpful.

canvas-data-loader/db_client.rs at master · instructure/canvas-data-loader · GitHub

I will ask around and investigate a bit.

I was asking the engineers lurking in #canvas-lms. After a brief explanation, they essentially pointed me to your issue...

Failed to open RocksDB Error due to Space · Issue #10 · instructure/canvas-data-loader · GitHub 

Looks like you got it resolved via GitHub moments ago.

I'm late to the party.

Smiley Happy

The support around Canvas/community, deserves :applause

We have the cdl-runner working. It is able to download the files and push the data to a MySQL database. However, we are only getting these tables: 

account_dim
course_dim
course_ui_canvas_navigation_dim
course_ui_navigation_item_dim
course_ui_navigation_item_fact
requests

There are no error messages. The downloaded flat files only have files for those tables. I'm searched over the Canvas Data documentation but I can't find anything. What do we need to do to get it to pull all the tables? 

Hi abe@williamsburglearning.com‌,

The list of tables you're receiving suggests that your account is in Canvas Data Reserved Mode‌. I think your next set of files will be complete since you downloaded them today. But you may need to open the Canvas Data Portal and download a file.

This was a few days ago, but the CDL is still only pulling those same tables. We have logged into the Canvas Data Portal and manually downloaded some of the tables a couple times since your reply, but there is still no difference to what the CDL is pulling. 

Is there someone we should contact to get our account out of Canvas Data Reserved Mode?

Yes, it sounds like it's time to email canvasdatahelp@instructure.com

We have a couple weeks of backlog that the CDL needs to get through, but it doesn't seem to be doing much. I can see that the cdl-runner is running, but it is using next to no CPU, and nothing seems to be happening to the database. There are no errors. I checked this morning and found the cdl-runner using 0 CPU, had been running for 12 hours and had done nothing. I killed the process and started it again. It acts like it's running, but it is barely using any CPU and nothing is being pushed into the database.

I looked at the source code, but I'm not familiar with Rust, so I don't really know what I'm looking for.

Is there anything else I can do?

It's been a bit slow in our instance too, it takes anywhere from 24 to 36 hours to complete. I do run canvas data cli to download the latest data files though. If I get a breakthrough or break it Smiley Happy I will keep you informed. 

Performance is definitely a problem with the Canvas Data Loader. I logged an issue: https://github.com/instructure/canvas-data-loader/issues/6

As Eric noted, the performance was adequate when initially released. I had it working late last year, but something changed early this year – my suspicion is something in a dependency. I spent quite a bit of time on it and gave up since we were only looking at it as an interim solution.

Depending on where you’re trying to load the data, there are some other options. For our purposes, data-loch has been adequate for loading to Redshift: https://github.com/ets-berkeley-edu/data-loch Note that it’s subsequently been replaced by Nessie: https://github.com/ets-berkeley-edu/nessie

Jeff

To add to the list of alternatives, we've used scripts created and shared by james@richland.edu‌ with a couple customizations to load data to MySQL from the flat files, which we download using CanvasDataCli's grab command (only most recent day's files).

canvancement/canvas-data/mysql at master · jamesjonesmath/canvancement · GitHub 

How to Use the Canvas Data CLI Tool

Thank you for the response. This does not make sense to me... I'm sorry.

Joanne Purk

Adjunct Instructor - ENGL 075 - Co-Req Reading & Writing

Ivy Tech Community College Northeast

Email: jpurk@ivytech.edu

Changing Lives. Making Indiana Great.

Thank you for the response. I have received emails from other jiveon regarding the Canvas Data Loader. This makes no sense to me. I am sorry.

Joanne Purk

Adjunct Instructor - ENGL 075 - Co-Req Reading & Writing

Ivy Tech Community College Northeast

Email: jpurk@ivytech.edu

Changing Lives. Making Indiana Great.

I have received emails from other jiveon consultants. I am sorry but this makes no sense to me. I don't understand.

Joanne Purk

Adjunct Instructor - ENGL 075 - Co-Req Reading & Writing

Ivy Tech Community College Northeast

Email: jpurk@ivytech.edu

Changing Lives. Making Indiana Great.

jpurk@ivytech.edu‌, it sounds like you are receiving email notifications of comments to this document. Would you log in to the Canvas Community to see if you are "following" this document in the Canvas Community, and/or if you are following the Canvas Data space itself? You can fine-tune and customize the email notifications you receive from Canvas Community updates by following the instructions in How do I follow people, places, or content in the Canvas Community? and https://community.canvaslms.com/docs/DOC-14908-75187841165 Thanks!

Please help, We are having problem on running cargo build --release. All 265 item where already build when it comes to this last build cdl-runner. It give us the following errors.

native=D:\CANVAS\DATA\canvas-data-loader\target\release\build\librocksdb-sys-65b72aa4e148cece\out`
error: linking with `link.exe` failed: exit code: 1120 

and 

note: liblibrocksdb_sys-b99843cdeaa55502.rlib(env_win.o) : error LNK2019: unresolved external symbol __imp__PathIsRelativeA@4 referenced in function "public: virtual class rocksdb::Status __thiscall rocksdb::port::WinEnvIO::GetAbsolutePath(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > const &,class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > *)" (?GetAbsolutePath@WinEnvIO@port@rocksdb@@UAE?AVStatus@3@ABV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@PAV56@@Z)
D:\CANVAS\DATA\canvas-data-loader\target\release\deps\cdl_runner-bfea684aa6e62554.exe : fatal error LNK1120: 1 unresolved externals


error: aborting due to previous error

error: Could not compile `cdl-runner`.

Thanks,

I'm glad I'm not the only one. Just trying to get this going today and I am also receiving this exact error on the cargo build.

Hi guys, Did anybody already solve this issue?

I opened a ticket on the Github project and was told it was an issue with one of the upstream libraries running on Windows:

Could not compile `cdl-runner` · Issue #12 · instructure/canvas-data-loader · GitHub 

Recommendation was to run on Linux.

I'm working to get Canvas Data into an AWS database. When providing the URL for the mySQL db I need to know what "port" to enter. Do you have any guidance on this?

 

- - - - -

5. Provide URL to database

Provide the authenticated url in the local.toml file to your database that you wish to import into. 

 

Supported databases at this time are:

  • Mysql - Format of URL should be mysql://<username>:<password>@<host>:<port>/<db_name>

The default port for MySQL is 3306. To find it in AWS, if you're using RDS.

RDS > Databases, and select the database.
The endpoint would be the <host> value. 
306479_Screen Shot 2019-02-08 at 12.05.17 PM.png
hello I already made the attempt downloading the data from canvas data portal but I keep listing only the 5 tables is there any other solution for this?306981_Postgres.PNG

An update to a cdl-runner alternative. I have been able to automate the canvas data load on a Windows machine using batch files.

Canvas data CLI for downloading the files

Powershell scripts to move the files to particular folder

A custom java process to unzip all the files

A powershell script to consolidate the data in multiple files

A PostgreSQL script to upload the data.  

Do you have any step by step with the scripts mentioned?

Before you start you should have all the canvas objects (dim/fact) created in the database. 

Git the Canvas data CLI :GitHub - instructure/canvas-data-cli: Command line tool to connect and download files from Canvas Da... . You will have to install python.Check the version.   

Depending on you file structure and data you will need a script to move all the files into a single folder as the CLI will create separate folders for each data file.

What you will need is a bulk unzip utility - winzip etc that can unzip .gz files. I wrote a simple java routine to unzip all the files in a folder.

Once unzipped if there are multiple files for one canvas object (eg request, submission_dim etc) you will need a script to consolidate the data into one file.  

Last step: A sql that imports the data in the unzipped text files files into the tables in the database. 

I'm trying to create a cron tab for windows according to step [OPTIONAL] Automate Downloading Into Database and I get the following error, 

Do you have any idea how to generate the script with rust log

307168_Captura.JPG

additional how could the script generate or do you have a link?

Hi. It' seems you're on a windows enviroment.

The solution to the failing link.exe is to install Visual Studio Community and select to install ".Net desktop development". Then restart the machine/server.

I'm currently debugging the error concerning rocksDB, will update when I find a solution

thanks for very elaborative documentation and when I am at almost there, I got an error.

following is the excerpt from my cmd prompt. please let me know where it went wrong.

C:\Softwares\CanvasDataLoader\canvas-data-loader>cargo build --release
Updating crates.io index
Downloaded base64 v0.9.3
Downloaded error-chain v0.12.1
Downloaded config v0.9.3
Downloaded flate2 v1.0.7
Downloaded glob v0.2.11
Downloaded log v0.4.6
Downloaded env_logger v0.5.13
Downloaded r2d2 v0.8.5
Downloaded mysql v14.2.0
Downloaded rocksdb v0.10.1
Downloaded futures v0.1.27
Downloaded lazy_static v1.3.0
Downloaded r2d2_postgres v0.14.0
Downloaded serde_json v1.0.39
Downloaded ring v0.13.5
Downloaded tokio-core v0.1.17
Downloaded serde v1.0.92
Downloaded serde_derive v1.0.92
Downloaded rayon v1.0.3
Downloaded chrono v0.4.6
Downloaded postgres v0.15.2
Downloaded reqwest v0.9.18
Downloaded regex v1.0.6
Downloaded nom v4.2.3
Downloaded yaml-rust v0.4.3
Downloaded itoa v0.4.4
Downloaded atty v0.2.11
Downloaded syn v0.15.35
Downloaded backtrace v0.3.30
Downloaded cfg-if v0.1.9
Downloaded miniz-sys v0.1.12
Downloaded libz-sys v1.0.25
Downloaded named_pipe v0.3.0
Downloaded winapi v0.3.7
Downloaded mysql_common v0.12.0
Downloaded rayon-core v1.4.1
Downloaded cc v1.0.37
Downloaded iovec v0.1.2
Downloaded tokio-executor v0.1.7
Downloaded either v1.5.2
Downloaded rust-ini v0.13.0
Downloaded serde-hjson v0.8.2
Downloaded proc-macro2 v0.4.30
Downloaded quote v0.6.12
Downloaded ryu v0.2.8
Downloaded version_check v0.1.5
Downloaded miniz_oxide_c_api v0.2.1
Downloaded bit-vec v0.5.1
Downloaded librocksdb-sys v5.18.3
Downloaded url v1.7.2
Downloaded parking_lot v0.8.0
Downloaded crossbeam-deque v0.2.0
Downloaded bytes v0.4.12
Downloaded tokio-io v0.1.12
Downloaded tokio-timer v0.2.11
Downloaded safemem v0.3.0
Downloaded termcolor v1.0.5
Downloaded crc32fast v1.2.0
Downloaded byteorder v1.3.2
Downloaded untrusted v0.6.2
Downloaded bufstream v0.1.4
Downloaded twox-hash v1.3.0
Downloaded smallvec v0.6.10
Downloaded mio v0.6.19
Downloaded scoped-tls v0.1.2
Downloaded tokio-reactor v0.1.9
Downloaded tokio v0.1.21
Downloaded humantime v1.2.0
Downloaded fnv v1.0.6
Downloaded toml v0.4.10
Downloaded postgres-shared v0.4.2
Downloaded libc v0.2.58
Downloaded num-integer v0.1.41
Downloaded time v0.1.42
Downloaded num-traits v0.2.8
Downloaded memchr v2.2.0
Downloaded aho-corasick v0.6.10
Downloaded serde v0.8.23
Downloaded regex-syntax v0.6.7
Downloaded wincolor v1.0.1
Downloaded pkg-config v0.3.14
Downloaded autocfg v0.1.4
Downloaded quick-error v1.2.2
Downloaded crossbeam-utils v0.2.2
Downloaded crossbeam-epoch v0.3.1
Downloaded bitflags v1.1.0
Downloaded sha1 v0.6.0
Downloaded base64 v0.10.1
Downloaded sha2 v0.8.0
Downloaded matches v0.1.8
Downloaded percent-encoding v1.0.1
Downloaded uuid v0.7.4
Downloaded fallible-iterator v0.1.6
Downloaded parking_lot_core v0.5.0
Downloaded lock_api v0.2.0
Downloaded rand v0.6.5
Downloaded http v0.1.17
Downloaded native-tls v0.2.3
Downloaded tokio-threadpool v0.1.14
Downloaded tokio-sync v0.1.6
Downloaded miow v0.2.1
Downloaded parking_lot v0.7.1
Downloaded bindgen v0.47.3
Downloaded num-traits v0.1.43
Downloaded rustc-demangle v0.1.15
Downloaded tokio-udp v0.1.3
Downloaded num_cpus v1.10.1
Downloaded backtrace-sys v0.1.28
Downloaded kernel32-sys v0.2.2
Downloaded num-bigint v0.2.2
Downloaded idna v0.1.5
Downloaded cookie v0.12.0
Downloaded mime v0.3.13
Downloaded hyper-tls v0.3.2
Downloaded tokio-tcp v0.1.3
Downloaded tokio-current-thread v0.1.6
Downloaded utf8-ranges v1.0.3
Downloaded linked-hash-map v0.3.0
Downloaded rustc_version v0.2.3
Downloaded linked-hash-map v0.5.2
Downloaded crc v1.8.1
Downloaded vcpkg v0.2.6
Downloaded tokio-fs v0.1.6
Downloaded postgres-protocol v0.3.2
Downloaded cookie_store v0.7.0
Downloaded hyper v0.12.29
Downloaded tokio-trace-core v0.2.0
Downloaded tokio-codec v0.1.1
Downloaded checked v0.5.0
Downloaded miniz_oxide v0.2.1
Downloaded serde_urlencoded v0.5.5
Downloaded slab v0.4.2
Downloaded winapi v0.2.8
Downloaded atoi v0.2.4
Downloaded socket2 v0.3.9
Downloaded crossbeam-utils v0.6.5
Downloaded thread_local v0.3.6
Downloaded lazy_static v0.2.11
Downloaded mime_guess v2.0.0-alpha.6
Downloaded encoding_rs v0.8.17
Downloaded unicode-xid v0.1.0
Downloaded ucd-util v0.1.3
Downloaded serde_test v0.8.23
Downloaded arrayvec v0.4.10
Downloaded winapi-util v0.1.2
Downloaded scopeguard v0.3.3
Downloaded nodrop v0.1.13
Downloaded rand_hc v0.1.0
Downloaded rand_core v0.4.0
Downloaded scheduled-thread-pool v0.2.1
Downloaded rand_os v0.1.3
Downloaded rand_pcg v0.1.2
Downloaded fake-simd v0.1.2
Downloaded digest v0.8.0
Downloaded net2 v0.2.33
Downloaded unicode-bidi v0.3.4
Downloaded hmac v0.5.0
Downloaded rand v0.3.23
Downloaded md5 v0.3.8
Downloaded winapi-build v0.1.1
Downloaded memchr v1.0.2
Downloaded scopeguard v1.0.0
Downloaded hex v0.2.0
Downloaded try_from v0.3.2
Downloaded futures-cpupool v0.1.8
Downloaded tokio-buf v0.1.1
Downloaded want v0.0.6
Downloaded crossbeam-queue v0.1.2
Downloaded lock_api v0.1.5
Downloaded unicase v2.4.0
Downloaded schannel v0.1.15
Downloaded cexpr v0.3.5
Downloaded clang-sys v0.26.4
Downloaded hashbrown v0.1.8
Downloaded peeking_take_while v0.1.2
Downloaded build_const v0.2.1
Downloaded rand_chacha v0.1.1
Downloaded rand_isaac v0.1.1
Downloaded opaque-debug v0.2.2
Downloaded unicode-normalization v0.1.8
Downloaded sha2 v0.7.1
Downloaded stringprep v0.1.2
Downloaded publicsuffix v1.5.2
Downloaded http-body v0.1.0
Downloaded parking_lot_core v0.4.0
Downloaded phf_codegen v0.7.24
Downloaded httparse v1.3.3
Downloaded clap v2.33.0
Downloaded adler32 v1.0.3
Downloaded which v2.0.1
Downloaded memoffset v0.2.1
Downloaded rand_jitter v0.1.4
Downloaded semver v0.9.0
Downloaded h2 v0.1.23
Downloaded base64 v0.6.0
Downloaded crossbeam-deque v0.7.1
Downloaded ws2_32-sys v0.2.1
Downloaded env_logger v0.6.1
Downloaded dtoa v0.4.4
Downloaded failure v0.1.5
Downloaded phf v0.7.24
Downloaded unicase v1.4.2
Downloaded rand_xorshift v0.1.1
Downloaded generic-array v0.9.0
Downloaded block-buffer v0.7.3
Downloaded generic-array v0.12.0
Downloaded rand_core v0.3.1
Downloaded digest v0.7.6
Downloaded rand v0.4.6
Downloaded semver-parser v0.7.0
Downloaded indexmap v1.0.2
Downloaded phf_shared v0.7.24
Downloaded owning_ref v0.4.0
Downloaded strsim v0.8.0
Downloaded unicode-width v0.1.5
Downloaded string v0.2.0
Downloaded safemem v0.2.0
Downloaded crypto-mac v0.5.2
Downloaded failure_derive v0.1.5
Downloaded byte-tools v0.2.0
Downloaded phf_generator v0.7.24
Downloaded libloading v0.5.1
Downloaded crossbeam-epoch v0.7.1
Downloaded try-lock v0.2.2
Downloaded textwrap v0.11.0
Downloaded block-buffer v0.3.3
Downloaded block-padding v0.1.4
Downloaded vec_map v0.8.1
Downloaded siphasher v0.2.3
Downloaded constant_time_eq v0.1.3
Downloaded byte-tools v0.3.1
Downloaded typenum v1.10.0
Downloaded synstructure v0.10.2
Downloaded stable_deref_trait v1.1.1
Downloaded arrayref v0.3.5
Compiling arrayvec v0.4.10
Compiling nodrop v0.1.13
Compiling libc v0.2.58
Compiling cfg-if v0.1.9
Compiling lazy_static v1.3.0
Compiling memoffset v0.2.1
Compiling scopeguard v0.3.3
Compiling rayon-core v1.4.1
Compiling rayon v1.0.3
Compiling proc-macro2 v0.4.30
Compiling either v1.5.2
error: linker `link.exe` not found
|
= note: The system cannot find the file specified. (os error 2)

note: the msvc targets depend on the msvc linker but `link.exe` was not found

note: please ensure that VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option

error: aborting due to previous error

error: Could not compile `rayon-core`.
warning: build failed, waiting for other jobs to finish...
error: linker `link.exe` not found
|
= note: The system cannot find the file specified. (os error 2)

note: the msvc targets depend on the msvc linker but `link.exe` was not found

note: please ensure that VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option

error: aborting due to previous error

error: Could not compile `libc`.
warning: build failed, waiting for other jobs to finish...
error: linker `link.exe` not found
|
= note: The system cannot find the file specified. (os error 2)

note: the msvc targets depend on the msvc linker but `link.exe` was not found

note: please ensure that VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option

error: aborting due to previous error

error: Could not compile `arrayvec`.
warning: build failed, waiting for other jobs to finish...
error: linker `link.exe` not found
|
= note: The system cannot find the file specified. (os error 2)

note: the msvc targets depend on the msvc linker but `link.exe` was not found

note: please ensure that VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option

error: aborting due to previous error

error: Could not compile `rayon`.
warning: build failed, waiting for other jobs to finish...
error: linker `link.exe` not found
|
= note: The system cannot find the file specified. (os error 2)

note: the msvc targets depend on the msvc linker but `link.exe` was not found

note: please ensure that VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option

error: aborting due to previous error

error: Could not compile `proc-macro2`.

To learn more, run the command again with --verbose.

C:\Softwares\CanvasDataLoader\canvas-data-loader>RUST_LOG=info ./target/release/cdl-runner
'RUST_LOG' is not recognized as an internal or external command,
operable program or batch file.

Are VS 2013, VS 2015 or VS 2017 was installed with the Visual C++ option?

make sure those application are installed.

On Wed, 12 Jun 2019 at 7:52 AM venkata.malisetty@csun.edu <

Thanks, Tommy Reyes.

Now that I have VS applications installed, I am able to venture further to get this new error.

It talks something to set up some variables but I don't know how. Am I missing any other bundle install that I am supposed to?

[2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] start: librocksdb-sys v5.18.3 => Target(script)/Profile() => Host    Compiling librocksdb-sys v5.18.3 [2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] start: cookie_store v0.7.0 => Target(lib)/Profile(release) => Host [2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] end: cookie_store v0.7.0 => Target(lib)/Profile(release) => Host [2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] start: reqwest v0.9.18 => Target(lib)/Profile(release) => Host [2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] end: reqwest v0.9.18 => Target(lib)/Profile(release) => Host [2019-06-12T19:35:34Z INFO  cargo::core::compiler::job_queue] end: librocksdb-sys v5.18.3 => Target(script)/Profile() => Host error: failed to run custom build command for `librocksdb-sys v5.18.3` process didn't exit successfully: `C:\Softwares\CanvasDataLoader\canvas-data-loader\target\release\build\librocksdb-sys-10a20a9d59bb2f10\build-script-build` (exit code: 101) --- stdout cargo:rerun-if-changed=build.rs cargo:rerun-if-changed=rocksdb/ cargo:rerun-if-changed=snappy/ cargo:rerun-if-changed=lz4/ cargo:rerun-if-changed=zstd/ cargo:rerun-if-changed=zlib/ cargo:rerun-if-changed=bzip2/  --- stderr thread 'main' panicked at 'Unable to find libclang: "couldn\'t find any valid shared libraries matching: [\'clang.dll\', \'libclang.dll\'], set the `LIBCLANG_PATH` environment variable to a path where one of these files can be found (invalid: [])"', src\libcore\result.rs:997:5 note: Run with `RUST_BACKTRACE=1` environment variable to display a backtrace.   C:\Softwares\CanvasDataLoader\canvas-data-loader>

It's looking for a clang libraries. You will need to install CLANG.

Regards,

Tommy Reyes

Section Head - Application Development

Information Technology Services

Far Eastern University

Nicanor Reyes Street, Sampaloc, Manila, Philippines 1015

+632 8494000 or 7777338 Loc. 601

Email: treyes@feu.edu.ph <fpantas@feu.edu.ph>

On Thu, Jun 13, 2019 at 7:55 AM venkata.malisetty@csun.edu <

If you are running from a Windows Command prompt you have to be in the CanvasData loader folder. From there you run the command using the path to the cdl-loader.exe. If using Windows you don't use the Rust_Info.

I posted my question here: Error importing data to Mysql but the gist is that the data loader hangs after connecting to the Mysql database and then produces the shown error.

Version history
Revision #:
1 of 1
Last update:
‎07-31-2017 05:24 PM
Updated by: