To Our Amazing Educators Everywhere,
Happy Teacher Appreciation Week!
Found this content helpful? Log in or sign up to leave a like!
Hello,
I was poking around the DAP client documentation and noticed that there's pretty robust documentation for creating a dialect plugin.
Has anyone in the community developed an Oracle or Microsoft SQL dialect plugin for the DAP client yet?
The CD2 transition has snuck up on us, and being able to integrate directly with an oracle or mssql db would save our organization a lot of effort and resources.
Hi-
I was looking at the documentation yesterday for creating database plugins (https://data-access-platform-api.s3.amazonaws.com/client/plugins.html) and I followed most of their instructions.
It looks like a lot of heavy-lifting is database specific and there is a lot of lines of mysql and postgres specific code. It looks tedious.
Instructure has stated in other posts here that they are working on a new version of dap which will have a framework that is easier to write plugins for.
I am working on coming up with an Oracle solution but I'm a bit stalled at the moment. In the meantime I'm using postgres.
I would love to know what solution you end up with!
Jason
That's correct.
The upcoming next milestone of DAP client library optimizes the data processing path and replaces SQLAlchemy with more direct function calls to database client libraries, which translates to improved insert/update/delete performance (with up to 5x processing speed shown in our measurements). The new solution comes with PostgreSQL, Microsoft SQL Server and MySQL support out of the box, and requires way fewer lines of code to integrate other engines (e.g. Oracle).
Any sort of ballpark on how far out you think something like that would be?
Just wondering if there's any update on this for Microsoft SQL Server, or timeline?
The open-source library we use in DAP client library version 1.0 and later has code for synchronizing schema and data with PostgreSQL, MySQL, Oracle and Microsoft SQL Server databases. However, before any dialect becomes supported in the official DAP client library, it has to go through rigorous testing to uncover any potential integration challenges. Unfortunately, I can't share a release date yet when (and if) these additional dialects become available and supported in DAP client library. (However, I am happy to raise this question to our product team.)
You may try:
After downloading the data, you need to create and configure a .ctl (control file) for use with the Oracle SQL*Loader, which allows you to load data from external files into tables in the Oracle database.
A control file (.ctl) is a text file that contains the instructions for SQL*Loader about how to map the input data to the database columns and various loading options.
Example of a .ctl file content:
LOAD DATA
INFILE 'example.tsv'
INTO TABLE your_table_name
FIELDS TERMINATED BY '\t' TRAILING NULLCOLS
(
column1,
column2,
column3 DATE "YYYY-MM-DD",
column4
)
The use the loader:
sqlldr userid=your_username/your_password@your_database control=your_control_file.ctl log=load.log
Has anyone found a solution to using an MS SQL Server dialect yet? Or a work around? I can't figure out how to unpack these compressed files automatically (like the original CLI) and name them appropriately to even use the BULK INSERT command on TXT or CSV files. I'd love to connect with anyone that may have solved the issue.
Not sure if there are any enterprising folks out there willing to share if they got an MSSQL connection working?
@kmboltonblevins adding MS SQL is on our roadmap, cannot specify the release date unfortunately though.
1. Which version of MS SQL are you using or hope to get supported?
2. Are you using on-prem MS SQL or Azure?
We have an on-prem SQL Server 2019.
I'm happy to follow-up on this forum topic that we have released the DAP CLI 1.3.0 where we have added support for MS SQL.
Please check it out and let us know how it works for you!
To participate in the Instructure Community, you need to sign up or log in:
Sign In