After getting the data using the canvas CLI we use Talend Open Studio to push the data into Oracle and use Tableau as a reporting engine.
At this point the size of the data is relatively small but as we progress from pilot to production we are anticipating much large data sets coming our way.
I am curious to know about the workflow of other institutions such as
How do you make data ready for reporting?
How often do you refresh the data.
Do you refresh it from beginning or just capture the data?
How do you scale your infrastructure with the size of the data?
Do you model the data again (i.e Stage the raw data and the model for reporting needs)or work with the model Canvas provides?