Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 23 Current »

Access your training data in the business intelligence tool of your choice. Allowing you to share training data with the rest of the organization.

About

We know your Learndot data is important to you for reporting, analytics and much more. Our Data Pipeline App provides you the ability to access your data in several ways and define how you use it. To make our data structure and relationships easier to understand we simplify our data model down into a Logical Data Model. All data that is listed as part of our Logical Data Model comes as part of our Data Pipeline App. 

Data is populated and refreshed through batch operations performed on intervals. NOTE: Our Data Pipeline App does not provide real-time data updates, only interval batch updates which happens every 4 hours. For a shorter window, please reach out to us via our success portal. And, if you looking for real-time data access, consider using our Learndot API.
Additional cost is associated with this App, please contact us using our success portal for details or reach out to us at support@learndot.com

App Aim

It is important to first understand the purpose of the integration and what problem it will solve. The purpose of this connector is to:

  • Allow Administrators to collect their Learndot data into their own personal database for running internal queries or generating reports.

Setting up Data Pipeline App

To Setup the Data Pipeline App:

  1. Choose a Database type. Our Data Pipeline App can be connected to a variety of destinations, as listed below: 
    1. Customer-Hosted database
      You host a MySQL database and provide ServiceRocket read-write access to database. ServiceRocket will provide a script to create the necessary tables and columns in database to match what is listed as part of the Logical Data Model. The full Logical Data Model will be available with regular data refreshes. 

    2. Custom
      Want something different? Get in contact with our Learndot Support Team to inquire about connecting our Data Pipeline App to additional locations or getting data feeds in different formats. 
  2. Send an email to Learndot Support requesting to get onboard the Data Pipeline App along with the type of database (Customer-Hosted OR Custom) you choose to use.
  3. If you choose to utilize your own database, Support will revert back to you with our Data Pipeline App Server IP Address which the database administrator has to White-list and Database creation SQL statement which the database manager has to run, to successfully create the database. 
  4. After creating the database and white-listing the Server IP Address, revert back to Learndot Support with the database Username and Password, and the URL (path of the database).  
  5. Our Learndot team will perform the initial push. Synchronization will then take place with regular data refreshes. 

Logical Data Model

Our Logical Data Model is a simplified model of our underlying Learndot database. Click here to learn more about our Data Model. Feel free to reach out to our Learndot Support Team with questions or clarifications around any of the data attributes.

Record Deletion

Deletion is not supported at the moment. Therefore if a record deleted from Learndot, it will not be deleted from the destination database.

Feature Implementation

Any custom feature request will be considered and added to the roadmap for implementation if deemed viable. 


Got a suggestion for a new app or integration? Contact our friendly Support Team.

Key Resources

Quick links to additional resources for Data Pipeline App:

  1. Logical Data Model for Data Pipeline App
  2. Release Notes for Data Pipeline
  3. ERD Diagram
  • No labels