Where’s my learning data?
The push to SAP’s SaaS Learning module has been successful for nearly everyone but it does come with some downsides. By design (and to support stability of the platform) the SAP suite doesn’t make bulk exports of large amounts of data easy or even possible in some cases. There are very common and practical reasons for this in a shared computing environment like the SAP cloud. But knowing the reasons doesn’t help to solve the challenges.
Out-of-box solutions provide options but they all have their drawbacks:
- The API’s present features allow users to retrieve data in a manner designed for integration (one or two at a time and in response to specific API “questions”) not bulk data exports. They are designed to answer questions like “show me this student’s learning plan” and not questions like “show me all the students that have this item on their learning plan”.
- Some tools that are presently available and are candidates for a bulk export, like Integration Center, do not use learning data yet. While it is on the roadmap to add this data source, the date is not yet firm.
- Some of the previous “cheat” methods, like using a custom report, is fraught with risk as SAP will be sunsetting the PRD tool sometime soon. Even if you ignore the tool removal risk reports can be clunky, slow, and fragile.
This makes feeding a hungry business intelligence tool, having Learning-driven integration events, or even getting a large portion of the data in a meaningful way not presently supported by the SAP Learning API extraordinarily difficult.
Enter data services
SAP has a separate offering that can satisfy the need for large scale data output from the Learning module. It’s called data services. Rather than trying to feed the data though a small pipe (like the APIs), data services are flat file exports of the table deltas from the Learning module data model. The data is delivered via Secure File Transfer (SFTP). This allows far more flexibility with the delivered size of the data and creates significantly less strain on the Learning module.
While you get access to a significantly larger volume of data here the trade is that the client end is responsible for more of the effort. There needs to be a place for the data to land, and the client must build and maintain the process to consume the data in a meaningful manner. There’s no “pre-digestion” of the data. You get it as raw flat export files that you must manage. With this responsibility comes ultimate flexibility. Even though you have to do everything to manage the data coming in you can do whatever you need to do in order to make it fit your consumption needs.
Important bits about Learning data services:
- The tables are bundled in three table groupings (SAP calls them “packages”). Package A is the most commonly used set of tables. B offers the somewhat lesser used tables and C the least commonly used tables. Pricing is based on the selected packages of tables.
- This is an out-of-box offering from SAP. There is no real customization possible to the offering except the ability to select the table packages.
What do I get?
Once you‘ve signed up you will receive these files via SFTP delivery:
- A full extract of all requested tables in your selected packages. This will allow you to “prime the pump” on your receiving side to have a set of base data to work with.
- A set of delta files (typically delivered nightly) that are an extract of the data rows produced for each table in your selected packages over the course of the delta period.
- A set of key files (also typically delivered nightly) that are just the columns comprising the primary key of the tables in your packages. This will allow you to detect deletions from the table because delta files do not include action flags. So you would not see deletions there
How does it work?
SAP will conduct a mini-project with you to ensure that the data is delivered properly and that your consumption process is working. While they do not help with the development of the receipt side of the process, they are available to answer questions about the delivery aspects. Once you have signed off on their delivery you are ready to go into production:
- You will request a full extract of the tables you have paid for by raising a ticket with SAP support.
- Your SAP project team will enable nightly delta and key files for your feed from your production instance.
- Your data receipt process picks up the files from the SFTP delivery site and consumes them.
Why should I worry?
As always, there are risks to consider for any integration:
- Deltas can get out of sync with your main data destination. This can be remedied in different ways depending on your consumption process and its implementation; though you would be wise to retain the delta files for a week or two to allow you to catch up if necessary. The emergency fix would be to enter another support ticket to get a full extract to allow you to reset your baseline.
- Remember, your data will be delivered exactly as it is from the database. If your data contains special characters or carriage returns it will be present in the extracts. Be sure your consumption process can handle them (and has robust error handling in general).
- In that same vein, full extracts are only delivered on demand via ticket request. They are typically an exception (or at least a very occasional regular occurrence) and usually need to be managed differently than your standard delta files.
Wrangling the large amount of data your Learning module produces can be difficult with the normal out-of-box tools. Using the Learning data services offering can be a reasonable answer to data-starved integrations, reporting tools, and other interfaces.
Based on my experience, working with a preferred and experienced vendor will help make your transition to Learning data services much easier and will help you to avoid some of the pitfalls. If you are interested in learning more about Learning data services or have more questions, visit https://www.gpstrategies.com/ or leave a comment below.