Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 3

This is Part Three of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here

SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked example is using SAP Ariba Data to create an SAC Story that lets you know how much spend has been approved within Ariba Requisitions created in the last thirty days

Our%20basic%20SAP%20Analytics%20Cloud%20Story

A simple SAC Story tracking Approved Requisitions

In Part One of our blog series we discussed how we can retrieve data from SAP Ariba’s APIs using SAP Data Intelligence Cloud. We stored this data as JSON Documents in the SAP HANA Document Store

In Part Two of our blog series we built SQL and Calculation Views on top of our JSON Document Collection

In this blog post we’ll use the Calculation View in SAP Analytics Cloud as a Live Data Model, which will provide the data to our SAP Analytics Cloud Story

Viewing%20the%20data%20in%20SAP%20Analytics%20Cloud

Viewing our HANA DocStore Collection data in an SAP Analytics Cloud Story

Before we can consume our Calculation View in SAP Analytics Cloudwe’ll need to connect SAC to our HDI Container. We can do this from within SAC itself

Click%20on%20Connections

Click on Connections

Click%20on%20Add%20Connection

Click on Add Connection

Select%20SAP%20HANA%20under%20Connect%20to%20Live%20Data

Select SAP HANA under Connect to Live Data

Next, we’ll have to enter the host and credentials for our HDI Container. If you’re not sure where to find these, refer to Part One of this blog series where we retrieved these for Data Intelligence Cloud (under the heading Creating Our Connections in Data Intelligence)

Choose%20HANA%20Cloud%20as%20the%20Connection%20Type%20then%20enter%20HDI%20details

Choose HANA Cloud as the Connection Type, enter HDI details then click OK

Our HANA Cloud Connection has been created and now we’re ready to create in SAC

Within SAP Analytics Cloud we’re going to use a Live Data Model to access our Calculation View in real time. This means that the data in our Document Store Collection will be available immediately after our Data Intelligence Pipeline updates it

Another benefit of using this Live Data Model compared to creating a Model on Acquired Data is that data doesn’t need to be copied to SAP Analytics Cloud for use

Click%20on%20Modeler%2C%20then%20Live%20Data%20Model

Click on Modeler, then Live Data Model

Select%20SAP%20HANA%20as%20the%20System%20Type%2C%20our%20AribaHDI%20Connection%20then%20click%20on%20the%20Input%20Help%20for%20Data%20Source

Select SAP HANA as the System Type, our AribaHDI Connection then click on the Input Help for Data Source

Click%20on%20our%20Calculation%20View

Click on our Calculation View

Click%20OK

Click OK

Now we’re looking at the Live Data Model in the SAP Analytics Cloud Modeler. We can see our Calculated Measure, ReportingAmount

CKOCKEO

Viewing the Measures for our Live Model

We can also check the Live Model’s Dimensions

Click%20on%20All%20Dimensions

Click on All Dimensions

Our%20Dimensions%20are%20all%20here

Our Dimensions are all here

Click%20on%20Save

Click on Save

Enter%20a%20Name%20and%20Description%20then%20click%20Save

Enter a Name and Description then click Save

Now that we’ve got our Live Data Model, we’re ready to create our Story and visualize our Ariba Data

Stories within SAP Analytics Cloud let us use visualize data in a number of ways, including charts, visualizations and images

Click%20on%20Stories

Click on Stories

Within a Story there are a number of different Page Types available. For our example we’re going to add a Responsive Page. A Responsive Page allows you to create layouts that resize and adapt when viewed on different screen sizes

Click%20on%20Responsive%20Page

Click on Responsive Page

Leave%20Optimized%20Design%20Experience%20selected%20and%20click%20Create

Leave Optimized Design Experience selected and click Create

First we’re going to give our Page a title – for example: Approved Requisitions (Past 30 Days)

Click%20then%20give%20your%20Page%20a%20Title

Double Click then give your Page a title

Next, we’re going to attach the Live Data Model we created to our Story

Click%20on%20Add%20New%20Data

Click on Add New Data

Click%20on%20Data%20from%20an%20existing%20dataset%20or%20model

Click on Data from an existing dataset or model

Click%20on%20Select%20other%20model

Click on Select other model

Click%20on%20our%20Live%20Data%20Model

Click on our Live Data Model

Now we’re able to use the data from our Live Data Model in our Story

Click%20on%20Chart%20in%20the%20Insert%20Toolbar

Click on Chart in the Insert Toolbar

Click%20on%20our%20Chart%2C%20then%20on%20the%20Chart%20Type%20dropdown%20and%20select%20the%20Numeric%20Point

Click on our Chart, then on the Chart Type dropdown and select the Numeric Point

Now it’s time to give our chart some data

Click%20on%20Add%20Measure%20under%20Primary%20Values

Click on Add Measure under Primary Values

Click%20the%20checkbox%20next%20to%20ReportingAmount

Click the checkbox next to ReportingAmount

Now we can see a sum of all of our Approved Requisitions. However, we may (or rather we probably will) have Requisitions in more than one currency. To separate these Requisitions we’ll need to use Chart Filters

Click%20on%20our%20Numeric%20Point%20again%20to%20exit%20the%20Measure%20Selection

Click on our Numeric Point again to exit the Measure Selection

Click%20on%20Add%20Filter

Click on Add Filters

Select%20the%20Column%20ReportingCurrency

Select the Column ReportingCurrency

From here we’ll be able to select which values of ReportingCurrency we’d like to see reflected in our ReportingAmount total. Given that it doesn’t make sense to sum totals in different currencies without first converting them, we’re going to select only a single currency

The data in your system may have different currencies than mine, so feel free to adjust accordingly

Select%20USD%2C%20unselect%20Allow%20viewer%20to%20modify%20selections%20and%20click%20OK

Select your currency, unselect Allow viewer to modify selections and click OK

We%20now%20have%20our%20total%20ReportingAmount%20for%20our%20first%20currency

We now have our total ReportingAmount for our first currency

While it’s great to have our total, the average end user is not going to know what ReportingAmount means. It’s time to give our Numeric Point a meaningful label

Click%20on%20Primary%20Value%20Labels%20under%20Show/Hide%20to%20turn%20off%20the%20lower%20ReportingAmount%20label

Click on Primary Value Labels under Show/Hide to turn off the lower ReportingAmount label

Double%20click%20on%20the%20Chart%20Title%20to%20edit

Double click on the Chart Title to edit

Type%20your%20Chart%20Title%20and%20use%20the%20Styling%20panel%20as%20desired

Type your Chart Title and use the Styling panel as desired

We%20can%20adjust%20the%20size%20of%20our%20Chart%20using%20the%20arrow%20in%20the%20corner

We can adjust the size of our Chart using the arrow in the corner

At this point, we have our Numeric Point set up and ready to go

Our%20Finished%20Numeric%20Point

Our finished Numeric Point

If your system only has the one currency, you can leave it here. If your system has more than one currency, you can duplicate the Numeric Point, then change the Chart Filter and Chart Title using the same steps we just followed

Fe

Click on Copy, then click Duplicate

Once we’ve finished with our currencies, it’s time to save

Click%20on%20Save

Click on Save

Enter%20a%20Name%20and%20Description%20then%20click%20on%20OK

Enter a Name and Description then click on OK

Our Story is now finished and ready to be viewed

Now we’ve created our Storybut we don’t want it to just sit inside our Files on SAP Analytics Cloudwe want people to use it. Let’s share our story with our colleagues

Click%20on%20Files

Click on Files

Click%20on%20the%20checkbox%20next%20to%20our%20Story%2C%20then%20click%20on%20Share%20under%20the%20Share%20menu

Click on the checkbox next to our Story, then click on Share under the Share menu

Click%20on%20Add%20Users%20or%20Teams

Click on Add Users or Teams

Select%20Users%20youd%20like%20to%20share%20the%20Story%20with%20then%20click%20OK

Select users you’d like to share the Story with then click OK

Click%20the%20checkbox%20to%20notify%20the%20Users%20by%20email%2C%20then%20click%20Share

Click the checkbox to notify the users by email, then click Share

We%20can%5D

If we’d like to change the Access Type, we can do that here

Now we’ve shared our Story with users, and decided what kind of access we’d like them to have. This isn’t the only type of sharing available in SAP Analytics Cloud – for example you can learn about publishing to the Analytics Catalog here, and read about your other options in the Help Documentation

The%20Analytics%20Catalog%20helps%20users%20to%20discover%20vetted%20Analytics

The Analytics Catalog is a single access point for SAP Analytics Cloud content vetted by your own Content Creators

Now that our setup work is done and the users can view our SAP Analytics Cloud Story, we want to make sure the underlying data in our SAP HANA Document Store Collection is kept up to date on a regular basis

Since our SAP Data Intelligence Pipeline is responsible for truncating the data and supplying a new batch of data, we want to schedule it to run automatically. We can do this from Data Intelligence Cloud itself

Click%20on%20Monitoring

Click on Monitoring

Click%20on%20Create%20Schedule

Click on Create Schedule

Write%20a%20description%20for%20our%20Schedule%20then%20choose%20our%20Pipeline%20under%20Graph%20Name

Write a description for our Schedule then choose our Pipeline under Graph Name

Choose%20how%20often%20our%20Pipeline%20will%20be%20run

Choose how often our Pipeline will be run

Our%20Schedule%20has%20been%20created

Our Schedule has been created

When we create our first Schedule, we’ll see My Scheduler Status: Inactive. We don’t need to worry – our Scheduler’s Status is actually Active. To see it, we can click on Refresh

Click%20on%20Refresh

Click Refresh

Our%20Scheduler%20Status%20is%20Active

Our Scheduler Status is Active

You may remember back in our first Blog Post that our Pipeline waits twenty seconds between each call of the Ariba APIs. This is because each of Ariba’s APIs has rate limiting. These rate limits are cumulative

What that means for us is that for each Realm and API (for example MyCorp-Test Realm and Operational Reporting for Procurement – Synchronous API) we have a shared rate limit, no matter how it’s called

The Pipeline we provided in Part One of this blog series is optimized for performance – i.e. it makes calls as fast as Ariba’s rate limiting will allow

If there’s more than one instance of this Pipeline running at once, both will receive a rate limit error and no data will be uploaded to our Document Store Collections

Please keep this in mind when you plan for the scheduling of these pipelines, and refer to the Ariba API Documentation for up-to-date information on Ariba Rate Limits as well as how many Records are returned each time the API is called

Throughout this blog series we’ve shown how we can set up a pipeline that will get data from Ariba and persist it in HANA Cloud Document Store, as well as how we can schedule it to run periodically

We’ve also shown how we can create a Calculation View on top of these JSON Documents, and finally how we can create a Story in SAP Analytics Cloud that will let us visualise our Ariba Data

Viewing%20the%20data%20in%20SAP%20Analytics%20Cloud

The data and models in these blog posts have been rather simple by design, in a productive scenario we will likely want much more complex combinations and views. Hopefully having followed along with our worked example you’ll have a good foundation to start with when you build your own

SAP Analytics Cloud | Introduction to SAP Analytics Cloud

SAP Analytics Cloud | Models Based on Live Data Connection to HANA Views

SAP Analytics Cloud | Sharing, Collaborating and Exporting

SAP Data Intelligence | Schedule Graph Executions

This blog series has had a lot of input from my colleagues – any errors are mine not theirs. In particular, thanks go to the Cross Product Management – SAP HANA Database & Analytics team, Antonio Maradiaga, Bengt Mertens, Andrei Tipoe, Melanie de Wit and Shabana Samsudheen

Note: While I am an employee of SAP, any views/thoughts are my own, and do not necessarily reflect those of my employer