How To Connect SAP Cloud for Real Estate to SAP Data Warehouse Cloud

This post will share my experiences with you and guide you through the steps to connect your SAP Cloud for Real Estate API published by the SAP API Business Hub for Cloud for Real Estate to your SAP Cloud for Real Estate standard content available in SAP Data Warehouse Cloud via SAP Data Intelligence Cloud.

I know and fully aware that there are a lot of more option to push your SAP Cloud for Real Estate data to SAP Data Warehouse Cloud but in this blog post want to share a real life customer scenario.

Prerequisites are:

  • You have already enabled your SAP Cloud for Real Estate API on SAP Business Technology Platform (SAP BTP) to get your individual OAuth2 token.
  • You have running an instance of SAP SAP Data Warehouse Cloud in your SAP BTP.
  • You have imported and activated your business content for SAP Cloud for Real Estate in SAP SAP Data Warehouse Cloud via Content Network.
  • You created a database user in your SAP Data Warehouse Cloud to get an own database user and schema to write to tables in your entry layer.
  • You have running an instance of SAP Data Intelligence Cloud in your SAP BTP to create and run pipelines to create tables initially in SAP Data Warehouse Cloud and push data from your SAP Cloud for Real Estate API.
  • You have basic experiences in SAP Data Intelligence Cloud, Python, JSON and consuming RESTful APIs.

Steps are:

  1. Create an inbound connection in SAP Data Intelligence Cloud from your SAP Cloud for Real Estate API.
  2. Create an outbound connection in SAP Data Intelligence Cloud to your SAP Data Warehouse Cloud instance.
  3. Create tables in your entry layer of your SAP Data Warehouse Cloud. You can do it via SAP Data Intelligence Cloud and pipelines or directly on your database schema via Database Explorer.
  4. Create a Python custom operator to get and parse data from your API and push them to your SAP Data Warehouse Cloud instance, specially in your SAP HANA Cloud schema of your database user, created before.
  5. Create pipelines for each entity you want to extract and push to the tables in your entry layer by using your Python custom operator, created in step 4.
  6. Manage your inbound and outbound connection of your custom operator in each pipeline.
  7. Run your pipelines and enjoy our data!

We will present you SAP Data Intelligence Cloud pipelines to manage your full ETL process in detail, soon! In this blog post we will add a link to a SAP Data Intelligence Cloud solution, which will show you, how you can do it but without any support!

….to be continued soon!