Manual Migration of Catalog Objects From On-Premises Database to HANA Cloud

This blog provides general concepts for manually migrating from the on-premise database to the  SAP HANA cloud. Begin with available paths for migrating the objects and explain each process step by step to do the migration manually using different methods.

First, let’s introduce database migration, why should we do migration and how can we do the migration.

what is database migration..?

Database migration is a process that transfers or moves data from one database to another. There are two databases one is the source database from where you want to move the data and the other is the target database the place you want to move your data. In the Migration process, we can achive migrating the data from various kinds of sources such as XL sheets, on-premise databases, normal databases..etc.

why should we do the migration.?

  • Moving existing data to a new database
  • To reduce cost
  • To improve performance
  • Moving from an on-premise database to a cloud-based database for better scalability and lower costs
  • Combine the data from different databases into one database.

how can we do the migration..?

We can do the migration process depending on several factors such as:

  • Size of the database
  • The number of records present
  • The compatibility of the objects present in the DB

The steps to follow the migration:

  • Checking the details of the Source database
  • Creation of the target database
  • Authorizations
  • Migrating the objects
  • Deploying the objects
  • Data loading
  • UAT

Here in this blog, we’ll discuss migrating the objects. We can migrate the objects in different ways

  • Using tools like XSC to XSA and HMS tool..etc
  • Manually migrating the objects.

The present blog explains how to do the migration of the objects manually. There are three ways to do the database migration manually. They are:

  1. Export objects using Database Explorer
  2. Import objects and deploys them with the development environment (Eg: BAS)
  3. Directly executing the DDLs in the SQL console using the Database Explorer

1. Export objects using Database Explorer :

To export the objects using the DB explorer first you need to have the objects in your local system in TAR.GZ  format. So first, go to the database explorer and right-click on the HDI container where you wanted to place the objects. You’ll find an option ‘import catalog objects’, please select that.

    import_option

Import_window

Click on the browse button on the new screen and select the TAR.GZ file from your local computer. Once you selected the tar file you can see the screen below:

Tar_gz_file_extraction

Now, select the objects you wanted to import and click on the “Import” button. The files are imported to your HDI container with a success message.

2. Import objects and deploys them with the development environment (Eg: BAS):

In this method or migration of database, you need to export the objects first from the source system or create files corresponding to their type and make them zip files (you can import them individually also) and then import that zip file to the development tool and continue like normal development integrated development environment.

First, you have to open the On-Premise DB source, select the files, choose the export option, and export them to your local system.

export_option_in_neo

Open the exported zip file and we need to make some changes in the files because it gives you the original files compatible with the source database, which gives an error while deploying them to our new Hana Cloud environment. A few important and common changes that need to do before we proceed are:

  • Remove the schema reference in every object and check for invalid data types. If you find any CS_* keywords, please remove them.

schema_reference_CS_data_types

  • For example, if we find some data types with CS_ (CS_FIXED, CS_INT, CS_DOUBLE, CS_STRING) remove them from the files save them individually and make all the files into a single .zip or .rar file format. 
  • Once all the changes in the files are done please upload them into the development tool (BAS), and do the deployment as shown below screenshots.

upload_option_in_BAS

Once deployed, your BAS terminal and the DB module look as in the screenshot below:

BAS_folder_structure

File extensions for the corresponding objects based upon their type: 

  • Table to .hdbtable 
  • View to .hdbview
  • Virtual table to .hdbvirtualtable
  • Table type to .hdbtabletype
  • Procedure to .hdbprocedure

The Complete Guide to Changes in the Respective Objects and File Extension Type of Those Objects : 

https://help.sap.com/docs/SAP_HANA_PLATFORM/4505d0bdaf4948449b7f7379d24d0f0d/453d48e28f6747799546236b4b432e58.html 

Once the files are uploaded and deployed you can find them in the database explorer in the corresponding HDI container in the catalog folder as shown below screenshot.

DB_catalog

tables_catalog

Open the catalog folder and search for the objects whatever you deployed to double confirm if you have it all in the destination database.

3. Directly executing the DDLs in the SQL console using the Database Explorer (recommended for a few objects) : 

  1. Select the required Schema from the catalog in the SAP HANA workbench of the On-premise DB.  Shown below are the system schemas.

Neo_catalog

Here you will also find the schemas’ in your scope migration relevant to your landscape. Select the object you want under that schema, and open it by double-clicking. Here we are taking a catalog object of the type “Tables” for demo purposes.

  1. Then right-click on the table and select Generate create.

generate_create

  1. Once you click on generate create you’ll get the below window that contains the DDL SQL statement.

Neo_SQL_console

  1. Select and copy the SQL statements in the window and paste them into a SQL console in the Data Base Explorer by following the steps below.

HC_SQL_console

  1. Careful when you open the SQL console and check if the Console is connected to the corresponding database or not.

DB_connection_check

Here in this picture, we are connected to the DBADMIN schema also called an HDI container in many discussions.

  1. Paste the copied SQL statements (Where we use “Generate create”) in the console, and execute the command (For execution click on the play button on the left side of the console)
    .
    Schema_name_and_CS_removing
  2. Here there, are some data types were having suffix CS_data type, we must delete those CS_ Data types because, in Hana Cloud, we don’t have to mention the column store type and the previous schema name in the SQL code there.
  1. While executing the console verify that it does not have the previous schema name in the DDL SQL script.checking
  2. Execute the script now.  It’ll create the runtime object in the target database. You can check that in DB explorer. It creates the runtime object(s) for those objects within an HDI Container.
  3. Here we created a table as an example. So you can check it on DB Explorer, which also shows the table and its fields. Find the example table we used in the below screenshot.

table_checking_after_creation

For tables and views creation we follow the above procedure. For the other objects, we need to follow the same steps., but may you need to edit some more syntaxes in the SQL script based on the type of object. The SQL Console is quite interactive to help you fix any errors that may be foreseen before you execute the SQL Scripts into the target database.

  • For another example, we share a virtual table syntax similar to the tables here.

CREATE VIRTUAL TABLE “<Schema Name>”.”<Virtual table Name>” AT “<Remote Source Name>”.”<NULL>”.”<Remote Object name>”;  

  • You can find more such examples for the objects when you follow the above steps for every object type, in your source database (On-premise).

Thank you for reading and following this blog, if you have any questions or updates and for all feedback please leave a comment.