Cloud Integration – Import and Export PGP Secret Key – Change PGP Secret Key Password

In SAP Cloud Integration, you can upload a PGP secring.gpg for using secret keys in the PGP decryption or encryption step of your integration flow. The SAP help documentation describes how to export and import PGP public keys with the help of the Gnu Privacy Assistant. Yet, there are no details on how to export and import PGP secret keys. Here, we provide a step by step description on how to import and export PGP secret keys from a PGP secring deployed on your Cloud Integration tenant via the GPG command line tool. Furthermore, we also provide a description on how to change the password of a secret key.

Download your PGP secret keyring from the Cloud Integration WEB UI using the Administrator user. The downloaded file must have the name “secring.gpg”.

In case you got the PGP secret keyring “secring.gpg” from somewhere else (because you want to add the secret keys of this secring to the secring uploaded to your Cloud Integration tenant), you can use also the following steps to first export the secret keys and you can use the commands of the following chapter “Import PGP Secret Key” to import the secret keys.

  1. Enter the downloaded file secring.gpg into an empty directory. In the following, we assume that the directory path is given by C:/source.
  2. List the content of the secring with the following gpg command:
    ​gpg --homedir=C:/source --list-secret-keys

    Example output of the command:

    ...
    C:/source/secring.gpg
    ---------------------
    sec 2048R/BBB29842 2021-01-17
    uid source <souce@source.com>
    ssb 2048R/64FACE2B 2021-01-17​
  3. Find out the uid (user ID) of the secret key you want to export. You can see the uid(s) of the secret keys from the output display of the previous command. In the example above, we only have one secret key with uid=”source <souce@source.com>“. We will use this uid in the following command.
  4. Export the secret key with the found uid with the following command into a file (below we use the file name “my-secret-key.pem“):
    gpg --homedir=C:/source --armor --export-secret-key "source <souce@source.com" > my-secret-key.pem​

You can now use the exported file “my-secret-key.pem” in the following chapter to import the secret key into another PGP secring.

You now need the secrete key file which was produced in a similar why as shown in the previous chapter “Export a PGP Secret Key”.  In the following, we assume that you have such kind of file with the name “my-secret-key.pem” in the directory where you execute the gpg commands. Execute the following commands to import a PGP secret key into a PGP secret keyring with the name “secring.gpg”. For example, you can get the secring.gpg by downloading this file from your Cloud Integration tenant.

  1. Store the PGP secret keyring with the name “secring.gpg” into the empty directory C:/target. You can also choose another directory name but keep in mind that the following commands assume that the directory name is C:/target.
  2. List the secret keys which are already contained in the PGP secret keyring:
    gpg --homedir=C:/target --list-secret-keys​

    Output Example of this command:

    C:/target/secring.gpg
    ---------------------
    sec 2048R/5D629E05 2021-01-17
    uid target <target@target.com>
    ssb 2048R/18A00B85 2021-01-17
  3. Import the secret key (as an example, we use the file “my-secret-key.pem” which contains the secret key):
    gpg --homedir=C:/target --import my-secret-key.pem​

    Output Example of this command:

    gpg: key BBB29842: secret key imported
    gpg: key BBB29842: public key "source <souce@source.com>" imported
    gpg: Total number processed: 1
    gpg: imported: 1 (RSA: 1)
    gpg: secret keys read: 1
    gpg: secret keys imported: 1​
  4. List the secret keys of the PGP secret keyring to check whether successful import was successful:
    gpg --homedir=C:/target --list-secret-keys​

    Output example of the command:

    C:/target/secring.gpg
    ---------------------
    sec 2048R/5D629E05 2021-01-17
    uid target <target@target.com>
    ssb 2048R/18A00B85 2021-01-17 sec 2048R/BBB29842 2021-01-17
    uid source <souce@source.com>
    ssb 2048R/64FACE2B 2021-01-17​

Once you’ve imported all the secret keys you need, you can use the PGP secret keyring file secring.gpg for uploading into the Cloud Integration tenant. Note: All secret keys in the secret keyring must have the same password. If this is not the case, read the following chapter on how to change the password of a secret key in a PGP secret keyring.

In the following, we assume that you have a secret keyring with the name secring.gpg in the folder C:/target and that you want to change the password of a the secret key in this keyring.

We use the tool GNU Privacy Assistant which provides a UI.

Prerequisites:

Now execute the following steps:

  1. Execute run_gpa.bat in the folder C:/target. You will get the following UI:
  2. Select the private key for which you want to change the password and right-click it. From the menu, choose the option “Edit Private Key…“. The following pop-up appears:
  3. Now, select “Change passphrase” and follow the instructions.

After you have adapted the passwords of the secret keys you can upload the secring.gpg file to your Cloud Integration tenant.

OIF and GAF Floorplans: Changing elements in the Identification Region (IDR) at runtime


What is the IDR?

The IDR, found in the header area of OIF and GAF floorplans, is to display information in label/value pairs that are relevant to the data processed in the application.

This blog describes how the content of label/value pairs can be changed at runtime via ABAP APIs.

Step-by-step: Changing the values

You can access the IDR API using the following code:

DATA: LO_IDR TYPE REF TO IF_FPM_IDR.
DATA: LO_FPM TYPE REF TO IF_FPM.
LO_FPM = CL_FPM=> GET_INSTANCE( ).
LO_IDR = LO_FPM-> GET_SERVICE( IF_FPM_CONSTANTS=>IDR ).

Now that you have the API for the IDR, you can use it in the following way:

DATA: LT_ITEM_GROUP_KEYS TYPE T_ITEM_GROUP_KEYS.
LT_ITEM_GROUP_KEYS = LO_IDR->GET_ITEM_GROUP_KEYS( ).

The table LT_ITEM_GROUP_KEYS now contains all the keys of the available label/value pairs that exist in the IDR area.

What you can do next is to use the following API methods to see which key is related to which label/value pair:

DATA: LT_ITEM_GROUP_KEYS TYPE T_ITEM_GROUP_KEYS.
LT_ITEM_GROUP_KEYS = LO_IDR->GET_ITEM_GROUP( … ).

You can make changes to them as follows:

LO_IDR->CHANGE_ITEM_GROUP_BY_VAL( ) or
LO_IDR->CHANGE_NAVIGATION_ITEM( ) or
LO_IDR->CHANGE_IMAGE_ITEM( ).

Which method you use depends on the type of the label/value pair (texts, navigation links or icons).

If a label/value pair has been created as a value reference to a Web Dynpro context element attribute, changing the context element automatically updates the value (that is, there is no need to use the IDR API for this).

Ulrich Miller

SAP Enterprise Support Academy: Join our newly developed Meet-the-Expert recordings

No time to retire! This is true for some of our SAP Enterprise Support Academy Expert-guided Implementation (EGI) classes. We continuously strive to provide you with the best learning quality and offerings possible. This brought us to reshape some courses to make them easier available and accessible to you.

We have transferred 4 Expert-guided Implementations (EGIs) into new Meet-the-Expert recordings. The sessions will provide you with expert knowledge, guidance, and answers to your questions. With the advantage that you can consume the sessions whenever it fits into your schedule.

Please find below a summary of the newly converted MTE recordings.

*Before you can access the links below, a one-time registration in SAP Learning Hub, edition for SAP Enterprise Support, is required. A detailed step-by-step guide to registration can be found here.

Register to watch the sessions by following the links.

Previous EGI:

Change Request Management (ChaRM) Basic Functions

Successor item, 3-modules MTE recordings:

This content covers the implementation of scenario change request management (ChaRM)with SAP Solution Manager 7.2. The prerequisites, ChaRM configuration, TMS configuration, business partners, etc. are discussed.

This content covers the topics of cycle transaction, request for change transaction, standard change, normal change, and urgent change with respect t ChaRM.

This content covers the topics – administration change, general change transaction, and defect correction. The reporting options available in the administration cockpit is also discussed.

Incident and Problem Management (ITSM)

Successor items:

This MTE covers the topics related to understanding IT Service management on SAP Solution Manager and will give an overview of the capabilities of ITSM standard scenario. The template users and authorizations required are discussed.

This MTE covers the topic of master data management, support team determination and multilevel categorization.

This MTE helps you understand the service level agreements and handling of processing times for ITSM

Technical Upgrade to SAP Solution Manager 7.2

Successor item:

Transport Execution Analysis (TEA)

Successor item:

The course provides an overview of various proactive checks for objects in transport requests, which can predict transport-related errors before the requests are imported into a target system.

How to sign up for the support edition

  1. Navigate to the sign up page
  2. Click the Sign Up button to authenticate yourself with your s-user ID and password. Upon first access, the system will check your eligibility, create a new SAP Learning Hub user, and populate your learning catalog respectively.
    You will then receive a registration confirmation via e-mail soon. Please note that the catalog assignment would take up to 2 hours.
  3. After you have received the confirmation e-mail, the registration process is complete.

Quicklinks:

Financial Accounting and Management Accounting Training on SAP S/4HANA 2020

Financial management requirements are constantly changing, and it’s more important than ever to keep your skills current. SAP Financial Accounting (FI) and SAP Controlling (CO) process and consultant training delivers the understanding and expertise professionals need for better management of finance accounting and reporting in enterprises.

At the beginning of 2021, we will successively convert the entire Financial Accounting and Management Accounting portfolio to the new SAP S/4HANA 2020 release.

Training courses will be held on SAP S/4HANA 2020 depending on availability.

Financial Accounting:

Learn how to configure, use, customize, and automate a broad range of basic to advanced SAP S/4HANA accounting functions, such as financial closing, asset and profit center accounting, cash management.

Training for SAP S/4HANA in Financial Accounting

Management Accounting

Learn more about how Management Accounting supports coordination, monitoring, and optimization of all the processes in an organization. SAP CO includes managing and configuring master data that covers cost and profit centers, internal orders, and other cost elements and functional areas.

Training for SAP S/4HANA in Management Accounting

Now Open: SAP Identity Management Customer Connection 2021

Yes, we did it again! We officially kicked off the 2021 SAP Identity Management Customer Connection on January 18. If you missed the actual kick-off, you can find recordings of both the English and German calls on the SAP Influence Platform.

What is a Customer Connection? This program allows SAP IDM customers and partners to submit feature and function requests for the IDM product. As outlined in the product scope, your requests should be centered around the following topics:

  • Functional and usability improvements related to IDM features such as the Developer Studio, administration and Web UIs, or the RESt Interface Version 2
  • New on-premise connectors or extensions to existing connectors
  • Improvements for the on-premise IDM solution regarding hybrid identity management scenarios (integration of SAP Identity Management with the Identity Provisioning component of the SAP Cloud Identity service)

From now until the end of February, we’re opening the platform for your input. Make sure to register and follow the project to contribute your own ideas, vote for the submissions of your peers, and receive notifications and updates.  The SAP development team will evaluate all requests that have at least 10 votes. From the pool of those requests, we will select several for implementation. During the selection call on April 7, we will communicate our development plan for the remainder of the year. Finally, in December, we will present the product improvements that we created based on your input.

Interested in the result of the 2020 Customer Connection? You can find details in my previous blog, the latest Support Package, and in this description of the new SAP S/4HANA connector.

We look forward to your ideas!

Day 4 | Skybuffer Community Chatbot | How to Bring Your Own Bot to Hybrid Chats

Business Challenge:

It might happen that your organization has already developed a chatbot on SAP Conversational AI platform and likes to use Hybrid Chats solution to boost the support organization with the Live Chat functionality, on-premise conversations logging, automatic conversations categorization and many other nice features.

Solution:

Please, follow these simple steps to adjust your chatbot with a few skills from Skybuffer Community Chatbot that will allow you to connect your chatbot properly to Hybrid Chats in a couple of minutes.

Step 1: Navigate to Skybuffer Community Skillset (please use community bot in our organization because it undergoes regular updates):

https://cai.tools.sap/skybuffer-community/skybuffer-foundation-content/skills

Step 2: Select zxas-disambiguation skill to be forked to your chatbot:

Step 3: Select zxas-set-up-memory-parameters skill to be forked to your chatbot:

Step 4: Navigate to ACTIONS section of zxas-set-up-memory-parameters in your chatbot (it is one but the last message group in the ACTIONS section) and change FALLBACK skill name in case you re-named the standard one:

You can always check how FALLBACK skill is named in your chatbot by simply checking it in the BUILD section:

Step 5: Now you can follow Day 1 blog and connect your chatbot to Hybrid Chats solution.

Step 6: Once your chatbot is connected, you can enjoy chatting both with the chatbot:

and the operator:

NB: Please note that automatic categorization can also be added to your chatbot in case you follow the steps in our blog series Day 1 and Day 2, so you can easily connect your chatbot skills to the categorization scheme in Hybrid Chats.

Conclusion

Now you have completed Day 4 guidelines and you know how to bring your own chatbot to Hybrid Chats solution just in a few clicks that will take you only 1-2 minutes to connect your human expertise department to your chatbot users.

You see the power and advantages of Hybrid Chats and Skybuffer Community organization that converted your chatbot into the Customer Care Center in a few clicks – isn’t that just fantastic?

P.S. You can also find the entire list of our blog posts under the links below:

Day 1 | Skybuffer Enterprise-Level Conversational AI Content Made Public for SAP Community Development

Day 2 | Skybuffer Community Chatbot | How to Customize Your Chatbot “get-help” skill

Day 3 | Skybuffer Community Chatbot | How to Create Multilingual Question-and-Answers Scenario Fast and Easy

Day 4 | Skybuffer Community Chatbot | How to Bring Your Own Bot to Hybrid Chats

Want to learn more about the Incident Management topic- Join CIC for the 1st Webcast of 2021

Are you interested in learning more about the Incident Management topic?

If so, Customer Interaction Center would like to invite you to our first webcast of 2021.

It will take place on Wednesday, 20th January 06:00 – 7:00am CET / 10:30 am IST and it will last for approximately one hour. 

A replay will be available for download after the event which can be found here

In this webcast, we will highlight the best practices for incident creation and also enabling you to get the best out of your support from SAP.

This webcast will help to improve the quality of incidents being created and generally enhance your support experience with SAP. The aim is that you will all understand the SAP processes that you are expected to follow to get the best results from your SAP engagement.

To register for the Webcast, please access here

To get more information on our webcasts please see here

Thanks for reading!

The New Application Interface Register for SAP Integration

All partners who complement SAP Application’s features by extending them, are applicable for SAP PE Build Partnership. This is for SAP PE Build Partners that uses API’s to get and post data to/from SAP applications. A very important announcement targeted towards SAP PE Build Partners was made in November 2020. The announcement made in the SAP Partner Portal was to notify that from February 15th, 2021, a Unique Integration Key provided by SAP must be used by all PE Build Partner’s Solution that integrates or connects to SAP Applications through APIs. This is also called the Application Interface key. This unique key must be used in the API Header.

How to do Application Interface Register for SAP Integration

To call every SAP LoB API as part of the Application Interface Register, it is required that the SAP PE Build Partner locate and use the Application Interface Key. The Application Interface Key will be an 8-character alphanumeric key which will be provided to all Registered Partner Application. Qualified and compliant applications can be deployed by partners by providing the Application Interface Key when calling for SAP APIs.

How to access the Application Interface Key

To access the Application Interface Key, SAP PE Build Partners need to go through the following:

  • Manage My Partnership Portal
  • Manage My Apps Portal
  • Manage My Apps Mobile Interface.

A Step-by-Step Guide of How to Locate the Application Interface Key:

Step 1 – Open and Login to the SAP Partner Portal – https://partneredge.sap.com/

Step 2 – Open the Partnership Dashboard – https://partneredge.sap.com/en/partnership/dashboard.html

Step 3 – Open SAP PartnerEdge Launchpad

Step 4 – Click on Manage My Apps

All existing Registered and Qualified Apps will be shown in Manage My Apps.

Step 5- Click on the App to get details, the API Key will be available here which can be used in SAP LOB Application’s API Header.

Step 6 – If there are no App Listed, then you can start Register New App from the Manage My Apps Home Page

Step 7 – Important Fields and Options under Register New App section is as follows:

Important Notice:

Even if the Application Interface key is not listed in the LoB Solution Page, the key still must be sent through with every API call. The Key header must be sent from wherever the API call to SAP System is made, this can be either an UI or Backend process.

The Key Header should be included every time a user is using Microservice Architecture with multiple services calling SAP APIs

Contact your SAP Partner Experience Manager for more details, if you are having any questions.

With APPSeCONNECT you can easily integrate your SAP ERP system with Ecommerce store or CRM and automate the business process!

Weekly Mission & Badge Roundup // January 11 – January 17, 2021

You know that we love and very much appreciate it when you engage in SAP Community. That’s why I publish a weekly roundup to community members who have earned a mission the week prior. This blog post covers all missions earned Monday through Sunday (midnight to midnight, UTC).

The previous week was a special one as we had our very first Random Day Surprise of 2021. For those of you who are new to SAP Community, we think it is great that you are actively participating. You might have just started in a particular topic and hence may not yet answer questions and/or publish blog posts, but we see that many of you engage with all the great content that is out there and for engaging on a random day (the date is of course kept secret until that day) you receive the Random Day Surprise badge. Check out my other blog post in which I list all members who received a badge for their engagement.

Whoooo – I was literally counting with awe how many badges our member Peter Hartmann earned last week. Look at the reputation section in his profile (he was quite a busy bee last week). Peter is a good example to mention that whenever you complete tutorials in the Developer Center, the badges you earn there will be displayed in your community profile!

As another highlight I would like to point out our Avid Bloggers to you: Natalia Katunin, Nikhil Walsetwar, Ari Schapira, and Bare Said. Last week, Bare informed us about the Planned Release of SAP HANA 2.0 SPS 06. Follow him to not miss out on news in the world of SAP HANA.

Want to see all of the missions that are available for you to complete, as well as the ones you have already earned? Visit the Missions section in your profile. Want to know more about SAP Community missions and how to earn them? See the Missions and Badges overview page.

Now join me in congratulating the following Community members who completed missions and earned badges from January 11 – January 17, 2021:

Your SAP on Azure – Part 25 – SQL Server Transparent Data Encryption with Azure Key Vault

SAP systems contain the company’s most confidential data. List of orders, production plans, sales forecasts, and what’s even more important the key financial data are hosted on SAP. Therefore, each organization should ensure that this information is kept in a secure way that makes it impossible to read in the case when an unauthorized party gets access to database files or backup media. Microsoft does a great job in protecting customers’ data – all information stored in Azure are by default encrypted at rest, so even if the attackers got access to the physical Azure hardware, they would not be able to make any use of it. For infrastructure workloads, customers can in addition decide to put another level of protection and encrypt virtual disks, which makes it impossible to read stored data without having an encryption key. But the disk encryption at-rest won’t help if the attacker receives access to your virtual machine and is able to make a copy of the database. Or if your database backup leaks. In such a case they can just restore the database on another server and access the data without any issue.

To protect your data against such challenges, you can enable Transparent Data Encryption, which provides database encryption at-rest at the file level. Without knowing the encryption key it won’t be possible to read the database content even with full access to the backup media or database files. The encryption is transparent for applications that connect to the database – to protect data-in-transit you should consider additional security precautions like enabling connection encryption between the application and the database. TDE performs real-time I/O encryption and decryption of the database files and protects you against unauthorized access in case of media loss or theft rather than unauthorized user access to the database due to leaked credentials.

All major databases that you use with SAP support Transparent Data Encryption. This technology is possible with SAP HANA, SAP ASE, IBM DB2, Oracle, and Microsoft SQL Server. The documentation is widely available on the internet so, in this blog, I would like to focus on the tight integration between Microsoft SQL Server and Azure, which makes Transparent Data Encryption even more secure by connecting your instance to the Azure Key Vault.

We used the Key Vault service a couple of times before in this blog series. It’s a service that uses Hardware Security Modules (HSM) to prevent the disclosure of sensitive information like passwords, secrets, and encryption keys secure. You keep full control over the access to the vault, but you don’t have to manage the whole infrastructure to keep your keys secure.

Whenever you insert or retrieve information from the encrypted database, SQL Server performs data encryption and decryption automatically. As I mentioned earlier, this process is fully transparent to your applications (including SAP), so all your queries will work without issue. There is a small performance overhead due to the encryption process, so before you activate it in your production system ensure you don’t have to allocate more resources to the database server. If you’d like to explore the performance topics in greater detail, please have a look at this great post available on Microsoft TechCommunity sites.

To protect your data, SQL Server uses the Database Encryption Key which is embedded into the database, and which is also encrypted using Server Certificate. This causes a problem, as you have to create and maintain the Server Certificate – including backups and protection against unauthorized access. The whole management of Server Certificate can be avoided if you enable integration between the database instance and the Azure Key Vault Service. Microsoft developed a plug-in that uses the Extensible Key Management interface available in SQL Server to use the keys stored in the vault instead of Server Certificate. In such a configuration the database engine firstly calls the Key Vault service to unlock the Database Encryption Key which is then used to encrypt and decrypt stored data. It’s worth remembering that the key stored in the Key Vault is not actually used to encrypt your data – it’s only used to decrypt the Database Encryption Key.

I hope this short introduction build your appetite, so let’s have a closer look on how to implement Transparent Data Encryption using Key Vault. But before we start, let me just explain two main terms. In this post I’ll be using the word Key quite often, so to avoid any ambiguity please remember that whenever I mention Database Encryption Key I mean the key that is used for database encryption. To protect this information, we are using Key Encryption Key which is stored in the Azure Key Vault.

CREATE AZURE RESOURCES

Let’s start by creating required Azure resources. As described above you need a Key Vault to store the encryption keys, and in addition you need to create a Service Principal to establish communication between your SQL instance and the Azure Key Vault service. As always, there are two ways of deploying Azure resources – you can do it using Azure Portal or PowerShell. In this blog I take a longer route and show you how to deploy all resources using the Portal.

To create Service Principal navigate to Azure Active Directory blade and select App Registration from the menu. Then choose New Registration and provide the name of the Service Principal. You don’t have to change any other settings:

Confirm your choices by clicking on the Register button. Within a second you’ll be redirected to the Service Principal overview screen. Take a note of the Application (Client) ID:

Then, choose Certificates and Secrets from the menu and click New Client Secret:In the pop-up window, provide the Client Secret name and choose the validity period. You can also choose the expiration time, but please remember that if you set it, you’ll have to update the SQL Server configuration after a specified period. It is, however, a more secure approach.

A new entry appears under the Client Secrets. Copy the secret from the Value field as we’ll require it later. Once you navigate out of the Secret screen you won’t be able to display the value again.

Now let’s create Azure Key Vault. Open the Key Vault blade and choose Add. Provide the name of the Key Vault and choose the region where it should be deployed to. You can leave other settings as they are.

Click on the Review and Create button and confirm creation of the resources:

After you create the Key Vault, you need to grant access to the Service Principal defined in the previous step. Choose Access Policies from the Key Vault menu and choose Add Access Policy:

Grant four permissions to the service principal: Get, List, Wrap and Unwrap. You don’t have to assign any other Secret or Client permissions:

Then select the previously created Service Principal:

Click the Add button and then Save your settings. The Service Principal can now access keys stored in the Azure Key Vault. The final step is to create the Key Encryption Key that will be used to encrypt the Database Encryption Key. Choose Keys from the menu and click Generate / Import. Enter the name of the key and leave other settings as they are. Confirm the operation by clicking on the Create button.

SQL Server 2016 and 2017 do not support keys longer than 3456 bits, and depending on installed SQL Server updates it may not even trigger a warning, which results in a corrupted database. Please check KB4463125 for details.

You’ve now completed all preparation steps in Azure to enable Transparent Data Encryption in SQL Server. In the next section, I show how to configure the SQL instance and encrypt the database.

CONFIGURE SQL SERVER

If you perform the encryption on the system that is currently in use, this is the moment when I recommend stopping the SAP instance and ensure there are no operations on the database. The encryption runs in online mode, which means in theory you could continue with normal operations, but in fact the process may cause a lot of blocking and locking. Therefore, I don’t recommend encrypting the database while your users still access the SAP system. To ensure you can revert changes if something goes wrong take a full database and server backup. As the encryption generates a lot of transaction logs change the database recovery mode to SIMPLE. Once the database is encrypted change this setting back to FULL.

If you plan to encrypt the database as part of the migration to Azure there are two routes you can take. If you’re using the Backup / Restore approach you have to perform the encryption after you restore the database. It’s an additional step that you execute during the downtime window, so ensure you correctly measured it and that it is included in the cutover plan. Later in the post I included a section about performance optimization of the encryption process.

It gets more interesting if you’re using Export / Import for SAP system migration. Instead of waiting for the migration to complete and then run the database encryption, you can pre-create an empty database and enable the encryption before you actually migrate any data. This way you don’t require an additional step after the migration, as all data are encrypted during the Import step. It may cause, however, a small performance impact, so please verify the runtime before.

To enable connectivity between SQL Server instance and Azure Key Vault you need to download and install SQL Server Connector for Microsoft Azure Key Vault. The plug-in is available on the Microsoft site:

https://go.microsoft.com/fwlink/p/?LinkId=521700

The installation takes just a minute.

The database encryption can only be configured through SQL commands. There is no GUI available. In the first step you have to configure the instance to use the Extensible Key Management interface and load the SQL Server Connector library.

-- Enable advanced options. USE master; GO EXEC sp_configure 'show advanced options', 1; GO RECONFIGURE; -- Enable EKM provider EXEC sp_configure 'EKM provider enabled', 1; GO RECONFIGURE; -- Load EKM provider libraries CREATE CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM
FROM FILE = 'C:\Program Files\SQL Server Connector for Microsoft Azure Key Vault\Microsoft.AzureKeyVaultService.EKM.dll'; GO 

The Key Vault connection information are stored in SQL Server as Credential. It consists of:

  1. Identity – it’s the name of the Azure Key Vault that you created. Please note that Azure Private Cloud (Azure Government, Azure China or Azure Germany) require full vault URI.
  2. Secret – it’s a combination of the Service Principal Application ID (without hyphens) followed by the Client Secret.

You can use following script to calculate the Secret based on your input:

DECLARE @ClientID VARCHAR(50)
DECLARE @ClientSecret VARCHAR(50) SET @ClientID = '<Service_Principal_Client_ID>'
SET @ClientSecret = '<Service_Principal_Client_Secret>' PRINT 'Credenital secret: ' + REPLACE(@ClientID, '-', '') + @ClientSecret

Enter the credential secret as SECRET in following script and execute it to create the credential.

USE master; CREATE CREDENTIAL sysadmin_ekm_cred WITH IDENTITY = '<KeyVault Name>', -- for public Azure -- WITH IDENTITY = '<KeyVault Name>.vault.usgovcloudapi.net', -- for Azure Government -- WITH IDENTITY = '<KeyVault Name>.vault.azure.cn', -- for Azure China 21Vianet -- WITH IDENTITY = '<KeyVault Name>.vault.microsoftazure.de', -- for Azure Germany SECRET = ‘<Client_Secret>’
FOR CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM; 

The created credential allows you to connect to the Key Vault service. But to actually use it you have to link it with a SQL Server Login. Only one-to-one relationship between the credential and login is allowed. To support the configuration process firstly assign the credential to your SQL Server login. Later on we’ll re-assign it to a system login that will handle the encryption.

-- Add the credential to the SQL Server administrator's domain login
ALTER LOGIN [<domain>\<login>] ADD CREDENTIAL sysadmin_ekm_cred; 

The following command create a SQL Server object that represents the key stored in the Azure Key Vault service. Note, that we’re not creating a new key, but just referencing the existing Key Encryption Key. Replace the value of parameter PROVIDER_KEY_NAME with the name of the key in the vault.

CREATE ASYMMETRIC KEY EKM_KEY
FROM PROVIDER [AzureKeyVault_EKM] WITH PROVIDER_KEY_NAME = '<Key Vault Key name>', CREATION_DISPOSITION = OPEN_EXISTING;

When you execute above command you actually reach to the Key Vault service for the first time. If you made a mistake on the way, this is the moment of truth. I went through the process a bunch of time and quite often this step reported an error.

The error message returned by the SQL engine doesn’t give too much information. To get a more precise explanation open Windows Event Viewer.

There are two main issues you can encounter:

  1. Wrong credentials
  2. Error accessing the registry

Wrong credentials – resolution:

If you accidentally provided wrong Client ID or Client Secret you must unassign the Credential from your user, drop it and then re-create it:

ALTER LOGIN [<domain>\<login>] DROP CREDENTIAL [sysadmin_ekm_cred]
GO
DROP CREDENTIAL [sysadmin_ekm_cred]

Error accessing the registry – resolution

This is a very interesting issue, which happens to me quite regularly, but I still haven’t discovered the root cause. If you see an event saying there is an “Error when accessing registry” you have to manually create the missing entry.

Open the Registry Editor and navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft. Create the key “SQL Server Cryptographic Provider” if it doesn’t exist.

Then assign Full Permissions to this registry key to the account that runs the SQL Server instance:

Finally, the script should complete without any issues:

You can display the Key Encryption Key using following command:

SELECT * FROM sys.asymmetric_keys

Create a new system Login based on the created asymmetric key. You can’t use it to login to the SQL Server, it’s a technical object to handle the encryption.

--Create a Login that will associate the asymmetric key to this login
CREATE LOGIN TDE_Login
FROM ASYMMETRIC KEY EKM_KEY;

As only one-to-one mapping between a login and credential is allowed, drop the credential from your user and assign it to the system Login you’ve just created.

ALTER LOGIN [<domain>\<login>]
DROP CREDENTIAL sysadmin_ekm_cred; ALTER LOGIN TDE_Login
ADD CREDENTIAL sysadmin_ekm_cred;

Perfect! You completed all required SQL Server configuration steps.

DATABASE ENCRYPTION

To encrypt SQL Server database, you need to create Database Encryption Key, as I explained at the beginning of the blog. Reference the ASYMMETRIC KEY which represents the Key Encryption Key stored in the Azure Key Vault. Remember, that the Key Encryption Key is not used to encrypt the database – the SQL Server engine uses it to encrypt the Database Encryption Key.

--Create an ENCRYPTION KEY using the ASYMMETRIC KEY (EKM_KEY)
USE <database name>
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_256
ENCRYPTION BY SERVER ASYMMETRIC KEY EKM_KEY;

The encryption starts after you execute the following command:

--Enable TDE by setting ENCRYPTION ON
ALTER DATABASE <database_name>
SET ENCRYPTION ON; 

The encryption process has started, so let’s see how you can monitor the progress.

ENCRYPTION MONITORING

You can use below command to monitor the status of encryption:

SELECT DB_NAME(database_id), encryption_state, key_algorithm, key_length, percent_complete
FROM sys.dm_database_encryption_keys

The overall status of the database encryption is displayed on column encryption_state:

  1. Unencrypted
  2. Encryption in progress
  3. Encrypted
  4. DEK change in progress
  5. Decryption in progress

The current encryption progress is displayed in the percent_complete column. Once all data are encrypted the encryption_state will change to 3 and percent_complete to 0.

When the encryption is completed you can start SAP system. You can confirm in DBACOCKPIT that the Transparent Data Encryption is enabled.

PERFORMANCE MONITORING AND TUNING

When you start the database encryption process you’ll notice increase in the CPU and disk utilization. I usually use Resource Monitor to monitor how hardware resources are utilized:

Depending on the database size, provisioned hardware resources, and SQL Server configuration the encryption can take anything from minutes up to a couple of days to complete. As you can see above, by default, the encryption doesn’t consume too much processor power or disk throughput. The total time required to enable TDE on a 300GB database took over 12 hours:

Such a performance may be OK if your database is small, but as very often the encryption process is part of your cutover plan, every minute counts!

There is an excellent blog available on TechCommunity that talks about Internals of TDE Encryption, and it gives an idea on how to increase the encryption throughput. When you execute the “ALTER DATABASE <database_name> SET ENCRYPTION ON” command, the SQL Server spawns multiple Asynchronous Disk Worker threads that performs the encryption – they load each database page into memory, mark it as dirty and then flush back to the disk. The encryption happens during the last step when data are written to the disk. It is a resource-intensive operation, so the Encryption Worker spawns only a limited number of Disk Workers, which depends on the number of storage volumes that are used by the database.

In my current setup, I have four data files, but they are located on a single volume. Therefore only a single thread is created to perform the encryption. To decrease the processing time, you need to ensure multiple Disk Worker threads are running in parallel. Let’s see what happens if I distribute the database files across four partitions on the same disk.

You can see below that the encryption process consumes much more resources. The CPU utilization grew to 50% and the disk throughput is close to 100MB/s which is pretty much the maximum value I can get out of E20 disk.

After making the change the encryption took much faster – instead of almost 12 hours it only required a little more than 90 minutes.

There is one more thing that can influence the encryption time. As Cameron Gardiner wrote on his blog, an important feature of TDE encryption is the Certification Revocation List. Too strict firewall rules can cause a slow LOGWRITE and therefore decrease the overall performance. The problem occurs on regular basis – SQL Server will runs slow for 180 seconds and then return to normal. Please check the TechCommunity blog for details.

It’s important to understand the backend mechanics of encryption process to correctly utilization of hardware resources. Otherwise, it can take a lot more time than it was actually required.

DATABASE BACKUP AND RESTORE

Once you activate the database encryption, the backup is also encrypted. Without a proper configuration of the SQL Server, you won’t be even able to restore an encrypted database. This aspect is important in disaster recovery, but you should also pay attention to it during a system refresh when you take a copy of the production database and restore it in the development or test landscape. In both cases, you need to ensure you can access the key used to encrypt the database.

To restore an encrypted database, you don’t have to perform full configuration again. As the backup already contains the Database Encryption Key, you don’t have to create another one. You only have to do the SQL Server configuration and link the database engine with the Key Vault by creating logins and credentials.

CHANGING THE KEY ENCRYPTION KEY

There are two use cases when you should consider changing the Key Encryption Key stored in the vault. If you specified the validity you have to replace the key after the selected period. You should also replace it when you refresh your development and test systems, as in you shouldn’t tie production and non-production resources.

Fortunately, changing the Key Encryption Key is a pretty simple operation. As I explained earlier, the key stored in the vault does not encrypt database files. It only encrypts the Database Encryption Key. So, you don’t have to run the encryption process again. Instead, you follow the guide creating a new link between SQL Server and the Key Vault, and then you just issue one command to change the Key Encryption Key. As it doesn’t change any data, the whole operation takes just a second. Please refer to the below script that does the whole operation at once:

use master
CREATE CREDENTIAL <new_credential_name> WITH IDENTITY = '<new_keyvault_name>', SECRET = '<service_principal_and_secret>'
FOR CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM; ALTER LOGIN [domain\username] ADD CREDENTIAL <new_credential_name>; CREATE ASYMMETRIC KEY <new_ekm_key_name>
FROM PROVIDER [AzureKeyVault_EKM] WITH PROVIDER_KEY_NAME = <new_key_from_keyvault>, CREATION_DISPOSITION = OPEN_EXISTING;
GO CREATE LOGIN <new_login_name>
FROM ASYMMETRIC KEY <new_ekm_key_name>;
GO ALTER LOGIN [domain\username]
DROP CREDENTIAL <new_credential_name>; ALTER LOGIN <new_login_name>
ADD CREDENTIAL <new_credential_name>; //Commands below change the encryption key:
USE [dbname]
ALTER DATABASE ENCRYPTION KEY
ENCRYPTION BY SERVER ASYMMETRIC KEY <new_ekm_key_name>; //Once the database is configured to use the newly created objects, you can clean up previously created credential and login
USE master
ALTER LOGIN [old_login_name] DROP CREDENTIAL [old_credential_name]
GO
DROP LOGIN [old_login_name]
GO
DROP CREDENTIAL [old_credential_name]

I think it’s also worth mentioning, that while I always recommend taking backup of the Key Vault, you can’t use this option to create a copy of the key. The backup file is tied to the Azure subscription number, and you won’t be able to restore it in another one. There is also no option to restore it under a different name. Therefore changing the Key Encryption Key at the SQL level is the only way to change the Key Vault you’re using.

ALWAYS ON AVAILABILITY GROUP

The final topic I would like to cover is how to enable encryption when the target environment is highly available. Always On Availability Group is the technology available in SQL Server to protect your system against unexpected downtime in the Microsoft cloud. It’s based on log replication between two or more SQL servers that are part of the Availability Group. As it only replicates the database content and not the entire SQL Server settings, you have to ensure you completed the configuration on all nodes.

An important question is when you should enable the encryption in a highly available environment. As the process is quite long and produces a lot of transaction logs, my recommendation is to break the log shipping for the encryption duration. Taking such an approach will be much faster. During the migration to Azure enabling the replication should be the last step.

Currently the “Add database to Availability Group” wizard in SQL Server Management Studio doesn’t support encrypted databases, so you must enable the replication manually using SQL commands.

Once you complete the configuration on the other node and the Key Encryption Key is available, take the database and log backup of the already encrypted database. Then, add the database to the availability group:

USE master GO
ALTER AVAILABILITY GROUP "<availability_group_name>" ADD DATABASE <database_name>
GO

You can verify that the database is part of the availability group in the dashboard. Don’t worry about errors and warnings related to the replication status, we’ll sort them in a second:

Copy the database backup to the other node of the cluster and restore it selecting the “RESTORE WITH NORECOVERY” option:

Finally, add the database replica to the same availability group to enable the synchronization:

USE master
GO ALTER DATABASE <database_name> SET HADR AVAILABILITY GROUP = <availability_group_name>;

If everything went fine, you can refresh the AlwaysOn dashboard. This time you shouldn’t see any errors:

I know working with database encryption can sometimes be intimidating, so I hope this post provides a bit of clarity about this topic. Due to bigger awareness and legislation changes, more and more customers are interested in providing additional security layer to their data. I think soon having database encryption in place will be a default data protection mechanism, so why not using advanced features that the Azure Key Vault offers.