SAP Analytics Cloud – Application Designer – Tab Strip + Timer Widgets:

SAP Analytics Cloud ( SAC )  has introduced new widgets in Application designers such as  Timer, Tab Strip, Panel etc. Here in this blog , we will be talking about the tab strip and timer widget and it’s use case.

Tab Strip:

As name implies, Tab Strip is a container widget that is used to view tabbed pages. We can add multiple tabs and place widgets within each tab. It acts as its own separate canvas. By default, the tab strip comes with 2 tabs but we can add as many from the builder panel and change text for each Tab.

Tab Script has scripting events onSelect which will be executed whenever the end-user changes the tab.

Timer:

Timer object enables you to start a timer to trigger timing events. By leveraging the feature of a timer, you can realize different scenarios such as:

  • create animations
  • send notifications to end users regularly
  • refresh your analytics application in a certain interval of time

In this blog, we will add a tab strip widget and create 4 tabs. In each tab we will add charts. And then we will create an application where its tabs/pages change automatically when user clicks on Play button and stops whenever he/she wants by clicking on stop button.

Addition to this we will add two right and left helper button for manually changing the tabs from left to right or right to left.

Now, lets go and see how it can be used in SAC:

I hope you have basic understanding of SAP Analytics cloud , application designer and how applications can be created in it.

Pre-requisite for learning SAP Analytics designer is to have understanding of Javascript.

Step 1: Open SAC, From the top select Create and then go to Analytic Application.

Step 2: Add a tab strip widget into Canvas area of application. Add multiple charts in each of the tabs.

** I have used a shape to hide the  header of the tab strip.

Step 3 : I have added 4 images in the canvas :

  1. Play Button
  2. Stop Button
  3. Right Button
  4. Left Button

Step 4 : create the timer object : Timer_1 :

Step 5 : Now Create  functions leftToRight() and rightToLeft() which will be invoked when pressing the above buttons that will change the tabs from left to right or right to left.

function leftToRight() { var key = TabStrip_1.getSelectedKey(); if (key === "Tab_1") { TabStrip_1.setSelectedKey("Tab_2"); } else if (key === "Tab_2") { TabStrip_1.setSelectedKey("Tab_3"); } else if (key === "Tab_3") { TabStrip_1.setSelectedKey("Tab_4"); } else if (key === "Tab_4") { TabStrip_1.setSelectedKey("Tab_1"); } } Function rightToLeft() { var key = TabStrip_1.getSelectedKey(); if (key === "Tab_4") { TabStrip_1.setSelectedKey("Tab_3"); } else if (key === "Tab_3") { TabStrip_1.setSelectedKey("Tab_2"); } else if (key === "Tab_2") { TabStrip_1.setSelectedKey("Tab_1"); } else if (key === "Tab_1") { TabStrip_1.setSelectedKey("Tab_4"); } }

Step 7 : Add scripts for each of the buttons to perform actions:

Play Button :

Start the timer object which will invoke the Timer_1 function.

Timer_1.Start(0);

Stop Button :

Stops the timer.

Timer_1.Stop();

Left Button :

triggers the function leftToRight() in order to change the tabs manually from left position to right.

PageHelper.leftToRight();

Right Button :

triggers the function rightToLeft() in order to change the tabs manually from right position to left

PageHelper.rightToLeft();

Now in the timer function, we need to define the interval for which the tabs changes automatically.

Here is the demo :

Conclusion :

With SAP Analytics Cloud Analytics Designer you can create applications for data analysis to meet sophisticated business requirements. It provides a dedicated development environment with advanced scripting capabilities.

Serverless Extensions – Part 5: Bring it all together – Testing the extension end to end

Now that we have configured and connected all the services required for our scenario, let us test out the two additional functionalities we set out to achieve.

Scenario 1

Create a promotion applicable for all the customer groups.

The application sends a message to the ‘promotions’ queue we set up (refer part 1)

 The message sent is picked up by the trigger to the ‘Master’ function(refer part 4). This function gets the contact number for the priority customers from the backend service and sends the WhatsApp message using the open connectors (refer part 1)

 

Since this promotion applies for all the customers, it gets published on twitter as well.

SCP functions service records execution logs which you can access from the dashboard.

 

Scenario 2

Let us try to create a promotion applicable only for the priority customers. 

 

In this case, the priority customers receive the promotion information through WhatsApp. The promotion does not get published on twitter.

I hope this blog series proves helpful to any of you planning to extend the legacy application using the services offered by SAP Cloud Platform.

SAP Data Intelligence 3.0 Walk Through

We recently released SAP Data Intelligence 3.0. My colleague Tobias Koebler  presented new features included in this 3.0 release in his blog https://blogs.sap.com/2020/03/20/sap-data-intelligence-development-news-for-3.0/.

In this blog, I will demonstrate SAP Data Intelligence, showing several aspects of SAP Data Intelligence 3.0 with a series of 7 videos, including the launchpad, connection management, metadata management, pipeline modeling, and ML scenario manager.

SAP Data Intelligence Launchpad

In our first video, we show you SAP Data Intelligence Launchpad. SAP Data Intelligence Launchpad is a browser-based application that provides a single point of access to a range of user-facing applications. The applications displayed in the Launchpad vary for different users and depend on the user logon credentials, which are dependent on user roles or personas, such as data engineer, data scientist, business analyst, IT user, data steward, and so on. The Launchpad enables you to organize, personalize, and launch the SAP Data Intelligence applications including grouping applications, ordering applications, removing applications, creating quick links, etc.

SAP Data Intelligence Connection Management

Before you can do anything with SAP Data Intelligence, SAP Data Intelligence administrators or other business users with necessary privileges use the SAP Data Intelligence Connection Management application to create and maintain connections. A connection represents an access point to a remote system or a remote data source. The connection management video shows connection management functionalities, including a list of connections available in your SAP Data Intelligence instance and the type of connections that SAP Data Intelligence supports. You can create a new connection, delete an existing connection, or check the status of the connection.

After your administrator(s) created connections and provided you access to the SAP Data Intelligence instance and to the connections, you can perform a series of operations and, based on your persona, the order will be different.

SAP Data Intelligence Metadata Explorer

As a data steward or a business analyst, you can use Metadata Explorer to perform metadata management functionalities available in SAP Data Intelligence. The use cases supported by Metadata Explorer include three use cases.

The first use case is data discovery and governance. It is not easy to find the dataset that you need for your uses. Even after you find a potential dataset, you may not trust the data itself for several reasons. In many cases, you will need to evaluate the data quality based on the source of the data, you may need feedback from other users, or the vocabulary to describe the attributes of the data may not conform your understanding. With a metadata catalog, along with a set of services including a business glossary, search, fact sheet, lineage, rating, and tagging, we support data discovery and governance in SAP Data Intelligence.

The second use case is data quality monitoring. The easy to use business rules and scorecards allow users to monitor data quality continuously.

The final use case is self-service data preparation, where business users and business analysts can leverage the data pipelining capability in an easy-to-use spreadsheet UI to enhance and enrich datasets.

In the third video of this series, we navigate all of the functionalities to perform the above three use cases in Metadata Explorer.

Data Preparation within SAP Data Intelligence Metadata Explorer

We demonstrate data discovery and data preparation in this video, where a easy-to-use search bar helps you find the dataset, filters search results based on additional knowledge, acquires data from a structured source, and enhances the dataset. Then we further enrich the data with another dataset from a Hadoop data lake and perform an inner join without doing any coding.

SAP Data Intelligence Modeler

In our 5th video, we show how a data engineer can create a data pipeline to replicate data from a non-SAP SaaS, here Salesforce using SAP Open Connector to SAP HANA and a cloud storage and also how to push data processed by data preparation to SAP Analytics Cloud for visualization.

Data Visualization with SAP Analytics Cloud

In our 6th video, we demonstrate how easy it is for a business user or a data engineer to use SAP Analytics Cloud to visualize data processed with SAP Data Intelligence.

ML Scenario Manager

In this series, we conclude showing navigation of ML Scenario Manager available in SAP Data Intelligence with the 7th video of this series.

I hope you have a chance to discover the SAP Data Intelligence product with these 7 videos.

SAP BPC product suite extending maintenance

Following the announced maintenance extensions of the SAP Business Suite 7 and SAP S/4HANA and of SAP BW/4HANA, the maintenance of the respective SAP BPC products will be adapted as follows:

  • The maintenance of SAP BPC, version for NetWeaver 10.1 will continue until end of 2027, aligned with SAP BW 7.5
  • The maintenance of SAP BPC, add-on for S/4HANA will continue until end of 2027, aligned with SAP BPC, version for NetWeaver
  • The maintenance of SAP BPC, version for BW/4HANA 11.1 will continue until end of 2027, aligned with SAP BPC, version for NetWeaver

The maintenance of SAP BPC, version for Microsoft 10.1 will continue until end of 2024, following explicit customer requests

The end of mainstream maintenance dates for earlier SAP BPC product versions remain unchanged. Details can be  found here: https://blogs.sap.com/2019/11/06/extend-sap-bpc-with-sap-analytics-cloud/

Since 2017, SAP BPC, version for NetWeaver, and SAP BPC, version for Microsoft, have been in maintenance mode and all feature investments in SAP BPC are being made exclusively for the version for BW/4HANA.

In 2015 we introduced SAP Analytics Cloud with planning capabilities designed for the cloud. SAP Analytics Cloud is rated visionary (Gartner) and leader (BARC) with strongly growing customer adoption. Future investments in planning will primarily go to SAP Analytics Cloud enhancing the rich feature set and integration into the SAP product portfolio even more. Hence we strongly recommend customers evaluating SAP Analytics Cloud for future investments in business planning. For the extension and transition for SAP BPC customers we have a dedicated tailored EXTEND-program in place providing further facts and guidelines.

This plan confirms SAP’s continued commitment in our established on-premise planning products while guiding the future in the cloud, including a transition path.

From Rockstars and Roadies or why rock stars need a stage and “Trusted Data” is the basic requirement for every intelligent enterprise

Introduction:

Data Scientists are the obvious Rockstars of the digital transformation and in great demand at the moment. Every Rockstar needs a rock solid stage he trusts and knows that it has been built out of high quality and fitting material and every Data Scientist needs Trusted Data to do his job.

Nobody would think of letting a Rockstar build his stage by his own. Couriously this is exactly what happens in many companies when it comes to the procisioning of data. The Rockstar would need by far too much time and would not be able to show his artistic talents. His wings would be cut, so to speak. Similarly, nobody hires data scientists to prepare data, but to create and train AI models.

For good reason there are the “Roadies”, the silent heroes of every rock concert. Transferred to the business world, they are the counterparts of specialists for data management . It is their very own task to provide quality-assured data in a manner that is compatible with the legal framework.

Data driven organizations are 23 times more likely to acquire customers, 6 times as likely to retain those customers, and 19 times as likely to be profitable as a result (Source: McKinsey & Company)​

Reason for this are already digitized business processes. This can only be achieved on the basis of Trusted Data to make better decisions, improve and automate business processes, increase productivity and monetize data.

But why are so many companies struggling to create the data foundation for business agility that is so essential right now?

  1. The basic step of every data management activity is to identify the right and relevant data for the respective business goal.
  2. Isolated data from different data sources have to be connected and eventually transformed in order to be able make correct relations.
  3. Data quality must be managed over the entire data life cycle, which requires continuous measurement
  4. Chief Data Officer has to define Data Governance Guidelines in order to prevent data from being copied and moved in an uncontrolled way for individual purposes
  5. No matter how great a digital process , it will never make it to production when data privacy and protection is at risk. Personal data must be deleted if there is no longer a legal basis and may have to be obfuscated based on the respective use case.
  6. A clear structure of responsibility and defined data ownership is an essential prerequisite for the treating data as a strategic asset. As with a real supply chain, the data supply chain must be managed end to end accordingly.
  7. Data monetization comprises of identifying and marketing data or data-based products to generate monetary value.

Learn more in my SAP webinar how to orchestrate, advocate and accelerate with SAP Information Management:

Trustworthy data – elementary requirement for the intelligent company

  • How to use your data to make your company fit for all challenges
  • How to provide users with the right data in real time in the context of the task at hand. How to use trustworthy data as an innovation catalyst
  • How to ensure compliance with external data protection guidelines and internal governance guidelines

I look forward to seeing you … May 13, 2020 10 a.m.CEST

Accompanying Sheet in SAP S/4HANA Cloud

The accompanying sheet functionality is a part of DMEE and DMEEX transactions. With SAP S/4HANA Cloud 1908 release, we came up with accompanying sheet functionality also for Fiori app Map Payment Format Data.

The Accompanying Sheet section can be found in the detail of a format mapping under the ‘Accompanying Sheet’ tab. In this section, you can determine whether the data for the accompanying sheet is created during payment file generation and define the data with which the accompanying sheet is created.

In the Accompanying Sheet Type field, you can select one of the following values in the dropdown list:

  • No Accompanying Sheet
  • Simplified Accompanying Sheet
  • Accompanying Sheet with Subtotals

No Accompanying Sheet

No data for the accompanying sheet is collected when a file is generated.

Simplified Accompanying Sheet

An accompanying sheet is created during file generation without subtotals data. If you select this value, you can specify a name in the attributes of an individual node in the format mapping tree, under which the value of the node will appear in the generated file. Note that this reference works only if the node occurs only once in the entire file. For this reason, it is only possible to define such names for nodes that have format level 1.

Accompanying Sheet with Subtotals

An accompanying sheet is created during file generation with subtotals data. If you select this value, besides including the values of individual format mapping nodes (as described above), the system also performs subtotal calculations and prepares the result for the accompanying sheet.

When you select this value, the Key Fields for Accompanying Sheet Subtotals section appears, where you can select key fields for subtotal calculation. If you mark a field as a key field, during the generation of the accompanying sheet, the system calculates subtotals for the different values of this field and creates separate lines for them in the output. If you select several fields to be used as key fields, the system calculates subtotals for each different combination of their values and creates separate lines for them.

This solution is used e.g. in Brazil. You can find general documentation about the process of defining the accompanying sheet here.

See SAP Help Portal for more information about Map Payment Format Data.

To find out more about functionalities for payment formats, visit our Payment Formats blog. We have also started a new blog about functionalities for Treasury Correspondence Formats.

Transpose Row-set into Column Name – Dynamic Table Creation in SAP HANA 2.0 SPS04

Introduction

During BI implementation of a project, there could be a reporting requirement where row data need to be transposed as column name. In various BI landscapes, usually data is ingested from sources on which data cleansing/massaging is performed and then data get loaded into table which feeds to reporting layer.

Traditionally in reporting, we transpose row values as column value or vice versa, but here rows need to be transposed as column name (instead as column value). For better understanding, let me demonstrate this requirement by a use case where machinery data get loaded into table (in figure 1). In this data, for any ChassisNo value, EngineCapacity and Horsepower have 3 distinct value set such as 1100,2200,3500 and 74,163,270 respectively.

(Figure 1 shows different variant of Chassis number)

Let’s first list down various facts related above data as follows:

  • For any ChassisNo, combination of EngineCapacity and HorsePower represents different variant. (for example, CIN100001 have 3 different variants with combination of EngineCapacity and Horsepower)
  • For all Chassis, there exist 3 similar variants (which will grow in future to N variants)
  • Column RPM, BodyType and Mileage represent attributes across different variants

Each chassis has 3 variants, thus for any chassis we can see 3 rows but in future this data is expected to grow with new variants launching across all chassis.

Requirement

Target representation is expected as shown in below figure2. Here all Chassis, data is represented into single row and column RPM/BodyType/Mileage get merged with transposed value of variants.

(Figure 2 – Target requirement)

Following transformation can be understood as follows:

  • Merged value of column EngineCapacity and HorsePower like (1100_74, 2200_163 and 3500_270) is transposed and append into column name such as RPM_1100_74 to Mileage_3500_270
  • Attribute RPM/BodyType/Mileage to displayed across these target columns

Solution

As we can see here target columns generating primarily based on variant data, and this data is expected to grow in future due to new variants, thus to handle this dynamic data we opt creating SQLScript write procedure which creates structure of target table, where column name will be built dynamically based on number of variants and its value.

PSEUDO CODE

Step 1

Extract column name and its datatype (like RPM,BodyType and Mileage) from base table using system table TABLES_COLUMNS in a variable. For example, RPM and Nvarchar(50).

Step 2

Extract variant data that is (EngineCapacity and HorsePower) value for anyone ChassisNo, which can be used to create part of target column names of table. For example, 1100_74.

Step 3

Loop all column extracted in Step 1 and store its column name concatenated by ‘_’ (like RPM_) and its datatypes in respective variable like concatenated by ‘,’ (like Nvarchar(50), ).

Step 4

Inner loop for variant data extracted in Step 2 followed by concat operation as follows:

Concat (Column, Variant Data) that is RPM_1100_74

Concat (i_weight_tmp, DATATYPE) that is RPM_1100_74 Nvarchar(50),

This inner loop iterates for all variant for column RPM extracted in step –1 which then pass control to outer loop and similar iterations continues for other column like BodyType and Mileage.

Step 5

Hold this dynamically generated string (comprising target column names) into a variable and perform below operation.

v_weight_f := Concat (‘ChassisNo Nvarchar(50),’ variable holding string of target column)

v_weight:= (‘CREATE table TargetChasis (‘||:v_weight_f||’)’)

EXEC v_weight

With usage of EXEC clause (of Dynamic SQL), table can be created on-the-fly with all its metadata defined as per data in base table.

Detailed code used for creating table dynamically can be referred in figure 3 as follows:

(Figure 3. Sample code for creating table dynamically)

Upon execution of above procedure, column table is created successfully.

Note: According to existing system limitation mentioned in SAP Note 2154870 , a column table in SAP HANA database can accommodate maximum 64,000 column in column store versus 1000 in row store table, thus this dynamic creation of table will be valid for column count with in this range.

At this point, dynamically built table is ready now and can be inserted with data from RPM, BodyType and Mileage column.

Conclusion

For similar reporting requirement where row data required to be transposed as column name,    this approach is one of solution with low complexity which can dynamically build table (using HANA SQLScript procedure) based on data in base table. Data can be dynamically loaded in this table by another procedure and can be directly exposed to reporting layer.

Sendgrid SMTP Replay Configuration Using Postfix in SAP S/4HANA

In this blog post I have explored about how to use / configure Sendgrid as SMTP relay in SAP S/4HANA using postfix mail package in Redhat linux.

Environment Details :

Linux Version : Red Hat Enterprise Linux Server 7.6

SAP S/4HANA 1909 FPS 00

HANA 2.0 SP04

Install Postfix in Redhat Linux

Check Postfix is installed

# rpm -qa | grep postfix

Install Postfix

# yum install -y postfix

# systemctl start postfix

# systemctl enable postfix

# systemctl status postfix

Postfix status should be active (running)

Steps to be carried in Linux server:

Take backup of below files from /etc/postfix

  1. main.cf
  2. sasl_passwd (If already available)

# cd etc/postfix

# cp main.cf main_bkp.cf

# cp sasl_passwd sasl_passwd_bkp

Add below lines in main.cf file

# cd etc/postfix

# vi main.cf

smtp_sasl_auth_enable = yes

smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd

smtp_sasl_security_options = noanonymous

smtp_sasl_tls_security_options = noanonymous

smtp_tls_security_level = encrypt

header_size_limit = 4096000

relayhost = [smtp.sendgrid.net]:587

— Save —

Edit sasl_passwd

Note: You should have sendgrid account details to proceed further.

# vi sasl_passwd

[smtp.sendgrid.net]:587 sendgridUsername:SendgridPassword

Note: Sendgrid login account details is username & password.

# sudo chmod 600 /etc/postfix/sasl_passwd

# sudo postmap /etc/postfix/sasl_passwd

# sudo systemctl restart postfix

Check postfix status,

# sudo systemctl status postfix

Test Using Unix Command:

Create file

# vi /tmp/test.txt

— Type any content in the text file —

Example :

SMTP Mail Configuration Using Sendgrid Completed Successfully !

— Save —

mail -s “Sendgrid Configuration Mail” example@gmail.com < /tmp/test.txt

You should receive mail as below,

You can also test send mail using telnet command.

SMTP Configuration in SAP:

  1. Create SMTP_USER (Service User) in all working clients with

          Profiles: SAP_ALL & S_A.SCON

      2. Create JOBUSER (Service User) in all working clients with (optional step)

          Profile : SAP_ALL

SCOT Configurations: (In all working clients)

1.SCOT

Setting -> Default Domain

Note : If the default domain is SAP.COM and the user name TESTUSER, the mail address would be TESTUSER@SAP.COM. Accordingly decide what is your default domain.

2. Create SMTP Node

Click on “Node in use” -> Save

3. Click on Settings “SMTP Configuration”

Type Sendgrid Login details: (Username & Password)

Save

4.Click on Set “Internet”

Save

Once Saved, we should see SCOT screen as below,

5. Click on Settings on left pane, modify as below screenshot

Save

6. Click on Settings “Inbound Messages”, modify as below

7. To Create Background Job to Trigger mail once in 10 Minutes (All Clients)

JobName :  Z_SAPCONNECT_<Client> (Any job name)

Background User : JOBUSER

Release the job.

8. T-Code : SICF

Double click on SAPCONNECT

Under logon data tab, mention user as “SMTP_USER” & Password

 

11. Activate the SAPCONNECT Service from SICF T-Code, right click on the service & activate.

How to Test?

T-Code : SCOT

Check in SOST (After 10 Minutes), Once the Z_SAP_CONNECT_100 job completes status should turn from waiting to sent

I hope above blog post is useful and I have described best to my knowledge on how to configure Sendgrid as SMTP relay service in SAP S/4HANA.

Comments & Suggestions are always welcome 🙂

Keep Supporting !

Regards,

Ragav

Solution for Analysis for Office Reports not appearing in Folders created in SAP BPC Optimized for S/4 HANA 1610

Introduction:

In SAP BPC optimized for S/4 HANA 1610 one of our customers had asked to display the Analysis for Office Reports in Specific Folders. When we tried doing so, we were unable to attach the reports in folders due to certain issues. In this blog post, I have mentioned about the issues that we faced and the solution for them. We have used Analysis for Office Version 2.8.200.93367.

Main Problem:

We tried creating Folders in BPC by referring to the below Blog post. But we were still unable to view the reports in respective folders. We had carried out few more steps in addition to this blog post in order to succeed in creating the folders in Analysis for Office in BPC.

https://blogs.sap.com/2018/01/23/analysis-office-role-folder-in-sap-netweaver-platform/

After following the steps in the above blog post, our Analysis for Office folders appeared as shown below without containing any Reports in them.

In the above screenshot you can see that the report did not appear in the folder.

Solution:

Follow the steps mentioned below in order to solve this Issue:

1.    Firstly, be sure that the report you are trying to assign to a Particular folder is collected in a Transport Request.

2.      Now, Open the report you want to add to by using the Search tab as mentioned below:

3.      Now, click on File and select Analysis. Here, select Save Workbook and again select Save Workbook from the list that appears as shown below.

4.      Select the Particular Folder to which you need to assign the Report and click on Save.

5.  Now, again Click on File and Select Analysis. Click on Open Workbook and then Select ‘Open a Workbook from the Business Warehouse Platform’. In the dialog box that appears, go to the Role tab and keep your cursor on the folder to which the new report is added. Click on Refresh as shown in the below Screenshot.

6. Now, the report that you have added appears on the selected folder.

Thus, our issue has been resolved.

Note: While opening AFO Reports in BPC under Role tab, the folders will appear only if any report is saved inside that specific folder.

Reference:

Get to Know the New Spaces Concept for SAP Fiori Launchpad

Within the scope of the recent SAP Fiori 3 innovations, there is an important innovation for the SAP Fiori launchpad: the spaces concept. With this new concept, we introduced the possibility to define the layout of the pages of whole user groups. This allows to consume the content of the launchpad in an easier and more structured way than before. This means, no overloaded home pages anymore, which led users to get lost with too much content, and no more confusion because of a missing structure.

New Layout Elements

There are new layout elements to achieve this: spaces and pages.

  • Space: defines the navigation structure and provides the overall context for the displayed content
  • Page: contains apps for a business role, grouped in sections to guide users through work contexts

This helps increasing the flexibility in content structuring by providing several role-based pages to users instead of just one home page. For every role, one space is made available to users where they can find a page with their most important entry points and information. Those spaces can be found in the menu bar on top of the launchpad. Tiles in pages can be grouped into sections.

For the time being, spaces represent an alternative layout in comparison to the home page. This means, that the home page is still available and used per default and spaces can be used instead of them. At some point in time, the spaces will completely replace the home page.

Where Do I Find It?

There is a new setting in the user actions menu settings, called “Spaces”. If your administrator activated this setting for the usage of spaces, you can access it by clicking on the user actions menu icon on the right of the shell bar. In the next step, when you select the “Settings”, you will find the new setting for enabling the spaces. Just click on the “Use Spaces:” switch, so that it shows the green tick mark. And of course, do not forget to save your changes.

Now, the launchpad will display spaces and pages, that your administrator has created for your business roles upfront, instead of groups.

You can find additional information, including a guided tour where you can try it out yourself, in the documentation.

Personalization

You may ask yourself now, if you are still able to adjust the pages with regards to their layout and the content, so that it best suits your needs. And the answer is, yes! You still have most of the personalization options that you already know from the home page. The App Finder even allows you to choose from your set of spaces where you want to add the desired tile to.

In the edit mode of the launchpad, you will not only find the personalization options, that you already know, but you will also see that as part of the SAP Fiori 3 enhancements, the overall design of the edit mode has been slightly changed for a better user experience.

With this first version of the spaces in the launchpad, some personalization options you are used to from the classic home page, are not available yet. Please have a look in the personalization documentation for further information.

Availability

The new spaces are currently available with SAP S/4HANA Cloud 2005. In this version, a space consists of one page. We plan to make it available in SAP S/4HANA 2020 on-premise and on SAP Cloud Platform later this year, as well. In this context, we will offer multiple pages per space in the launchpad, which will also be part of SAP S/4HANA Cloud 2008. Thereby you will have even more ways to organize your daily work within the overall context of a business role.

Further Reading

Soon you will find a blog post here about how to manage spaces and pages for SAP Fiori launchpad from an administration perspective.