Features in Project and Task Management in SAP Cloud ALM

This blog post will help you know the capabilities and Features offered by Project and Task Management in SAP Cloud ALM

These are clustered into following areas. The What’s new section will be updated frequently so you can know the latest features easily

Project Setup

Consumption of Activate Content

Task Generation

Task Attributes and Administration

Task Experience

What’s new

Topic:Project Setup Feature Description
Create Project You can create a Project from Project Tile in Fiori launchpad, Overview Page.
Create Multiple projects You can click on project Tile or Overview Page
Project Timeboxing You can assign dates to Project phases, activate phases show as milestones.
Custom milestones You can add custom milestones in Project and use them to assign due dates to Tasks
Integrate with Central Business Configuration Once the setup is complete, you can select an Integration scenario during Project Setup and link.
Project History You can look at the Project history.
Team assignment You can assign people to the roles.
Topic: Consumption of Activate Content Description
Selection of Project Template You can select a limited set of templates provided by SAP Activate Methodology.
Switching of Project Template The template selected during the Project creation can be switched at a later stage. Please note it leads to deletion of Template tasks provided by SAP Activate methodology, so please use this only if required.
Support for Workstreams Set of workstreams is provided by Activate methodology and are used for reporting and filtering. The same set of Workstreams is also available when you create Requirements in SAP Cloud ALM. Please note its possible that same task has more than one Workstream.
Support for Roles A set of roles is available for Team assignment.
Content Update Content is regularly updated. The latest changes can be found in history.
Topic:Task Generation Description
Tasks from SAP Cloud ALM setup Before even you select a Project template, SAP Cloud ALM offers you some setup tasks to get started.
Task from Activate After you select a template, content from Activate methodology for the selected template is converted into an actionable task list.
Tasks from Requirements You can generate tasks from the captured Requirements in SAP Cloud ALM to track them to completion.
Tasks created manually You can supplement the task list by creating tasks manually.
Tasks created via Mass Upload You can perform a mass upload of tasks and Deliverables.
Tasks created via Scoping When you create Scopes, a set of tasks are generated per newly created Scope.
Topic: Task attributes and administration Description
Task assignment You can assign tasks to the available team members.
Task Timeboxing Due dates of the tasks can be set to different Phases and milestones
Set Task status Task status can be set to Open, In Progress, Blocked, Done.
Link as attachments You can add links to Tasks as attachments.
Task History You can see the Task History to know the latest changes.
Task Comments You can enter comments in the task. You can also edit or delete your own comments.
Support for Deliverables Deliverables is a Group of Tasks. Initial list of Deliverables is provided by the Template used in Project. You can add your own Deliverables using the mass upload.
Last Changed Date The task list shows Last changed date which can be used in Filtering to access recently updated tasks
Topic:Task Experience Description
View Upcoming Tasks A list of upcoming tasks assigned to you or roles where you are assigned is made available on the Overview page.
Task Navigation You can navigate from the Upcoming tasks in Overview to the Task List. In a task list if you click on a task you get to the task details easily.
View Task Predecessors Some of the tasks have predecessors. This is shown by a special icon in the task List.
Sort Tasks You can sort tasks by different criteria.
Group Tasks You can Group tasks by different criteria.
Adapt Task Filters You can extend the filter set provided by clicking on the Adapt Filter icon.
Adapt result columns You can add more attributes to the task list results by accessing the settings icon and show or hide certain attributes from results
Switch Project You can switch the Project easily by using the drop down. Please not project switch triggers a Global context switch and all applications show the newly selected Project
Mass Editing of Tasks You can select multiple tasks in Task list and click on Mass Edit to update attributes such as Status, Assignments and Due Dates
View Task Progress You can set the task Progress as you proceed. The task completion directly affects the Project progress shown on the Overview Page
Topic:Whats new Description
Project Timeboxing You can assign dates to Project phases or milestones. The label Phase in the Task list is replaced by the label Timebox
Custom milestones You can add custom milestones in Project and use them to assign due dates to Tasks
Link as attachments You can add links to Tasks as attachments.
Last Changed Date The task list shows Last changed date which can be used in Filtering to access recently updated tasks
Blocked Status Now you can set the status of a Task to Blocked

Did you find this list useful. Do you want a request a feature? Are you looking for more information on an existing feature ?

Lets start a dialogue , feel free to leave a comment or give feedback.

You can visit the blog post below that is one central place for Project and Task management in SAP Cloud ALM

Project and Task Management capabilities-Master List

SAP S/4HANA for advanced compliance reporting, Brazil option: Por onde começar com o relatório EFD-Reinf?

(Scroll down for the English version, which is proudly presented by the SAP Translation Portal)

Olá pessoal,

Se você acompanha as nossas postagens aqui na SAP Community, percebeu que nos últimos meses viemos falando bastante sobre o relatório EFD-Reinf no SAP S/4HANA for advanced compliance reporting, Brazil option – como realizar as configurações iniciais, gerar e enviar os eventos às autoridades legais, entre outros.

Sabemos que abordamos muitos assuntos, e mesmo que você tenha acompanhado todas as postagens, ou esse seja seu primeiro contato com o relatório e o nosso produto, compilamos aqui todos os tópicos abordados em uma ordem sequencial, que vai lhe ajudar a entender por onde começar.

Vamos lá?

O que é o SAP S/4HANA for advanced compliance reporting, Brazil option?

Para atender as obrigações fiscais brasileiras, a SAP oferece três soluções: o SAP Tax Declaration Framework for Brazil e a versão localizada do framework SAP S/4HANA for advanced compliance reporting em cloud e on premise.

Você pode conferir em detalhes mais sobre a nossa solução em S/4HANA ACR, Brazil option: a solução global para relatórios fiscais já está localizada para o Brasil.

Agora que você já sabe o que é o nosso produto, vamos falar sobre como você pode reportar o relatório EFD-Reinf nele.

Sequência de arquivos do relatório EFD-Reinf a partir do layout 1.4.0

Listamos abaixo todo conteúdo referente ao relatório em sua ordem processual, desde as configurações iniciais até o envio às autoridades legais.

Que configurações devo realizar?

Para utilizar o relatório EFD-Reinf, você precisa realizar algumas configurações iniciais que garantem o funcionamento do relatório e o envio de dados para o governo. Abordamos as parametrizações do relatório necessárias em Como realizar as parametrizações do relatório EFD-Reinf.

Além desses passos, também é necessário configurar o envio de dados através do SAP Localization Hub, advanced compliance reporting service for the Neo Environment. Essas configurações são abordadas no post Como configurar a comunicação com as autoridades legais para o envio de eventos do EFD-Reinf.

Como funciona a geração de eventos?

Assim que as configurações necessárias estiverem feitas, partimos para a geração de eventos.

Os eventos de tabela são os primeiros que precisam ser gerados e enviados à Receita Federal do Brasil, e abordamos esse assunto em Como gerar os eventos de tabela do relatório EFD-Reinf.

Além dos eventos de tabela, você também precisa gerar os eventos periódicos, processo que foi descrito em Como gerar os eventos periódicos do relatório EFD-Reinf.

Como funciona o envio de eventos?

Após a geração do evento, é hora de enviá-lo às autoridades legais. Se você ainda não tiver realizado a configuração do SAP Localization Hub, advanced compliance reporting service for the Neo Environment, este é o momento.

Uma vez que a geração e configuração estiverem okay, basta realizar o envio conforme abordado em Como enviar eventos do EFD-Reinf para as autoridades legais.

Como posso solicitar os dados consolidados?

Após o envio dos eventos, é necessário realizar o fechamento de um período, assunto que desenvolvemos em Como gerar os eventos de fechamento do relatório EFD-Reinf.

Pode ser também que após o fechamento você consultar os dados consolidados. Para isso, basta você seguir os passos abordados em Como consultar a consolidação de dados de um período do EFD-Reinf.

E se for necessário corrigir um evento?

Esse processo é simples e se divide em três passos: a reabertura, a correção, e o fechamento. Para reabrir um período cujo evento já foi enviado, confira Como reabrir um período do relatório EFD-Reinf. Para fazer a correção de eventos, explicamos o processo em Como realizar a correção de eventos periódicos do EFD-Reinf. E para o fechamento, são os passos que já mencionamos acima.

E se for necessário excluir um evento?

Para a exclusão, escrevemos o Como gerar a exclusão de eventos periódicos do relatório EFD-Reinf que aborda todos os passos necessários.

Como posso migrar de uma solução legada para o SAP S/4HANA for advanced compliance reporting, Brazil option?

Caso você ainda não utilize o SAP S/4HANA for advanced compliance reporting, Brazil option para reportar seus dados, e queira saber como funciona a migração de dados de sistemas legados, explicamos todo esse processo em Como realizar a carga de eventos legados do EFD-Reinf.

Esperamos ter sanado suas dúvidas à respeito do framework e a sua localização para o Brasil e o relatório EFD-Reinf.

Para mais informações sobre o relatório, confira a nossa documentação no SAP Help Portal:

Gostaríamos muito de saber o seu feedback. Deixe seu comentário abaixo caso tenha alguma dúvida ou sugestão para um próximo post. Além dos comentários, você pode entrar em contato conosco através da plataforma Lá você pode propor ideias para melhorar nosso produto, votar em outras ideias já lançadas e acompanhar ideias em implementação. Também não esqueça de seguir a tag Advanced Compliance Reporting aqui na SAP Community para ficar ligado nas últimas notícias sobre o SAP S/4HANA for advanced compliance reporting, Brazil option.

Até a próxima!

Julia Forneck

Time de desenvolvimento do ACR

#SAPGoGlobal #SAPLocalization #ACR

SAP S/4HANA for advanced compliance reporting, Brazil option: Where should I start with the EFD-Reinf report?

If you follow our posts here on the SAP Community, you have seen that in recent months we have been talking a lot about the EFD-Reinf report in SAP S/4HANA for advanced compliance reporting, Brazil option – how to make initial settings, generate and send the events to legal authorities, among others.

We know that we address many topics, and even if you have followed all the posts, that is, your first contact with the report and our product, we have compiled all the topics here in a sequential order, which will help you to understand where to start.

Let’s go?

What is SAP S/4HANA for advanced compliance reporting, Brazil option?

To meet the Brazilian tax obligations, SAP offers three solutions: the SAP Tax Declaration Framework for Brazil and the localized version of the SAP S/4HANA for advanced compliance reporting framework, on cloud and on premise.

You can find out more about our solution in SAP S/4HANA ACR, Brazil option: the global solution for fiscal reports is already localized for Brazil.

Now that you already know what our product is, let’s talk about how you can report EFD-Reinf on it.

File sequence of the EFD-Reinf report as of layout 1.4.0

Below we list all content related to the report in its procedural order, from the initial settings to the submission to the legal authorities.

What settings should I make?

To use the EFD-Reinf report, you need to make some initial settings that ensure that the report works and sends data to the government. We approach the necessary report parameterizations in How to perform the parameterization of the EFD-Reinf report.

In addition to these steps, it is also necessary to configure the sending of data through the SAP Localization Hub, advanced compliance reporting service for the Neo Environment. These settings are covered in the post How to configure the communication with legal authorities to send the EFD-Reinf events.

How does event generation work?

Once the necessary settings are made, we go to event generation.

The table events are the first to be generated and sent to the Brazilian Federal Revenue Service, and we addressed this subject in How to generate the EFD-Reinf table file events.

In addition to the table events, you also need to generate the periodic events, a process that has been described in How to generate the EFD-Reinf periodic events.

How does event sending work?

When your event has been generated, it is time to send it to the legal authorities. If you haven’t yet configured SAP Localization Hub, advanced compliance reporting service for the Neo Environment, this is the moment.

Once the generation and configuration are okay, you’ll only need to submit it as discussed in How to send EFD-Reinf events to the legal authorities.

How can I consult the consolidated data?

After sending the events, it’s necessary to perform the closing of a period, a topic we’ve mentioned in How to generate the closing events of the EFD-Reinf report.

It may also be that after the closure you’ll want to consult the consolidated data. To do this, you simply follow the steps covered in How to consult the data consolidation of an EFD-Reinf period.

What if it’s necessary to correct an event?

This process is simple and is divided into three steps: reopening, correction and closing.

To reopen a period which event has already been sent, check How to reopen a period of the EFD-Reinf report. To perform the correction of events, we explain the process in How to perform the correction of EFD-Reinf periodic events, and for the closing, just follow the steps we’ve already mentioned above.

What if an event needs to be deleted?

For the exclusion, we wrote How to generate EFD-Reinf report exclusion events, which covers all the necessary steps.

How can I migrate from a legacy solution to the SAP S/4HANA for advanced compliance reporting, Brazil option?

If you do not already use SAP S/4HANA for advanced compliance reporting, Brazil option to report your company’s data, and want to know how data migration from legacy systems works, we explain the entire process in How to load EFD-Reinf legacy events.

We hope to have answered your doubts about the framework and its localization for Brazil and the EFD-Reinf report.

For more information about the report, please refer to our documentation on the SAP Help Portal:

We would love your feedback. Leave your comment below if you have any questions or suggestions for a next post. In addition to the comments, you can contact us through the Customer Influence platform. There you can propose ideas to improve our product, vote on other ideas already launched, and follow up on ideas in implementation. Also remember to follow the Advanced Compliance Reporting tag here in the SAP Community to stay tuned to the latest news about SAP S/4HANA for advanced compliance reporting, Brazil option.

See you next time!

Julia Forneck

ACR Development Team

#SAPGoGlobal #SAPLocalization #ACR

SAP Migration to Hyperscaler – Part 1

はじめに

Hyperscalerと一括りにしてしまうとよく言えば多彩、悪く言えば雑多な内容となってしまいますので、このブログポストでは主に、既存のオンプレミス環境からIaaSとしてHyperscalerへ載せ替える予定のお客様を対象に、移行を検討する際の考慮点について、その考え方や考慮すべきポイントにフォーカスして、インフラ寄りのテクニカルな観点で記載しております。新規構築時にIaaSとしてHyperscalerを採用される予定のお客様も一部分はご参考にして頂ける内容となっております。普段よりHyperscalerへの移行を経験されているようなよくご存知の方にとっては、内容再整理の一助としてご参考にして頂ければと思います。

なお、移行先のDBは基本的にはSAP HANAをベースに記載しております。

今回のブログポストは、以下3回に分けての記載を予定しております。

Part1: Hyperscalerへの移行概要とデザインアプローチ (1)

Part2: デザインアプローチ (2)

Part3: 移行アプローチ

Hyperscalerへの移行概要

プライベート・パブリック問わず既に様々なサービスの基盤となっているHyperscalerですが、SAPシステムにとってHyperscalerのIaaSは、従来のオンプレミスアプリケーションをパブリッククラウド上にほぼそのまま持ってくることを可能とするアーキテクチャシナリオとなりますので、それが故に、比較的容易にオンプレミスからクラウドへの移行を促すことのできるソリューションと位置付けております。

昨今プレミアムエンゲージメントサービス契約を通じて、プラットフォームデザイン関連のサービスを提供させて頂く中ではHyperscalerのIaaSをベースに計画、実装を進めていらっしゃるお客様がほとんどとなっており、改めてHyperscalerの浸透度合いが急速に高まっていると感じます。

その他のクラウドサービスと比較すると、OSの層を境に、お客様とサービスプロバイダーの責任分界点が存在することになります。

SAPシステムに対してサポートされているIaaSプロバイダは、SAP Note 1380654記載の通りで、Azure、AWS、GCPを始めとしたHyperscalerの他、まだ日本ではあまり馴染みのないIaaSプロバイダも含めて当ブログポスト執筆時点で9つにも及びます。

SAPシステムのインフラ基盤としてIaaSを採用する際の技術的なクライテリアとしては以下のような項目が挙げられるかと思います。

項目
ロケーション戦略 希望のリージョンでIaaSプロバイダがサービスを展開しているか、また該当のリージョンにおける機密事項などのデータ保持について企業のコンプライアンス上問題ないか
SLA、KPI、HA/DR要件 IaaSプロバイダが提供するSLAやダウンタイム、性能などの数値がビジネスの要件を満たせるか
Hybridあるいはクラウドファースト オンプレミスとクラウドを混在させるHybridを推進するか、全てのオンプレミスをIaaSやそれに代わるPaaSやSaaSに移行するクラウドファーストを推進するかの方針が明確か(*移行のロードマップを描く際に重要な要素となります。)
セキュリティ要件 ネットワーク技術や暗号化技術など、IaaSプロバイダが提供可能なセキュリティの各方式が企業の求める要件を満たせるか
SAPのレディネス OS/DBの種類やバージョンの要件含めたSAPアプリケーションの認定状況、あるいはVMのスペックが問題ないか
構築あるいはサービス購入 IaaSを利用するにあたってはDNSやLB、NFSなどHyperscalerが提供するサービスを購入して組み合わせることが必要となるが、企業のマルチベンダ戦略、マルチクラウド戦略などと合致しないといったことがないか

なお、クラウド移行のロードマップを定義する中で、ステップ・バイ・ステップアプローチとして現行の状態を含めて下図のように5ステップの状態を定義しておりますが、上述のHybridで行く方針なのか、全てをクラウドに移行する方針なのかなどによって、最終形態はもちろん、中間の形態も大きく変わります。

即ち、最終形態のみならず中間形態それぞれのステップごとに該当のシステムのテクニカルアーキテクチャだけでなく、システム間連携、変更管理、セキュリティ、運用など、システム群として技術的に実現可能なプランニングを検討する必要があります。

デザインアプローチ (1)

SAPではHyperscaler上にSAPシステムを構築する上でのテクニカルアーキテクチャ観点でのデザインアプローチとビルディングブロックを下図のように定義しております。全体としては先述の通りシステム間連携、変更管理、セキュリティ、監視などの運用要素も入ってきますが、本ブログポストにおいてはこの部分はフォーカスせず、別の機会にお譲りいたします。

Hyperscalerの一般的なメリットとして、リソースの自由な拡張や縮小があげられると思います。その文脈で言うと、SAPシステムも十分にその恩恵に預かれますが、しかしながら基幹システムとしての利用を想定する場合においては、ある程度の決まった範囲で安定的に運用できることの方が、両者を天秤にかけたときにメリットであると考えることもできます。

SAPではHyperscalerの各パートナー様と協力し、SAPのユースケースに沿った適切なパフォーマンスとスケーラビリティを確保することを一つの目的としています。そのために、サイジングについては引き続き重要なデザインのパートを占めます。

サイジングの要素としては、メモリ、CPU、ディスクサイズ、ディスクI/O、ネットワークという項目自体はオンプレミスと変わりありません。様々なパターンがあるためDBをSAP HANAと想定して記載させて頂きますが、どのHyperscalerでも、認定マシンについてはまずはメモリやCPUの事前定義済みのセットでVMインスタンスを選定することになります。

昨今カスタムマシンタイプとして、vCPUの数をカスタマイズするといったことも可能となっていますが、SAPとHyperscaler各社との間で定めた制約、例えばメモリとvCPUの割合などを考慮に入れる必要がある他、事前定義されていない構成=SAPの保証としても疑問符がついてしまうため、サポートチケットを起票頂いて確認頂くことをお願いしております。本ブログポストでは事前定義済みのセットをベースに記載いたします。

メモリサイズは経年分をあまり見込まずにスモールスタートし、経年とともにVMインスタンスのタイプを変更することも一つの手ではありますが、Reservedとして予め固定期間で固定サイズのVMを予約しておくことで大幅な価格の割引を受ける方式もあります。後者はHyperscalerならではのメリットですので十分に検討して頂きたい部分になりますが、見積りを大きく間違えてしまうと、場合によっては解約手数料などで逆効果となる場合もありますので要注意です。

Hyperscalerでも、データティアリングオプションは引き続き有用であるので、それを組み合わせるのもメモリ削減の観点やデプロイメントの観点では重要なポイントとなります。

上述の通り、認定マシンについてはメモリやCPUの事前定義済みのセットとなっているため、メモリだけでマシンを決定してしまうと、CPUが不足してしまうということがあり得ます。新規構築の場合はQuick Sizerを利用してSAPS値を導き出しますが、マイグレーションの場合は現在の使用状況などからおおよその必要となるSAPS値を導き出すこととなります。CPU使用状況の履歴とともに、現在のハードウェアを導入した際の、あるいはその後拡張などしている場合は、それぞれのタイミングでQuick Sizerなどを回し直されているかと思いますので、その際のSAPS値がいくつであったかという情報を収集しておくことをお勧めいたします。

Part1はここまでとし、次回Part2ではサイジング要素の後半から記載いたします。

SAP Cloud Platform as Digital Integration Hub

What a year! 2020 was like one of these Sci-Fi movies becoming reality…

From my IT-Job-perspective, it felt like a “kick in the ant-hill”: suddenly, everyone realised the importance of digitalisation, especially for B2C use-cases. Whilst the companies already having a clear online retail strategy (Amazon, Zalando, Otto, Globus…) saw their results grow – others struggled to keep the lights on. And this is exactly where Platform-as-a-Services, like the SAP Cloud Platform, can help execute a modern, scalable and omnichannel B2C strategy.

Through services like the “Integration Suite”, our customers can build integrations for B2B purposes but also engagement platforms for their customers. A great example is “Globus”, a German Retailer, who duplicated the blueprint of their “customer engagement platform” for retail-partners and employees.

The issue is very often, that our customer’s systems of record are not meant to scale for the foreseen usage, nor are they technologically capable to communicate in internet-based standards. Changing these core systems would also be a critical operation – which is sometimes not feasible at-all or at least not in the required timeframe.

A solution blueprint

What Gartner talked about in their “Digital Integration Hub” (DIH) 2018 became a reality at some customers: an API-platform to decouple you systems of record from the pace and requirements of modern applications. Over the last years, Gartner has refined the architectural paradigm and refreshed that concept in 2020.

This blog will assume you are familiar with the DIH concepts and will explain how SAP Cloud Platform can be used as implementation environment for it.

An implementation blueprint

As you may have seen from our Integration Solution Advisor Methodology, we have provided an architecture based on SAP Cloud Platform that implements the DIH-blueprint.

It is important to keep in mind that the picture above shows an illustrative example that needs to be adapted to the specific customer context and requirements. It serves as a starting point for customers to model their hybrid integration use cases.

As a Customer Advisor at SAP, I am interested in both concepts and details. So let’s have a look under the hood to see how the SAP Cloud Platform implements the DIH blueprint.

Multi-experience applications

These are the applications “fueled” by the APIs. In an interconnected world where APIs are the de-facto standard for communication, a multi-experience application could be a web-portal, a smartphone app, a giant touch-screen kiosk, a chat-bot, a smart-watch, etc.. But it could also be a SaaS application you want to do synchronous, near real-time integration with.

The SAP Cloud Platform provides many tools and frameworks to build or configure such applications, be it Business Application Studio, SAPUI5, mobile SDKs (Android and Apple), Conversational AI, …. This is one of the reasons we are seen as one of the leaders in that space.

Hybrid integration platform

This is where we typically find the “SAP Cloud Platform Integration Suite“. The dedicated and specialised services work together to model, adapt and manage APIs, and to perform modern application integration.

The front-end API Services need to be fully managed from both a security, performance and analytical point of view- typically in an API Gateway. But they may also need to be modelled first, if you want to implement a new API.
All this can be achieved using the “SAP Cloud Platform API Management” service that lets you:

  • model APIs first: define a contract between the API-Consumer and the API-Implementation, based on open standards like OpenAPI or OData.
  • secure APIs: especially mediating internet-oriented security concepts with on-premises security concepts is important. But also protecting back-end systems from unforeseeable traffic is an important task.
  • increase performance of APIs: through concepts like caching and purpose-oriented refactoring (lightweight), the responsiveness can be greatly increased.
  • steer your API platform: monitoring – ie. analytics – is an important aspect of a digital strategy so you can define project-priorities based on API adoption.

The front-end APIs may be implemented in different places, depending on the use case:

  • in the “SAP API Management” service: this is where mash-ups and filtering can occur, to do lightweigt API composition.
  • in the “SAP Cloud Platform Integration” service: this is where REST APIs can be graphically implemented in order to orchestrate complex logic and connect to heterogenous systems.
  • in a headless microservice that can be developed with the “SAP Cloud Application Programming Model” (CAP) and run on various services of the SAP Cloud Platform such as Kyma or Cloud Foundry runtime.
  • in the Hana in-memory database: Hana features native capabilities to expose its data through OData REST APIs and provides lightening fast response times.

Obviously there are more ways to implement front-end APIs but these are the main ones.

You may rightfully ask yourself: “What about eventing?”. This is also where the hybrid integration layer comes into play. One functional aspect of Eventing should be to have the integration layer been notified by changes in the backend. Through the usage of “SAP Cloud Platform Enterprise Messaging” this becomes perfectly possible.

Systems of records

The aforementioned “systems of record” can be anything really. However, you may want to make a mental separation between databases and applications: on one hand you will handle communication with a database though typical Data Integration concepts (high volume, batch, table-level, ETL, …) whereas the communication with applications is based on Process Integration concepts (object-level, near real-time, …).

Here is how the Integration Solution Advisor Methodology (ISA-M) compares the 2 integration styles:

In case you want to learn more about the ISA-M, you may want to look-up the upcoming (or current, or past – depending your temporal location) on OpenSAP.

With this in mind, there are multiple ways to implement communication with the systems of record:

  • Applications – Process Integration: through the “SAP Cloud Platform Cloud Connector“, the SAP Cloud Platform establishes a secure tunnel to on-premises systems. This connection is then available to any of the services, including the “Integration Service” (aka. CPI or fka. HCI, part of the Integration Suite). The integration service enables connectivity to virtually any application.
    Typically, the integration service is used for executing real-time transactions against systems, or to update the central in-memory data store due to changes.
  • Database – Data Integration: through the “SAP Smart Data Integration” (SDI), the Hana Database is capable of communicating and synchronising with many different, non-SAP databases.
    Typically, SDI is used to transfer relevant – or virtualise – data between the DIH in-memory database and the on-premises databases.

Obviously, it is possible to use other tools and services to perform the synchronisation of data between the in-memory database and the system of records. For instance, one could use “SAP Data Intelligence” to perform data integration.

Conclusion

In a nutshell we can say that the “Digital Integration Hub” pattern is a very good guidance for companies looking at executing a digitalisation strategy, especially in regards to lessons learned over the past years.
The DIH can fully be implemented with SAP solutions: the illustration above is only an example of the implementation – and should be refined and adapted to specific requirements.

One thing is for sure: every company will – one day or another – run into the requirement of providing omnichannel access to its functions and data from systems of records. The question is only “How will they do it?”.

Volunteering for TM Podcast transcript, anyone?

You might be aware of the TM podcast. We think, it is already good, but it could be improved, of course. E.g. by making transcripts available, which would make the podcast of course way more accessible and inclusive.  And meanwhile the infrastructure is there. A nice example is e.g. here.

As a result we would have the text in sync with the audio. This looks something like this:

Example%20of%20Transcipt%20in%20action

Example of Transcript in action

Of course, this requires a good transcript. There is a well established format for this, this looks something like this (this is the raw version as returned by automated text recognition, more funny than helpful):

Basically per timestamp and speaker the text is written down.

So far we have the separated audio tracks, sometimes also the “funny” version of the webvtt file.

And now we need you: Of course creating those transcripts in good quality is quite some work, and as our main focus is delivering software this is currently not possible from our side. But maybe there is somebody in our audience willing to help us out here?

If so, just contact us and we can see how we can proceed. Looking forward to hear from you!

User CPU time – ‘us’ time in top

CPU consumption in Unix/Linux operating systems are studied using 8 different metrics: User CPU timeSystem CPU timenice CPU timeIdle CPU timeWaiting CPU timeHardware Interrupt CPU timeSoftware Interrupt CPU timeStolen CPU time. In this article let us study about ‘User CPU time’.

What is ‘user’ CPU time?

In order to understand ‘User CPU Time’, one should understand ‘System CPU time’ as well, since they go hand in hand. User CPU time is the amount of time the processor spends in running your application code. System CPU Time is the amount of time the processor spends in running the operating system(i.e., kernal) functions connected to your application. Let’s say your application is manipulating the elements in an array; then, it will be accounted as ‘User’ CPU time. Let’s say your application is making network calls to external applications. To make network calls, it has to read/write data into socket buffers which is part of the operating system code. This will be accounted as ‘System CPU’ time.

How to find ‘user’ CPU time?

User CPU time can be found from the following sources:

a. You can use web-based root cause analysis tools like yCrash to report ‘user’ CPU time. Tool is capable of generating alerts if ‘user’ CPU time goes beyond the threshold.

b. ‘user’ CPU time is also reported in the Unix/Linux command line tool ‘top’ in the field ‘us’ as highlighted in the below image.

Fig: ‘user’ time in top

How to simulate high ‘user’ CPU time?

To simulate high ‘user’ CPU reporting let’s use BuggyApp. BuggyApp is an open source java project which can simulate various sort of performance problems. When you launch  BuggyApp with following arguments, it will cause the ‘user’ CPU consumption to spike up on the host.

 java -jar buggyApp.jar PROBLEM_CPU

Fig: You can see the user CPU time spike up to 99%

Now let’s see the source code in the BuggyApp which is causing the ‘user’ CPU time to spike up.

public class CPUSpikeDemo { public static void start() { new CPUSpikerThread().start(); new CPUSpikerThread().start(); new CPUSpikerThread().start(); new CPUSpikerThread().start(); new CPUSpikerThread().start(); new CPUSpikerThread().start(); System.out.println("6 threads launched!"); }
} public class CPUSpikerThread extends Thread { @Override public void run() { while (true) { doSomething(); } } public static void doSomething() { // Does nothing } }

Here you can see that BuggyApp is launching 6 ‘CPUSpikerThread’ in the ‘CPUSpikeDemo’ class. You can notice that ‘CPUSpikerThread’ is going on an infinite while loop and invoking the ‘doSomething()’ function again and again. This doSomething() method doesn’t do anything.

Thus If threads keep looping infinitely on the application-level code then ‘user’ CPU time will start to spike up.

How to resolve high ‘user’ CPU time?

  1. Short-term/Tactical solution is to restart the high CPU consuming process.
  1. Strategic/Permanent solution is to identify lines of code in the application that is causing the CPU to spike up using root cause analysis tools like yCrash and fix it.
  1.  If CPU consumption is consistently (i.e., 24 x 7) high, then it could be indicative that you are running short of compute capacity. In such circumstances, you can do couple of things:

a. Upgrade the capacity of your device. May be move to better EC2 instance (if you are in cloud

environment).

b. Reduce the number of processes running on that device.

c. Try adding more devices (i.e. EC2 instances) to your application pool, so that your work load can get distributed.

Blog Series – SAP S/4 HANA Supply Chain for Transportation Management – 06 – Integrated consignment process and consolidation opportunities

One of the key advantages of having professional shipping functionality of SAP S/4HANA Supply Chain for Transportation Management (embedded TM) is the ability to consolidate inbound and outbound movements. Previous blog focused on the opportunity and benefits of consolidating inbound and outbound movements in MM subcontracting process. Similarly, there are processes in SD (sales and distribution) area where we can consolidate/combine inbound & outbound movements. This blog focuses on one such scenario – integrated customer consignment process with embedded TM.

Customer consignment is the process of keeping the stock at customer premises, and until the customer consumes or sells it, it will be a part of the seller. The customer is not obliged to pay for these goods, until it is removed from the consignment stock. Since consignment stocks still form part of valuated stock, one must manage this stock in the system. However, consignment stock must be:

  • Managed separately from the rest of your stock, so that we are aware of the stock which is stored at the customer location
  • Managed separately for each customer
Advantages – Shipper Advantages – Customer
  • Introduces new/unproven products within current/new sales channels
  • Potentially ensure long-term business with a retailer
  • Judge what levels of inventory are turned over in particular time periods
  • Reduced inventory holding costs
  • Only pay for what is consumed
  • Generally, avoids lag times between using stock and new orders arriving
  • Opportunity to establish a more traditional bulk order system in realistic order quantities
  • Access to a wider range of inventory could draw new business
Disadvantages – Shipper Disadvantages – Customer
  • High investment in logistics for large amount of new inventory
  • If it does not sell, the supplier faces a loss because they are still the owner
  • With no monetary risk, the customer may not be motivated to aggressively promote the inventory
  • If the inventory fails to be consumed, customers could waste floor/shelf space
  • Management and logistics of consignment stock along with others
  • The supplier owns the stock, but the customer will be responsible as keeper.

Consignment process is used in most of the industries and is very common in consumer goods industry, where there is a requirement to display products at the store fronts. Additionally, consignment process can act a catalyst for the demand driven supply chain process. Company can use the pull method in supply chain, as consignment issue process (consumption) represents the actual sales of the product. Most inventory management systems do not handle consignment inventory very well. However, SAP has a robust solution for consignment process. Additionally, SAP consignment process integrates seamlessly with SAP Transportation Management and provides a strong end-to-end solution for various SAP clients.

SAP Solution:

The consignment process in SAP consists of four small processes:

  1. Consignment fill-up: Send materials to customer consignment stock
  2. Consignment issue: Issue materials from customer consignment stock to the customer
  3. Consignment return: Return materials (from customer ownership) to customer consignment stock
  4. Consignment pickup: Pickup consignment stock and move it to plant stock

In ‘Consignment fill-up’ and ‘Consignment pickup’ invoices are not generated, since there is no change of ownership for the materials. SAP provided various sales and delivery order types, item categories and movement types for these 4 processes.

Scenario

SO

Type

SO Item

Category

Sch line

Category

Delivery

Type

Del Item

Category

Movt

Type

Ownership

(After the process)

Fill Up KB KBN E0 LF KBN 631 Shipper
Issue KE KEN C0 LF KEN 633 Customer
Return KR KRN D0 LR KRN 634 Shipper
Pick Up KA KAN F0 LR KAN 632 Shipper

The framework for the standard consignment process which is regularly used in the industry is detailed below.

In this scenario, as a part of ‘Consignment Fill-Up’, the shipper moves the material to the warehouse which is under the ownership of customer. Once the customer uses the materials, the ‘Consignment Issue’ process is executed to remove the stock from the shipper’s inventory. Moreover, customer can return the material to the consignment stock using ‘Consignment Return’ process. ‘Consignment Issue’ and ‘Consignment Return’ processes are not relevant for transportation, as there is no physical movement outside the warehouse. In case the shipper has to take the material back from customer warehouse to shipper’s plant/warehouse, ‘Consignment Pickup’ process can be used. This is a TM relevant process. Majority of the consignment scenario follows the above process, and the integrated process flow is detailed below.

Integrated SD-TM process flow (for order based planning scenario):

*  A generic process flow is available in SAP best practice explorer for for Customer Consignment (1IU). However, this process does not detail the integrated process with SAP TM. https://rapid.sap.com/bp/#/browse/categories/sap_s%254hana/areas/on-premise/packageversions/BP_OP_ENTPR/S4HANA/1909/US/6/EN/scopeitems/1IU

TM set up to activate

The set up to activate consignment orders for transportation management are same as any other scenario. Using integration profile and control key settings various consignment order and deliveries can be integrated, depending on whether it is order based or delivery-based planning. It is crucial to activate the correct order or the delivery type depending on the scenario. For example, for above scenario, the control key activation setting for order-based planning is shown below.

Only KA – Consignment Pickup and KE – Consignment Fill -Up are activated for Transportation planning and execution. Other settings for planning, execution and settlement are similar to a standard TM processes. For Consignment Returns & Consignment Issue processes, TM planning and execution are not relevant. Hence the control key activation is not set up for these order types.

Inbound and outbound consolidation opportunities:

If you analyze the above processes, it is evident that Consignment Fill-up and Consignment Pickup orders can be combined. Generally, there will be many independent orders moving from shippers’ plant/warehouse to customer warehouse (Fill Up). Similarly, some orders will require the materials to be brought back to shipper’s plant/warehouse. If we can combine these 2 types of movements in a single cockpit profile, using optimizer (or manually) these orders can be consolidated. This significantly reduces the empty moves and transportation cost(depending on the transportation volume).

Integrated process flow and the stages of a consolidate freight order with Consignment Pick Up and Fill up is shown below.

Other Scenarios:

Even though above scenario is the most common process in the industry, there are some variations. In order to cater to the various requirements such as Just In Time etc…, certain shippers setup exclusive warehouses near big customers plants/warehouses.

In certain situations, customer expects the warehouse to have only the consignment stocks. Also, the movement from this consignment warehouse can be under the ownership of the shipper. In this case all the 4-consignment process are Transportation relevant as there is physical movement from shipper’s consignment warehouse to customer plant(detailed in above diagram). So, the activation of orders/deliveries using integration profiles and control keys should incorporate this aspect. From a consolidation perspective, Issue and Return can be consolidated in addition to fill up and pick up.

Another variation of the consignment process is when the consignment stock is at shipper’s plant or warehouse itself. This may be requested by customer to make sure that the stock is reserved for them.

In this case, only the Issue and Return process are relevant for transportation. Fill up and Pick up are not TM relevant, as the movement happens within the warehouse (detailed above). The control key activation has to be maintained accordingly and the consolidation opportunities are available for Issue and Return orders.

In addition to these variations other processes can also be possible. For example, customer may pick up the material from shipper’s warehouse and shipper may not own the transportation. Similarly, some of the returns can be handled by the customer. All these processes may not be transportation relevant and the control key/integration profile activation has to be setup to incorporate these variations. Moreover, some shippers/customers may not use Consignment Return and Pick-Up to return the material to the shipper’ plant. They can use the regular return order as a one step process to return the issued material to the shipper. SAP can support all these processes and if we can design and activate the TM relevance of orders and deliveries as per the requirement, we can realize the integrated process with Transportation Management.

Conclusion:

In short, consignment process can have many variations. SAP has provided some standard setting which can cater to most of the scenarios. Additionally, the seamless integration of SAP SD with SAP TM is one of the biggest advantage when compared to other ERP products. As part of the implementation, it is critical to understand all these variations and set up the transportation management system accordingly. Similarly, we can model the vendor consignment process for the inbound scenarios. Hence, there are lot of opportunities in the consignment process where inbound and outbound shipments can be consolidated.

In SAP LE-TRA or S/4 HANA basic shipping, it is not possible to combine inbound and outbound movements. Planning inbound and outbound FU in FU work list (or transportation cockpit) is considered as professional shipping functionality. Combining and consolidating inbound and outbound movement (using TM optimizer or Cockpit) can reduce the empty moves, improve the vehicle utilization, and help in reduction of carbon footprint. This can significantly reduce the cost if the freight volume and spend are high. Hence professional shipping in ‘S/4 HANA Supply Chain for TM (embedded TM)’ can be a compelling business case and is critical to achieve efficient transportation process for many shippers/manufacturers.