One more interesting blog and sharing the knowledge and experience with you, basically let’s follow the topics and instructions during the reading.
Scenario and Integration Perspective
Amazon S3 Bucket service
Scenario and Integration Perspective
Challenge in this integration – In detail.
Integration Flow in SAP CPI using AmazonWebService Adapter
- Scenario one: Send the data from S4/HANA ( SAP DMS ) to Amazon S3 Bucket Service.
- Scenario two: Send the payload (JSON format) with the details of the file that has been send t Amazon S3 Bucket to External API.
SOAP UI Test
In below blog I would like to share how to integrate Amazon S3 Bucket service using the new adapter release by SAP.
In case that you still not have this adapter you can take a look in my previous blog – where I explain how to generate the AWS Header V4 signature via Groovy Script and using the HTTPS adapter.
What I will not cover in the blog ?
- The whole setup of the Amazon S3 service detail , you can easily check.
- How to deploy the new adapter release by SAP.
- ABAP codes to extract the binary file from SAP DMS
- ABAP code to send the file from SAP MDG
- Further details how the external system check the files in Amazon S3 service with the JSON file received in the API Service.
- Storage of the exception
Amazon S3 Bucket service
An Amazon S3 bucket is a public cloud storage resource available in Amazon Web Services’ (AWS) Simple Storage Service (S3), an object storage offering. Amazon S3 buckets, which are similar to file folders, store objects, which consist of data and its descriptive metadata.
Basically the S3 bucket act on-cloud repository
S3 Bucket vs traditional SFTP:
I recommend you to read the advantages to think – GO TO CLOUD in your business, not only for storage perspective, please take look in the link below with the main differences between S3 Bucket (Cloud) and the traditional on-premise SFTP server.
Important to mention that Amazon provides SFTP on-cloud also, we are not going to explore this topic in this blog.
Scenario and Integration perspective:
The scenario is the integration off S4HANA, once the change of state of Material happens, the synchronism with SAP MDG is automatically and the replication of possible document, pdf’s, imagens and others are store in the SAP DMS..
- Scenario 1: – Green Line
- The details of the materials the documents ( pdf’s, imagens and others) is stored in the SAP DMS, a custom abap function was developed to extract those details in binary base64 and send to SAP CPI.
- SAP CPI will receive using the SOAP-RM ( Asynchronous ), proceed with some transformations and forward to AWS S3
- SAP CPI receive back the reply, if HTTP 200, return success to SAP DMS, if happens exception or HTTP different 200, return fail.
- Scenario 2: – Black Line
- After receive the status success from the scenario 1, the SAP MDG with custom function extract the details the internal table and send a custom XML to SAP CPI.
- SAP CPI will receive using the SOAP-RM ( Asynchronous ) proceed with some transformations and send the JSON file to the API
- Scenario 3: – Yellow Line.
- The external system is responsible to check the file and compeer what is store in Amazon S3.
Scenario one: Send the data from S4/HANA ( SAP DMS ) to Amazon S3 Bucket Service.
As per of the clear description above, I will present you the Scenario 1 – Iflow in SAP CPI.
I will not cover the orientation about local integration process because it is related with the system responsible to store the logs of Success and Exception.
Custom XML Model from SAP DMS:
As you can see in the XML could contains more than one file, type of file, folder to be store and many <LINES> of the BASE64 from the file.
Challenge in this integration
Send one single file per document as string to Amazon S3.
I need to spilt the files and concatenate all those lines in one single document, decode and send to Amazon S3 bucket.
To solve the challenge I used the:
- Interacting Splitter ( to generate a new calls from <item> )
- Content Modifier to generate properties dynamically that will be use per call
- XSLT Mapping to exclude <MATERIAL>,<PASTA>,<NOME_ARQUIVO>,<CONTENT_TYPE> and just keep <item><LINE> and concatenate all content.
- Decoding BASE64 Standard Function.
Interating Splitter – Configuration:
Content-Modifier – Configuration:
XSLT – Code:
This XSLT code is removing some tags and concatenating any value that comes inside <LINE> tag and check if the position is the last.
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="text"/> <xsl:strip-space elements="*"/> <xsl:template match="//item/MATERIAL"/> <xsl:template match="//item/PASTA"/> <xsl:template match="//item/NOME_ARQUIVO"/> <xsl:template match="//item/CONTENT_TYPE"/> <xsl:template match="/LINE"> <xsl:value-of select="concat('',@value)"/> <xsl:if test="not(position()=last())"> <xsl:text>&</xsl:text> </xsl:if> </xsl:template> </xsl:stylesheet>
Decoding BASE64 – No configuration needed, check the flow in detail below:
The result of those steps is basically send any messages, separate per type as string to Amazon S3, you will see more in details in the test sample further in this blog.
Configuration of Amazon adapter for SAP CPI, you can check in my previous blog but I will present here the details extract by the second step in the Content-Modifier
Content Modifier to generate Dynamically the Directory, File Name and Content-Type per call, generated by the Integrating Splitter.
I will not describe in details this flow because the focus is in the first scenario, but basically here there are 4 local internal process – 2 for logging and 2 for call API with the URL dynamically based on the XML input, the responsible for this is the routing step.
SOAP UI Test
Here in this section I will explain you the test case and just overview of the Scenario 2 and nothing about Scenario 3 because is the responsibility of the external system.
In the call I will use the same sample of image of XML provide above, I can’t post here because of the content of <LINE> is something really big.
Check the details of both files that will be store in the folder dinamically – documentacaoPDF / imagemPNG and names – TEST_BLOG_SAP.pdf and TEST_BLOG_SAP.png
<item> <MATERIAL>6057099A</MATERIAL> <PASTA>documentacaoPDF</PASTA> <NOME_ARQUIVO>TEST_BLOG_SAP.pdf</NOME_ARQUIVO> <CONTENT_TYPE>application/pdf</CONTENT_TYPE> <CONTENT> <item> <LINE> XXXXXXXXXXX</LINE> </item> </CONTENT> </item> <item> <MATERIAL>6057099A</MATERIAL> <PASTA>imagemPNG</PASTA> <NOME_ARQUIVO>TEST_BLOG_SAP.png</NOME_ARQUIVO> <CONTENT_TYPE>image/png</CONTENT_TYPE> <CONTENT> <item> <LINE> XXXXXXXXXXX</LINE> </item> </CONTENT> </item>
Trace in SAP CPI:
Interacting Splitter step 2x:
Base64 step 2x:
Result the concatenation of all base64 content from each <LINE> inside the <CONTENT>
S3 Bucket without folders:
S3 Bucket after the SAP CPI call with two folders – documentacaoPDF / imagemPNG
Let’s check those files – PDF first – TEST_BLOG_SAP.pdf
Let’s check those files – PNG first – TEST_BLOG_SAP.png
As you can see the PDF and Image extract from S4HANA ( DMS ) is store with success in the Bucket S3 Amazon.
Now the Scenario 2:
Based on the XML input from S4HANA ( MDG ) custom function, there will be a routing for different address in the API based on the data from the XML:
I will present a short version and simple explanation of the Scenario 3.
<Id>794290</Id> <Name>TEST_BLOG_SAP.pdf</Name> <Version>AA</Version> <Part>000"</Part> <Path>/documentacaoPDF</Path> <EvBrand>SAP</EvBrand> <EvName>TEST_BLOG_SAP</EvName> <EvType>TEST</EvType>
The values from <EvBrand> and <EvType> will be used to generate the dynamically URL call to API and the values from <Name>, <Path>,<EvName> will be used for external company check in the bucket if everything is ok with the file.
I hope that you enjoy the read and what I presented in this blogs.