SAP R/3 IDoc Cookbook for EDI and Interfaces

This book is an in-depth discussion and cookbook for IDoc development in R/3 for EDI and eCommerce

1.1   Communication. 9

1.2   Psychology of Communication. 10

1.3   Phantom SAP Standards and a Calculation. 11

1.4   Strategy. 12

1.5   Who Is on Duty?. 13

1.6   Marcus T. Cicero. 14

2.1   What are IDocs?. 16

2.2   Exploring a Typical Scenario. 17

3.1   Get a Feeling for IDocs. 20

3.2   The IDoc Control Record. 22

3.3   The IDoc Data. 23

3.4   Interpreting an IDoc Segment Info. 24

3.5   IDoc Base - Database Tables Used to Store IDocs. 25

4.1   Quickly Setting up an Example. 27

4.2   Example: The IDoc Type MATMAS01.. 28

4.3   Example: The IDoc Type ORDERS01.. 29

5.1   Sample Processing Routines. 31

5.2   Sample Outbound  Routines. 32

5.3   Sample Inbound Routines. 34

6.1   Basic Terms. 37

6.2   Terminology. 38

7.1   Basic Customising Settings. 42

7.2   Creating an IDoc Segment WE31.. 45

7.3   Defining the Message Type (EDMSG) 49

7.4   Define Valid Combination of Message and IDoc Types. 50

7.5   Assigning a Processing Function (Table  EDIFCT ) 51

7.6   Processing Codes. 52

7.7   Inbound Processing Code. 55

8.1   Individual ABAP. 60

8.2   NAST Messages Based Outbound IDocs. 61

8.3   The RSNAST00 ABAP. 63

8.4   Sending IDocs Via RSNASTED.. 64

8.5   Sending IDocs Via RSNAST00. 65

8.6   Workflow Based Outbound IDocs. 66

8.7   Workflow Event From Change Document 67

8.8   ALE Change Pointers. 68

8.9   Activation of change pointer update. 69

8.10  Dispatching ALE IDocs for Change Pointers. 70

9.1   How the IDoc Engine Works. 73

9.2   How SAP Standard Processes Inbound IDocs. 74

9.3   How to Create the IDoc Data. 75

9.4   Interface Structure of IDoc Processing Functions. 76

9.5   Recipe to Develop an Outbound IDoc Function. 77

9.6   Converting Data into IDoc Segment Format 78

10.1  How the IDoc Engine Works. 80

10.2  How SAP Standard Processes Inbound IDocs. 81

10.3  How to Create the IDoc Data. 82

10.4  Interface Structure of IDoc Processing Functions. 83

10.5  Recipe to Develop an Outbound IDoc Function. 84

10.6  Converting Data into IDoc Segment Format 85

11.1  IDoc Type and Message Type. 87

11.2  Partner Profiles. 88

11.3  Defining the partner profile ( WE20 ) 89

11.4  Data Ports ( WE21 ) 90

12.1  What Is Remote Function Call RFC?. 92

12.2  RFC in R/3. 93

12.3  Teleport Text Documents With RFC.. 94

12.4  Calling A Command Line Via RFC ?. 96

13.1  Workflow in R/3 and Its Use for Development 99

13.2  Event Coupling (Event Linkage) 100

13.3  Workflow from Change Documents. 101

13.4  Trigger a Workflow from Messaging. 102

13.5  Example, How to Create a Sample Workflow Handler 103

14.1  A Distribution Scenario Based on IDocs. 108

14.2  Example ALE Distribution Scenario. 109

14.3  ALE Distribution Scenario. 111

14.4  Useful ALE Transaction Codes. 112

14.5  ALE Customizing SALE.. 114

14.6  Basic Settings SALE.. 115

14.7  Define the Distribution Model (The "Scenario")  BD64.. 116

14.8  Generating Partner Profiles  WE20.. 118

14.9  Creating IDocs and ALE Interface from BAPI SDBG.. 122

14.10       Defining Filter Rules. 127

15.1  R/3 RFC from MS Office Via Visual Basic. 131

15.2  Call Transaction From Visual Basic for WORD 97. 132

15.3  R/3 RFC from JavaScript 134

15.4  R/3 RFC/OLE Troubleshooting. 137

16.1  Recording a Transaction With SHDB.. 139

16.2  How to Use the Recorder Efficiently. 142

16.3  Include ZZBDCRECXX to Replace BDCRECXX.. 143

16.4  ZZBRCRECXX_FB_GEN: Generate a Function from Recording. 145

17.1  EDI and International Standards. 150

17.2  Characteristics of the Standards. 151

17.3  XML. 152

17.4  ANSI X.12. 154

18.1  Converter 157

 

SAP R/3 made an old dream come true: enter your business data once in your computer and trigger all the following activities automatically, send the data to another computer without typing them in again.

Facts, Know-how and recipes, that is all you can except from this book.

·         Establish EDI communication with your clients or suppliers

·         communicate in real-time with your legacy and satellite systems

·         send and receive data to and from your production machine

·         integrate PC or UNIX applications directly in R/3 with RFC

·         automate your business from order entry to invoicing

·         survey your purchase orders for goods receipt to clearing payments

The authors know, that nobody will believe them: but it is so imple and easy with R/3 to set up an automated business scenario with IDocs, ALE and Workflow

This book teaches you how SAP R/3 approaches in a coherent concept the electronic data exchange with another computer. With this know-how

Paper is out - EDI is in

No modern global playing company will allow their suppliers any more, to deliver their order, delivery, transport and invoice information on paper. They require

Read in this book, why

·         EDI will be unevitable in future global business

·         EDI projects with R/3 would often cost five to ten times as much as necessary?

·         IDocs and ALE are the ideal bridge between R/3 and legacy systems

·         IDocs are the framework for a fully automated business workflow

In the technical part of the book you will learn, how to

·         customize the R/3 IDoc engine

·         interface IDocs with standard converters for X.12, EDIFACT, VDA etc.

·         design your own IDoc structures

·         write IDoc handler programs in an hour

·         trigger IDocs from any R/3 application via messages or workflow

·         set up automated workflow based on IDocs

·         set up ALE scenarios for automated data replications

 

 


Preface

Proper Know-How Saves Costs

We always believed, what has been confirmed over and over again in manifold projects: The main source to cutting project costs, is a proper education of the team. Giving the team members the same book to read homogenizes the knowledge and sharpens a common sense within the group.

A Frequently Given Answers Book

This book is the result of thousands of hours of discussion and work with R/3 consultants, developer and clients  about interface development from and to R/3. When we started a new big project in autumn 1998 at the Polar Circle, which involved a big number of interfaces, I observed curiously, that my developers were ordering numerous books, all being related to EDI.

Well, those books did not say any word about R/3 and it was obvious that they were not very helpful for our team. I consequently searched the directories for books on R/3 IDocs, but there was nothing. So I started to compile my material on IDocs and ALE with the intent to publish it in the WWW. Since I submit the site http://idocs.de to some search engines I got an astonishing amount of hits. Emails asked for a written version of the stuff on the web. So – here it is.

Mystery EDI Unveiled

EDI and e-commerce are  miracle words in today’s IT world. Like any other mystery it draws its magic from the ignorance of the potential users. It is true that there are many fortune making companies in the IT world who specialize on EDI. They sell software and know-how for giant sums of money. Looking behind the scenes reveals, that the whole EDI business can simply be reduced to writing some conversion programs. This is not too easy, but the secret of EDI companies is, that the so-called standards are sold for a lot of money. As soon as you get hold of the documentation, things turn out to be easy.

IDocs, A Universal Tool for Interface Programming

Although R/3 IDocs had been introduced as a tool to implement EDI solution for R/3, it is now accepted as a helpful tool for any kind of interface programming. While this is not taught clearly in SAP’s learning courses, we put our focus on writing an interface quickly and easily.

http://idocs.de

We praise cutting edge technology. So this book takes advantage of the modern multimedia hype. Latest updates, corrections and more sophisticated and detailed examples are found on our web site.

Axel Angeli in December 1999

Logos! Informatik GmbH


About The Authors

Axel Angeli,

is born in 1961. He is a Top Level SAP R/3 consultant and R/3 cross-application development coach. He specializes in coaching of large multi-national, multi-language development teams and troubleshooting development projects.

His job description is also known as computer logistics, a delicate discipline that methodically wakes the synergetic effects in team to accelerate and mediate IT projects.

He is a learned Cybernetics scientist (also known as Artificial Intelligence) in the tradition of the Marvin Minsky [The society of mind] and Synergetics group of Herman Haken and Maria Krell. His competence in computer science is based on the works of Donald Knuth [The Art of Computer Programming], Niklas Wirth (the creator of the PASCAL language), the object oriented approach as described and developed during the XEROX PARC project (where the mouse and windows style GUIs have been invented in the early 1970ies) and Borland languages.

Before his life as SAP consultant, he made a living as a computer scientist for medical biometry and specialist for high precision industry robots. He concentrates now on big international projects. He speaks fluently several popular languages including German, English, French and Slavic.

 š axela@logosworld.de

Robi Gonfalonieri,

 born in 1964 is a senior ABAP IV developer and R/3 consultant for SD and MM. He is a learned economist turned ABAP IV developer. He specializes in international, multi-language projects both as developer and SD consultant. He speaks fluently several languages including German, French, English and Italian.

 š robig@logosworld.de

Ulrich Streit,

 born in 1974 is ABAP IV developer and interface specialist. He developed a serious of legacy system interfaces and interface monitors for several clients of the process industry.  š ulis@logosworld.de

logosworld.com

is a group of loosely related freelance R/3 consultants and consulting companies. Current members of the logosworld.com bond are the following fine companies:

Logos! Informatik GmbH, Brühl, Germany: R/3 technical troubleshooting

OSCo GmbH, Mannheim, Germany: SAP R/3 implementation partner

UNILAN Corp., Texas: ORACLE implementation competence

For true international R/3 competence and enthusiastic consultants,

email us š info@logosworld.de

or visit http://idocs.de


 

For Doris, Paul, Mini und Maxi


 

Danke, Thank You, Graçias, Tack så mycket, Merci, Bedankt, Grazie, Danjawad, Nandri, Se-Se 

I due special thanks to a variety of people, clients, partners and friends. Their insistence in finding a solution and their way to ask the right questions made this book only possible.

I want especially honour Francis Bettendorf, who has been exactly that genre of knowledgeable and experienced IT professionals I had in mind, when writing this book. A man who understands an algorithm when he sees it and without being too proud to ask precise and well-prepared questions. He used to see me every day with the same phrase on the lips: "Every day one question." He heavily influenced my writing style, when I tried to write down the answers to his questions. He also often gave the pulse to write down the answers at all. At the age of 52, he joyfully left work the evening of Tuesday the 23rd March 1999 after I had another fruitful discussion with him. He entered immortality the following Wednesday morning. We will all keep his memory in our heart.

Thanks to Detlef and Ingmar Streit for doing the great cartoons.

Thanks also to Pete Kellogg of UNILAN Corp., Texas, Juergen Olbricht, Wolfgang Seehaus and his team of OSCo, Mannheim for continuously forming such perfect project teams. It is joy working with them.

Plans are fundamentally ineffective because the "circumstances of our actions are never fully anticipated and are continuously changing around us". Suchman does not deny the existence or use of plans but implies that deciding what to do next in the pursuit of some goal is a far more dynamic and context-dependent activity than the traditional notion of planning might suggest.

Wendy Suchman, Xerox PARC http://innovate.bt.com/showcase/wearables/

 

Who Would Read This Book?

This book was written for the experienced R/3 consultants, who wants to know more about interface programming and data migration. It is mainly a compilation of scripts and answers who arose during my daily work as an R/3 coach.

Quid – What is that book about?

The R/3 Guide is a Frequently Given Answers book. It is a collection of answers, I have given to questions regarding EDI over and over again, both from developers, consultants and client’s technical staff. It is focussed on the technical aspect of SAP R/3 IDoc technology. It is not a tutorial, but a supplement to the R/3 documentation and training courses.

Quis – Who should read the book?

The R/3 Guide has been written with the experienced consultant or ABAP developer in mind. It does not expect any special knowledge about EDI, however, you should be familiar with ABAP IV and the R/3 repository.

Quo modo – how do you benefit from the book?

Well, this book is a “How to” book, or a “Know-how”-book. The R/3 Guide has its value as a compendium. It is not a novel to read at a stretch but a book, where you search the answer when you have a question.

Quo (Ubi) – Where would you use the book?

You would most likely use the book when being in a project involved in data interfaces, not necessarily a clean EDI project. IDocs are also helpful in data migration.

Quando – when should you read the book

The R/3 Guide is not a tutorial. You should be familiar with the general concept of IDocs and it is meant to be used after you have attended an R/3 course on IDocs, ALE or similar. Instead of attending the course you may alternatively read one of the R/3 IDoc tutorial on the market.

Cur – Why should you read the book

Because you always wanted to know the technical aspects of IDoc development, which you cannot find in any of the publicly accessible R/3 documentation.


1.1   Communication. 9

1.2   Psychology of Communication. 10

1.3   Phantom SAP Standards and a Calculation. 11

1.4   Strategy. 12

1.5   Who Is on Duty?. 13

1.6   Marcus T. Cicero. 14

2.1   What are IDocs?. 16

2.2   Exploring a Typical Scenario. 17

3.1   Get a Feeling for IDocs. 20

3.2   The IDoc Control Record. 22

3.3   The IDoc Data. 23

3.4   Interpreting an IDoc Segment Info. 24

3.5   IDoc Base - Database Tables Used to Store IDocs. 25

4.1   Quickly Setting up an Example. 27

4.2   Example: The IDoc Type MATMAS01.. 28

4.3   Example: The IDoc Type ORDERS01.. 29

5.1   Sample Processing Routines. 31

5.2   Sample Outbound  Routines. 32

5.3   Sample Inbound Routines. 34

6.1   Basic Terms. 37

6.2   Terminology. 38

7.1   Basic Customising Settings. 42

7.2   Creating an IDoc Segment WE31.. 45

7.3   Defining the Message Type (EDMSG) 49

7.4   Define Valid Combination of Message and IDoc Types. 50

7.5   Assigning a Processing Function (Table  EDIFCT ) 51

7.6   Processing Codes. 52

7.7   Inbound Processing Code. 55

8.1   Individual ABAP. 60

8.2   NAST Messages Based Outbound IDocs. 61

8.3   The RSNAST00 ABAP. 63

8.4   Sending IDocs Via RSNASTED.. 64

8.5   Sending IDocs Via RSNAST00. 65

8.6   Workflow Based Outbound IDocs. 66

8.7   Workflow Event From Change Document 67

8.8   ALE Change Pointers. 68

8.9   Activation of change pointer update. 69

8.10  Dispatching ALE IDocs for Change Pointers. 70

9.1   How the IDoc Engine Works. 73

9.2   How SAP Standard Processes Inbound IDocs. 74

9.3   How to Create the IDoc Data. 75

9.4   Interface Structure of IDoc Processing Functions. 76

9.5   Recipe to Develop an Outbound IDoc Function. 77

9.6   Converting Data into IDoc Segment Format 78

10.1  How the IDoc Engine Works. 80

10.2  How SAP Standard Processes Inbound IDocs. 81

10.3  How to Create the IDoc Data. 82

10.4  Interface Structure of IDoc Processing Functions. 83

10.5  Recipe to Develop an Outbound IDoc Function. 84

10.6  Converting Data into IDoc Segment Format 85

11.1  IDoc Type and Message Type. 87

11.2  Partner Profiles. 88

11.3  Defining the partner profile ( WE20 ) 89

11.4  Data Ports ( WE21 ) 90

12.1  What Is Remote Function Call RFC?. 92

12.2  RFC in R/3. 93

12.3  Teleport Text Documents With RFC.. 94

12.4  Calling A Command Line Via RFC ?. 96

13.1  Workflow in R/3 and Its Use for Development 99

13.2  Event Coupling (Event Linkage) 100

13.3  Workflow from Change Documents. 101

13.4  Trigger a Workflow from Messaging. 102

13.5  Example, How to Create a Sample Workflow Handler 103

14.1  A Distribution Scenario Based on IDocs. 108

14.2  Example ALE Distribution Scenario. 109

14.3  ALE Distribution Scenario. 111

14.4  Useful ALE Transaction Codes. 112

14.5  ALE Customizing SALE.. 114

14.6  Basic Settings SALE.. 115

14.7  Define the Distribution Model (The "Scenario")  BD64.. 116

14.8  Generating Partner Profiles  WE20.. 118

14.9  Creating IDocs and ALE Interface from BAPI SDBG.. 122

14.10       Defining Filter Rules. 127

15.1  R/3 RFC from MS Office Via Visual Basic. 131

15.2  Call Transaction From Visual Basic for WORD 97. 132

15.3  R/3 RFC from JavaScript 134

15.4  R/3 RFC/OLE Troubleshooting. 137

16.1  Recording a Transaction With SHDB.. 139

16.2  How to Use the Recorder Efficiently. 142

16.3  Include ZZBDCRECXX to Replace BDCRECXX.. 143

16.4  ZZBRCRECXX_FB_GEN: Generate a Function from Recording. 145

17.1  EDI and International Standards. 150

17.2  Characteristics of the Standards. 151

17.3  XML. 152

17.4  ANSI X.12. 154

18.1  Converter 157

 

EDI projects can soon become very expensive. However, when analysing the reasons for high costs, one finds quickly that it is not the technical implementation of the EDI project that explodes the total costs.

Summary

Most of the implementation time and costs get lost in agreeing on common standards and establishing formalities between the sender and the receiver

A successful EDI project requires that the developers on both ends sit together face to face

Sticking to a phantom “SAP standard” for IDocs,  which does not actually exist in R/3, lets the costs of the project soar

 

Just make a plan,                               Mach nur einen Plan,

And let your spirit hail.                        Sei ein großes Licht,

Then you make another plan,               Dann mach noch einen zweiten Plan

And both will fail.                                Gehen tun sie beide nicht.

Bertold Brecht and Kurt Weill, Three Penny Opera

 

More than 80% of the time of an EDI project is lost in waiting for answers, trying to understand proposals and retrieving data nobody actually needs.

A common language

EDI means to exchange information between a sender and a receiver. Both communication partners need to speak the same language to understand each other.

The language for EDI is comprised of the file formats and description languages used in the EDI data files. In the simple case of exchanging plain data files, the partners need to agree on a common file format.

The time spent on finding an agreement of a common format wastes a great deal of money.  See a common scenario:

The receiving party defines a file structure in which it likes to receive the data. This is usually an image of the data structure of the receiving computer installation.

This is a good approach for the beginning, because you have to start somewhere. But now the disaster takes course.

The proposal is sent to the other end via email. The developer of the sender system takes a look at it and remains quiet. Then he starts programming and tries to squeeze his own data into the structure.

Waiting for a response

If it becomes too tedious, a first humble approach takes place to convince the other party to change the initial file format. Again it is sent via email and the answer comes some days later. Dead time, but the consultant is paid.

Badly described meaning of a field

It can be even worse: one party proposes a format and the other party does not understand the meaning of some fields.

Echoing

Another field cannot be filled, because the sender does not have the information. Looking closer you find out, that the information originated from the receiving partner anyway. The programmer who proposed the format wanted it filled just for his personal ease. This is known as Echoing, and it is always a "nice to have" feature.

Using the same term for different objects

A real disaster happens if both parties use the same expression for different items. A classic case is the term “delivery”: What is known as an SD transport in R/3 is known as a delivery in many legacy systems.

There are many other situation where  one thing always happens: time is wasted. And time is money.

Face to face

The solution is quite simple: bring the people together. Developers of both parties need to sit together, physically face to face. If each can see what the other person does, they understand each other.

Bringing developers together accelerates every project. Especially when both parties are so much dependent on each other as in an EDI project, the partners need to communicate without pause.

There is a negative psychological aspect in the communication process, if the parties on both ends do not know each other or reduce communication with each other to the absolute minimum, 

Sporadic communication leads to latent aggression on both sides, while spending time together builds up mutual tolerance. Communicating directly and regularly positively affects the mutual respect. Once the parties accept the competence of each other,  they accept the other’s requirements more readily

Send them over the ocean.

 What if people sit on two ends of the world, one in America the other in Europe? The answer is absolutely clear: get them a business class flight and send them over the ocean.

Travel cost will be refunded by the saved time

The time you will save when the people sit together compensates a multitude of the travel costs. So do not think twice.

Sitting together also enhances the comprehension of the total system. An EDI communication forms a logical entity. But if your left hand does not know what your right hand does, you will never handle things firmly and securely.

See the business on both ends

Another effect is thus a mutual learning. It means learning how the business is executed on both sides. Seeing the similarities and the differences allows flexibility. And it allows for correct decision making without needing to ask the communication partner.

SAP R/3 delivers a serious of predefined EDI programs. Many project administrators see them as standards which should not be manipulated or modified. The truth is, that these IDoc processing functions are recommendations and example routines, which can be replaced be own routines in customizing.

Predefined not standard

SAP R/3 is delivered with a series of predefined IDoc types and corresponding handler function modules.

Some of the handler programs have been designed with user-exits where a developer can implement some data post-processing or add additional information to an IDoc.

You must always see those programs as examples for IDoc handling. If the programs already do what you want, it is just fine. But you should never stick to those programs too long, if you need different data to be sent.

R/3 IDocs were primarily designed for the automotive industry

The R/3 standard IDoc programs were designed – consciously or not - with the German association of automobile manufacturers (VDA) in mind. The VDA is a committee which defines EDI standards for their members, e.g. Volkswagen, BMW, Daimler-Benz-Chrysler. Not every car manufacturer, e.g. FORD uses these recommendations. Other industries define their own standards which are not present in R/3.

If a file exchange format already exists for your company or your industry, you may want to use that one. This means typing in the file format, writing the program that fills the structure and customising the new IDoc and message types.

A simple calculation:

Calculation

Discussing the solutions               5 days

Typing in the file formats               1/2 day

Writing the program to fill the segments    1 days

Adjust the customizing  1/2 day

Testing and correcting everything              3 days

Travel time        2 days

Total   12 days

This is not an optimistic calculation. You will notice that eight out of the twelve days are accounting for non IT related tasks like discussing solutions, educating each other and testing.

If a project takes longer than that, it simply means that unanticipated time was spent discussing and adapting solutions, because things have changed or turned out to be different as initially planned.

Do not loose your time in plans. Have prototypes developed and take them as a basis.

You cannot predict all eventualities

Do not stick to the illusion, that a proper design in the beginning will lead to a good result. It is the age old error in trusting the theorem of Laplace:

Laplace

“Tell me all the facts of the world about the presence and I will predict the future for you.”

Heisenberg and uncertainty

Let aside the fact, that modern physics since Heisenberg and his uncertainty theorem has proven, that even knowing everything about now, does not allow to predict the future deterministically.

You do not know the premises before

If you want to know all the eventualities of a project, you have to be gone through similar projects. It is only your experience that allows you to make a good plan. However, you usually do a project only once, unless you are a consultant.

The question is: If you have never been through an EDI project, how will you obtain the necessary experience?

Prototypes

The answer is: make a prototype, a little project. Do not loose your time in writing plans and detailed development requests. Rather start writing a tiny prototype. Introduce this prototype and maintain your solution. Listen to the arguments and improve the prototype steadily.

This is how you learn.

This is how you succeed.

Writing interface programs is much like translating languages. The same rule apply.

Writing interface programs is like translating a language. You have information distributed by one system and you have to translate this information into a format that the other system understands.

A translation should always be done by a native speaker of the target language. This applies to interface programs as well.

If data needs to be converted, do this always in the target system. If in doubt let the source system send everything it can. If the target does not need the information it can ignore it.

Some may have learned it in school: the basic rules of rhetoric according to Cicero. You will know the answers, when your program is at its end. Why don’t you ask the questions in the beginning? Ask the right question, then you will know.

When starting a new task, you have always to answer the magic “Q” s of rhetoric. It is a systematic way to get the answer you need to know anyway.

Quid – What

What is the subject you are dealing with? Make clear the context you are in and that all parties talk about the same.

Quis – Who

Who is involved in the business? Get the names and make sure, that they know each other before the project enters the hot phase.

Quo modo – how

How do you want to achieve your goal? Be sure all participants choose the same methods. And how do you name the things? Agree on a common terminology!

Quo (Ubi) – where

Where do things take place? Decide for a common place to work. Decide the platform, where elements of the programs should run.

Quando - when

When do you expect a result? Define milestones and discuss the why when the milestones were missed. You should always check why your initial estimate was wrong, also if you are faster than planned.

Cur – Why

Why do you want to install a certain solution? Isn’t there a better alternative?

IDocs are SAP’s file format to exchange data with a foreign system. This chapter is intended as an introduction to the concept.

Summary

IDocs are an ASCII file format to exchange data between computers; the format is chosen arbitrarily

IDocs are similar to segmented files; they are not a description language like ANSI X.12, EDIFACT or XML

The IDoc contents are processed by function modules, which can be assigned in customizing

IDocs are structured ASCII files (or a virtual  equivalent). They are the file format used by SAP R/3 to exchange data with foreign systems.

IDocs are SAP's implementation of structured text files

IDocs are simple ASCII data streams. When they are stored to a disk file, the IDocs are simple flat files with lines of text, where the lines are structured into data fields. The typical structured file has records, each record starting with a leading string that identifies the record type. Their specification is stored in the data dictionary.

Electronic Interchange Document

IDocs is the acronym for Interchange Document. This  indicates a set of (electronic) information which builds a logical entity. An IDoc is e.g. all the data of a single customer in your customer master data file, or the IDoc is all the data of a single invoice.

Data Is transmitted in ASCII format, i.e. human readable form

IDoc data is usually exchanged between systems and partners that are completely independent. Therefore, the data should be transmitted in a format that can easily be corrected by the computer operators. It is therefore mandatory to post the data in a human readable form.

Nowadays, this means that data is coded in ASCII format, including numbers which are sent as a string of figures 0 to 9. Such data can easily be read with any text editor on any computer, be it a PC, Macintosh, UNIX System, S/390 or any internet browser.

IDocs exchange messages

The information which is exchanged by IDocs is called a message and the IDoc is the physical representation of such a message. The name “messages” for the information sent via IDocs is used in the same ways as other EDI standards. .

IDocs are used like classical interface files

Everybody who has ever dealt with interface programming, will find IDocs very much like the hierarchical data files used in traditional data exchange.

International standards like the ODETTE or VDA formats are designed in the same way as IDocs are.

XML, ANSI X:12 or EDIFACT use a description language

Other EDI standards like XML, ANSI X.12 or EDIFACT/UN are based on a data description language. They differ principally from the IDocs concept, because they use a programming language syntax (e.g. like Postscript or HTML) to embed the data.

The IDoc process is a straight forward communication scenario. A communication is requested, then data is retrieved, wrapped and sent to the destination in a predefined format and envelope.

Figure 1:       

IDoc Document

Structured ASCII File

A typical EDI scenario from the viewpoint of R/3

The illustration above displays a sketch for a typical IDoc communication scenario. The steps are just the same as with every communication scenario. There is a requesting application, a request handler and a target.

The sketch shows the communication outbound R/3. Data is leaving the R/3 system.

R/3 application creates data

An R/3 application creates data and updates the database appropriately. An application can be a transaction, a stand-alone ABAP Report or any tool that can update a database within R/3.

IDoc engine picks up the request

If the application thinks that data needs to be distributed to a foreign system, it triggers the IDoc mechanism, usually by leaving a descriptive message record in the message table NAST.

The application then either directly calls  the IDoc engine or a collector job eventually picks up all due IDoc messages and determines what to do with them.

IDoc engine determines a handler function from customising

If the engine believes that data is ready to be sent to a partner system, then it determines the function module which can collect and wrap the required IDoc data into an IDoc.

In IDoc customising, you specify the name of the  function module to use. This can either be one which is  predefined by R/3 standard or a user-written one.

IDoc is backup up in R/3 and sent out

When the IDoc is created it is stored in an R/3 table and from there it is sent to the foreign system.

Conversion to standards is done by external program

If the foreign system requires a special conversion, e.g. to XML, EDIFACT or X.12 then this job needs to be done by an external converter, like the Seeburger ELKE™ system. These converters are not part of R/3.

If you have to decide on a converter solution, we  strongly recommend using a plain PC based solution. Conversion usually requires  a lot of fine tuning which stands and falls with the quality of the provided tools.

IDocs are relatively simple to understand. But, like most simple things they are difficult to explain. In this chapter we want to look at some IDocs and describe their elements, so that you can get a feeling for them.

Summary

The first record in an IDoc is a control record describing the content of the data

All but the first record are data records with the same formal record structure

Every record is tagged with the segment type and followed by the segment data

The interpretation of the segment is done by the IDoc application

Both sent and received IDocs are logged in R/3 tables for further reference and archiving purposes

For the beginning we want to give you a feeling of what IDocs are and how they may look , when you receive them as  plain text files.

IDocs are plain ASCII files (resp. a virtual equivalent)

IDocs are basically a small number of records in ASCII format, building a logical entity. It makes sense to see an IDoc as a plain and simple ASCII text file, even if it might be transported via other means.

Control record plus many data records = 1 IDoc

Any IDoc consists of two sections:

the control record

which is always the first line of the file and provides the administrative information.

the data record

which contains the application dependent data, as in our example below the material master data.

We will discuss the exchange of the material master IDoc MATMAS in the paragraphs that follow..

IDocs are defined in WE31

The definition of the IDoc structure MATMAS01 is deposited in the data dictionary and can be viewed with  WE30 .

IDOC Number Sender  Receiver   Port Message Type IDoc Type

0000123456  R3PARIS R3MUENCHEN FILE ORDERS       ORDERS01

Figure 2:        Simplified example of an IDoc control record for sales orders

SegmentType Sold-To Ship-To Value      Deldate  User

ORDERHEADER 1088    1089    12500,50   24121998 Micky Maus

Figure 3:        Simplified example of an IDoc data record for sales orders


Textfeld: EDI_DC40  043000000000001234540B 3012  MATMAS03    MATMAS   DEVCLNT100  PROCLNT100            
E2MARAM001                    043000000000001234500000100000002005TESTMAT1          19980303ANGELI       19981027SAPOSS 
E2MAKTM001                    043000000000001234500000300000103005EEnglish Name for TEST Material 1        EN
E2MAKTM001                    043000000000001234500000400000103005FFrench Name for TEST Material 1         FR
E2MARCM001                    0430000000000012345000005000001030050100DEAVB              901       PD9010  0   0.00 EXX  0.000
E2MARDM001                    0430000000000012345000006000005040051000D                  0.000         0.000                  
E2MARDM001                    0430000000000012345000007000005040051200D                  0.000         0.000                  
E2MARMM                       043000000000001234500000900000103005KGM1    1                        0.000         0.000        



Part of the content of an IDoc file for IDoc type MATMAS01




      0000000000012345 DEVCLNT100 PROCLNT100 19991103 210102                                                             
          E1MARAM                     005 TESTMAT1           19980303 ANGELI      19981027SAPOSS       KDEAVCB         
               E1MAKTM                     005 D German  Name for TEST Material 1          DE                              
               E1MAKTM                     005 E English Name for TEST Material 1          EN                              
               E1MAKTM                     005 F French  Name for TEST Material 1          FR                              
               E1MARCM                     005 0100 DEAVB                   901                                           
               E1MARCM                     005 0150 DEAVB                   901                                           
                  E1MARDM                     005 1000 D                      0.000          0.000             
                  E1MARDM                     005 1200 D                      0.000          0.000             
               E1MARMM                     005 KGM 1     1                           0.000          0.000          0.000  
               E1MARMM                     005 PCE 1     1                           0.000          0.000          0.000  


The same IDoc in a formatted representation

The very first record of an IDoc package is always a control record. The structure of this control record is the DDic structure EDIDC and describes the contents of the data contained in the package.

Control record serves as cover slip for the transport

The control record carries all the administrative information of the IDoc, such as its origin,  its destination and a categorical description of the contents and context of the attached IDoc data. This is very much like the envelope or cover sheet that would accompany any paper document sent via postal mail.

Control record is used by the receiver to determine the processing algorithm

For R/3 inbound processing, the control record is used by the standard IDoc processing mechanism to determine the method for processing the IDoc. This method is usually a function module but may be a business object as well. The processing method can be fully customised.

Control record not necessary to process the IDoc Data

Once the IDoc data is handed over to a processing function module, you will no longer need the control record information. The function modules are aware of the individual structure of the IDoc type and the meaning of the data. In other words: for every context and syntax of an IDoc, you would write an individual function module or business object (note: a business object is also a function module in R/3) to deal with.

Control Record structure is defined as EDIDC in DDic

The control record has a fixed pre-defined structure, which is defined in the data dictionary as  EDIDC and can be viewed with SE11 in the R/3 data dictionary. The header of our example will tell us, that the IDoc has been received from a sender with the name PROCLNT100   and sent to the system with the name DEVCLNT100 . It further tells us that the IDoc is to be interpreted according to the IDoc definition called MATMAS01 .

    MATMAS01 ... DEVCLNT100 PROCLNT100 ...

Figure 4:        Schematic example of an IDoc control record

Sender

The sender's identification PROCLNT100 tells the receiver who sent the IDoc. This serves the purpose of filtering unwanted data and also provides  the opportunity to process IDocs differently with respect to the sender.

Receiver

The receiver's identification DEVCLNT100 should be included in the IDoc header to make sure that the data has reached the intended recipient.

IDoc Type

The name of the IDoc type MATMAS01 is the key information for the IDoc processor. It is used to interpret the data in the IDoc records, which otherwise would be nothing more than a sequence of meaningless characters.

All records in the IDocs, which come after the control record are the IDoc data. They are all structured alike, with a segment information part and a data part which is 1000 characters in length, filling the rest of the line.

All IDoc data record have a segment info part and 1000 characters for data

All records of an IDoc are structured the same way, regardless of their actual content.  They are records with a fixed length segment info part to the left, which is followed by the segment data, which is always 1000 characters long.

IDoc type definition can be edited with WE30

We will examine an IDoc of type MATMAS01 . The IDoc type MATMAS01 is used for transferring material master data via ALE. You can view the definition of any IDoc data structure directly within R/3 with transaction WE30.

 

Segment Info   Segment Data-à

 

...E1MARAM ....00000001234567…

Material base segment

...E1MARCM ....PL01…

Plant Segment

...E1MARDM ....SL01

Storage location data

...E1MARDM ....SL02

Another storage location

...E1MARCM ....PL02

Another plant

 

Data and segment info are stored in  EDID4

Regardless of the used IDoc type, all IDocs are stored in the same database tables EDID4 for release 4.x and EDID3 for release 2.x and 3.x. Both release formats are slightly different with respect to the lengths of some fields. Please read the chapter on port types for details.

Depending on the R/3 release, the IDoc data records are  formatted either according the DDic structure EDID3 or EDID3. The difference between the two structures reflects mainly the changes in the R/3 repository, which allow longer names starting from release 4.x.

All IDoc data records are exchanged in a fixed format, regardless of the segment type. The segment’s true structure is stored in R/3’s repository as a DDic structure of the same name.

R/3 is only interested in the segment name

The segment info tells the IDoc processor how the current segment data is structured and should be interpreted. The information, which is usually the only interest, is the name of the segment EDID4-SEGNAM.

Segment name tells the data structure

The segment name corresponds to a data dictionary structure with the same name, which has been created automatically when defining the IDoc segment definition with transaction WE31 .

Remaining information is only for foreign systems

For most applications, the remaining information in the segment info can be ignored as being redundant. Some older, non-SAP-compliant partners may require it. E.g. the IDoc segment info will also store the unique segment number for systems, which require numeric segment identification.

To have the segment made up for processing in an ABAP, it is usually wise to move the segment data into a structure, which matches the segment definition.

For a segment of type e1maram the following coding is commonly used:

Data in EDID4-SDATA

TABLES: e1maram.
   . . .
MOVE edidd-sdata TO e1maram.

Then you can access the fields of the IDoc segment EDIDD-SDATA as fields of the structure e1maram  .

Data in EDID4-SDATA

WRITE: e1maram-matnr.

Sample coding

The following coding sample, shows how you may read a MATMAS IDoc and extract the data for the MARA and MARC segments to some internal variables and tables.

DATA: xmara LIKE e1maram.

DATA: tmarc AS STANDARD TABLE OF e1marcm

            WITH HEADER LINE.

LOOP AT edidd.

   CASE edidd-segnam.

         WHEN 'E1MARAM'.

          MOVE edidd-sdata TO xmara.

         WHEN 'E1MARCM'.

           MOVE edidd-sdata TO tmarc.

          APPEND tmarc.

   ENDCASE.

ENDLOOP.

now do something with xmara and tmarc.

When R/3 processes an IDoc via the standard inbound or outbound mechanism, the IDoc is stored in the tables. The control record goes to table EDIDC and the data goes to table EDID4.

All inbound and outbound Documents are stored in EDID4

All IDoc, whether sent or received are stored in the table EDID4. The corresponding control file header goes into EDIDC.

There are standard programs that read and write the data to and from the IDoc base. These programs and transaction are heavily dependent on the customising, where rules are defined which tell how the IDocs are to be processed.

Avoid reinventing the wheel

Of course, as IDocs are nothing more than structured ASCII data, you could always process them directly with an ABAP. This is certainly the quick and dirty solution, bypassing all the internal checks and processing mechanisms. We will not reinvent the wheel here.

Customising is done from the central menu WEDI

To do this customising setting, check with transaction WEDI and see the points, dealing with ports, partner profiles, and all under IDoc development.

 

Figure 5:        Tables used to store the IDoc within R/3

 

The best way to learn is by doing. This chapter tells you how to set up your R/3 system so that it can send IDocs to itself. When sending IDocs to your own system you can test the procedures without the need for a second client or installation.

Summary

Define a new internal RFC destination INTERNAL

Explore both the transactions WEDI and SALE and adjust the settings as necessary

Use transaction BALE to generate an arbitrary IDoc

If you have a naked system, you cannot send IDocs immediately. This chapter will guide you through the minimum steps to see how the IDoc engine works.

You can access most of the transactions used in the example below in the menu WEDI and SALE.

Check EDID4 with SE16

We will assume, that we want to send material master data from the current system to a remote system. To simulate this scenario we do not need to have a second system. With a little trick, we can set up the system to send an IDoc back to the sending client.

We will set up the system to use an RFC call to itself. Therefore we need to define an RFC remote destination, which points back to our own client. There is a virtual RFC destination called NONE which always refers to the calling client.

Declare the RFC destination to receive the IDoc

RFC destinations are installed with the transaction SM59. Create a new R/3 destination of type "L" (Logical destination) with the name INTERNAL and the destination NONE.

Note: Do not use RFC type internal. Although you could create them manually, they are reserved for being automatically generated. However, there is the internal connection "NONE" or "BACK" which would do the same job as the destination we are creating now.

Define a data port for INTERNAL

The next step is defining a data port, which is referenced by the IDoc sending mechanism to send the IDoc through. Declaring the port is done by transaction WE21.

Declare a new ALE model with SALE .

We will now declare an ALE connection from our client to the partner INTERNAL. ALE uses IDocs to send data to a remote system. There is a convenient transaction to send material master data as IDocs via the ALE.

Declare MATMAS01 as a valid ALE object to be sent to INTERNAL

The set up is done in transaction SALE. You first create a new ALE model, to avoid interfering with eventual existing definitions. Then you simply add the IDoc message MATMAS as a valid path from your client to INTERNAL.

Send the IDoc with transaction BALE.

In order to send the IDoc, you call the transaction BALE and choose the distribution of material master data (BD10). Choose a material, enter INTERNAL as receiver and go.

Display IDocs with WE05

To see, which IDocs have been sent, you can use the transaction WE05. If you did everything as described above, you will find the IDocs with an error status of 29, meaning that there is no valid partner profile. This is true, because we have not defined one yet.

To sharpen your understanding, we will show you an example of an IDoc of type MATMAS01, which contains material master data.

Note: You can check with transaction WE05 ,  if there are already any IDocs in your system.

IDoc structure can be seen with WE30

You can call transaction WE30 to display the structure of the Idoc type of the found IDoc.

Below is the display of an IDoc of type MATMAS01.

 

Figure 6:        Structure of the MATMAS01 Idoc type

 

MATMAS01 mirrors widely the structure of R/3’s material master entity.

Content of IDoc file

If this IDoc would have been written to a file, the file content would have looked similar to this:

...MATMAS01 DEVCLNT100 INTERNAL...

...E1MARAM ...and here the data

...E1MARCM ...and here the data

...E1MARDM ...and here the data

To allow an interference, here is a sample of IDoc type ORDERS01 which is used for purchase orders and sales orders.

ORDERS01 is used for purchasing and sales order data

Purchasing and sales orders naturally share the same IDoc type because what is a purchase order on the sender side will become a sales order on the receiver side.

Other than MATMAS01, the IDoc type ORDERS01 does not reflect the structure of the underlying RDB entity, neither the one of SD (VA01) nor the one of MM (ME21). The structure is rather derived from the EDI standards used in the automobile industry. Unfortunately, this does not make it easier to read.

Note: With transaction WE05 you can monitor, if there are already  any IDocs in your system.

IDoc structure can be seen with WE30

You can call transaction WE30 to display the structure of the IDoc type of the found IDoc

Content of IDoc file

If this IDoc would have been written to a file, the file content would have looked similar to this:

...ORDERS01 DEVCLNT100 INTERNAL...

...E1EDKA1 ....and here the data

...E1EDKA2 ....and here the data

...E1EDP19 ....and here the data

Figure 7:        Structure of the ORDERS01 IDoc type

This chapter demonstrates how an IDoc is prepared in R/3 for outbound and how a receiving R/3 system processes the IDoc.

     K    eep

     I t

     S imple and

     S mart

Creating and processing IDocs is primarily a mechanical task, which is certainly true for most interface programming. We will show a short example that packs SAP R/3 SAPscript standard text elements into IDocs and stores them.

Outbound function

Outbound IDocs from R/3 are usually created by a function module. This function module is dynamically called by the IDoc engine. A sophisticated customising  defines the conditions and parameters to find the correct function module.

The interface parameters of the processing function need to be compatible with a well-defined standard, because the function module will be called from within another program.

Inbound function

IDoc inbound functions are function modules with a standard interface, which will interpret the received IDoc data and prepare it for processing.

The received IDoc data is processed record by record and interpreted according to the segment information provided with each record. The prepared data can then be processed by an application, a function module,  or a self-written program.

The example programs in the following chapters will show you how texts from the text pool can be converted into an IDoc and processed by an inbound routine to be stored into another system.

The following will give you the basics to understand the example:

Text from READ_TEXT

SAP R/3 allows the creation of text elements, e.g. with transaction SO10. Each standard text element has a control record which is stored in table STXH. The text lines themselves are stored in a special cluster table. To retrieve the text from the cluster, you will use the standard function module function READ_TEXT . We will read such a text and pack it into an IDoc. That is what the following simple function module does.

If there is no convenient routine to process data, the easiest way to hand over the data to an application is to record a transaction with transaction SHDB and create a simple processing function module from that recording.

Outbound is triggered by the application

Outbound routines are called by the triggering application, e.g. the RSNAST00 program.

Inbound is triggered by an external event

Inbound processing is triggered by the central IDoc inbound handler, which is usually the function module IDOC_INPUT . This function is usually activated by the gatekeeper who receives the IDoc.

The most difficult work when creating outbound IDocs is the retrieval of the application data which needs sending. Once the data is retrieved, it  needs to be converted to IDoc format, only.

FUNCTION

*"----------------------------------------------------------------------

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(I_TDOBJECT) LIKE  THEAD-TDOBJECT DEFAULT 'TEXT'

*"             VALUE(I_TDID) LIKE  THEAD-TDID DEFAULT 'ST'

*"             VALUE(I_TDNAME) LIKE  THEAD-TDNAME

*"             VALUE(I_TDSPRAS) LIKE  THEAD-TDSPRAS DEFAULT SY-LANGU

*"       EXPORTING

*"             VALUE(E_THEAD) LIKE  THEAD STRUCTURE  THEAD

*"       TABLES

*"              IDOC_DATA STRUCTURE  EDIDD OPTIONAL

*"              IDOC_CONTRL STRUCTURE  EDIDC OPTIONAL

*"              TLINES STRUCTURE  TLINE OPTIONAL

*"----------------------------------------------------------------------

* *** --- Reading the application Data --- ****

  CALL FUNCTION 'READ_TEXT'

       EXPORTING

            ID                      = T_HEAD-TDID

            LANGUAGE                = T_HEAD-TDSPRAS

            NAME                    = T_HEAD-TDNAME

            OBJECT                  = T_HEAD-TDOBJECT

       IMPORTING

            HEADER                  = E_THEAD

       TABLES

            LINES                   = TLINES.

* *** --- Packing the application data into IDoc

     MOVE E_THEAD TO IDOC_DATA-SDATA.

     MOVE 'YAXX_THEAD' TO IDOC_DATA-SEGNAM.

     APPEND IDOC_DATA.

     LOOP AT TLINES.

         MOVE E_THEAD TO IDOC_DATA-SDATA.

* ***            -- we still need to fill more segment info

         MOVE 'YAXX_TLINE' TO IDOC_DATA-SEGNAM.

        APPEND IDOC_DATA.

     ENDLOOP.

* *** --- Packing the IDoc control record --- ****

  CLEAR IDOC_CONTRL.

  IDOC_CONTRL-IDOCTP = 'YAXX_TEXT'.

* *** -- we still should fill more control record info

  APPEND IDOC_CONTRL.

ENDFUNCTION.

Figure 1:        Sample IDoc outbound function module

 


 

We will show a short example that packs SAP R/3 SapScript standard text elements into IDocs and stores them back to texts in a second routine. The text elements can be edited with SO10.

Text from READ_TEXT

Each R/3 standard text element has a header record which is stored in table STXH. The text lines themselves are stored in a special cluster table. To retrieve the text from the cluster, you will use the standard function module function READ_TEXT.

Outbound processing

The program below will retrieve a text document from the text pool, convert the text lines into IDoc format, and create the necessary control information.

The first step is reading the data from the application database by calling the function module READ_TEXT.

* *** --- Reading the application Data --- ****

  CALL FUNCTION 'READ_TEXT'

       EXPORTING

            ID                      = T_HEAD-TDID

            LANGUAGE                = T_HEAD-TDSPRAS

            NAME                    = T_HEAD-TDNAME

            OBJECT                  = T_HEAD-TDOBJECT

       IMPORTING

            HEADER                  = E_THEAD

       TABLES

            LINES                   = TLINES.

Figure 2:        Reading data

Our next duty is to pack the data into the IDoc record. This means moving the application data to the data part of the IDoc record structure EDIDD and filling the corresponding segment information.

* *** --- Packing the application data into Idoc

     MOVE E_THEAD TO IDOC_DATA-SDATA.

*     the receiver needs the segment name

      in order to interpret the segment

     MOVE 'YAXX_THEAD' TO IDOC_DATA-SEGNAM.

     APPEND IDOC_DATA.

     LOOP AT TLINES.

         MOVE E_THEAD TO IDOC_DATA-SDATA.

* ***            -- we still need to fill more segment info

         MOVE 'YAXX_TLINE' TO IDOC_DATA-SEGNAM.

        APPEND IDOC_DATA.

     ENDLOOP.                   

Figure 3:        Converting application data into IDoc format

Finally, we have to provide a correctly filled control record for this IDoc. If the IDoc routine is used in a standard automated environment, it is usually sufficient to fill the field EDIDC-IDOCTP with the IDoc type, EDIDC-MESTYP with the context message type and the receiver name. The remaining fields are automatically filled by the standard processing routines if applicable.

* *** --- Packing the Idoc control record --- ****

  CLEAR IDOC_CONTRL.

  IDOC_CONTRL-IDOCTP = 'YAXX_TEXT'.

* ***            -- we still need to fill more control rec info

  APPEND IDOC_CONTRL.

Figure 4:        Filling control record information

Inbound processing is basically the reverse process of an outbound.. The received IDoc has to be unpacked, interpreted and transferred to an application for further processing.

FUNCTION

*"----------------------------------------------------------------------

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(INPUT_METHOD) LIKE  BDWFAP_PAR-INPUTMETHD

*"             VALUE(MASS_PROCESSING) LIKE  BDWFAP_PAR-MASS_PROC

*"       EXPORTING

*"             VALUE(WORKFLOW_RESULT) LIKE  BDWFAP_PAR-RESULT

*"             VALUE(APPLICATION_VARIABLE) LIKE  BDWFAP_PAR-APPL_VAR

*"             VALUE(IN_UPDATE_TASK) LIKE  BDWFAP_PAR-UPDATETASK

*"             VALUE(CALL_TRANSACTION_DONE) LIKE  BDWFAP_PAR-CALLTRANS

*"       TABLES

*"              IDOC_CONTRL STRUCTURE  EDIDC

*"              IDOC_DATA STRUCTURE  EDIDD

*"              IDOC_STATUS STRUCTURE  BDIDOCSTAT

*"              RETURN_VARIABLES STRUCTURE  BDWFRETVAR

*"              SERIALIZATION_INFO STRUCTURE  BDI_SER

*"----------------------------------------------------------------------

  DATA: XTHEAD   LIKE THEAD  .

  DATA: TLINES LIKE TLINE    OCCURS 0 WITH HEADER LINE.

  CLEAR XTHEAD.

  REFRESH TLINES.

* *** --- Unpacking the IDoc --- ***

  LOOP AT IDOC_DATA.

     CASE IDOC_DATA-SEGNAM.

       WHEN 'YAXX_THEAD'.

            MOVE IDOC_DATA-SDATA TO XTHEAD.

       WHEN 'YAXX_TLINE'.

            MOVE IDOC_DATA-SDATA TO TLINES.

     ENDCASE.

  ENDLOOP.

* *** --- Calling the application to process the received data --- ***

  CALL FUNCTION 'SAVE_TEXT'

       EXPORTING

            HEADER          = XTHEAD

            SAVEMODE_DIRECT = 'X'

       TABLES

            LINES           = TLINES.

    ADD SY-SUBRC TO OK.

* füllen IDOC_Status

* fill IDOC_Status

    IDOC_STATUS-DOCNUM = IDOC_CONTRL-DOCNUM.

    IDOC_STATUS-MSGV1  = IDOC_CONTRL-IDOCTP.

    IDOC_STATUS-MSGV2  = XTHEAD.

    IDOC_STATUS-MSGID  = '38'.

    IDOC_STATUS-MSGNO  = '000'.

    IF OK NE 0.

      IDOC_STATUS-STATUS = '51'.

      IDOC_STATUS-MSGTY  = 'E'.

    ELSE.

      IDOC_STATUS-STATUS = '53'.

      IDOC_STATUS-MSGTY  = 'S'.

      CALL_TRANSACTION_DONE = 'X'.

    ENDIF.

    APPEND IDOC_STATUS.

ENDFUNCTION.

Figure 5:        Sample IDoc outbound function module

Inbound processing function module

This example of a simple inbound function module expects as input an IDoc with rows of plain text as created in the outbound example above. The procedure will extract the text name and the text line from the IDoc and hand over the text data to the function module SAVE_TEXT which will store the text in the text pool.

Unpacking the IDoc data

The received IDoc data is processed record by record and data is sorted out according to the segment type.

* *** --- Unpacking the IDoc --- ***

  LOOP AT IDOC_DATA.bb

     CASE IDOC_DATA-SEGNAM.

      WHEN 'YAXX_THEAD'.

            PERFORM UNPACK_IDOC TABLES IDOC_DATA USING XTHEAD.

      WHEN 'YAXX_TLINE'.

            PERFORM UNPACK_TAB  TABLES IDOC_DATA TLINES.

       ENDCASE.

  ENDLOOP.

 

When the IDoc is unpacked data is passed to the application.

* *** --- Calling the application to process the received data --- ***

  CALL FUNCTION 'SAVE_TEXT'

       EXPORTING

            HEADER          = XTHEAD

       TABLES

            LINES           = TLINES.

Figure 6:        Storing data

Finally the processing routine needs to pass a status record to the IDoc processor. This status indicates successful or unsuccessful processing and will be added as a log entry to the table EDIDS.

* fill IDOC_Status

    IF OK NE 0.

      IDOC_STATUS-STATUS = '51'.

*      IDOC_STATUS-.. = . fill the other fields to log information

    ELSE.

      IDOC_STATUS-STATUS = '53'.

    ENDIF.

    APPEND IDOC_STATUS.

Figure 7:        Writing a status log

The status value '51' indicates a general error during application processing and the status '53' indicates everything is OK.

This chapter addresses common expressions used in context with IDocs. You should be familiar with them. Some are also used in non-IDoc context with a completely different meaning, e.g. the term message, so avoid misunderstandings. Many fights in project teams arise from different interpretations of the same expression.

There are several common  expressions and methods that you need to know, when dealing with IDoc.


Message Type

The message type defines the semantic context of an IDoc. The message type tells the processing routines, how the message has to be interpreted.

The same IDoc data can be sent with different message types. E.g. The same IDoc structure which is used for a purchase order can also be used for transmitting a sales order. Imagine the situation that you receive a sales order  from your clients and in addition you receive copies of sales orders sent by an subsidiary of your company.

IDoc Type

An IDoc type defines the syntax of the IDoc data. It tells which segments are found in an Idoc and what fields the segments are made of. 

Processing Code

The processing code is a logical name that determines the processing routine. This points usually to a function module,  but the processing routine can also be a workflow or an event.

The use of a logical processing code makes it easy to modify the processing routine for a series of partner profiles at once.

Partner profile

Every sender-receiver relationship needs a profile defined. This one determines 

·         the processing code 

·         the processing times and conditions 

·         and in the case of outbound IDocs  

·                the media port used to send the IDoc and 

·                the triggers used to send the IDoc

Partner Type

The IDoc partners are classified in logical groups. Up to release 4.5 there were the following standard partner types defined: LS, KU, LI.

LS - Logical Systems

The logical system is meant to be a different computer and was primarily introduced for use with the ALE functionality. You would use a partner type of LS, when linking with a different computer system, e.g. a legacy or subsystem.

KU - Customer [ger.: Kunde]

The partner type customer is used in classical EDI transmission to designate a partner, that requires a service from your company or is in the role of a debtor with respect to your company, e.g. the payer, sold-to-party, ship-to-party.

LI - Supplier [Ger.: Lieferant]

The partner type supplier is used in classical EDI transmission to designate a partner, that delivers a service to your company. This is typically the supplier in a purchase order. In SD orders you also find LI type partners, e.g. the shipping agent.

Message Type – How to Know What the Data Means

Data exchanged by an IDoc via EDI is known as message. Messages of the same kind belong to the same message type.

 

Define the semantic context

The message type defines the semantic context of an IDoc. The message type tells the receiverhow the message has to be interpreted.

Messages are information for a foreign partner

The term message is commonly used in communication, be it EDI or telecommunication. Any stream of data sent to a receiver with  well-defined information in itis known as a message. EDIFACT, ANSI/X.12, XML and others use message the same way.

The term message is also used for R/3’s internal communication between applications

Unfortunately, the term message is used in many contexts other than EDI as well. Even R/3 uses the word message for the internal communication between applications. While this is totally OK from the abstract point of view of data modelling, it may sometimes cause confusion if it is unclear whether we are referring to IDoc messages or internal messages.

The specification of the message type along with the sent IDoc package is especially important when the physical IDoc type (the data structure of the IDoc file) is used for different purposes.

A classical ambiguity arises in communication with customs via EDI. They usually set up a universal file format for an arbitrary kind of declaration, e.g. Intrastat, Extrastat, Export declarations, monthly reports etc. Depending on the message type, only applicable fields are filled with valid data. The message type tells the receiver which fields are of interest at all.

Partner Profiles – How to Know the Format of the Partner

Different partners may speak different languages. While the information remains the same, different receivers may require completely different file formats and communication protocols. This information is stored in a partner profile.

Partner profiles are the catalogue of active EDI connection from and to R/3

In a partner profile you will specify the names of the partners which are allowed to exchange IDocs to your system. For each partner you have to list the message types that the partner may send.

Partner profiles store the IDoc type to use

For any such message type, the profile tells the IDoc type, which the partner expects for that kind of message.

Outbound customising agrees how data is electronically exchanged

For outbound processing, the partner profile also sets the media to transport the data to its receiver, e.g.

·         an operating system file

·         automated FTP

·         XML or EDIFACT transmission via a broker/converter

·         internet

·         direct remote function call

The means of transport depends on the receiving partner, the IDoc type and message type (context).

Different partners, different profiles

Therefore,  you may choose to send the same data as a file to your vendor and via FTP to your remote plant.

Also you may decide to exchange purchase data with a vendor via FTP but send payment notes to the same vendor in a file.

Inbound customising determines the processing routine

For inbound processing, the partner profile customizsng will also determine a processing code, which can handle the received data. 

The partner profile may tell you the following:

 

 

Supplier

MAK_CO

sends the message

SHIPPING_ADVISE

via the port named

INTERNET

using IDoc type

SHPADV01

processed with code

SHIPMENTLEFT

 

 

Sales agent

LOWSELL

sends the message

SALESORDERS

via the port named

RFCLINK

using IDoc type

ORDERS01

processed with code

CUSTOMERORDER

 

 

 

 

Sales agent

SUPERSELL

sends the message

SALESORDERS

via the port named

RFCLINK

using IDoc type

ORDERS01

processed with code

AGENTORDER

 

IDoc Type – The Structure of the IDoc File

The IDoc type is the name of the data structure used to describe the file format of a specific IDoc.

IDoc type defines the structure of the segments

An IDoc is a segmented data file. It has typically several segments. The segments are usually structured into fields; however, different segments use different fields.

The Idoc type is defined with transaction WE30, the respective segments are defined with transaction WE31.

The processing code is a pointer to an algorithm to process an IDoc. It is used to allow more flexibility in assigning the processing function to an IDoc message.

The logical processing code determines the algorithm in R/3 used to process the IDoc

The processing code is a logical name for the algorithm used to process the IDoc. The processing code points itself to a method or function, which is capable of processing the IDoc data.

 

A processing code can point to an SAP predefined or a self-written business object or function module as long as they comply with certain interface standards.

Allows changing the algorithm easily

The processing codes allow you to easily change the processing algorithm. Because the process code can be used for more than one partner profile, the algorithm can be easily changed for every concerned IDoc.

The processing code defines a method or function to process an IDoc

The IDoc engine will call a function module or a business object which is expected to perform the application processing for the received IDoc data. The function module must provide exactly the interface parameters which are needed to call it from the IDoc engine.

                                                  

In addition to  the writing of the processing function modules, IDoc development requires the definition of the segment structures and a series of customising settings to control the flow of the IDoc engine.

Summary

Customise basic installation parameters

Define segment structures

Define message types, processing codes

Segments define the structure of the records in an IDoc. They are defined with transaction WE31.

Check first, whether the client you are working with already has  a logical system name assigned.

T000 – name of own logical system

The logical system name is stored in table T000 as T000‑LOGSYS. This is the table of installed clients.

TBDLS – list of known logical destinations

If there is no name defined,  you need to create a logical system name . This means simply adding a line to table TBDLS. You can edit the table directly or access the table from transaction SALE.

Naming conventions:
DEVCLNT100
PROCLNT123
TSTCLNT999

The recommended naming convention is

sysid + "CLNT" + client

If your system is DEV and client is 100, then the logical system name should be: DEVCLNT100.

System PRO with client 123 would be PROCLNT123 etc.

 SM59 – define physical destination and characteristics of a logical system

The logical system also needs to  be defined as a target within the R/3 network. Those definitions are done with transaction SM59 and are usually part of the work of the R/3 basis team.

Figure 8:        Step to customise outbound IDoc processing

Figure 9:        Elements that influence IDoc processing

The segment defines the structure of the records in an IDoc. They are defined with transaction WE31. We will define a structure to send a text from the text database.

Define a DDic structure with WE31

Transaction WE31 calls the IDoc segment editor. The editor defines the fields of a single segment structure. The thus defined IDoc segment is then created as a data dictionary structure. You can view the created structure with SE11 and use it in an ABAP as any TABLES declaration.

Example:

To demonstrate the use of the IDoc segment editor we will set up an example, which allows you to send a single text from the text pool (tables STXH and STXL) as an IDoc. These are the texts that you can see with SO10 or edit from within many applications.

We will show the steps to define an IDoc segment YAXX_THEAD with the DDic structure of THEAD.

Figure 10:     WE31, Defining the IDoc segment

  

Figure 11:     Naming the segment

Figure 12:     Selecting a template

Copy the segment structure from a DDic object

To facilitate our work, we will use the "copy-from-template-tool", which reads the definition of a DDIC structure and inserts the field and the matching definitions as rows in the IDoc editor. You could, of course, define the structure completely manually, but using the template makes it easier.

Figure 13:     Now select it really

The tool in release 4.0b lets you use both DDIC structures or another IDoc segment definition as a template.

Figure 14:     Created structure

The definition autpmatically creates  a corresponding DDic structure

The thus created structure can be edited any time. When saving, it will create a data dictionary structure based on the definition in WE31. The DDIC structure will retain the same name. You can view the structure as a table definition with SE11 and use it in an ABAP the same way.

The message type defines the context under which an IDoc is transferred to its destination. It allows for using the same IDoc file format for several different applications.

Sales order becomes purchase order for receiver

Imagine the situation of sending a purchase order to a supplier. When the IDoc with the purchase order reaches the supplier, it will be interpreted as a sales order received from a customer, namely you.

Sales order can be forwarded and remains a sales order

Simultaneously you want to send the IDoc data to the supplier's warehouse to inform it that a purchase order has been issued and is on the way.

Both IDoc receivers will receive the same IDoc format; however, the IDoc will be tagged with a different message type. While the IDoc to the supplier will be flagged as a purchase order (in SAP R/3 standard: message type = ORDERS), the same IDoc sent to the warehouse should be flagged differently, so that the warehouse can recognize the order as a mere informational copy and process it differently than a true purchase order.

Message type plus IDoc type determine processing algorithm

The message type together with the IDoc type determine the processing function.

EDMSG

The message types are stored in table EDMSG.

WEDI

Defining the message type can be done from the transaction WEDI

Figure 15:     EDMSG: Defining the message type (1)

EDMSG used as check table

The entry is only a base entry which tells the system that the message type is allowed. Other transactions will use that table as a check table to validate the entry.

Figure 16:     EDMSG: Defining the message type (2)

The valid combinations of message type and IDoc type are stored in table EDIMSG.

Used for validation

The declaration of valid combinations is done to allow validation, if the system can handle a certain combination.

Figure 17:     EDIMSG: Define valid combination of message and IDoc types

The combination of message type and IDoc type determine the processing algorithm. This is usually a function module with a well defined interface or a SAP business object and is set up in table EDIFCT.

The entry made here points to a function module which will be called when the IDoc is to be processed.

The entries for message code and message function are usually left blank. They can be used to derive sub types of messages together with the partner profile used.

Figure 18:     Assign a handler function to a message/message type

The definition for inbound and outbound IDocs is analogous. Of course, the function module will be different.

R/3 uses the method of logical process codes to detach the IDoc processing and the processing function module. They assign a logical name to the function instead of specifying the physical function name.

Logical pointer to a processing method

The IDoc functions are often used for a series of message type/IDoc type combination. It is necessary to replace the processing function by a different one. E.g. when you make a copy of a standard function to avoid modifying the standard.

Easily replacing  the processing method

The combination message type/IDoc will determine the logical processing code, which itself points to a function. If the function changes, only the definition of the processing codes will be changed and the new function will be immediately effective for all IDocs associated with the process code.

For inbound processing codes you have to specify the method to use for the determination of the inbound function.

Figure 19:     Assign an outbound processing code (Step 1)

Processing with ALE

This is the option you would usually choose. It allows processing via the ALE scenarios.

Figure 20:     Associate a processing code with a message type

Validate allowed message types

After defining the processing code you have to assign it to one or several logical message types. This declaration is used to validate, if a message can be handled by the receiving system.

The inbound processing code is assigned analogously. The processing code is a pointer to a function module which can handle the inbound request for the specified IDoc and message type.

The definition of the processing code is identifying the handler routine and assigning a serious of processing options.

Processing with ALE

You need to click "Processing with ALE", if your function can be used via the ALE engine. This is the option you would usually choose. It allows processing via the ALE scenarios.

Associate a function module with a process code

Table TBD51 to define if visible BTCI is allowed

For inbound processing you need to indicate whether the function will be capable of dialog processing. This is meant for those functions which process the inbound data via call transaction. Those functions can be replayed in visible batch input mode to check why the processing might have failed.

Figure 21:     Define if the processing can be done in dialog via call transaction

Validate allowed message types

After defining the processing code, you have to assign it to one or several logical message types. This declaration is used to validate, if a message can be handled by the receiving system.

Figure 22:     Associate a processing code with a message type

The examples above show only the association with a function module. You can also define business objects with transaction SWO1 and define them as a handler. For those familiar with the object model of R/3, it may be a design decision. In this book, we will deal with the function modules only.

IDocs should be sent out at certain events. Therefore you have to define a trigger. A lot of consideration is required to determine the correct moment when to send out the IDoc. The IDoc can be triggered at a certain time or when an event is raised. R/3 uses several completely different methods to determine the trigger point. There are messages to tell the system that there is an IDoc waiting for dispatching, there are log files which may be evaluated to see if IDocs are due to send and there can be a workflow chain triggered, which includes the sending of the IDoc.

 

Figure 23:     General Process logic of IDoc outbound

 

The simplest way to create IDocs, is to write an ABAP. The individual ABAP can either be a triggering ABAP which runs at certain events, e.g. every night, or it can be an ABAP which does the complete IDoc creation from scratch.

Triggering ABAP

A triggering ABAP would simply try to determine which IDocs need sending and call the appropriate IDoc creation routines.

ABAP creates the whole IDoc

You may also imagine the ABAP to do all the job. As this is mostly reinventing the wheel, it is not really recommended and should be reserved to situation, where the other solution do not provide an appropriate mean.

You can use the R/3 message concept to trigger IDocs the same way as you trigger SAPscript printing.

One of the key tables in R/3 is the table NAST. This table records reminders written by applications. Those reminders are called messages.

Applications write messages to NAST, which will be processed by a message handler

Every time when an applications sees the necessity to pass information to a third party. a message is written to NAST. A message handler will eventually check the entries in the table and cause an appropriate action.

EDI uses the same mechanism as printing

The concept of NAST messages has originally been designed for triggering SAPscript printing. The very same mechanism is used for IDocs, where the IDoc processor replaces the print task, as an IDoc is only the paperless form of a printed document.

Condition technique can mostly be used

The messages are usually be created using the condition technique, a mechanism available to all major R/3 applications.

Printing, EDI and ALE use the same trigger

The conditions are set up the same way for any output media. So you may define a condition for printing a document and then just change the output media from printer to IDoc/EDI or ALE.

Figure 24:     Communicating with message via table NAST

NAST messages are created by application by calling function module MESSAGING

Creating NAST messages is a standard functionality in most of the SAP core applications. Those applications - e.g. VA01, ME21 - perform calls to the central function module MESSAGING of group V61B. The  function module uses customizing entries, mainly those of the tables T681* to T685*.

NAST contains object key, sender and receiver

A NAST output message is stored as a single record in the table NAST. The record stores all information that is necessary to create an IDoc. This includes mainly an object key to identify the processed object and application to the message handler and the sender and receiver information.

Programs RSNAST00 and RSNASTED provide versatile subroutines for NAST processing

The messages are typically processed by

FORM ENTRY in PROGRAM RSNAST00.

If we are dealing with printing or faxing and

FORM EDI_PROCESSING in PROGRAM RSNASTED.

If we are dealing with IDocs

FORM ALE_PROCESSING in PROGRAM RSNASTED.

If we are dealing with ALE.

The following piece of code does principally the same thing as RSNAST00 does and makes full use of all customizing settings for message handling.

FORM einzelnachricht IN PROGRAM RSNAST00

TABLES: NAST.

SELECT * FROM NAST ...

PERFORM einzelnachricht IN PROGRAM RSNAST00

Programs are customized in table TNAPR

The processing routine for the respective media and message is customized in the table TNAPR. This table records the name of a FORM routine, which processes the message for the chosen media and the name of an ABAP where this FORM is found.

The ABAP RSNAST00 is the standard ABAP, which is used to collect unprocessed NAST message and to execute the assigned action.

RSNAST00 is the standard batch collector for messages

RSNAST00 can be executed as a collector batch run, that eventually looks for unprocessed IDocs. The usual way of doing that is to define a batch-run job with transaction SM37. This job has to be set for periodic processing and start a program that triggers the IDoc re-sending.

RSNAST00 processes only messages of a certain status

Cave! RSNAST00 will only look for IDocs which are set to NAST-VSZTP = '1' or '2' (Time of processing). VSZPT = '3' or '4' is ignored by RSNAST00.

For batch execution a selection variant is required

Start RSNAST00 in the foreground first and find the parameters that match your required selection criteria. Save them as a VARIANT and then define the periodic batch job using the variant.

If RSNAST00 does not meet 100% your needs you can create an own program similar to RSNAST00. The only requirement for this program are two steps:

* Read the NAST entry to process into structure NAST
tables nast.
data: subrc like sy-subrc.....
select from NAST where .......
* then call FORM einzelnachricht(rsnast00) to process the record
PERFORM einzelnachricht(rsnast00) USING subrc.

Standard R/3 provides you with powerful routines, to trigger, prepare and send out IDocs in a controlled way. There are only a few rare cases, where you do not want to send IDocs the standard way.

The ABAP RSNAST00 is the standard routine to send IDocs from entries in the message control. This program can be called directly, from a batch routine with variant or you can call the FORM einzelnachricht_screen(RSNAST00) from any other program, while having the structure NAST correctly filled with all necessary information.

RSNAST00 determines if it is IDoc or SAPscript etc.

If there is an  entry in table NAST, RSNAST00 looks up the associated processing routine in table TNAPR. If it is to send an IDoc with standard means, this will usually be the routine RSNASTED(EDI_PROCESSING) or RSNASTED(ALE_PROCESSING) in the case of ALE distribution.

RSNASTED processes IDocs

RSNASTED itself determines the associated IDoc outbound function module, executes it to fill the EDIDx tables and passes the prepared IDoc to the port.

You can call the standard processing routines from any ABAP, by executing the following call to the routine. You only have to make sure that the structure NAST is declared with the tables statement in the calling routine and that you fill at least the key part and the routine (TNAPR) information before.

TABLES NAST.

NAST-MANDT = SY-MANDT.

NAST-KSCHL = 'ZEDIK'.

NAST-KAPPL = 'V1'.

NAST-OBJKY = '0012345678'.

NAST-PARNR = 'D012345678'.

PERFORM einzelnachricht_screen(RSNAST00).

Calling einzelnachricht_screen determines how the message is processed. If you want to force the IDoc-processing you can call it directly:

TNAPR-PROGN = ''.

TNAPR-ROUTN = 'ENTRY'.

PERFORM edi_processing(RSNASTED).

Here is the principle flow how RSNAST00 processes messages for IDocs.

Figure 25:     Process logic of RSNAST00 ABAP

Unfortunately, there are application that do not create messages. This is especially true for master data applications. However, most applications fire a workflow event during update, which can easily be used to trigger the IDoc distribution.

SWE_EVENT_CREATE

Many SAP R/3 applications issue a call to the function SWE_EVENT_CREATE during update. This function module ignites a simple workflow event.

Workflow is a call to a function module

Technically a workflow event is a timed call to a function module, which takes the issuing event as the key to process a subsequent action.

Applications with change documents always trigger workflow events

If an application writes regular change documents (ger.: Änderungsbelege) to the database, it will issue automatically a workflow event. This event is triggered from within the function CHANGEDOCUMENT_CLOSE. The change document workflow event is always triggered, independent of the case whether a change document is actually written.

Workflow coupling can be done by utility functions

In order to make use of the workflow for IDoc processing, you do not have to go through the cumbersome workflow design procedure as it is described in the workflow documentation. For the mentioned purpose, you can register the workflow handler from the menu, which says Event Coupling from the BALD transaction.

Workflow cannot easily be restarted

Triggering the IDoc from a workflow event has a disadvantage: if the IDoc has to be repeated for some reason, the event cannot be repeated easily. This is due to the nature of a workflow event, which is triggered usually from a precedent action. Therefore you have to find an own way how to make sure that the IDoc is actually generated, even in the case of an error. Practically this is not a very big problem for IDocs. In most cases the creation of the IDoc will always take place. If there is a problem, then the IDoc would be stored in the IDoc base with a respective status. It will shown in transaction WE05 and can be resend from there.

Instead of waiting for a polling job to create IDocs, they can also be created immediately after a transaction finishes. This can be done by assigning an action to an workflow event.

Workflow events are usually fired from an update routine

Most application fire a workflow event from the update routine by calling the function

FUNCTION swe_event_create

SWLD lets install and log workflows

You can check if an application fires events by activating the event log from transaction SWLD. Calling and saving a transaction will write the event’s name and circumstances into the log file.

If an application does not fire workflow events directly, there is still another chance that a workflow may be used without touching the R/3 original programs.

Workflow Events are also fired from change document

Every application that writes change documents triggers a workflow event from within the function module CHANGEDOCUMENT_CLOSE, which is called form the update processing upon writing the change document. This will call the workflow processor

FUNCTION swe_event_create_changedocument

Both workflow types are not compatible with each other with respect to the function modules used to handle the event.

The workflow types are incompatible but work according the same principal

Both will call a function module whose name they find in the workflow linkage tables. swe_event_create will look in table SWETYPECOU while swe_event_create_changedocument would look in SWECDOBJ for the name of the function module.

The workflow handler will be called dynamically

If a name is found, the function module will then be called dynamically. This is all to say about the linkage of the workflow. 

The dynamic call looks like the following.

CALL FUNCTION swecdobj-objtypefb
EXPORTING
changedocument_header = changedocument_header
objecttype = swecdobj-objtype
IMPORTING
objecttype = swecdobj-objtype
TABLES
changedocument_position = changedocument_position.

Applications which write change documents will also try to write change pointers for ALE operations. These are log entries to remember all modified data records relevant for ALE.

Most applications write change documents. These are primarily log entries in the tables CDHDR and CDPOS.

Change docs remember changes in transaction

Change documents remember the modified fields made to the database by an application. They also remember the user name and the time when the modification took place.

Data elements are marked to be relevant for change documents

The decision whether a field modification is relevant for a change document is triggered by a flag of the modified field’s data element. You can set the flag with SE11 by modifying the data element.

ALE may need other triggers

For the purpose of distributing data via ALE to other systems, you may want to choose other fields, which shall be regarded relevant for triggering a distribution.

Therefore R/3 introduced the concept of change pointers, which are nothing else than a second log file specially designed for writing the change pointers which are meant to trigger IDoc distribution via ALE.

Change pointers remember key of the document

So the change pointers will remember the key of the document every time when a relevant field has changed.

An ABAP creates the IDocs

Change pointers are then evaluated by an ABAP which calls the IDoc creation, for every modified document found in the change pointers.

Change pointers are when change documents have been written

The Change pointers are written from the routine CHANGEDOCUMENT_CLOSE when saving the generated change document.  So change pointers are automatically written when a relevant document changes.

The following function is called from within CHANGEDOCUMENT_CLOSE in order to write the change pointers.

CALL FUNCTION 'CHANGE_POINTERS_CREATE'
EXPORTING
change_document_header = cdhdr
TABLES
change_document_position = ins_cdpos.

Change pointers are log entries to table BDCP which are written every time a transaction modifies certain fields. The change pointers are designed for ALE distribution and written by the function CHANGE_DOCUMENT_CLOSE.

Change pointers are written for use with ALE. There are ABAPs like RBDMIDOC which can read the change pointers and trigger an IDoc for ALE distribution.

The change pointers are mainly the same as change documents. They however can be set up differently, so fields which trigger change documents are not necessarily the same that cause change pointers to be written.

In order to work with change pointers there are two steps to be performed

·         Turn on change pointer update generally

·         Decide which message types shall be included for change pointer update

Activate Change Pointer Generally

R3 allows to activate or deactivate the change pointer update. For this purpose it maintains a table TBDA1. The decision whether the change pointer update is active is done with a

 

Function Ale_Component_Check

Currently (release 40B) this check does nothing else than to check, if this table has an entry or not. If there is an entry in TBDA1, the ALE change pointers are generally active. If this table is empty, change pointers are turned off for everybody and everything, regardless of the other settings.

The two points read like you had the choice between turning it on generally or selectively. This is not the case: you always turn them on selectively. The switch to turn on generally is meant to activate or deactivate the whole mechanism.

reading the change pointers which are not yet processed

The change pointers which have not been processed yet, can be read with a function module.

Call Function 'CHANGE_POINTERS_READ'

RBDMIDOC

The ABAP RBDMIDOC will process all open change pointers and distribute the matching IDocs.

Use Change Documents Instead Of Change Pointers

When you want to send out an IDoc unconditionally every time a transaction updates, you better use the workflow from the change documents.

Change pointers must be processed by an ABAP, e.g. RBDMIDOC.

RBDMIDOC processes change pointers and sends the IDocs

The actual distribution of documents from change pointers must be done by an ABAP, which reads the change pointers and processes them. The standard ABAP for that is RBDMIDOC. For recurring execution it can be submitted in a scheduled job using SM35 .

Function module defined in table TBDME

It then calls dynamically a function module whose name is stored in table TBDME for each message type.

Call Function Tbdme-Idocfbname
   Exporting
      Message_Type = Mestyp
      Creation_Date_High = Date
      Creation_Time_High = Time
   Exceptions
      Error_Code_1.

Example

A complex example for a function module, which collects the change pointers, can be examined in:

MASTERIDOC_CREATE_SMD_DEBMAS .

This one reads change pointers for debtors (customer masters). During the processing, it calls the actual IDoc creating module MASTERIDOC_CREATE_DEBMAS .

To summarize the change pointer concept

·         Change pointers record relevant updates of transaction data

·         Change pointers are written separate from the change documents, while at the same time

·         Change pointers are evaluated by a collector run

BDCPS

Change pointer: Status

BDCP

Change pointer

BDCPV

A view with BDCP and BDCPS combined: Change pointer with status

TBDA2

Declare activate message types for change pointers with view V_TBDA2.or transaction BD50 or  .
SALE -> Activate change pointers for message types

TBD62

The view V_TBD62 defines those fields which are relevant for change pointer creation. The table is evaluated by the CHANGE_DOCUMENT_CLOSE function. The object is the same used by the change document. To find out the object name, look for CHANGE_DOCUMENT_CLOSE in the transaction you are inspecting or see table CDHDR for traces.

Figure 26:     Tables involved in change pointers processing

Sample content of view V_TBD62

Object

Table name

Field

DEBI

KNA1

NAME3

DEBI

Kann1

ORT01

DEBI

Kann1

REGIO

Figure 27:     Sample content of view V_TBD62

This chapter will show you how an IDoc function is principally designed and how R/3 processes the IDocs. I cannot stop repeating, that writing IDoc processing routines is a pretty simple task. With a number of recipes on hand, you can easily build your own processors.

IDocs are usually created in a four step process: retrieving the data, converting it to IDoc format, adding a control record, and delivering the IDoc to a port.

Collect data from R/3 database

This is the single most important  task in outbound processing. You have to identify the database tables and data dependencies which are needed in the IDoc to be sent. The smartest way is usually to select the data from the database into an internal table using SELECT * FROM dbtable INTO itab  ... WHERE ...

Wrap data in IDoc format

The collected data must be transformed into ASCII data and filled into the predefined IDoc segment structures. The segment definitions are done with transaction WE31 and the segments allowed in an IDoc type are set up in transaction WE30. Segments defined with WE31 are automatically created as SAP DDIC structures. They can be viewed with SE11, however, they cannot be edited.

Create the IDoc control record

Every IDoc must be accompanied by a control record which  must contain at least the Idoc type to identify the syntactical structure of the data and  the name and role of the sender and the receiver. This header information is checked against the partner definitions for outbound. Only if a matching partner definition exists, can the IDoc  be sent. Partner definitions are set up with transaction WE20.

Send data to port

When the partner profile check matches, the IDoc is forwarded to a logical port, which is also assigned in the partner profile. This port is set up with transaction WE21 and defines the  medium to transport the IDoc, e.g. file or RFC. The RFC destinations are set up with transaction SM57 and must also be entered in table TBDLS with an SM31 view. Directories for outbound locations of files are set up with transaction FILE and directly in WE21. It also allows the use of a function module which generates file names. Standard functions for that purpose begin like EDI_FILE*.

When you receive an IDoc the standard way, the data is stored in the IDoc base and a function module is called, which decides how to process the received information.

EDID4 - Data

Data is stored in table EDID4 (EDID3 up to release 3.xx, EDIDD up to release 2.xx)

EDIDC - Control Record

An accompanying control record with important context and administrative information is stored in table EDIDC.

Event signals readiness

After the data is stored in the IDoc base tables, an event is fired to signal that there is an IDoc waiting for processing. This event is consumed by the IDoc handler, which decides, whether to process the IDoc immediately, postpone processing, or decline activity for whatever reason.

EDIFCT - Processing function

When the IDoc processor thinks it is time to process the IDoc it will search the  table EDIFCT , where it should find the name of a function module which will be called to process the IDoc data.

This function module is the heart of all inbound processing. The IDoc processor will call this routine and pass the IDoc data from EDID4 and the control record from EDIDC for the respective IDoc.

Function has a fixed interface

Because this routine is called dynamically, it must adhere to a strictconvention All function interface parameters must exactly match the calling convention.  For exact specifications see "Interface Structure of IDoc Processing Functions"  later in this chapter. 

EDIDS - Status log

The processing steps and their respective status results are stored in table EDIDS.

Status must be logged properly

In addition, the routine has to properly determine  the next status of the IDoc in table EDIDS; usually it will be EDIDS-STATU = 53 for OK or 51 for error.

R/3 provides a sophisticated IDoc processing framework. This framework determines a function module which is responsible for creating or processing the IDoc.

Function module to generate the IDoc

The kernel of the IDoc processing is always a distinct function module. For the outbound processing, the function module creates the IDoc and leaves it in an internal table, which is passed as an interface parameter.

During inbound processing the function module receives the IDoc via an interface parameter table. It would interpret the IDoc data and typically update the database either directly or via a call transaction.

Function are called dynamically

The function modules are called dynamically from a standard routine. Therefore, the function must adhere to a well-defined interface.

Function group EDIN with useful routines

You may want to investigate the function group EDIN, which contains a number of IDoc handler routines and would call the customised function.

Copy and modify existing routines

The easiest way to start the development of an outbound IDoc function module is to copy an existing one. There are many samples in the standard R/3 repository'; most are named IDOC_OUTBOUND* or IDOC_OUTPUT*

Outbound sample functions are named like IDOC_OUTPUT*

FUNCTION IDOC_OUTPUT_ORDERS01

 

 

Inbound sample functions are named like IDOC_INPUT*

FUNCTION IDOC_INPUT_ORDERS01

 

 

Outbound sample functions for master data are named like MASTERIDOC_INPUT*

FUNCTION MASTERIDOC_CREATE_MATMAS

 

 

Figure 28:     Schematic of an IDoc outbound process

To use the standard IDoc processing mechanism, the processing function module must have certain interface parameters because the function is called dynamically from a standard routine.

The automated IDoc processor will call your function module from within the program RSNASTED, usually either from the FORM ALE_PROCESSING or EDI_PROCESSING.

In order to be compatible with this automated call, the interface of the function module must be compliant.

FUNCTION Z_IDOC_OUTBOUND_SAMPLE.

*"       IMPORTING

*"             VALUE(FL_TEST) LIKE  RS38L-OPTIONAL DEFAULT 'X'

*"             VALUE(FL_COMMIT) LIKE  RS38L-OPTIONAL DEFAULT SPACE

*"       EXPORTING

*"             VALUE(F_IDOC_HEADER) LIKE  EDIDC STRUCTURE  EDIDC

*"       TABLES

*"              T_IDOC_CONTRL STRUCTURE  EDIDC

*"              T_IDOC_DATA STRUCTURE  EDIDD

*"       CHANGING

*"             VALUE(CONTROL_RECORD_IN) LIKE  EDIDC STRUCTURE  EDIDC

*"             VALUE(OBJECT) LIKE  NAST STRUCTURE  NAST

*"       EXCEPTIONS

*"              ERROR_IN_IDOC_CONTROL

*"              ERROR_WRITING_IDOC_STATUS

*"              ERROR_IN_IDOC_DATA

*"              SENDING_LOGICAL_SYSTEM_UNKNOWN

*"              UNKNOWN_ERROR

Figure 29:     Interface structure of an NAST compatible function module

Inbound functions are also called via a standard mechanism.

FUNCTION IDOC_INPUT_SOMETHING.

*"       IMPORTING

*"             VALUE(INPUT_METHOD) LIKE  BDWFAP_PAR-INPUTMETHD

*"             VALUE(MASS_PROCESSING) LIKE  BDWFAP_PAR-MASS_PROC

*"       EXPORTING

*"             VALUE(WORKFLOW_RESULT) LIKE  BDWFAP_PAR-RESULT

*"             VALUE(APPLICATION_VARIABLE) LIKE  BDWFAP_PAR-APPL_VAR

*"             VALUE(IN_UPDATE_TASK) LIKE  BDWFAP_PAR-UPDATETASK

*"             VALUE(CALL_TRANSACTION_DONE) LIKE  BDWFAP_PAR-CALLTRANS

*"       TABLES

*"              IDOC_CONTRL STRUCTURE  EDIDC

*"              IDOC_DATA STRUCTURE  EDIDD

*"              IDOC_STATUS STRUCTURE  BDIDOCSTAT

*"              RETURN_VARIABLES STRUCTURE  BDWFRETVAR

*"              SERIALIZATION_INFO STRUCTURE  BDI_SER

Figure 30:     Interface structure of an IDoc inbound function

This is an individual coding part where you need to retrieve the information from the database and prepare it in the form the recipient of the IDoc will expect the data.

Read data to send

The first step is reading the data from the database, the one you want to send.

FUNCTION Y_AXX_COOKBOOK_TEXT_IDOC_OUTB.

*"----------------------------------------------------------------------

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(I_TDOBJECT) LIKE  THEAD-TDOBJECT DEFAULT 'TEXT'

*"             VALUE(I_TDID) LIKE  THEAD-TDID DEFAULT 'ST'

*"             VALUE(I_TDNAME) LIKE  THEAD-TDNAME

*"             VALUE(I_TDSPRAS) LIKE  THEAD-TDSPRAS DEFAULT SY-LANGU

*"       EXPORTING

*"             VALUE(E_THEAD) LIKE  THEAD STRUCTURE  THEAD

*"       TABLES

*"              IDOC_DATA STRUCTURE  EDIDD OPTIONAL

*"              IDOC_CONTRL STRUCTURE  EDIDC OPTIONAL

*"              TLINES STRUCTURE  TLINE OPTIONAL

*"       EXCEPTIONS

*"              FUNCTION_NOT_EXIST

*"              VERSION_NOT_FOUND

*"----------------------------------------------------------------------

  CALL FUNCTION 'READ_TEXT'

       EXPORTING

            ID                      = ID

            LANGUAGE                = LANGUAGE

            NAME                    = NAME

            OBJECT                  = OBJECT

       TABLES

            LINES                   = LINES.

* now stuff the data into the Idoc record format

  PERFORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' E_THEAD.

  LOOP AT LINES.

    PERFORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' LINES.

  ENDLOOP.

ENDFUNCTION.

The physical format of the IDocs records is always the same. Therefore, the application data must be converted into a 1000 character string.

Fill the data segments which make up the IDoc

An IDoc is a file with a rigid formal structure. This allows the correspondents to correctly interpret the IDoc information. Were it for data exchange between SAP-systems only, the IDoc segments could be simply structured like the correspondent DDIC structure of the tables whose data is sent.

However, IDocs are usually transported to a variety of legacy systems which do not run SAP. Both correspondents therefore would agree on an IDoc structure which is known to the sending and the receiving processes.

Transfer the whole IDoc to an internal table, having the structure of EDIDD

All data needs to be compiled in an internal table with the structure of the standard SAP table EDIDD. The records for EDIDD are principally made up of a header string describing the segment and a variable length character field (called SDATA) which will contain the actual segment data.

FORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' E_THEAD.

  TABLES: THEAD.

  MOVE-CORRESPONDING E:THEAD to Z1THEAD.

  MOVE ‚Z1THEAD’ TO IDOC_DATA-SEGNAM.

  MOVE Z1THEAD TO IDOC_DATA-SDATA.

  APPEND IDOC_DATA.

ENDFORM.“

Figure 31:     Routine to move the translate to IDoc data

Fill control record

Finally, the control record has to be filled with meaningful data, especially telling the IDoc type and message type.

    IF    IDOC_CONTRL-SNDPRN IS INITIAL.

      SELECT SINGLE * FROM T000 WHERE MANDT EQ SY-MANDT.

      MOVE T000-LOGSYS TO IDOC_CONTRL-SNDPRN.

    ENDIF.

    IDOC_CONTRL-SNDPRT = 'LS'.

* Trans we20 -> Outbound Controls muss entsprechend gesetzt werden.

* 2  = Transfer IDoc immediately

* 4  = Collect IDocs

  IDOC_CONTRL-OUTMOD = '2'.     "1=imediately, subsystem

  CLEAR IDOC_CONTRL.

  IDOC_CONTRL-IDOCTP = 'YAXX_TEXT'.

  APPEND IDOC_CONTRL.

Figure 32:     Fill the essential information of an IDoc control record

 

This chapter will show you how an IDoc function is principally designed and how R/3 processes the IDocs. I cannot stop repeating, that writing IDoc processing routines is a pretty simple task. With a number of recipes on hand, you can easily build your own processors.

IDocs are usually created in a four step process: retrieving the data, converting it to IDoc format, adding a control record, and delivering the IDoc to a port.

Collect data from R/3 database

This is the single most important  task in outbound processing. You have to identify the database tables and data dependencies which are needed in the IDoc to be sent. The smartest way is usually to select the data from the database into an internal table using SELECT * FROM dbtable INTO itab  ... WHERE ...

Wrap data in IDoc format

The collected data must be transformed into ASCII data and filled into the predefined IDoc segment structures. The segment definitions are done with transaction WE31 and the segments allowed in an IDoc type are set up in transaction WE30. Segments defined with WE31 are automatically created as SAP DDIC structures. They can be viewed with SE11, however, they cannot be edited.

Create the IDoc control record

Every IDoc must be accompanied by a control record which  must contain at least the Idoc type to identify the syntactical structure of the data and  the name and role of the sender and the receiver. This header information is checked against the partner definitions for outbound. Only if a matching partner definition exists, can the IDoc  be sent. Partner definitions are set up with transaction WE20.

Send data to port

When the partner profile check matches, the IDoc is forwarded to a logical port, which is also assigned in the partner profile. This port is set up with transaction WE21 and defines the  medium to transport the IDoc, e.g. file or RFC. The RFC destinations are set up with transaction SM57 and must also be entered in table TBDLS with an SM31 view. Directories for outbound locations of files are set up with transaction FILE and directly in WE21. It also allows the use of a function module which generates file names. Standard functions for that purpose begin like EDI_FILE*.

When you receive an IDoc the standard way, the data is stored in the IDoc base and a function module is called, which decides how to process the received information.

EDID4 - Data

Data is stored in table EDID4 (EDID3 up to release 3.xx, EDIDD up to release 2.xx)

EDIDC - Control Record

An accompanying control record with important context and administrative information is stored in table EDIDC.

Event signals readiness

After the data is stored in the IDoc base tables, an event is fired to signal that there is an IDoc waiting for processing. This event is consumed by the IDoc handler, which decides, whether to process the IDoc immediately, postpone processing, or decline activity for whatever reason.

EDIFCT - Processing function

When the IDoc processor thinks it is time to process the IDoc it will search the  table EDIFCT , where it should find the name of a function module which will be called to process the IDoc data.

This function module is the heart of all inbound processing. The IDoc processor will call this routine and pass the IDoc data from EDID4 and the control record from EDIDC for the respective IDoc.

Function has a fixed interface

Because this routine is called dynamically, it must adhere to a strictconvention All function interface parameters must exactly match the calling convention.  For exact specifications see "Interface Structure of IDoc Processing Functions"  later in this chapter. 

EDIDS - Status log

The processing steps and their respective status results are stored in table EDIDS.

Status must be logged properly

In addition, the routine has to properly determine  the next status of the IDoc in table EDIDS; usually it will be EDIDS-STATU = 53 for OK or 51 for error.

R/3 provides a sophisticated IDoc processing framework. This framework determines a function module which is responsible for creating or processing the IDoc.

Function module to generate the IDoc

The kernel of the IDoc processing is always a distinct function module. For the outbound processing, the function module creates the IDoc and leaves it in an internal table, which is passed as an interface parameter.

During inbound processing the function module receives the IDoc via an interface parameter table. It would interpret the IDoc data and typically update the database either directly or via a call transaction.

Function are called dynamically

The function modules are called dynamically from a standard routine. Therefore, the function must adhere to a well-defined interface.

Function group EDIN with useful routines

You may want to investigate the function group EDIN, which contains a number of IDoc handler routines and would call the customised function.

Copy and modify existing routines

The easiest way to start the development of an outbound IDoc function module is to copy an existing one. There are many samples in the standard R/3 repository'; most are named IDOC_OUTBOUND* or IDOC_OUTPUT*

Outbound sample functions are named like IDOC_OUTPUT*

FUNCTION IDOC_OUTPUT_ORDERS01

 

 

Inbound sample functions are named like IDOC_INPUT*

FUNCTION IDOC_INPUT_ORDERS01

 

 

Outbound sample functions for master data are named like MASTERIDOC_INPUT*

FUNCTION MASTERIDOC_CREATE_MATMAS

 

 

Figure 33:     Schematic of an IDoc outbound process

To use the standard IDoc processing mechanism, the processing function module must have certain interface parameters because the function is called dynamically from a standard routine.

The automated IDoc processor will call your function module from within the program RSNASTED, usually either from the FORM ALE_PROCESSING or EDI_PROCESSING.

In order to be compatible with this automated call, the interface of the function module must be compliant.

FUNCTION Z_IDOC_OUTBOUND_SAMPLE.

*"       IMPORTING

*"             VALUE(FL_TEST) LIKE  RS38L-OPTIONAL DEFAULT 'X'

*"             VALUE(FL_COMMIT) LIKE  RS38L-OPTIONAL DEFAULT SPACE

*"       EXPORTING

*"             VALUE(F_IDOC_HEADER) LIKE  EDIDC STRUCTURE  EDIDC

*"       TABLES

*"              T_IDOC_CONTRL STRUCTURE  EDIDC

*"              T_IDOC_DATA STRUCTURE  EDIDD

*"       CHANGING

*"             VALUE(CONTROL_RECORD_IN) LIKE  EDIDC STRUCTURE  EDIDC

*"             VALUE(OBJECT) LIKE  NAST STRUCTURE  NAST

*"       EXCEPTIONS

*"              ERROR_IN_IDOC_CONTROL

*"              ERROR_WRITING_IDOC_STATUS

*"              ERROR_IN_IDOC_DATA

*"              SENDING_LOGICAL_SYSTEM_UNKNOWN

*"              UNKNOWN_ERROR

Figure 34:     Interface structure of an NAST compatible function module

Inbound functions are also called via a standard mechanism.

FUNCTION IDOC_INPUT_SOMETHING.

*"       IMPORTING

*"             VALUE(INPUT_METHOD) LIKE  BDWFAP_PAR-INPUTMETHD

*"             VALUE(MASS_PROCESSING) LIKE  BDWFAP_PAR-MASS_PROC

*"       EXPORTING

*"             VALUE(WORKFLOW_RESULT) LIKE  BDWFAP_PAR-RESULT

*"             VALUE(APPLICATION_VARIABLE) LIKE  BDWFAP_PAR-APPL_VAR

*"             VALUE(IN_UPDATE_TASK) LIKE  BDWFAP_PAR-UPDATETASK

*"             VALUE(CALL_TRANSACTION_DONE) LIKE  BDWFAP_PAR-CALLTRANS

*"       TABLES

*"              IDOC_CONTRL STRUCTURE  EDIDC

*"              IDOC_DATA STRUCTURE  EDIDD

*"              IDOC_STATUS STRUCTURE  BDIDOCSTAT

*"              RETURN_VARIABLES STRUCTURE  BDWFRETVAR

*"              SERIALIZATION_INFO STRUCTURE  BDI_SER

Figure 35:     Interface structure of an IDoc inbound function

This is an individual coding part where you need to retrieve the information from the database and prepare it in the form the recipient of the IDoc will expect the data.

Read data to send

The first step is reading the data from the database, the one you want to send.

FUNCTION Y_AXX_COOKBOOK_TEXT_IDOC_OUTB.

*"----------------------------------------------------------------------

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(I_TDOBJECT) LIKE  THEAD-TDOBJECT DEFAULT 'TEXT'

*"             VALUE(I_TDID) LIKE  THEAD-TDID DEFAULT 'ST'

*"             VALUE(I_TDNAME) LIKE  THEAD-TDNAME

*"             VALUE(I_TDSPRAS) LIKE  THEAD-TDSPRAS DEFAULT SY-LANGU

*"       EXPORTING

*"             VALUE(E_THEAD) LIKE  THEAD STRUCTURE  THEAD

*"       TABLES

*"              IDOC_DATA STRUCTURE  EDIDD OPTIONAL

*"              IDOC_CONTRL STRUCTURE  EDIDC OPTIONAL

*"              TLINES STRUCTURE  TLINE OPTIONAL

*"       EXCEPTIONS

*"              FUNCTION_NOT_EXIST

*"              VERSION_NOT_FOUND

*"----------------------------------------------------------------------

  CALL FUNCTION 'READ_TEXT'

       EXPORTING

            ID                      = ID

            LANGUAGE                = LANGUAGE

            NAME                    = NAME

            OBJECT                  = OBJECT

       TABLES

            LINES                   = LINES.

* now stuff the data into the Idoc record format

  PERFORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' E_THEAD.

  LOOP AT LINES.

    PERFORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' LINES.

  ENDLOOP.

ENDFUNCTION.

The physical format of the IDocs records is always the same. Therefore, the application data must be converted into a 1000 character string.

Fill the data segments which make up the IDoc

An IDoc is a file with a rigid formal structure. This allows the correspondents to correctly interpret the IDoc information. Were it for data exchange between SAP-systems only, the IDoc segments could be simply structured like the correspondent DDIC structure of the tables whose data is sent.

However, IDocs are usually transported to a variety of legacy systems which do not run SAP. Both correspondents therefore would agree on an IDoc structure which is known to the sending and the receiving processes.

Transfer the whole IDoc to an internal table, having the structure of EDIDD

All data needs to be compiled in an internal table with the structure of the standard SAP table EDIDD. The records for EDIDD are principally made up of a header string describing the segment and a variable length character field (called SDATA) which will contain the actual segment data.

FORM PACK_LINE TABLES IDOC_DATA USING 'THEAD' E_THEAD.

  TABLES: THEAD.

  MOVE-CORRESPONDING E:THEAD to Z1THEAD.

  MOVE ‚Z1THEAD’ TO IDOC_DATA-SEGNAM.

  MOVE Z1THEAD TO IDOC_DATA-SDATA.

  APPEND IDOC_DATA.

ENDFORM.“

Figure 36:     Routine to move the translate to IDoc data

Fill control record

Finally, the control record has to be filled with meaningful data, especially telling the IDoc type and message type.

    IF    IDOC_CONTRL-SNDPRN IS INITIAL.

      SELECT SINGLE * FROM T000 WHERE MANDT EQ SY-MANDT.

      MOVE T000-LOGSYS TO IDOC_CONTRL-SNDPRN.

    ENDIF.

    IDOC_CONTRL-SNDPRT = 'LS'.

* Trans we20 -> Outbound Controls muss entsprechend gesetzt werden.

* 2  = Transfer IDoc immediately

* 4  = Collect IDocs

  IDOC_CONTRL-OUTMOD = '2'.     "1=imediately, subsystem

  CLEAR IDOC_CONTRL.

  IDOC_CONTRL-IDOCTP = 'YAXX_TEXT'.

  APPEND IDOC_CONTRL.

Figure 37:     Fill the essential information of an IDoc control record

R/3 defines partner profiles for every EDI partner. The profiles are used to declare the communication channels, schedule, and conditions of processing.

Summary

Partner profiles declare the communication medium to be used with a partner.

Ports define the physical characteristics of a communication channel.

If you define an ALE scenario for your IDoc partners, you can use the ALE automated partner profile generation ( ® ALE ).

An IDoc file requires a minimum of accompanying information to give sense to it. These are the message type and the IDoc type. While the IDoc type tells you about the fields and segments of the IDoc file, the message type flags the context under which the IDoc was sent.

IDoc type signals syntactical structure

A receiver of an IDoc must  know the exact syntactical structure of the data package received. Naturally, the receiver only sees a text file with lines of characters. In order to interpret it, it is necessary to know which segment types the file may contain and how a segment is structured into fields. SAP sends the name of the IDoc type in the communication header.

IDoc type (WE30)

The IDoc type describes the file structure. The IDoc type is defined and viewable with transaction WE30.

Examples:

Examples of IDoc  types are MATMAS01, ORDERS01, COND_A01 or  CLSMAS01.

Message type signals the semantic context

The message type is an identifier that tags the IDoc to tell the receiver how the IDoc is meant to be interpreted. It is therefore the tag for the semantic content of the IDoc.

Examples

Examples of message types are MATMAS, ORDERS, COND_A or  CLSMAS.

For any combination of message type and receiving partner, a profile is maintained

The combination of IDoc type and message type gives the IDoc the full meaning. Theoretically, you could define only a single IDoc type for every IDoc you send. Then, all IDocs would have the same segments and the segments would always have  the same field structure. According to the context some of the record fields are filled; others are simply void. Many antiquated interfaces are still working that way.

Typical combinations of IDoc and message types are the following:

 

Message Type

IDoc Type

Sales order, older format

ORDERS

ORDERS01

Sales order, newer format

ORDERS

ORDERS02

Purchase Requisition

PURREQ

ORDERS01

 

The example shows you that sales orders can be exchanged in different file formats. There may be some customers who accept the latest IDoc format ORDERS02, while others still insist on receiving the old format ORDERS01.

The IDoc format for sales orders would also be used to transfer a purchase requisition. While the format remains the same, the different message type signals that it is not an actual order but a request.

Partner profiles play an important role in EDI communications. They are parameter files which store the EDI partner dependent information.

Partner profiles define the type of data and communication paths of data to be exchanged between partner

When data is exchanged between partners, it is important that sender and receiver agree on  the exact syntax and semantics of the data to be exchanged. This agreement is called a partner profile and tells the receiver the structure of the sent file and how its content is to be interpreted.

 The following information is defined with the partner profile.

For any combination of message type and receiving partner, a profile is maintained

·         IDoc type and message type as key identifier of the partner profile

·         Names of sender and receiver to exchange the IDoc information for the respective IDoc and message type 

·         Logical port name via which the sender and receiver, resp. will communicate

The communication media is assigned by the profile

If you exchange e.g. sales orders with partners, you may do this via different media with different customers. There may be one customer to communicate with you via TCP/IP (the Internet) while the other still insists on receiving diskette files.

Profiles cannot be transported

They must be defined for every R/3 client individually. They cannot be transported using the R/3 transport management system. This is because the profile contains the name of the sending system, which is naturally different for consolidation and production systems.

Profiles define the allowed EDI connections

The profiles allow you to open and close EDI connection with individual partners and specify in detail which IDocs are to be exchanged via the interface.

Profiles can also used to block an EDI communication

The profile is also the place to lock permanently or temporarily an IDoc communication with an EDI partner. So you shut the gate for external communication with the profile.

The transaction WE20 is used to set up the partner profile.

WE20

The profiles are defined with transaction WE20, which is also found in the EDI master menu WEDI. From there you need to specify partner and partner type and whether you define a profile for inbound or outbound. Additionally, you may assign the profile to a NAST message type.

Partner type, e.g.
LI=Supplier
CU=Customer
LS=Logical system

The partner type defines from which master data set the partner number originates. The partner types are the ones which are used in the standard applications for SD, MM or FI. The most important types for EDI are LI (=Lieferant, supplier), CU (Customer) or LS (Logical system). The logical system is of special interest when you exchange data with computer subsystems via ALE or other RFC means.

Inbound and outbound definitions

For every partner and every direction of communication, whether you receive or send IDocs, a different profile is maintained. The inbound profile defines the processing routine. The outbound profile defines mainly the target, where to send the data .

Link message type to outbound profile

If you send IDocs out of an application’s messaging, i.e. a communication via the NAST table, then you have to link the message type with an IDoc profile. This is also done in transaction WE20.

Inbound profiles determine the processing logic

The processing code is a logical name for the processing function module or object method. The processing code is used to uniquely determine a function module that will process the received IDoc data. The inbound profile will point to a processing code.

IDoc data can be sent and received through a multitude of different media. In order to decouple the definition of the media characteristics from the application using it, the media is accessed via ports.

A port is a logical name to access a physical input/output device

A port is a logical name for an input/output device. A program talks to a port which is presented to it with a common standard interface. The port takes care of the translation between the standard interface format and the device dependent format.

Communication media is defined via a port definition

Instead of defining the communication path directly in the partner profile, a port number is assigned. The port number then designates the actual medium. This allows you to define the characteristics of a port individually and use that port in multiple profiles. Changes in the port will then reflect automatically to all profiles without touching them.

Typical ports for data exchange :

Communication media

·         Disk file with a fixed name

·         Disk file with dynamic names

·         Disk file with trigger of a batch routine

·         Standard RFC connection via TCP/IP

·         A network channel

·         TCP/IP FTP destination (The Internet)

·         Call to a individual program e.g. EDI converter

Every program should communicate with other computers via the ports only

Every application should send or receive its data via the logical ports only. This allows you to easily change the hardware and software used to make the physical I/O connection without interfering with the program itself.

The transactions used to define the ports are

WE21 defines the port; SM59 sets up media

·         WE21         to create the port and assign a logical name, and

·         SM59         to define the physical characteristics of the I/O device used.

There are different port versions for the respective R/3 releases as shown in the matrix below:

Port types

Port Type

DDic Format

Release

1

not used

not used

2

EDID3

2.x, 3.x

3

EDID4

4.x

Figure 38:     R/3 port types by release

Port versions differ in length of fields

The difference between the port types is mainly the length of some fields. E.g. does port type 3 allow segment names up to 30 characters in length, while port type 3 is constrained to a maximum segment name of 8 characters.

A remote function call RFC enables a computer to execute a program an a different computer within the same LAN, WAN or Internet network. RFC is a common UNIX feature, which is found also in other object-oriented operating systems. R/3 provides special DLLs for WINDOWS, NT and UNIX to allow RFC calls from and to R/3.

Summary

RFC can link two systems together.

RFC function modules are like standard function with only a few limitations.

RFC can also call program on a non R/3 system.

There's a story about some frogs that teaches us all a valuable lesson about life.

The story goes like this :

         A group of frogs were travelling through the woods. Two of them fell into a deep pit. All the other frogs gathered  around the pit. When they saw how deep the pit was they told the two frogs that they were as good as dead. The two frogs ignored the comments  and tried to jump up out of the pit with all of their might. The other frogs kept telling them to stop, saying that they were as good as dead. Finally, one of the frogs took heed of what the other frogs were saying and gave up. He fell down and died.

         The other frog continued to jump as hard as he could. Once again, the crowd of frogs yelled at him to stop the pain and just die. He jumped even harder and finally made it out. When he got out, the other frogs said, "Did not you hear us?" The frog explained to them that he was deaf. He thought they were encouraging him the entire time.

This story teaches us two lessons. There is power of life and death in the tongue. An encouraging word to someone who is down can lift him up and help him make it through difficult times. A destructive word to someone who is

down, can be what it takes to kill him.

So let's be careful what we say. Let us speak life to those who cross our path. Words are so powerful, it's sometimse hard to understand that an encouraging word can go such a long way. Keeping  this in mind, let's always be careful and think about what we have to say.

Received as a SPAM (“send phenomenal amount of mail”) e-mail from unknown

A Remote Function Call enables a computer to execute a program an another computer. The called program is executed locally on the remote computer using the remote computer’s environment, CPU and data storage.

RFC allows execute subroutines on a remote computer

Remote function call is one of the great achievements of TCP/IP networks. Every computer within the network can accept an RFC-call and decides whether it wants to execute the request. Every modern FTP server implementation includes the RFC calling feature.

Classical networking loads the program to the client computer

A classical network server stores the program code in a central location. When the program is called, the code will be transported via the network to the calling computer workstation and executed on the calling computer, consuming the caller’s resources of CPU, memory and disk.

RFC executes the program on the server

An RFC calls the program on the remote computer. It is just like stepping over to the remote computer, typing in the program command line with all parameters and waiting for the result to be reported back to the calling computer. The calling computer does not provide any resources other than the parameters specified with the call.

Here is again what an RFC does.

·         It calls the program on a remote computer and specify parameters if and as necessary.

·         The remote computer decides whether to fulfil the request and execute the program.

·         Every manipulation done by the called program is effective in the same way as if the program had  started on the remote system.

·         The calling program task waits meanwhile for the called program to terminate.

·         When the RFC program terminates, it returns result values if applicable.

·         The called program needs not to be present on the calling computer.

·         The called program can be run under a completely different operation system, so you can call a WINDOWS program from UNIX and vice versa.

The internet is a typical RFC application

A typical RFC example is the internet with a web browser as the RFC client and the web server as the RFC server. Executing a server applet e.g. via CGI or a JAVA or JAVASCRIPT server side applet is actually a remote function call from the web browser to the HTTP server.

If R/3 is doing RFC calls into another system, then it does exactly what a browser does when performing a request on the HTTP or FTP server.

RFC provides interface shims for different operating systems and platforms, which provide the communication APIs for doing RFC from and to R/3.

SAP R/3 is designed as a multiserver architecture. Therefore, R/3 is equipped with a communication architecture that allows data exchange and communication between individual R/3 application and database servers. This communication channel also enables R/3 to execute programs running on a remotely connected server using RFC technology.

SAP R/3 provides special routines to enable RFC from and to R/3 for several operation systems. For NT and WINDOWS the DLLs are delivered with the SAPGUI

Non SAP R/3 programs can access function modules in R/3, which is done by calling an SAP provided interface stem. Interfaces exist for UNIX, Windows and IBM S/390 platforms.

R/3 systems which are tied together via TCP/IP are always RFC capable. One R/3 system can call function modules in a remote RFC system, just as if the function where part of the own calling system.

A function module can be called via RFC if the function has RFC enabled. This is a simple flag on the interface screen of the function.

Enabling RFC for a function does not change the function. The only difference between RFC-enabled and standard functions is that RFC functions have some restriction: namely, they cannot have untyped parameters.

This example demonstrates the use of RFC functions to send data from one SAP system to a remote destination. The example is a simple demonstration of how to efficiently and quickly use RFC in your installation.

A text in SAP is an ordinary document, not a customizing or development object. Therefore, texts are never automatically transported from a development system to a production system. This example helps to copy text into a remote system.

Step 1: Reading the text documents in the sending system

The ABAP Z_RFC_COPYTEXT selects texts from the text databases STXH and STXL. The ABAP reads the STXH database only to retrieve the names of the text documents that match the selection screen. The text itself is read using the standard SAP function module READ_TEXT.

Step 2: Sending the text and saving it in the destination system

Then the ABAP calls the function module Y_RFC_SAVE_TEXT remotely in the destination system. The function runs completely on the other computer. The function needs not  exist in the calling system.

FUNCTION Z_RFC_SAVE_TEXT.

*"----------------------------------------------------------------------

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(CLIENT) LIKE  SY-MANDT DEFAULT SY-MANDT

*"             VALUE(HEADER) LIKE  THEAD STRUCTURE  THEAD

*"       EXPORTING

*"             VALUE(NEWHEADER) LIKE  THEAD STRUCTURE  THEAD

*"       TABLES

*"              LINES STRUCTURE  TLINE

*"       EXCEPTIONS

*"              ID

*"              LANGUAGE

*"              NAME

*"              OBJECT

*"----------------------------------------------------------------------

  CALL FUNCTION 'SAVE_TEXT'

       EXPORTING

*           CLIENT          = SY-MANDT

            HEADER          = HEADER

*           INSERT          = ' '

            SAVEMODE_DIRECT = 'X'

*           OWNER_SPECIFIED = ' '

       IMPORTING

*           FUNCTION        =

            NEWHEADER       = NEWHEADER

       TABLES

            LINES           = LINES.

ENDFUNCTION.

Figure 39:     Z_READ_TEXT, a copy of function READ_TEXT with RFC enabled

 

REPORT Z_RFC_COPYTEXT.

TABLES: THEAD, STXH, RSSCE.

SELECT-OPTIONS: TDNAME   FOR RSSCE-TDNAME   MEMORY ID TNA OBLIGATORY.

SELECT-OPTIONS: TDOBJECT FOR RSSCE-TDOBJECT MEMORY ID TOB.

SELECT-OPTIONS: TDID     FOR RSSCE-TDID     MEMORY ID TID.

PARAMETERS:     RCVSYS  LIKE T000-LOGSYS    MEMORY ID LOG OBLIGATORY.

DATA: THEADS LIKE STXH  OCCURS 0 WITH HEADER LINE.

DATA: TLINES LIKE TLINE OCCURS 0 WITH HEADER LINE.

DATA: XTEST LIKE TEST VALUE 'X'.

START-OF-SELECTION.

************************************************************************

* Get all the matching text modules                                    *

************************************************************************

SELECT * FROM STXH INTO TABLE THEADS

                    WHERE TDOBJECT IN TDOBJECT

                      AND TDID     IN TDID

                      AND TDNAME   IN TDNAME.

************************************************************************

* Process all found text modules                                       *

************************************************************************

LOOP AT THEADS.

************************************************************************

* Read the text from pool                                              *

************************************************************************

  CALL FUNCTION 'READ_TEXT'

       EXPORTING

            ID                      = THEADS-TDID

            LANGUAGE                = THEADS-TDSPRAS

            NAME                    = THEADS-TDNAME

            OBJECT                  = THEADS-TDOBJECT

       IMPORTING

            HEADER                  = THEAD

       TABLES

            LINES                   = TLINES

       EXCEPTIONS

            OTHERS                  = 8.

************************************************************************

* RFC call to function in partner system that stores the text there    *

************************************************************************

  CALL FUNCTION 'Z_RFC_SAVE_TEXT'

       DESTINATION ’PROCLNT100’

       EXPORTING

            HEADER          = THEAD

       TABLES

            LINES           = TLINES.

       EXCEPTIONS

            OTHERS          = 5.

Figure 40:     Program to copy text modules into a remote system via RFC

R/3 RFC is not limited to communication between R/3 systems. Every computer providing support for  the RFC protocol can be called from R/3 via RFC. SAP provides necessary API libraries for all operating systems which support R/3 and many major programming languages e.g. C++, Visual Basic or Delphi.

RFC does not now the physics of the remote system

Calling a program via RFC on a PC or a UNIX system is very much like calling it in another R/3 system. Indeed, the calling system will not even be able to recognize whether the called program runs on another R/3 or on a PC.

RFC server must be active on remote computer

To make a system RFC compliant, you have to run an RFC server program on the remote computer. This program has to have a calling interface which is well defined by SAP. In order to create such a server program, SAP delivers an RFC development kit along with the SAPGUI.

The RFC call to Windows follows the OLE/ACTIVE-X standard, while UNIX is connected via TCP/IP RFC which is a standard in all TCP-compliant systems.

For most purposes you might be satisfied to execute a command line program and catch the program result in a table. For that purpose you can use the program RFCEXEC which comes with the examples of the RFC development kit both for UNIX and WINDOWS. Search for it in the SAPGUI directory. This program will call the operating systems command line interpreter along with an arbitrary string that you may pass as parameter.

RFCEXEC must be defined as RFC destination with SM59

In order to call rfcexec, it has to be defined as a TCP/IP destination in SM59. R/3 comes with two destinations predefined which will call rfcexec either on the R/3 application server SERVER_EXEC or on the front end LOCAL_EXEC. By specifying another computer name you can redirect the call for RFCEXEC to the named computer. Of course, the target computer needs to be accessible from the R/3 application server (not from the workstation) and have rfcexec installed.

The object interface of rfcexec supports two methods only, which are called as remote function call from R/3.

rfc_remote_exec

rfc_remote_exec will call RFCEXEC and execute the command interpreter with the parameter string. No results will be returned besides an eventual error code.

CALL FUNCTION ‘RFC_REMOTE_EXEC’

   DESTINATION ‘RFC_EXEC’

   EXPORTING  COMMAND = ’dir c:\sapgui >input’

The example call above would execute the following when run on a DOS system.

command.com /c copy c:\config.sys c:\temp

rfc_remote_pipe

rfc_remote_pipe will call RFCEXEC, execute the command line interpreter with the parameter string and catch the output into an internal table.

CALL FUNCTION ‘RFC_REMOTE_PIPE’

   DESTINATION ‘RFC_EXEC’

   EXPORTING  COMMAND = ’dir c:\sapgui >input’

The example call above would execute the following when run on a DOS system,

command.com /c dir c:\sapgui >input

while the file input is caught by rfc_remote_pipe and returned to the calling system.

Process incoming files

A common application for the use of rfc_remote_pipe is to automatically check a file system for newly arrived files and process them. For that purpose, you would create three directories, e.g. the following.

x:\incoming

x:\work

x:\processed

The statement retrieves the file list with rfc_remote_pipe into an R/3 internal table.

dir x:\incoming /b

Then the files are move into a working directory.

move x:\incoming\file x:\work

Finally the files are processed and moved into an archive directory.

move x:\work x:\processed

There are two faces of workflow in R/3. One is the business oriented workflow design as it is taught in universities. This is implemented by the SAP Business Workflow™. However, the workflow is also a tool to link transactions easily. It can be used to easily define execution chains of transactions or to trigger user actions without the need to modify the SAP standard code. This can even be achieved without laboriously customising the HR related workflow settings.

Summary

Workflow event linkage allows the execution of another program when a transaction finishes.

The workflow event linkage mechanism can be easily used without customising the full workflow scenarios.

This way we use the workflow engine to chain the execution of transaction and circumvent the setup of the SAP Business Workflow™.

There are several independent ways to trigger the workflow event linkage.

Americans work hard because they are optimists.

Germans work hard because they fear the future.

SAP R/3 provides a mechanism, called Workflow that allows conditional and unconditional triggering of subsequent transactions from another transaction. This allows you to build up automatic processing sequences without having the need to modify the SAP standard transactions.

Workflow as business method

The SAP business workflow was originally designed to model business workflows according to scientific theories with the same name Business Workflow. This is mainly a modelling tool that uses graphical means, e.g.. flow charting to sketch the flow of events in a system to achieve the required result. SAP allows you to transcript these event modellings into customizsng entries, which are then executed  by the SAP Workflow mechanism.

Transaction SWO1

The transaction to enter the graphical model, to define the events and objects, and to develop necessary triggering and processing objects is SWO1 (It is an O not a zero).

SAP approach unnecessary complex and formal

I will not even try to describe how to design workflows in SAP. I believe that the way  workflows are realized in SAP is far too complicated and unnecessarily complex and will fill a separate book.

Workflow events can be used for own developments

Fortunately, the underlying mechanism for workflows is less complex as the formal overhead. Most major transactions will trigger the workflow via SWE_EVENT_CREATE . This will make a call to a workflow handler routine, whose name can usually be customised dynamically and implemented as a function module.

Contrary to what you mostly hear about R/3 workflow, it is relatively easy and mechanical to define a function module as a consecutive action after another routine raised a workflow event.For example,  this can  be used to call the execution of a transaction after another one has finished.

Every workflow enabled transaction will call SWE_EVENT_CREATE

The whole workflow mechanism is based on a very simple principle. Every workflow enabled transaction will call directly or indirectly the function module during SWE_EVENT_CREATE update.

SWE_EVENT_CREATE will look in a table, e.g. SWETYPECOU to get the name of the following action

The function module SWE_EVENT_CREATE will then consult a customising table. For a simple workflow coupling, the information is found in the table SWETYPECOU . The table will tell the name of the subsequent program to call, either a function module or an object method.

This way of defining the subsequent action is called type coupling because the action depends on the object type of the calling event.

The call to the following event is done with a dynamic function call. This requires that the called function module has a well-defined interface definition. Here you see the call as it is found in SWE_EVENT_CREATE .

CALL FUNCTION typecou-recgetfb " call receiver_type_get_fb

         EXPORTING

                     objtype = typecou-objtype

                     objkey = objkey

                     event = event

                     generic_rectype = typecou-rectype

         IMPORTING

                     rectype = typecou-rectype

         TABLES

                     event_container = event_container

         EXCEPTIONS

         OTHERS = 1.

Figure 41:     This is the call of the type coupled event in release 40B

Reading the change pointers which are not yet processed

Call Function 'CHANGE_POINTERS_READ'

RBDMIDOC

The ABAP RBDMIDOC will process all open change pointers and distribute the matching IDocs.

Every time a change document is written, a workflow event for the change document object is triggered. This can be used to chain unconditionally an action from a transaction.

CHANGEDOCUMENT_CLOSE

The most interesting chaining point for workflow events is the creation of the change document. Nearly every transaction writes change documents to the database. This document is committed to the database with the function module CHANGEDOCUMENT_CLOSE. This function will also trigger a workflow event.

The workflow handler triggered by an event which is fired from change documents is defined in table SWECDOBJ . For every change document type, a different event handler can be assigned. This is usually a function module and the call for  it is the following:

CALL FUNCTION swecdobj-objtypefb
    EXPORTING
       changedocument_header = changedocument_header
       objecttype = swecdobj-objtype
    IMPORTING
       objecttype = swecdobj-objtype
    TABLES
       changedocument_position = changedocument_position.

Figure 42:     This is the call of the change doc event in release 40B

In addition, change pointers for ALE are written

Change pointers are created by calling FUNCTION CHANGEDOCUMENT_CLOSE which writes the usual change documents into table CDHDR and CDPOS. This function then calls  the routine CHANGE_POINTERS_CREATE, which creates the change pointers.

CALL FUNCTION 'CHANGE_POINTERS_CREATE'
     EXPORTING
        change_document_header = cdhdr
     TABLES
        change_document_position = ins_cdpos.

Figure 43:     This is the call of the type coupled event in release 40B

The third common way to trigger a workflow is doing it from messaging.

Define a message for condition technique

When the R/3 messaging creates a message and processes it immediately, then it actually triggers a workflow. You can use this to set up conditional workflow triggers, by defining a message with the message finding and link the message to a workflow.

Assign media W or 8

You define the message the usual way for your application as you would do it for defining a message for SAPscript etc. As a processing media you can assign either the type W for workflow or 8 for special processing.

The media type W for workflow would require defining an object  in the object repository. We will only show how you can trigger the workflow with a standard ABAP using the media type 8.

Form routine requires two parameters

You need to assign a program and a form routine to the message in table TNAPR. The form routine you specify needs exactly two USING-parameters as in the example below.

REPORT ZSNASTWF.

TABLES: NAST.

FORM ENTRY USING RETURN_CODE US_SCREEN.

*    Here you go

na call your workflow action

  RETURN_CODE = 0.

  SY-MSGID = '38'.

  SY-MSGNO = '000'.

  SY-MSGNO = 'I'.

  SY-MSGV1 = 'Workflow called via NAST'.

  CALL FUNCTION 'NAST_PROTOCOL_UPDATE'

       EXPORTING

            MSG_ARBGB = SYST-MSGID

            MSG_NR    = SYST-MSGNO

            MSG_TY    = SYST-MSGTY

            MSG_V1    = SYST-MSGV1

            MSG_V2    = SYST-MSGV2

            MSG_V3    = SYST-MSGV3

            MSG_V4    = SYST-MSGV4

       EXCEPTIONS

            OTHERS    = 1.

ENDFORM.

NAST must be declared public in the called program

In addition, you need to declare the table NAST with a tables statement public in the ABAP where the form routinely resides. When the form is called, the variable NAST is filled with the values of the calling NAST message.

Let us show you a function module which is suitable to serve as a function module and define the linkage.

Create a function module that will be triggered by a workflow event

We want to create a very simple function module that will be triggered upon a workflow event. This function is called from within function SWE_EVENT_CREATE. The parameters must comply with the calling standard as shown below.

CALL FUNCTION typecou-recgetfb

         EXPORTING

                     objtype = typecou-objtype

                     objkey = objkey

                     event = event

                     generic_rectype = typecou-rectype

         IMPORTING

                     rectype = typecou-rectype

         TABLES

                     event_container = event_container

         EXCEPTIONS

         OTHERS = 1.

Listing 1:       Call of the type coupled event in release 40B

 

Template for workflow handler

Release 40B provides the function module WF_EQUI_CHANGE_AFTER_ASSET which could be used as a template for the interface. So we will copy it and put our coding in instead..

FUNCTION Z_WORKFLOW_HANDLER.

*"*"Lokale Schnittstelle:

*"       IMPORTING

*"             VALUE(OBJKEY) LIKE  SWEINSTCOU-OBJKEY

*"             VALUE(EVENT) LIKE  SWETYPECOU-EVENT

*"             VALUE(RECTYPE) LIKE  SWETYPECOU-RECTYPE

*"             VALUE(OBJTYPE) LIKE  SWETYPECOU-OBJTYPE

*"       TABLES

*"              EVENT_CONTAINER STRUCTURE  SWCONT

*"       EXCEPTIONS

*"              NO_WORKFLOW

  RECEIVERS-EXPRESS  = ' '.

  RECEIVERS-RECEIVER = SY-SUBRC.

  APPEND RECEIVERS.

  DOCUMENT_DATA-OBJ_DESCR  = OBJ_KEY.

  CONTENT = OBJ_KEY.

  APPEND CONTENT.

  CALL FUNCTION 'SO_NEW_DOCUMENT_SEND_API1'

       EXPORTING  DOCUMENT_DATA              = DOCUMENT_DATA

       TABLES     OBJECT_CONTENT             = CONTENT

                  RECEIVERS                  = RECEIVERS.

ENDFUNCTION.

Listing 2:       A workflow handler that sends an Sap Office mail

Link handler to caller

The function can be registered as a handler for an event. This is done with transaction SWLD.

Event logging

If you do not know the object type that will trigger the event, you can use the event log. You have to activate it from SWLD and then execute the event firing transaction. When the event has been fired,  it will  trace it in the event log.

Figure 44:     Transaction SWLD to define event linkage and see event log

All workflow handlers are called via RFC to a dummy destination WORKFLOW_LOCAL_000 where 000 is to be replaced by the client number.

Most errors are caused by following reasons:

Hit list of common errors

·         You forgot to set the RFC flag in the interface     definition of your event handling function module.

·         There is a syntax error in your function module (check with generate function group).

·         You mistyped something when defining the coupling.

·         The internal workflow destination WORKFLOW_LOCAL_000 is not defined.

SM58 to display what happened to your event

If you think your handler did not execute at all, you can check the list of pending background tasks with transaction SM58. If your event is not there, it has either never been triggered (so your tables SWETYPEENA and SSWETYPEOBJ may have the wrong entries) or your event handler executed indeed and  probably may have done something other than you expected. Ergo: your mistake.

Read carefully the help for CALL FUNCTION .. IN BACKGROUND TASK

Your event handler function is called IN BACKGROUND TASK. You may want to read carefully the help on this topic in the SAP help. (help for “call function” from the editor command line)


 

FUNCTION YAXXWF_MAIL_ON_EVENT.

*“       IMPORTING

*“             VALUE(OBJKEY) LIKE  SWEINSTCOU-OBJKEY

*“             VALUE(EVENT) LIKE  SWETYPECOU-EVENT

*“             VALUE(RECTYPE) LIKE  SWETYPECOU-RECTYPE

*“             VALUE(OBJTYPE) LIKE  SWETYPECOU-OBJTYPE

*“       TABLES

*“              EVENT_CONTAINER STRUCTURE  SWCONT

·         This example sends a mail to the calling user and tells

·         about the circumstances when the event was fired.

·         Just for fun, it also lists  all current enqueue locks

  DATA: ENQ    LIKE SEQG3  OCCURS 0 WITH HEADER LINE.

  DATA: DOC_DATA LIKE SODOCCHGI1.

  DATA: MAIL LIKE STANDARD TABLE OF SOLISTI1 WITH HEADER LINE.

  DATA: RECLIST LIKE STANDARD TABLE OF SOMLRECI1 WITH HEADER LINE.

  MAIL-LINE   = ‚Event fired by user: &’.

  REPLACE ‚&’ WITH SY-UNAME INTO MAIL-LINE.

  APPEND MAIL.

*----------------------------------------------------------------------*

  MAIL-LINE   = ‚Object Key: &’.

  REPLACE ‚&’ WITH OBJKEY INTO MAIL-LINE.

  APPEND MAIL.

*----------------------------------------