Extract program
Hi all,
Please provide me the basic program on extract that we can start working in extract
Thanks and Regards
rahul singh
An extract is a sequential dataset in the memory area of the program. You can only address the entries in the dataset within a special loop. The index or key access permitted with internal tables is not allowed. You may only create one extract in any ABAP program. The size of an extract dataset is, in principle, unlimited. Extracts larger than 500 KB are stored in operating system files. The practical size of an extract is up to 2 GB, as long as there is enough space in the file system.
An extract dataset consists of a sequence of records of a predefined structure. However, the structure need not be identical for all records. In one extract dataset, you can store records of different length and structure one after the other. You need not create an individual dataset for each different structure you want to store. This fact reduces the maintenance effort considerably.
In contrast to internal tables, the system partly compresses extract datasets when storing them. This reduces the storage space required. In addition, you need not specify the structure of an extract dataset at the beginning of the program, but you can determine it dynamically during the flow of the program.
Have a look at below link:
[Extracts|http://help.sap.com/saphelp_nw70/helpdata/en/9f/db9f3935c111d1829f0000e829fbfe/frameset.htm]
REPORT demo_extract_extract.
NODES: spfli, sflight.
FIELD-GROUPS: header, flight_info, flight_date.
INSERT: spfli-carrid spfli-connid sflight-fldate
INTO header,
spfli-cityfrom spfli-cityto
INTO flight_info.
START-OF-SELECTION.
GET spfli.
EXTRACT flight_info.
GET sflight.
EXTRACT flight_date.
There are three field groups. The INSERT statement assigns fields to two of the field groups. During the GET events, the system fills the extract dataset with two different record types. The records of the field group flight_info consist of five fields: spfli-carrid, spfli-connid, sflight-fldate, spfli-cityfrom, and spfli-cityto.
The first three fields belong to the prefixed field group header. The records of the field group flight_date consist only of the three fields of the header field group.
You will find examples as well in above link.
I hope it helps.
Similar Messages
-
Flat file issue in extract program
Hi All
I have developed an extract program, which downloads the data from SAP and dump it in a flat file, in desired format.
Flat file contains various rows.
In one row, i want to keep 80 charecter length blank at the end.
Ex:
HDR2345 c 20060125 0r42z1005.5USD
Now after "USD", i want to keep 80 char length as Blank spaces.
If i code it as v_record+33(80) = ' '.. it won't work out...
Please suggest me...
Regards
PavanHi Pavan,
Please try this code.
DATA: L_SPACE TYPE X VALUE '20'.
MOVE L_SPACE TO V_RECORD+33(80)
OR
MOVE L_SPACE TO V_RECORD+79(80).
This should work.
Regards,
Ferry Lianto -
Error 8 when starting the extraction program
HI all
I have the following problem when i start an extraction. "Error 8 when starting the extraction program" I test the Extraction program on the R/3 system(source system) and it works, It displays records. On the target system in the infpk under step-by-step it gives you the error and It does not give a oss number, it only suggest that you need to look for a OSS. We have BW 3.5 installed with ECC 5.0.
Also on the source system Tcode WE02, for the load period id displays the IDOC with a red light with the same error. It displays the partnet, basic type and the port, the Error is marked next to inbound(inbox).
Regards
alexAlex,
Check the following OSS notes
947690
511475
23538
Even though the note 947690 is for process chain the steps mentioned in this note could cause Error 8 when scheduling a package for extraction. I think in your case the server name or the host name is incorrect. Check the host and the server name in TCODE SM51 and ask basis to look for the same in RFC entries using SM59 as well.
Abdul -
'Error 8 occurred when starting the data extraction program'
Hello Experts,
I am trying to pull master data (Full upload) for a attribute. I am getting an error on BW side i.e. 'The error occurred in Service API .'
So I checked in Source system and found that the an IDOC processing failure has occurred. The failure shows 'Error 8 occurred when starting the data extraction program'.
But when I check the extractor through RSA3, it looks fine.
Can someone inform what might be the reason of the failure for IDOC processing and how can this be avoided in future. Because the same problem kept occurring later as well.
Thanks
Regards,
KPHi,
Chk the idocs from SM58 of source system are processing fine into ur target system(BI system)...?
Chk thru Sm58 and give all * and target destination as ur BI system and execute and chk any entries pending there?
rgds,
Edited by: Krishna Rao on May 6, 2009 3:22 PM -
Error 8 when starting the extraction program (BI7.x)
I am receiving the following message when loading from the ODS to the data targets:
Error 8 when starting the extraction program
Message no. R3019
Diagnosis
Error 8 occurred when starting the data extraction program.
System response
The data extraction is terminated.
Procedure
Check for relevant OSS Notes, or create a problem message of your own.
I have searched the forums and OSS, and can't find anything that matches my situation exactly. I was originally getting an error message that the infosource (80FIA_DS11) didn't exist in the source system. I restored the BIP client and all transfer structures were reactivated. Now when I try to load, I get this error 8. Has anyone experienced this issue?
Thanks,
PamJosko,
I am also using BI 7.x and had the problem loading from DSO to Infocube. What are the entries in table ROIDCPRMS? What is the host name displayed when you run transaction SM51 in the BI system?
Pam -
Error 4 when starting the extraction program
Hello Gurus,
Yesteday I got one error "Error occured in the source system"that "Error 4 when starting the extraction program" and "error in the source system" has given, it has affected all the loads,please any one give me solution for thia it is urjent.
Please find the error message bello.
Error message from the source system
Diagnosis
An error occurred in the source system.
System response
An error occurred in Service API .
Check the error message.
Procedure
The procedure for removing errors takes place depending on
the error message:
Note
If the source system is a Client Workstation, then it is possible that the file that was to be loaded was being edited at the time of the data request. Make sure that the file is in the specified directory and is
Thanks,
Muddu Reddy.Hi sekhar,
You can try this. you right click on the datasource and click on display in source system it takes you to the R/3 system enter the user id and password so u can now see the datasource in the R/3 so now click on back f3 so now u vl be in BI now replicate the datasource and try to load the data. I think now it should work fine.
Tell me if you still face some problem.
Thanks
Ram -
2LIS_02_ITM extraction program on R3 has a bug!
We deleted some purchasing document items on R3, e.g.,
doc 1
item 1 deleted
item 2
item 3
item 4 deleted
item 5
doc 2
item 1 deleted
item 2 deleted
Run RSA3 on this datasource, find the field "Indicator: Cancel Data Record" for doc1 are all null for doc1 even if item 1 and item 4 has been deleted on R3. This field values are right for doc2 though. Sounds like if all the doc items get deleted then this field shows correct value 'R', if only delete some items of a doc, then the field doesn't indicate the deletion! EKPO table shows that item 1 and item 4 does get deleted, but the extraction program fails to populate this field "Indicator: Cancel Data Record" with correct value 'R'!
Any idea? We have searched OSS Notes, but find something which are only for BW other than for R3. The problem starts at R3 extraction program!
ThanksIs there a concept like 'document gets cancelled in R/3 only when all its items are cancelled'??
If Yes,could it be that this indicator indicates the 'intention of cancelling' the document rather than the document's item..
Just giving my thoughts..
cheers,
Vishvesh -
The extraction program does not support object 0MAT_PLANT
Pessoal alguém já passou por este problema?
Estou tentando fazer um delta no objeto 0MAT_PLANT, mas me retorna sempre este erro.
Já fiz um novo Init, mas o erro continua.
The extraction program does not support object 0MAT_PLANT
Message no. R3009
Diagnosis
The application program for the extraction of data was called up using the invalid InfoSource/invalid InfoObject 0MAT_PLANT.
System Response
The data extraction is terminated.
Procedure
Check the SAP Support Portal for the appropriate Notes and create a customer message if necessary.Hi Eduard ,
You can try the following steps to solve that
1. Delete the previous Delta Init for the InfoObject
If you have access to R/3 Side you can follow the steps or you can follow the BW Steps
Goto RSA7 ->Select and Delete the Init Request for the InfoObject
From BW Side
InfoPackage-> Inti for Source System -> Select & Delete
the Init Request
2. Execute the Init InfoPackage for the InfoPackage again.
3. After successful completion, execute the delta InfoPackage.
Hope it solves your problem, if not pls let me know.
Thanks & Regards,
Chandran Ganesan
SAP Business Intelligence -
The extraction program does not support object 0JOB_ATTR
HI all,
I'm getting the following error when execting 0JOB_ATTR extractor in R/3 system (via RSA3):
The extraction program does not support object 0JOB_ATTR
Message no. R3009
Diagnosis
The application program for the extraction of data was called up using the invalid InfoSource/invalid InfoObject 0JOB_ATTR.
System Response
The data extraction is terminated.
Procedure
Check the SAP Support Portal for the appropriate Notes and create a customer message if necessary.
I've found various threads with this error (or similar ones), but none of them are solved (or not published solution). Haven't found any SAP notes yet either, so any input would be very helpfull.
Thank you.
G.0JOB_ATTR not in use and SAP replaced this with 0EC_CJOB_ATTR... U have to use 0EC_CJOB_ATTR to load master data for 0JOB.
Even i faced the same problem and found the following info in one of the SDN thread, and we are using 0EC_CJOB_ATTR (make sure u have data in table HRP5050).
I have sent a message to SAP asking for the solution . Below is the reply what they gave.
"This error appears because of naming conventions we have to switch the name of the Data Source from 0JOB_ATTR to 0EC_CJOB_ATTR. Please use this data source to extract the compensation job attributes to BW.
Note also that data source 0EC_CJOB_ATTR extracts data from table HRP5050.
If no data is being extracted please do the following:
If you have data in HRP5050 and still do not get any data please try the following :
1) Take over the Data Source from the business content (RSA5)again and check in RSA3 .
2) Replicate the Data Source to BW
3) Activate the Info Source again.
4) Delete any previous delta initializations for the datasource.
5) Initialize the delta again
6) Try to load data". -
Need help for writing extract program
hi
i need help for writing extract program to retriew data from legacy system.
i already developed bdc programs for me31k and me21.
my requirement is to write extract program s for those t.codes.
to retriew data from legacy system and stored in flat file.i need help with a java program. it is a program that allows the user to enter a student's GPA, number of extracurricular activities, and number of service activities. The user can not enter a gpa above 4.0 or below 0. The user can not enter a negative number for the number of both activities. If the student meets the following criteria: 1) GPA of 3.8 or above and at least one extracurricular activity and one service activity, 2) GPA below 3.8 but at least 3.4 and a total of at least three extracurricular and service activities, 3) GPA below 3.4 but at least 3.0 and at least two extracurricular activities and three service activities, the message "Scholarship candidate" should display. If the student does not meet the criteria above, then the message"not a candidate" should display. Can you help me, please?
You haven't posted ANY 'java program' for us to help with.
The forum is NOT a coding service. It is to help you with YOUR code.
Post the code you have written and SHOW us (don't just tell us) how you compile it and execute it and the results you get. Then we can help you with any problems you are are having.
If you need help understanding just what the program should be doing you need to ask your instructor to clarify the assignment. -
Message no. R3009 - error in extraction program for the DSO
Dear All,
I get the following error, when I execute the data source object, when I select monitor button in the window the process is in Red. So I couldn't get the report at the ODS. Please help me to clear the following error :
The extraction program does not support object ZSHIPTRACKND
Message no. R3009
Diagnosis
The application program for the extraction of data was called up using the invalid InfoSource/invalid InfoObject ZSHIPTRACKND.
System Response
The data extraction is terminated.
Procedure
Check the SAP Support Portal for the appropriate Notes and create a customer message if necessary.
with Regards,
JeraldWhat release are you at?
Have you searched SAP Notes and the SDN forums for this. A quick search indicates 9 SAP Notes and 15 SDN posts regarding this error message. Unless we know what release level you are at we are unable to offer viable solutions. -
Where used of a table field in BI extraction program
Hi,
How can I find a particular table or table-field is used in any BI extraction program or not? Actually we are doing a modification and change in the structure of a big Z table and custom developments. So I want to analyse and list the programs related to BI extraction also... which are going to effected because of this change. So please let me know... how can I find this. Actually I am a R/3 Logistics functional consultant... no much idea in ABAP & BI...
Thanks in advance.
Regards,
CHS.Hi,
this sounds like a question for the BW area.
regards
Ingo Hilgefort. -
Extraction program for EH&S (EH&S to XI scenario)
Dear Experts,
I am from the XI background and am new to EH&S. I would like to know whether there exists an extraction program to extract substance data from EH&S(ECC) and send it to XI. If yes then what is it? The EH&S team here doesn't know of any kind. They have only used Change pointers to extract data from one client to the another.
Please help.
Thanks,
MerrillyHello,
How to set up ALE for substances is described in the IMG / Customizing for EH&S.
For EH&S 2.7B on R/3 4.6C this is located under:
Environment, Health & Safety -> Product Safety -> Interfaces -> EH&S Application Link Enabling (ALE) -> ALE for Specification Management.
All necessary information should be found in 'Set up Distribution of Specification Data':
Set Up Distribution of Specification Data
In this IMG activity you set up the distribution of data on
specifications (see also Information About the Concept in EH&S
Application Link Enabling (ALE)).
Requirements
1. Setting the Active Code Page
To transfer the data using ALE, you must ensure that the active code
pages are the same in the receiving system and sending system as
follows:
- You must select a code page for transfer (for example, SAP(ISO)
code page 1100) that you set up on all SAP Systems that belong
to your ALE distribution model.
- In Customizing for Product Safety in the IMG activity Specify
Environment Parameters using the environment parameter
ALE_TRANSFER_LANGUAGE, you must specify a language of the
previously selected code page as the transfer language in the
sending system. This language controls which code page is active
in the sending system during data transfer.
- The RFC destinations of the target systems must be defined with
the logon language that corresponds to the specified code page.
- If data is to be transferred to different SAP(ISO) code pages,
the operating systems of the sending and receiving systems must
use the same character sets (ASCII or EBCDIC):
Sending system Receiving system Transfer possible
AS400 AS400 Yes
UNIX UNIX Yes
NT NT Yes
NT UNIX Yes
UNIX NT Yes
AS400 NT No
AS400 UNIX No
NT AS400 No
UNIX AS400 No
For more information, see:
- The IMG activity Specify Environment Parameters
- The IMG activity Set Up EH&S Native Language Support
- The Product Safety documentation in the section EH&S Native
Language Support
2. Settings in Customizing for Distribution (ALE)
You have made the necessary settings in Customizing for Distribution
(ALE).
Also maintain, for example, the sections
- Basic Settings
- Communication
3. Settings in the Product Safety Component
Make sure that you have fulfilled the following prerequisites:
a) Maintain the environment variables for serialization
Serialization collects the IDocs and makes sure that they are
processed in the correct order. For more information, see
section Serialized Distribution in Customizing for Distribution
(ALE).
You specify the serialization number for the sending logical
system in the IMG activity Specify Environment Parameters using
the environment parameter ALE_SERIAL_ID. You specify one unique
channel per logical system.
b) Specify specifications to be distributed
If you want to distribute specification data manually, choose
the specification directly from the hit list in specification
management.
For automatic distribution, the specifications must appear in a
set of hits that is assigned to the distribution method. If a
set of hits has not been assigned, all of the specifications are
distributed, providing that you have not defined filters.
Apart from standard filters (see below), you can use the
following customer exits to define further restrictions and
filters:
- Specify Additional Table and Parameter Filter (1)
- Specify Additional Table and Parameter Filter (2)
c) Ensure unique identification of specifications
The specification object must have a unique specification key.
In the standard system, identification is supported by the
specification key.
You can use the customer exit Develop Enhancement for
Identification to enhance the identification, for example, to
link with one or more identifiers.
d) Check authorizations
For manual distribution and automatic scheduling, you must have
the read authorization for all the specification data to be
distributed.
You also need the appropriate authorizations for inbound
processing in the target system.
Activities
1. Maintain the Distribution Model in Distribution (ALE) Customizing
In Customizing for Distribution, call the IMG activity Maintain
Distribution Model.
For more information, see the documentation for the IMG activity.
To guarantee communication between systems during distribution, you
must make the following entries in the IMG activity Maintain
Distribution Model by choosing Add method:
Field Entry
Sender/client: <Key for EH&S system>
Receiver/server: <Key for target system>, for example,
Sales and Distribution system (SD), on which EH&S
is installed.
Object name/interface: Substance (substance(specification),
BUS1077)
Method: SavRepMul (save replicated specifications)
Note:
Message type SUBMAS is supported.
You can set the following filters:
- Reducing specifications by determining recipients:
You can reduce the specifications to be distributed by defining
the following filters:
- Specification type
- Authorization group
- Substance nature
- Set of hits (external key of group object)
You can enter several values for a filter field. Individual
values are linked with OR, whereas the filter groups are linked
with AND.
If no filters are entered, all specifications are distributed.
- Reducing specifications using 'IDENTHEADER' filtering
You define the identification category and identification type
whose specifications are to be distributed.
- Reducing data using 'PROPHEADER' filtering
You specify the value assignment types for which specification
data is to be distributed.
- In Distribution (ALE) Customizing, you can use the IMG activity
Set Up Segment Filtering to exclude further tables from
distribution, for example:
- Material assignment
- Regulatory list assignment
- Reference specification assignment
- Usage
- Assessment
Then in Distribution (ALE) Customizing, you maintain the IMG
activity Generate Partner Profiles.
2. Maintain Settings in the Sending and Recipient Systems
The following tables must be maintained in the same way in the
sending and recipient systems:
- Value assignment type TCG01 and description TCG02 (system
tables)
- Table of names for the DDIC objects TCG03 (system table)
- Table of names for the child DDIC objects TCG04 (system table)
- Specify Value Assignment Types
Value assignment type TCG11 and description TCG12
Value assignment type - specification type assignment TCG13
- Identification category TCG21 and description TCG22 (system
tables)
- Check Identification Types
Identification type TCG23 and description TCG24
- Check Identification Listing
Identification listings TCG26 and description TCG27
Definition of identification listings TCG28
- Assign Characteristic-Specific Identification
Overriding identification list definitions TCG29
- Specify Specification Types
Specification type TCG31 and description TCG32
- Specify Authorization Groups
Authorization object TCG36 and description TCG37
- Specify Types for User-Defined Texts
Value assignment text type TCG41 and description TCG42
- Create Sources
Source TCG46
- Specify Source Types
Source type TCG47 and description TCG48
- Set Up Property Trees
Property tree TCG51 and description TCG52
Property tree - value assignment type assignment TCG53
- Specify Data Origin
Data origin TCG56
- Specify Phrase Libraries and Phrase Groups
Phrase library TCG61 and description TCG62
Phrase group TCG63 and description TCG64
- Specify Language Selection
Phrase language (languages supported in phrase library) TCG65
- Value assignment type class characteristic TCG66
System table, filled using master data matchup
- Check Value Assignments
Value assignment assessment TCG71 and description TCG72
- Specify Component Types for Compositions
Component type TCG76 and description
- Specify Regulatory Lists
Regulatory list TCG81 and description TCG82
- Specify Ratings
Value assignment rating TCG86 and description TCG87
- Specify Validity Areas
Validity area TCG91 and description TCG92
- Specify Usage Profiles
Usage profile TCG96 and description TCG97
Usage profile - rating - validity area assignment TCG98
- Specify Exchange Profiles
Exchange profile TCGC3 and description TCGC4
- Specify Environment Parameters
General environment parameters TCGENV
- Protect Characteristics Within Namespace
Namespaces for characteristics TCGK1
- Manage User Exits
User exit parameters from user exit management TCGUEEN
User exits from user exit management with function module
assignment TCGUEFU
Language-dependent user exit names from user exit management
TCGUENA
User exit categories from user exit management TCGUETY
3. Check Master Data to Be Distributed
For the ALE process, the following master data must be distributed
to all the relevant systems:
- Phrases
- Phrase sets (for systems, in which data is created)
- Classes and characteristics
- Materials (all material data that is required for
material-specification assignment)
- Change numbers
Note:
Classes and characteristics are distributed via export and
import. The help texts and phrase sets are also transferred to
other systems along with the classes and characteristics.
Classes and characteristics can also be distributed using ALE.
Note:
The required data providers must have been created manually in the
Product Safety component under Tools -> Edit addresses -> Data
provider before data is distributed in the recipient system. The
data providers must be unique with regard to the following three
fields:
- Name (NAME1)
- City (CITY1)
- Country (COUNTRY)
During distribution, the data providers of the specification to be
sent are read and also distributed. When writing to the recipient
system, the SAP System determines the corresponding address number
for the data provider in the recipient system by comparing the three
fields Name, City and Country for the address numbers that were
sent, and then transfers this address number that was determined to
the Data provider field (OWNID).
You can determine the address number of the data provider in
Customizing for Product Safety in the IMG activity Specify
Authorization Groups.
To do this, call the input help for the Data prov. field in the IMG
activity. You will find the value you require in the Addr. no.
field.
The address number is not displayed in address management in the
Product Safety component.
4. Check Control Data to Be Distributed
See above: "Maintain Settings in the Sending and Recipient Systems"
5. Check Consistency
A consistency check can be performed for the settings in the
distribution model and the partner profiles.
To do this, in Distribution (ALE) Customizing, call the IMG activity
Check Partner Profiles and Model Settings.
The distribution model must have been distributed and the partner
profiles must have been maintained in all relevant systems.
6. Error Handling
As soon as an error occurs when an IDoc is processed, the whole IDoc
is not updated. You can use a workflow to correct errors. IDocs can
be modified manually (you can change the identifier, for example)
and updated retrospectively.
General Procedure
1. In a customer reference model you define which data is to be
distributed to which system. You use sets of hits to define the
specifications that are to be distributed and specify filters with
regard to specifications or specification data as required.
2. The first time you distribute specifications to the target systems,
you do it manually, using the method REPLICATE according to the
distribution model. Serialization must be switched off.
3. You activate serialization and switch on delta distribution as
follows:
Activating Serialization
a) In Customizing for Product Safety, in the IMG activity Specify
Environment Parameters specify the channel for the parameter
ALE_SERIAL_ID through which the ALE data is to be distributed.
b) In Customizing for Distribution (ALE) choose Set Up Data
Distribution -> Serialized Distribution -> Serialized
Distribution Using Object Types -> Activate Inbound Object Types
and specify the inbound object types for which serialization is
to be performed.
c) Schedule a job (RBDAPP01) that posts the IDocs that arrive in
series in the recipient system.
Switching On Delta Distribution
a) Activating change pointers for a message type
Changes to master data objects are logged in the form of change
pointers during master data distribution. To activate the
writing of change pointers, in Customizing for Distribution
(ALE) choose Set Up Data Distribution -> Master Data
Distribution -> Activating Change Pointers -> Activate Change
Pointers for Message Types and set the Active indicator for the
message type for which you want to realize delta distribution.
b) Activating change pointers for each field
From the SAP R/3 screen, choose Tools -> Business Framework ->
ALE -> Development -> Master data -> Activate change pointer for
each field and enter the message type for which you want to
determine fields, for which the SAP System writes change
pointers. All relevant data fields are delivered. If necessary,
adjust the table to your requirements.
c) Activating change pointers generally
To generally activate master data distribution using change
pointers, in Customizing for Distribution (ALE), choose Set Up
Data Distribution -> Master Data Distribution -> Activating
Change Pointers -> Activate Change Pointers (generally) and set
the active indicator.
d) Scheduling delta distribution as a job
You can perform delta distribution manually or schedule it as a
job.
To perform delta distribution manually, from the SAP R/3 screen
choose Tools -> Business Framework -> ALE -> Administration ->
Periodic processing -> Process change pointers and enter the
message type you require and choose Execute.
To schedule delta distribution as a job, in Customizing for
Distribution (ALE) choose Scheduling Periodic Processing ->
Scheduling Generation of IDocs from Change Pointers -> Define
Variants and create a variant. Then in the IMG activity Schedule
Jobs create a job (RBDMIDOC) for the variant. You can set the
time at which distribution is performed, for example,
immediately after a change or periodically.
The following applies when transferring data:
o If a specification is not found in the target system, it is created
with the specification key that is transferred.
o If a specification is available in the target system, its data is
updated.
o When complete specification data is being sent, the specification is
locked and no changes can be made to it. If this lock cannot be set,
the specification cannot be processed. The IDoc is given the status
with error and a work item is created.
Note:
In manual distribution, change pointers are not taken into account.
In other words, the entire data record is distributed and delta
distribution is not performed.
Note on Executing the Report Program RC1PHDEL:
You must NOT schedule the report program RC1PHDEL (physical deletion
of data) to run in the source system or manually run it between
initial distribution and delta distribution, because the keys of
logically deleted data records can no longer be read and distributed
by delta distribution after the report program has been executed.
Before initial distribution you can execute the report in the source
system.
In the target system you can execute the report program independent
of distribution, as long as the target system does not serve as the
source system for further distribution.
If you want to execute the report program after the first delta
distribution, you must first make sure that all change pointers
created have been fully processed by the delta distribution and, if
possible, that all IDocs (intermediate documents) created were also
successfully posted in the target system. Otherwise, there is no
guarantee that the source and target systems will be consistent.
Example
The report program RC1PHDEL was executed in the source system so
that the deletions of an identifier of the identification category
NUM and the identification type CAS and a phrase item were not
distributed.
Then an identifier of the identification category NUM and
identification type CAS with the identical value that was previously
deleted and a phrase item for the same phrase in the previously
deleted language are created. The following errors occur in the
subsequent delta distribution:
- The identifier cannot be created because the NUM-CAS value
already exists. The IDoc cannot therefore be posted.
Notes:
If you attempt to create duplicate identifiers in the dialog,
the same error messages will be displayed as when posting using
ALE.
If the NUM-CAS identifier was not identical to the previously
deleted identifier, the IDoc would be posted successfully and
the identifier would be created in addition.
- The phrase item cannot be created, because only one phrase item
is permitted for a phrase in each language. The old phrase item
must be deleted first before the IDoc can be posted.
The next section describes the solution for similar error cases. You
should, however, try to avoid the need for this procedure by not
executing the report program.
e) In the target system in the dialog, delete the EH&S objects
(specifications, phrases, reports), for which you have already
run delta distribution. These objects are internally marked with
a deletion indicator.
Caution:
When you distribute reports, the report bodies are distributed.
To delete these, you simply need to delete all report reference
specifications. Deleting report bodies is not necessary and also
not possible. If you cannot delete objects (such as
specifications or phrases) owing to their where-used-list, it is
sufficient to delete the 'deleted' detail data in the target
system that has not been distributed. This may require more
processing effort than if you delete all the objects using the
hit list and distribute all the data again in full.
f) In the target system, set the Set missing deletion indicators
and Delete physically indicators in the RC1PHDEL report program
and execute the report program in the target system.
This triggers the following actions:
- The deletion indicator is set in the tables that depend on the
header data to be deleted.
- All data with deletion indicators is deleted physically.
- The corresponding entries in the table ESTALE (conversion
table for ALE) are deleted physically.
g) Deactivate the active serialization and the writing of change
pointers and start initial distribution of all objects again.
h) You can now activate the writing of change pointers and
serialization again and use delta distribution as usual.
Note:
If you still have IDocs in the source system or target system that
were created before the deletion in your target system, but were not
distributed or posted, ignore these IDocs and do not post them. You
can do this by changing the channel before repeating the initial
distribution by using the environment parameter ALE_SERIAL_ID (for
more information on the environment parameter, see the IMG activity
Set Up Distribution of Specification Data).
Return ->
Application
Hope this helps
Mark -
Where used list of a Z field in BI extraction programs
Hi,
How can I find a particular table or table-field is used in any BI extraction program or not? Actually we are doing a modification and change in the structure of a big Z table and custom developments. So I want to analyse and list the programs related to BI extraction also... which are going to effected because of this change. So please let me know... how can I find this. Actually I am a R/3 Logistics functional consultant... no much idea in BI...
Thanks in advance.
Regards,
CHS.Hi,
this sounds like a question for the BW area.
regards
Ingo Hilgefort. -
Incorrect call-up of the extraction program
Hi,
I am trying to extract the data from R/3 into BW through a function module( function module based extraction).
I get the error message while trying to extract in RSA6 as
"Incorrect call-up of the extraction program".
Could you please let me know if its the problem in any settings or any other issue.
Thanks
Regds
GautamHi Gautam,
If possible send the Code for your fm....and i think the error shld be name of the data source.
You need to mentioned the name of the data source in your code i_datasource.....
If this help please assign points.
Regards,
Pawan. -
Datasource problem - error 8 when starting extraction program
Hello,
we are using a datasource (data mart) to write data from one cube to another.
When running the infopackage we get the 2 errors below. When testing with the extractor checker, everything was fine. Datasource was replicated, transfer rules are active.
1. Message no. R3019 - Error 8 when starting extraction program
2. Message no. RSM340 - Errors in source system
more error info...
" Error message from the source system
Diagnosis
An error occurred in the source system.
System response
Caller 09 contains an error message.
Further analysis:
The error occurred in Service API ."
Has anybody any ideas?
Thank you,
RuairiHi Friend,
Check the following OSS notes
947690
511475
23538
Even though the note 947690 is for process chain the steps mentioned in this note could cause Error 8 when scheduling a package for extraction. I think in your case the server name or the host name is incorrect. Check the host and the server name in TCODE SM51 and ask basis to look for the same in RFC entries using SM59 as well.
Regards,
varun.
Maybe you are looking for
-
Can't paste images in signature
I can paste or drag'n'drop image files (.png / .jpeg) to the signature of mac mail but the image is not displayed properly - instead, a symbol with the file's name is inserted. Any ideas why this happens? How can I change this? Could this be related
-
Buying from Appstore for my company
Hello I have an issue: I need to make a purchase from Appstore for my company but I only have my own, personal credit card. Can I make the purchase using my own credit card, but have the company selling the software make the invoice for my company? T
-
My sim says searching all the time even when i take out my sim card
MYiPhone 5s sim place searching all the time even when i take the sim out of the phone
-
Hi all i have table and data like this sale_order_id QTY item_type 12554 58 DYEING 12554 30 CUTTING 12554 58 PRINTINGAND MY CODE IS LIKE THIS DECLARE a_qty NUMBER; a_sale_order_no VARCHAR2(100);
-
Just got an imac how do i get lion for free?
Says it will be available the 20th via app store, but do I have to buy it? then get a refund, or is there a form I fill out to get lion free?