Data load - InfoPackage
Hi guys,
Can any one explain the differece in options 'Start Data Load Immediately' and 'Start Later in Background' with 'Immediate Start' in an InfoPackage?
Thank You,
Troy
hi Troy,
If you select the option 'Start Data Load immediately' then the data load will start immediately and you can monitor the data load by going to the monitor.
But the use of selecting the option, 'Start Later in the Background' with 'Immediate Start' is that the data load will be assigned a job name for the background process and you can track the data load by the job name in the SM37.I do it because i can track by the job name in SM37 and know the status of the data load.
Hope it helps.
Similar Messages
-
Data mart data load InfoPackage gets shot dumps
This is related to the solution Vijay provided in the link What is the functionality of 0FYTLFP [Cumulated to Last Fiscal Year/Period
I encounter the problem again for Data Mart load that I created different initial load InfoPackages with different data selection and ran them separatedly that the initial data packet are messed up and whenever I try to creat a new InfoPackage, always get short dumps. RSA7 on BW system itself doesn't give the fault entry.
I try to use the program RSSM_OLTP_INIT_DELTA_UPDATE you provided, get three parameters:
LOGSYS (required)
DATASOUR (required)
ALWAYS (not required)
I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button!
Then I tried another option you suggested by checking the entries in the following three tables:
ROOSPRMS Control Parameters Per DataSource
ROOSPRMSC Control Parameter Per DataSource Channel
ROOSPRMSF Control Parameters Per DataSource
I find there is no any entry for 1st table with datasource 80PUR_C01, but find two entries in each of the 2nd and 3rd tables. I need to go ahead to delete these two entries for these two tables, right?
ThanksKevin,
sorry, I didn't follow your problem/question, but pay attention when you want to modify these tables content !!!
Since there is an high risk of inconsistencies...(why don't you ask for some SAP support in OSS for this situation?)
Hope it helps!
Bye,
Roberto -
Data load InfoPackage gets shot dumps
This is related to the solution Vijay provided in the link What is the functionality of 0FYTLFP [Cumulated to Last Fiscal Year/Period
I encounter the problem again for Data Mart load that I created different initial load InfoPackages with different data selection and ran them separatedly that the initial data packet are messed up and whenever I try to creat a new InfoPackage, always get short dumps. RSA7 on BW system itself doesn't give the fault entry.
I try to use the program RSSM_OLTP_INIT_DELTA_UPDATE you provided, get three parameters:
LOGSYS (required)
DATASOUR (required)
ALWAYS (not required)
I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button!
Then I tried another option you suggested by checking the entries in the following three tables:
ROOSPRMS Control Parameters Per DataSource
ROOSPRMSC Control Parameter Per DataSource Channel
ROOSPRMSF Control Parameters Per DataSource
I find there is no any entry for 1st table with datasource 80PUR_C01, but find two entries in each of the 2nd and 3rd tables. I need to go ahead to delete these two entries for these two tables, right?
Thankshey Paolo,
I tried RSSM_OLTP_INIT_DELTA_UPDATE, but there are two required input fields:
LOGSYS (required)
DATASOUR (required)
I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button! The LOGSYS value I input is correct by filling in with the BW system source system name in InfoPackage? Also I tried to run this program many times, a little better, but still abnormal. Maybe I need to try to delete the last two control table entries?
There is problem with giving reward points now with this website, I will give you reward points later.
Thanks -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Loading data from infopackage via application server
Hi Gurus,
I have a requirement where i need to load data present in the internal table to a CSV file in the application server (AL11) via open data set, and then read the file from the aplication server, via infopackage ( routine ) then load it to the PSA.
Now i have created a custom program to load data to AL11 application server and i have used the below code.
DATA : BEGIN OF XX,
NODE_ID TYPE N LENGTH 8,
INFOOBJECT TYPE C LENGTH 30,
NODENAME TYPE C LENGTH 60,
PARENT_ID TYPE N LENGTH 8,
END OF XX.
DATA : I_TAB LIKE STANDARD TABLE OF XX.
DATA: FILE_NAME TYPE RLGRAP-FILENAME.
FILE_NAME = './SIMMA2.CSV'.
OPEN DATASET FILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
XX-NODE_ID = '5'.
XX-INFOOBJECT = 'ZEMP_H'.
XX-NODENAME = '5'.
XX-PARENT_ID = '1'.
APPEND XX TO I_TAB.
XX-NODE_ID = '6'.
XX-INFOOBJECT = 'ZEMP_H'.
XX-NODENAME = '6'.
XX-PARENT_ID = '1'.
APPEND XX TO I_TAB.
LOOP AT I_TAB INTO XX.
TRANSFER XX TO FILE_NAME.
ENDLOOP.
now i can see the data in the application server AL11.
Then in my infopackage i have the following code,
form compute_flat_file_filename
using p_infopackage type rslogdpid
changing p_filename like rsldpsel-filename
p_subrc like sy-subrc.
Insert source code to current selection field
$$ begin of routine - insert your code only below this line -
P_FILENAME = './SIMMA2.CSV'.
DATA : BEGIN OF XX,
NODE_ID TYPE N LENGTH 8,
INFOOBJECT TYPE C LENGTH 30,
NODENAME TYPE C LENGTH 60,
PARENT_ID TYPE N LENGTH 8,
END OF XX.
DATA : I_TAB LIKE STANDARD TABLE OF XX.
OPEN DATASET P_FILENAME FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF SY-SUBRC = 0.
DO.
READ DATASET P_FILENAME INTO XX.
IF SY-SUBRC <> 0.
EXIT.
ELSE.
APPEND XX TO I_TAB.
ENDIF.
ENDDO.
ENDIF.
CLOSE DATASET P_FILENAME.
P_SUBRC = 0.
i have the following doubt,
while loading the data from internal table to application server, do i need to add any "data seperator" character and "escape sign" character?
Also in the infopackage level i will select the "file type" as "CSV file", what characters do i need to give in the "data seperator" and "escape sign" boxes? Please provide if there is any clear tutorial for the same and can we use process chain to load data for infopackage using file from application server and this is a 3.x datasource where we are loading hierarchy via flat file in the application server.
Edited by: Raghavendraprasad.N on Sep 6, 2011 4:24 PMHi,
Correct me if my understanding is wrong.. I think u are trying to load data to the initial ODS and from that ODS the data is going to t2 targets thru PSA(Cube and ODS)....
I think u are working on 3.x version right now.. make sure the following process in ur PC.
Start process
Load to Initia ODS
Activation of the Initial ODS
Further Update thru the PSA(which will update both ODS and Cube).
make sure that u have proper Update rules and Init for both the targets from the Lower ODS and then load the data.
Thanks -
Error while starting data loading on InfoPackage
Hi everybody,
I'm new at SAP BW and I'm working in the "Step-By-Step: From Data Model to the BI Application in the web" document from SAP.
I'm having a problem at the (Chapter 9 in the item c - Starting Data Load Immediately).
If anyone can help me:
Thanks,
Thiago
Below are the copy of the error from my SAP GUI.
<><><><><><><><><><<><><><><><><><><><><><><><><><><><><><><><>
Runtime Errors MESSAGE_TYPE_X
Date and Time 19.01.2009 14:41:22
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Batch - Manager for BW Processes ***********
Long text of error message:
Technical information about the message:
Message class....... "RSBATCH"
Number.............. 000
Variable 1.......... " "
Variable 2.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSBATCH" or "LRSBATCHU01"
"RSBATCH_START_PROCESS"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "sun"
Network address...... "174.16.5.194"
Operating system..... "Windows NT"
Release.............. "5.1"
Hardware type........ "2x Intel 801586"
Character length.... 8 Bits
Pointer length....... 32 Bits
Work process number.. 2
Shortdump setting.... "full"
Database server... "localhost"
Database type..... "ADABAS D"
Database name..... "NSP"
Database user ID.. "SAPNSP"
Terminal.......... "sun"
Char.set.... "English_United State"
SAP kernel....... 701
created (date)... "Jul 16 2008 23:09:09"
create on........ "NT 5.2 3790 Service Pack 1 x86 MS VC++ 14.00"
Database version. "SQLDBC 7.6.4.014 CL 188347 "
Patch level. 7
Patch text.. " "
Database............. "MaxDB 7.6, MaxDB 7.7"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0"
Memory consumption
Roll.... 8112
EM...... 11498256
Heap.... 0
Page.... 65536
MM Used. 6229800
MM Free. 1085264
User and Transaction
Client.............. 001
User................ "THIAGO"
Language key........ "E"
Transaction......... "RSA1 "
Transactions ID..... "CD47E6DDD55EF199B4E6001B782D539C"
Program............. "SAPLRSBATCH"
Screen.............. "SAPLRSS1 2500"
Screen line......... 7
Information on where terminated
Termination occurred in the ABAP program "SAPLRSBATCH" - in
"RSBATCH_START_PROCESS".
The main program was "RSAWBN_START ".
In the source code you have the termination point in line 340
of the (Include) program "LRSBATCHU01".
Source Code Extract
Line
SourceCde
310
endif.
311
l_lnr_callstack = l_lnr_callstack - 1.
312
endloop. " at l_t_callstack
313
endif.
314
315
*---- Eintrag für RSBATCHHEADER -
316
l_s_rsbatchheader-batch_id = i_batch_id.
317
call function 'GET_JOB_RUNTIME_INFO'
318
importing
319
jobcount = l_s_rsbatchheader-jobcount
320
jobname = l_s_rsbatchheader-jobname
321
exceptions
322
no_runtime_info = 1
323
others = 2.
324
call function 'TH_SERVER_LIST'
325
tables
326
list = l_t_server
327
exceptions
328
no_server_list = 1
329
others = 2.
330
data: l_myname type msname2.
331
call 'C_SAPGPARAM' id 'NAME' field 'rdisp/myname'
332
id 'VALUE' field l_myname.
333
read table l_t_server with key
334
name = l_myname.
335
if sy-subrc = 0.
336
l_s_rsbatchheader-host = l_t_server-host.
337
l_s_rsbatchheader-server = l_myname.
338
refresh l_t_server.
339
else.
>>>>>
message x000.
341
endif.
342
data: l_wp_index type i.
343
call function 'TH_GET_OWN_WP_NO'
344
importing
345
subrc = l_subrc
346
wp_index = l_wp_index
347
wp_pid = l_s_rsbatchheader-wp_pid.
348
if l_subrc <> 0.
349
message x000.
350
endif.
351
l_s_rsbatchheader-wp_no = l_wp_index.
352
l_s_rsbatchheader-ts_start = l_tstamps.
353
l_s_rsbatchheader-uname = sy-uname.
354
l_s_rsbatchheader-module_name = l_module_name.
355
l_s_rsbatchheader-module_type = l_module_type.
356
l_s_rsbatchheader-pc_variant = i_pc_variant.
357
l_s_rsbatchheader-pc_instance = i_pc_instance.
358
l_s_rsbatchheader-pc_logid = i_pc_logid.
359
l_s_rsbatchheader-pc_callback = i_pc_callback_at_end.Hi,
i am also getting related this issue kindly see this below short dump description.
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Variant RSPROCESS0000000000705 does not exist
Long text of error message:
Diagnosis
You selected variant 00000000705 for program RSPROCESS.
This variant does not exist.
System Response
Procedure
Correct the entry.
Technical information about the message:
Message class....... "DB"
Number.............. 612
Variable 1.......... "&0000000000705"
Variable 2.......... "RSPROCESS"
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSPC_BACKEND" or "LRSPC_BACKENDU05"
"RSPC_PROCESS_FINISH"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "CMCBIPRD"
Network address...... "192.168.50.12"
Operating system..... "Windows NT"
Release.............. "6.1"
Hardware type........ "16x AMD64 Level"
Character length.... 16 Bits
Pointer length....... 64 Bits
Work process number.. 0
Shortdump setting.... "full"
Database server... "CMCBIPRD"
Database type..... "MSSQL"
Database name..... "BIP"
Database user ID.. "bip"
Terminal.......... "CMCBIPRD"
Char.set.... "C"
SAP kernel....... 701
created (date)... "Sep 9 2012 23:43:54"
create on........ "NT 5.2 3790 Service Pack 2 x86 MS VC++ 14.00"
Database version. "SQL_Server_8.00 "
Patch level. 196
Patch text.. " "
Database............. "MSSQL 9.00.2047 or higher"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0, Windows NT 6.1, Windows NT 6.2"
Memory consumption
Roll.... 16192
EM...... 4189840
Heap.... 0
Page.... 16384
MM Used. 2143680
MM Free. 2043536
User and Transaction
Client.............. 001
User................ "BWREMOTE"
Language Key........ "E"
Transaction......... " "
Transactions ID..... "9C109BE2C9FBF18BBD4BE61F13CE9693"
Program............. "SAPLRSPC_BACKEND"
Screen.............. "SAPMSSY1 3004"
Screen Line......... 2
Information on caller of Remote Function Call (RFC):
System.............. "BIP"
Database Release.... 701
Kernel Release...... 701
Connection Type..... 3 (2=R/2, 3=ABAP System, E=Ext., R=Reg. Ext.)
Call Type........... "asynchron without reply and transactional (emode 0, imode
0)"
Inbound TID.........." "
Inbound Queue Name..." "
Outbound TID........." "
Outbound Queue Name.." "
Information on where terminated
Termination occurred in the ABAP program "SAPLRSPC_BACKEND" - in
"RSPC_PROCESS_FINISH".
The main program was "SAPMSSY1 ".
In the source code you have the termination point in line 75
of the (Include) program "LRSPC_BACKENDU05".
Source Code Extract
Line SourceCde
45 l_t_info TYPE rs_t_rscedst,
46 l_s_info TYPE rscedst,
47 l_s_mon TYPE rsmonpc,
48 l_synchronous TYPE rs_bool,
49 l_sync_debug TYPE rs_bool,
50 l_eventp TYPE btcevtparm,
51 l_eventno TYPE rspc_eventno,
52 l_t_recipients TYPE rsra_t_recipient,
52 l_t_recipients TYPE rsra_t_recipient,
53 l_s_recipients TYPE rsra_s_recipient,
54 l_sms TYPE rs_bool,
55 l_t_text TYPE rspc_t_text.
56
57 IF i_dump_at_error = rs_c_true.
58 * ==== Dump at error? => Recursive Call catching errors ====
59 CALL FUNCTION 'RSPC_PROCESS_FINISH'
60 EXPORTING
61 i_logid = i_logid
62 i_chain = i_chain
63 i_type = i_type
64 i_variant = i_variant
65 i_instance = i_instance
66 i_state = i_state
67 i_eventno = i_eventno
68 i_hold = i_hold
69 i_job_count = i_job_count
70 i_batchdate = i_batchdate
71 i_batchtime = i_batchtime
72 EXCEPTIONS
73 error_message = 1.
74 IF sy-subrc <> 0.
>>> MESSAGE ID sy-msgid TYPE 'X' NUMBER sy-msgno
76 WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
77 ELSE.
78 EXIT.
79 ENDIF.
80 ENDIF.
81 * ==== Cleanup ====
82 COMMIT WORK.
83 * ==== Get Chain ====
84 IF i_chain IS INITIAL.
85 SELECT SINGLE chain_id FROM rspclogchain INTO l_chain
86 WHERE log_id = i_logid.
87 ELSE.
88 l_chain = i_chain.
89 ENDIF.
90 * ==== Lock ====
91 * ---- Lock process ----
92 DO.
93 CALL FUNCTION 'ENQUEUE_ERSPCPROCESS'
94 EXPORTING
If we do this
Use table RSSDLINIT and in OLTPSOURCRE: enter the name of the data source
And in the logsys enter the name of the source system and delete the entry for that info package. what will happen is process chain will run suceesfully and short dump will not come or what kindly give the detail explanation about this RSSDLINIT.
Regards,
poluru -
Loading Infopackages one at a time (i.e. one by one)
Dear All BI Experts,
We are implementing SCM5.1 with BI7.0
We load a cube in BI with daily forecast data from APO, but unfortunately we have spotted one of the dates is wrong, so the data in the cube has been summing up incorrectly.
That has now been fixed via the transformation, so what I want to do is reload the cube but with one Infopackage at a time (not all eight as are queued up now).
Does anyone know how to load packages one by one?
Any help will be greatfully received.
Nigel BlanchardHi Nigel Blanchard
There is an ancient solution i am sorry if my mind working as ancient guy. But let me share my though here
As i suggested Just run the DTP between Data source and Cube with Get request by request delta option with below process
Say suppose you have 18 requets in PSA
when you run the DTP first time its trying to fetch the records one by one till 18 th so you can manually red teh process and delete the unwanted requests and then change the transformations date related info and then again run the DTP this time it will try to fetch the request from 2-18 so after loading of second request break the load by changing the QM ststus of DTP if its not possible then go to the target cube mange from there stop the data load then reomev the requests which ever present other than 2 remove 3-18
and then again repeat the process
318 break the load delete the requests 418
4--18 " " " "
Repeat the process till all the requests got finished... and finally cross verify weather all the requests got loaded or not..
Hope its clear a little..!
Thanks
K M R
***Even if you have nothing, you can get anything.
But your attitude & approach should be positive..!****
>
Nigel Blanchard wrote:
> Dear Respondants so far,
>
> At least KMR is the closest!
>
> Geo - Your comment doesn't really help me.
>
> Jagadish - Yes you can move data from a data source or PSA straight into a Cube. You do not need a DSO. So I do mean infopackages.
>
> KMR - You seem to have got the idea of what I am doing.
> Forecast data is created in APO and saved into a DSO.
> We have built an extract datasource in BI that fetches this data daily to a datasource on BI.
> We then load this data directly into a cube. We have no need for a DSO as no transformation takes place (the DSO in APO essentially becomes our corporate memory anyway).
>
> So the problem is this;
>
> Initially, moving the data from the datasource to the cube has gone wrong due to the date problem. We have now fixed this in the transformation between the cube and the datasource.
> To reload the data back into the cube, I need to load one package at a time. After each package I need to change the transformation to reset this date. So, that is what I mean when I say I need to load one package at a time.
> The process will look something like; Set date on transformation->Load First package->reset date on transformation->Load Second package->reset date on transformation->Load Third package and so on until I catch up with today's load (about 8 days worth). After that I can just use the 'Current Date' function to automatically fill in the correct date. So the sooner I do it the less manual intervention required!
>
> You are right about the DTP to delta the data into the cube request by request, but this does not provide me with the break between requests that I need to manually reset the date in the transformation. This option will run all of them one after the other without stopping. Is there no way I can just specifiy the request I want to load by request number or an option to simply load one request, so I get the break in between loads that I need?
>
> Many thanks for your time and help so far.
>
> Nigel.
Edited by: K M R on Feb 16, 2009 4:30 PM -
Data load problem - BW and Source System on the same AS
Hi experts,
Im starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
Ive configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, Ive created an InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 source system). Ive started the data load process, but the monitor says that no Idocs arrived from the source system and keeps the status yellow forever.
Additional information:
<u><b>BW Monitor Status:</b></u>
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System Response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
<b><u>BW Monitor Details:</u></b>
0 from 0 records
but there are 2 records on RSA3 for this data source
Overall status: Missing messages or warnings
- Requests (messages): Everything OK
o Data request arranged
o Confirmed with: OK
- Extraction (messages): Missing messages
o Missing message: Request received
o Missing message: Number of sent records
o Missing message: Selection completed
- Transfer (IDocs and TRFC): Missing messages or warnings
o Request IDoc: sent, not arrived ; Data passed to port OK
- Processing (data packet): No data
<b><u>Transactional RFC (sm58):</u></b>
Function Module: IDOC_INBOUND_ASYNCHRONOUS
Target System: SRMCLNT100
Date Time: 08.03.2006 14:55:56
Status text: No service for system SAPSRM, client 001 in Integration Directory
Transaction ID: C8C415C718DC440F1AAC064E
Host: srm
Program: SAPMSSY1
Client: 001
Rpts: 0000
<b><u>System Log (sm21):</u></b>
14:55:56 DIA 000 100 BWREMOTE D0 1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
Documentation for system log message D0 1 :
The transaction has been terminated. This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction. The actual reason for the termination is indicated by the T100 message and the parameters.
Additional documentation for message IDOC_ADAPTER 601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
<b><u>RFC Destinations (sm59):</u></b>
Both RFC destinations look fine, with connection and authorization tests successful.
<b><u>RFC Users (su01):</u></b>
BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
Someone could help ?
Thanks,
GuilhermeGuilherme
I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
Also check this weblog on data Load errors basic checks. it may help
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Thanks
Sat -
Data load from DSO to Cube in BI7?
Hi All,
We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
a. Infopackage1 from datasource to ODS and
b. Infopackage2 from ODS to the CUBE.
Now after we transported the migrated dataflow to production, to load the same infoproviders I use
1. Infopackage to load PSA.
2. DTP1 to load from PSA to DSO.
3. DTP2 to load from DSO to CUBE.
step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
Please note that the DSO already has data loaded using infopackage. (Is this causing the problem?)
Thanks,
Sirish.Hi Naveen,
Thanks for the Reply. The creation of DTP is not possible without a transformation.
The transformation has been moved to production successfully. -
Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 [X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures]
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
Thanks.Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
Request-ID is only a part of the new data table - after activation of your data your request will get lost. If you want to see whats happening, load you data request by request and activate your data after each request
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are equal, you will have only one dataset in your DSO
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
Then you will have two datasets in your DSO
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
If you choose overwrite, you will get 30 - if you choose addition, you will get 40
Thanks. -
Data loading after field enhancement.
Dear all,
We are using BI7.00 and in one of our data source, a new field has to be enabled. Our people are under the impression that without downtime, the previous data which is available in the Target and the PSA can have values for the new field also.
I could not perceive the possibility. Experts suggestion required in this regard. Can you kindly provide answers for the following questions.
1) Can enhancement be done to the data source without deletion of setup table?
2) Can the delta queue be as it is without stopping the delta pull process i.e., the process chain and the background jobs.
3) If the field is enhanced, can the value of the field be loaded to all the data which is previously loaded to the PSA and the Target.
Request Experts to provide apt solution so that field enhancement can take place without disturbing any of the data loads.
I went through the forum posts and was able to find something about export data source and Loop back principles - these suggests that my requirement is possible.
I do not know the process. Can experts provide step by step suggestion to my query.
Regards,
M.MHello Magesh,
1)Enhancement cannot be done if there are records in the set up tables.
2)When an enhancement is done...delta queue also needs to be empty...so you will have to stop the collective running jobs...lock the system and empty the delta queue by scheduling the delta package twice....then only the transports to production will go succesful.
3)Until you fill the set up tables again and do a historial loads...the old values for the new added field will not appear..
If you just do an init without data transfer and schedule new delta loads...then the new added fields will contain values from that day and changes to them...previously loaded values to BW will remain as it is...to have the values for newly added fields you need to load the history through full repair loads by filling the set up tables first.
Follow the following steps to load only the new values for the added fields
1)Lock the system
2)schedule the collective update job through job control so that all the records are in the delta queue and no records or LUW are left in LBWQ for that data source.
3)Schedule the delta infopackage twice so that even the queue for repeat delta is also empty.
4) do the transports and then delete the old init and do a new init without data transfer.
5)schedule the normal delta.
To have history for the added fields
1)Lock the system and
2)Delete the old init and clear the LBWQ from LUW's
3)Do the transports
3)Fill the set up tables and do init without data transfer for the data source.
4)Unlock the system
5)Do the full repair loads to the BW data targets
6)Schedule the delta loads.
Thanks
Ajeet -
QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES
WHAT ARE QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
WHAT ARE DATALOADING PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
WILL REWARD FULL POINT S
REGARDS
GURUBW Back end
Some Tips -
1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 Background Processing Job Management to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 ABAP/4 Run-time Analysis and then run the analysis for the transaction code RSA3 Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW BW IMG Menu on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
Hope it Helps
Chetan
@CP.. -
Data Load : Number of records count
Hi Experts,
I want to document number of records transferred to BW during an infopackage execution.
I want to automate the process by running a report in background which will fetch a data from SAP tables about number of records been transfered by all my InfoPackage .
I would like to know how should I proceed with.
I want to know some System tables which contains same data as that of RSMO transaction displays to us.
Kindly help with valuable replies.HI,
inorder to get the record counts report you need to create a report based on below tables
rsseldone, rsreqdone, rsldpiot, rsmonfact
Check the below link which explain in detail with the report code as well.
[Data load Quick Stats|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/90215bba-9a46-2a10-07a7-c14e97bdb764]
This doc also explains how to trigger a mail with the details to all.
Regards
KP -
Error is data loading from 3rd party source system with DBCONNECT
Hi,
We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
The connection is working OK and we can see the tables in the source system. But we cannot load the data.
The error in the monitor is as follows:
'Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.'
But, unfortunately, the error message has no further information.
If we look at the job log in sm37, the job finished with the following log -
27.10.2009 12:14:19 Job started 00 516 S
27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA) 00 550 S
27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG RSM1 797 S
27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container OL 356 S
27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G RSM1 796 S
27.10.2009 12:14:24 Job finished 00 517 S
In a BW 3.10 system, there is no message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
Thanks in advance,
RajibThere will be three things to get the errors like this
1.RFC CONNECTION FAILED
2.CHECK THE SOURCE SYSTEM
3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
4.CHECK I DOC PROCESSING
5.FINALLY MEMORY ISSUES.
6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
7.LAST IS MEMORY ISSUE.
and also Check the RFC connection in SM59 If it is ok then
check the SAP note : 692195 for authorization
Santosh -
How to debug a transfer rule during data load?
I am conducting a flat file (excel sheet saved as a CSV file) data load. The flat file contains a date field and the value is '12/18/1988'. In transfer rule for this field, I use a function call to transfer this value to '19881218' which corresponds to BW DATS format, but the monitor of the InfoPackage shows red error:
"Value '1981218' of characteristic 0DATE is not a number with 000008 spaces".
Somehow, the last digit or character of the year 1988 was cut and the year grabbed is 198 other than 1988. The function code is (see below in between two * lines):
FUNCTION ZDM_CONVERT_DATE.
""Local Interface:
*" IMPORTING
*" REFERENCE(CHARDATE) TYPE STRING
*" EXPORTING
*" REFERENCE(DATE) TYPE D
DATA:
c_date(2) TYPE c,
c_month(2) TYPE c,
c_year(4) TYPE c,
c_date_combined(8) TYPE c.
data: text(10).
text = chardate.
search text for '/'.
if sy-fdpos = 1.
concatenate '0' text into text.
endif.
c_month = text(2).
c_date = text+3(2).
c_year = text+6(4).
CONCATENATE c_year c_month c_date INTO c_date_combined.
date = c_date_combined.
ENDFUNCTION.
Could experts here tell me what's wrong and also tell me on how to debug a transfer rule during data load?
Thankshey Bhanu/AHP,
I find the reason. Originally, I set the character length for the date InfoObject ZCHARDAT1 to 9, then I find the date field value (12/18/1988)length is 10. Then I modified the InfoObject ZCHARDAT1 length from 9 to 10 and activated it already. But when defining the transfer rule for this field, before the code screen, click the radio button "Selected Fields" and pick the filed /BIC/ZCHARDAT1, then continue to go to the transfer rule code screen, but find the declaration lines for the infoObject /BIC/ZCHARDAT1 is as following:
InfoObject ZCHARDAT1: CHAR - 000009
/BIC/ZCHARDAT1(000009) TYPE C,
That means even if I've modified the length to 10 for the InfoObject and activated it, but somehow the transfer rule code screen always takes the old length 9. Any idea to have it fixed to take the length 10 in the transfer rule code screen defination?
Thanks
Maybe you are looking for
-
Since I have installed iOS7 on my iPhone 4s - 16G, my 4s connects very slowly to my home Wi-Fi network. Sometimes it even doesn't connects at all. Which result in a higher mobile data usage (and thus higher bill at the end of the month). Can this err
-
Two MacBooks with Snow Leopard should be able to iChat?!
I feel as if this should be easy. The connection is established but the audio is intermittent and video is intermittent. Is the inability to have streaming audio and video tied to the connection speed? How do I validate my current connection speed? I
-
Help need for procedure with "cursor in a cursor"
Hi Iam using two cursors in my procedure. Create SP_sample as cursor C1 is select a from A cursor C2 is select a,bc,d from A,B where A.a=B.a Begin For Cur_rec C1 loop For Cur_rec1 C2 loop "SELECT QUERY" end loop end loop end 1)the "SELECT QUERY" is w
-
Can't install Robohelp 9. Stops at 37%.
Hi, I'm trying to install RoboHelp 9 on a desktop PC with Windows XP sp3. The installation starts running but stops at 37%. The only item what's updated is the remaining time. When the time is at 120 min. I have to kill the process. I've tried to ins
-
Starting Animation From JavaScript
Hi. I just made my first edge animate project and made a progress loader animation. Works well. When i placed the code in the site the animation was always on, so i've hidden the item in css #Stage { display:none; } When my animation is needed i'm ma