Infopackage issues
Hi all,
I looked at the 'monitor' to checked the 4 IPs (infopackage for updating from the ODS to the Cube) under the infosoure <b>8xxxx</b> under the data mart. I found the init, full, delta load, and my own delta load IPs are <b>0 from 0 records</b>.
The monitor shows that it is successful with green light. Why is 0 recoreds loaded? It supposed to have 50,000 records (which shows in the IP for updating from Infosource into the ODS)
Doesn't it need to have the same data reords loaded between ODS IP (for updating to ODS) and Cube IP (for updating to Cube)?
Thank you
Hi Eugene, Ashwin, MM,
My question was if you can tell me why I had about 50,000 data records inserted into to the cube (from the ODS into the cube) but I don't see any data inserted when I checked the infopackages showing: 0 data are inserted from 0 data records ( init load, delta load, and full load from the 8xxx datasource to the 8xxx infosource)
Yes, Eugene. I mean that in the manage infocube screen my request from the infosource(8xxxxx) is 0 transfered 0 inserted records when I was doing the delta data load?
Yes, Ashwin. your explannation was showed in the mornitor screen with delta load. But it also shows in the 'init load' monitor screen when I did have data in the cube. Is it correct? I was wondering if it is normal for the business not to have new transaction data for few days?
MM, sorry, I am not sure what you mean. Can you clarify it? THe init load was good and then I did the delta load that it was good too. I checked these two data load in the monitor and they are both in the green light but 0 data records inserted from 0 records when I have data in the cube. I also created the process chain for the delta load.
Thank you
Similar Messages
-
Hi All,
i have got an issue loading the Info Package. Initially the load failed in R3, with the error SAPSQL_SQLS_INVALID_CURSOR.
So, we have done a Init without Data Transfer and then 3 days Full req and it retrieved data in that.
After this we are running Delta load, where, the job keeps running for hours in R3 with no records retrieved!
Could any one please suggest on this?
Many Thanks!
Regards,
SyedHi Ashish/binu,
Thanks a lot for your replies!
As you have said the Raise no more data Exception was already there.
So we found the solution as, There was custom table which maintans the last date of the delta update, but in Production the entry was missing , the date was given correct now.
The issue has been fixed!
Many Thanks!
Regards,
Syed -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Error while Scheduling InfoPackage
When i am scheduling a full load in the infopackage to load data to PSA but am getting an DBIF_RSQL_INVALID_REQUEST error. The desciption is "An invalid request was made to the SAP database interface in a statement in which the table "EDID4 " was accessed".
Can anyone help me overcome this and resolve the issue, please.
Thanks
Nehacheck OSS Note 89384 if it is relevant to ur issue.
-
Infopackage Idocs in status 2 - could not find code page for receiver system
Hi,
We just migrated our production system from BW 7.01 non unicode to BW 7.4 on HANA.
We now encounter issues with idocs while loading data into bw from our ECC5 source. When we analyze idocs in the source system it appears with the message "could not find code page for receiver system"
One weird thing is that the idoc seems to have been created before we started the infopackage in bw.. We controlled system time and AS time and everything seems ok.
We did not encounter this issues on our previous migration test runs..
Hope someone can help
ChristopheHi,
Thanks for responding. We finally found out what the problem was.
We have two applications servers on our ECC with 2 different OS. One of them could not reach the new BW HANA server.
Regards
Christophe -
QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES
WHAT ARE QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
WHAT ARE DATALOADING PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
WILL REWARD FULL POINT S
REGARDS
GURUBW Back end
Some Tips -
1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 Background Processing Job Management to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 ABAP/4 Run-time Analysis and then run the analysis for the transaction code RSA3 Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW BW IMG Menu on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
Hope it Helps
Chetan
@CP.. -
Exception currencies TCURX table, R3 - BW loading issue
Hello Friends,
I am facing issue in report values for amount in local currency key field. Its value not matching with R3 values. Decimal places are shifted in report.
We are on BI7 (RSDS data source).
There is data coming from R3 into one DSO from different different data sources (Sales, Billing, Purchasing etc) and then getting loaded to cube for reporting.
Few data sources are standard like LIS and few are Z data sources.
This issue must be related to exception currencies maintained in TCURX table. Becauase we are getting issues only for those currencies.
For Ex: See R3 and report value is different.
R3 value:71.230,00 KWD
Cube value:71.230,00 KWD
Report value: 7.123,000 KWD
BW report is showing discrepancy for KWD, CLP (TCURX) currencies for Actual amount in LC key figure.
Can you please explain how to handle fix here?
I have heard that the ECC to BW the respective customization needs to be considered into account in the source system itself.
So can we ask our R3 team to fix this issue for exceptional currencies on R3 side. And any particular information that we will have to share with R3 team to get this corrected?
Please share your inputs on this.
Thank You.Hi P.K,
We too have faced same issue in last project, we followed some procedure as I attached below.
Please go through the attached document and understand.
Issue description: SAP stores all Amount values of different currencies with a fixed interpretation of having two decimal places. There are few exceptions to this rule for the currencies which has no meaning for the decimal places like 'Chilean Peso', Japanese Yen, South Korean Won, Paraguayan Guarani and many other currencies which we talk in the below sections. For few Currencies the fractions are needs to be stored more than 2 decimals due to the currency values and this document describes the procedures that needs to be followed to have the right reporting.
Analysis provided: Kenya Sales KPI’s values display by multiple of 100.
This document describes the procedure to handle the currencies which has the decimal places other than 2 in the table TCURX.
About TCURX table:
Currencies which do not have two decimal places must be defined in table TCURX (decimal places for currency codes). The table determines the number of decimal places in the output according to the currency key. If the contents of currency exist in table TCURX as currency key CURRKEY, the system sets the number of decimal places according to the entry for the field CURRDEC in TCURX. You can maintain this table via the R/3 Implementation Guide (Global Settings -> Currencies, Transaction OY04).
If a currency is not defined in Table TCURX (decimal places for currency codes), this currency is regarded as currency with two decimal places.
Impact of TCURX in BW on data:
Currently in my present system the following currencies were listed in the TCURX Table:
Details of the Currencies
In the above table some of the Currencies are obsolete.
Data load from Source to BW:
In the Source I have the data like below:
For one of the company the total net Sales is present as 675772,43 CLP (Chilean Peso) in the source system for the period Jan’12.
When the data is loaded into BW:
The data in the Source and in the Cube is matching each other.
Data reported in the query looks like below:
The data in the query gets multiplied by 100 which results in 67577243.00 CLP this is due to the fact that when we execute the query, the query looks at the possible entry of the the currency in the table TCURX. If the entry is available as 0, then it will multiply by 100 as normally as per SAP, all Currencies are stored with two decimal places.
Corrective measures / steps which user should take before raising an issue:
This behavior is due to the fact that the Query always looks at the TCURX table while execution and if there is an entry in TCURX then it will be considered as below:
[Amount Value] * [10 ** (2- (CURRDEC entry in TCURX table))]
In present case as the entry is marked as 0 in TCURX table the result will be like below:
= Amount*10**(2-0) = Amount*10**2 = Amount * 100
This will be corrected as below:
All amount related entries at the data source level should be marked to ‘External Format’.
Earlier the same was like below with the internal format.
In 3.x we have the option to choose the same at the infopackage level with 'Currency Conversion for External Systems':
The above change in the data source will provide the below results:
During data loads: When we load the data into BW having amount fields in it, it checks the TCURX table for the currency entry. If the currency entry is present in TCURX table, it divides the amount value by 10** (2- (CURRDEC entry in TCURX table)).
Query Output: The exactly reverse happens while displaying such values in the report output. The amount value will be multiplied by 10** (2- (CURRDEC entry in TCURX table)) while displaying in the report output.
One of below example shows the results as below from the flat file when the format is changed to External.
From Flat file: Sample content from Flat file
In the Cube: Output from the Cube for the same entry
Values are divided by 100 as for Currency IDR the entry is marked as ‘ZERO’ in TCURX table for number of Decimal places.
In the Query: Out of the Query looks like below:
Values the source value matches with the target value where the values of the cube get multiplied by 100.
Points to be noted: This procedure is only with the amounts having CURR data type.
If the amounts are used as FLTP type then this issue can also be avoided.
All the currencies having entry other than 2 in the field CURRDEC of table TCURX will be affected by this phenomenon.
The division will be carried out at the time of loading while multiplication will be carried out at the time of report output display.
Mainly this needs to be considered when dealing with the Flat file data loads which holds the currencies and for the ECC to BW the respective customization needs to be considered into account in the source system side.
Regards,
Vijay -
Issue with a load from R3 to BW 3.5
Hi Guys,
We are having an issue here with a load from R3 to BW 3.5 to an ODS and
a transactional infocube.
In a daily basis we are running loads to BW from R/3 infosets and all
of them but one loads fine.
The one that is having problems is actually the one that loads more
data, therefore the infopackage is divided into two infopackages.
In the update rule of the first ODS we are running an initial routine
in which we are doing a RSDRD_SEL_DELETION (selective deletion) of the
data to be loaded to both the ODS and the infocube, and is actually
here where we got the core dump.
Our first assumption was that maybe there was any yellow request in the
transactional infocube avoiding the selective deletion but we have
discarded this.
We think that, as the only one failing is the one that divides into two
infopackages, the problem might be that at the moment that the first is
triying to delete the second one is loading data into the ods and there
we get the dump.
The question here is ¿Could this be the problem? ¿How could we
workaround this if this is the case?
Thanks for any suggestion.Hi,
Be carefull on ODS activation, if you're using 2 infopackage and/or deleting data from an ODS which is lock for an activation, you'll occurs a dump.
Check SM12 / ST22 / SM21 and job logs in SM37.
And if pb still occurs for a load, debug it in start routine.
Regards, -
Infopackage in process chain taking long time to run
HI Experts,
One of the element (Infopackage) in process chain(Daily process chain) is taking much longer time(6 hr and still runing) as generally it takes 15-20 mins to complete and the status is in yellow and still running in process monitor without giving any clear picture of error. Manually we are making the staus red and we are updating from PSA and this time it is getting completed in time as specified.
Flow from PSA to datatarget(DSO) in series
For last one week we are facing same issue and i would like to mention that we don't have access to SM37,SM12 to look the logs
and any locks.
without this i need to investigate what is the root cause for the same.
with regards,
muraliHi,
please find the job log.
Date
Time
Message
MsgID/No./Ty
13.12.2010
21:45:30
Job started
00
516
S
13.12.2010
21:45:30
Step 001 started (program SBIE0001, variant &0000000109368, user ID RFCUSER)
00
550
S
13.12.2010
21:45:30
Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
R3
413
S
13.12.2010
21:45:30
DATASOURCE = 2LIS_17_I3HDR
R3
299
S
13.12.2010
21:45:30
RLOGSYS = PBACLNT200
R3
299
S
13.12.2010
21:45:30
REQUNR = REQU_D9GSSV08BTS93Z6Y7XWP1ZEUJ
R3
299
S
13.12.2010
21:45:30
UPDMODE = D
R3
299
S
13.12.2010
21:45:30
LANGUAGES = *
R3
299
S
13.12.2010
21:45:30
R8
048
S
13.12.2010
21:45:30
Current Values for Selected Profile Parameters *
R8
049
S
13.12.2010
21:45:30
R8
048
S
13.12.2010
21:45:30
abap/heap_area_nondia......... 0 *
R8
050
S
13.12.2010
21:45:30
abap/heap_area_total.......... 25500319744 *
R8
050
S
13.12.2010
21:45:30
abap/heaplimit................ 40000000 *
R8
050
S
13.12.2010
21:45:30
zcsa/installed_languages...... ED *
R8
050
S
13.12.2010
21:45:30
zcsa/system_language.......... E *
R8
050
S
13.12.2010
21:45:30
ztta/max_memreq_MB............ 2047 *
R8
050
S
13.12.2010
21:45:30
ztta/roll_area................ 3000320 *
R8
050
S
13.12.2010
21:45:30
ztta/roll_extension........... 2000000000 *
R8
050
S
13.12.2010
21:45:30
R8
048
S
13.12.2010
21:45:31
70 LUWs confirmed and 70 LUWs to be deleted with function module RSC2_QOUT_CONFIRM_DATA
RSQU
036
S
13.12.2010
21:45:33
Call customer enhancement BW_BTE_CALL_BW204010_E (BTE) with 9,895 records
R3
407
S
13.12.2010
21:45:33
Result of customer enhancement: 9,895 records
R3
408
S
13.12.2010
21:45:33
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 9,895 records
R3
407
S
13.12.2010
21:45:33
Result of customer enhancement: 9,895 records
R3
408
S
13.12.2010
21:45:33
PSA=0 USING & STARTING SAPI SCHEDULER
R3
299
S
13.12.2010
21:45:33
Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
R3
409
S
13.12.2010
21:45:34
IDOC: Info IDoc 2, IDoc No. 5136702, Duration 00:00:00
R3
088
S
13.12.2010
21:45:34
IDoc: Start = 13.12.2010 21:45:30, End = 13.12.2010 21:45:30
R3
089
S
13.12.2010
21:45:35
Asynchronous transmission of info IDoc 3 in task 0003 (1 parallel tasks)
R3
413
S
13.12.2010
21:45:35
Altogether, 0 records were filtered out through selection conditions
RSQU
037
S
13.12.2010
21:45:35
IDOC: Info IDoc 3, IDoc No. 5136703, Duration 00:00:00
R3
088
S
13.12.2010
21:45:35
IDoc: Start = 13.12.2010 21:45:35, End = 13.12.2010 21:45:35
R3
089
S
13.12.2010
21:55:37
tRFC: Data Package = 1, TID = 0AF0842B00C84D0694000F00, Duration = 00:10:03, ARFCSTATE = SYSFAIL
R3
038
S
13.12.2010
21:55:37
tRFC: Start = 13.12.2010 21:45:34, End = 13.12.2010 21:55:37
R3
039
S
13.12.2010
21:55:37
Synchronized transmission of info IDoc 4 (0 parallel tasks)
R3
414
S
13.12.2010
21:55:37
IDOC: Info IDoc 4, IDoc No. 5136717, Duration 00:00:00
R3
088
S
13.12.2010
21:55:37
IDoc: Start = 13.12.2010 21:55:37, End = 13.12.2010 21:55:37
R3
089
S
13.12.2010
21:55:37
Job finished
00
517
S -
Function Module not extracting records when InfoPackaged is executed
Hi,
I'm using a Function module for extracting data. The DataSource is working fine in RSA3 with a selection criteria. But when i schedule the infopackage with the same selections in the data selection tab, Its not getting me any records into BW. In the Details Tab i get the message "Error occurred in the data selection" . I guess there is a problem with the data selection. Pasted below is the code i'm using for extraction. Can anyone please let me know if there is any issue with the code.
Also how do we pass on the value of I_MAXSIZE into the function module, the remaining parameters get the values from BW. And can anyone please explain me the concept of the "s_counter_datapakid" and when it should be incremented and where it should be located in the code?
Thanks,
AM
Auxiliary Selection criteria structure
DATA: l_s_select TYPE srsc_s_select.
Maximum number of lines for DB table
STATICS: s_s_if TYPE srsc_s_if_simple,
counter
s_counter_datapakid LIKE sy-tabix,
cursor
s_cursor TYPE cursor.
Initialization mode (first call by SAPI) or data transfer mode
(following calls) ?
IF i_initflag = sbiwa_c_flag_on.
Initialization: check input parameters
buffer input parameters
prepare data selection
Check DataSource validity
CASE i_dsource.
WHEN 'XXXXX'.
WHEN OTHERS.
this is a typical log call. Please write every error message like this
log_write 'E' "message type
'R3' "message class
'009' "message number
i_dsource "message variable 1
' '. "message variable 2
RAISE error_passed_to_mess_handler.
ENDCASE.
APPEND LINES OF i_t_select TO s_s_if-t_select.
Fill parameter buffer for data extraction calls
s_s_if-requnr = i_requnr.
s_s_if-dsource = i_dsource.
s_s_if-maxsize = i_maxsize.
Fill field list table for an optimized select statement
(in case that there is no 1:1 relation between InfoSource fields
and database table fields this may be far from beeing trivial)
APPEND LINES OF i_t_fields TO s_s_if-t_fields.
ELSE.
Data transfer: First Call OPEN CURSOR + FETCH
Following Calls FETCH only
First data package -> OPEN CURSOR
IF s_counter_datapakid = 0.
g_first = 'X'.
Fill range tables BW will only pass down simple selection criteria
of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'BWLVS'.
IF l_s_select-low <> '101' AND
l_s_select-low <> '109'.
RAISE invalid_movement_type.
ELSE.
MOVE-CORRESPONDING l_s_select TO r_im_bwlvs.
APPEND r_im_bwlvs.
ENDIF.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'BEGDA'.
MOVE-CORRESPONDING l_s_select TO r_im_begda.
r_im_begda-low0(4) = l_s_select-low6(4).
r_im_begda-low4(2) = l_s_select-low0(2).
r_im_begda-low6(2) = l_s_select-low3(2).
APPEND r_im_begda.
ENDLOOP.
LOOP AT s_s_if-t_select INTO l_s_select WHERE fieldnm = 'ENDDA'.
MOVE-CORRESPONDING l_s_select TO r_im_endda.
r_im_endda-low0(4) = l_s_select-low6(4).
r_im_endda-low4(2) = l_s_select-low0(2).
r_im_endda-low6(2) = l_s_select-low3(2).
APPEND r_im_endda.
ENDLOOP.
*Select from table ltak into internal table i_ltak the following fields.
*Condition: where bwlvs = import bwlvs & endat is >= import begda and
<= import endda.
READ TABLE r_im_begda INDEX 1.
IF sy-subrc NE 0.
RAISE blank_date_invalid.
ELSE.
IF r_im_begda-low IS INITIAL.
RAISE blank_date_invalid.
ENDIF.
ENDIF.
READ TABLE r_im_endda INDEX 1.
IF sy-subrc NE 0.
RAISE blank_date_invalid.
ELSE.
IF r_im_endda-low IS INITIAL.
RAISE blank_date_invalid.
ENDIF.
ENDIF.
IF NOT r_im_endda-low IS INITIAL
AND r_im_begda-low > r_im_endda-low.
RAISE invalid_date.
ENDIF.
r_endat-sign = 'I'.
r_endat-option = 'BT'.
r_endat-low = r_im_begda-low.
r_endat-high = r_im_endda-low.
APPEND r_endat.
*-- Get data from the tables
PERFORM get_data.
*-- Populate the extract structure with data
PERFORM write_data.
*-- Fill Export able with the extracted data
e_t_data[] = i_zbipicprd2[].
ENDIF. "First data package ?
IF g_first EQ 'X'.
CLEAR g_first.
ELSE.
*-- This is important to prevent endless loop.
RAISE no_more_data.
ENDIF.
s_counter_datapakid = s_counter_datapakid + 1.
ENDIF. "Initialization mode or data extraction ?
ENDFUNCTION.The load is running successfully but its not fetching any records into BW.
In the Status tab, Step by Step Analysis the following three steps are red:
Does selectable data exist in the source system? (Red)
Data selection successfully started ? (Red)
Data selection successfully finished ? (Red)
and the below is the Analysis
Diagnosis
The data request was a full update.
o In this case, the corresponding table in the source system does not
contain any data.
o System Response
Info IDoc received with status 8.
Procedure
Check the data basis in the source system.
But i have data in R/3 and i'm able to extract it using RSA3 -
Error While loading into BW using infoPackage
Hi All,
While following directions posted at the following wiki page https://wiki.sdn.sap.com/wiki/display/BOBJ/ConfiguretheLoadJobinBW%28DS+3.2%29,
we get an error while loading data into BW using infoPackage via Data services 3.2
In SAP BI side Error says :"Error while executing the following command: -sJO return code:220"
On DS side for the RFC server log we get the error" Error while executing the following command -sJOB_SQL_TO_DBI -Ubbu -P;(encrypted password) -A (encryted information) -VMODE=F"
Issue is still open with support as we still cannot excute the job from infopackage. it is succesfully executed from designer, admin console, batch file.
Any insights/help would be appreciated.
Thanks.
Edited by: rana_accn on Jan 26, 2010 6:48 PMSure I will.
One thing we have noticed is that in the server_eventlog parameter for the log line "Starting job with command line parameters are different"
When job is executed from DS side parameters -Ct -Cm -Ca -Cj -Cp -S -N -Q -U -P -G -r -T -Locale are used
When job is executed from BW side parameters -Ct -Cm -Ca -Cj -Cp -s -N -Q are used
Also when the job is executed from BW, log files start getting created but then trace and error file show an error that cannot open database - and that points to DSCentral repository.This repository is not even mapped to the job server and the job has been exported from the correct repository as command line execution of the job works perfectly fine.
Thnx
Just an update we were not able to figure out why this happened in the existing environment. One reason given was that somehow the commands which were being executed from BW side were getting corrupted. We had to build a new environment and this is no longer an issue in the new environment, as now all jobs can be executed from BW. Only difference between the 2 environments was we are now on 12.2.1.3 while earlier we were on 12.2.0.0.
Edited by: rana_accn on Mar 23, 2010 10:21 PM
Edited by: rana_accn on Apr 29, 2010 9:46 PM -
Error While Monitoring the Infopackage Load(URGENT)
Hi All,
I am monitoring one Process Chain in that one Infopackage load is going on since 1h 45m , i have checked for the short dumps but there is no time out issue and i am checking the job status R/3 side in SM37 but it saying no job matches with this selection criteria.
Can anyone please help me out in this isuue.
Thanks
karunaHello,
Check the background job (SM37) by giving one day before in the BW Side. Also goto RSMO -> Detailed tab and check still IDoc is being received and in the Status tab check the waiting time of the job for cancellation.
[Thanks|http://chandranonline.blogspot.com/]
[Chandran|http://chandranonline.blogspot.com/] -
Error when scheduling infopackage
hi all,
when i am trying to scheduling infopackage its giving error like
Runtime Errors DBIF_RSQL_SQL_ERROR
Except. CX_SY_OPEN_SQL_DB
SQL ERROR IN DATABASE WHEN ACCESSING A TABLE.
This error is occuring in production system.please guide me how to resolve this issue.
regards
Vamshi D KrishnaHi,
these type of error will usually occur for tablespace or disk space not enough.
pls check ur oracle database for tablespace.just make a try to increase the space and again try ur process.
thnks. -
Error while opening the Infopackage
Hi All,
When I try to open the Info package it is throwing short dump - "MESSAGE TYPE X"
I have tried by deleting the IP and creating new one but still I am facing the same issue.
Thanks,
Ramanathan.RHi,
Are you trying to open init package?
When an Init data load fails for an Infopackage. Sometimes you end up getting short dumps when running the infopackage.
When init fails and you have to reset the init conditions . The symptoms for the same ar a Message Type X dump when trying to open the info package with a dump in RSM1_CHECK_FOR_DELTAUPD
do the following :
In the BW System:
Tables RSSDLINIT and RSSDLINITSEL have the init conditions stored. Delete these records from BW.
In SAp R/3
Tables ROOSPRMSF and ROOSPRMSC have the init conditions - delete these entries.
Now you can open your infopackage.
Regards,
Kams -
PSA deleition variant issue for the BW data source 80CUST_SALEST
Hi all,
I am trying to create a PSA deletion variant for the datasource 80CUST_SALEST, It is a BW Export Datasource,
We are in BI EHP1 7.01 SP3, I am getting a error like "Object name 0CUST_SALEST after the prefix
start with ABC...XYZ". it is not allowing me to save the PSA variant.
I also had a similar type of issue when creation for 1_COPA Datasources which was resolved by the OSS 1309619.
Can you please suggest me or advice me of any OSS notes/remedies for this issue.can't you create the variant referring to the infopackage instead of the psa table?
M.
Maybe you are looking for
-
SDN not working in Firefox 1.5.0.4
hello sdn people, i would love to use SDN in my favourite browser, however this is what happens in mozilla firefox 1.5.0.4: SDN seems to be loaded, i even get the SDN icon in the adress bar, but then it gets trapped in an endless loop and reloads the
-
Cannot get Windows Media Player off my computer
I downloaded Windows Media Player for something a while back and found that I did not need it, so I put it in the garbage. Trouble is, I cannot get it out of the garbage! I cannot empty my recycle bin without bypassing the WMP. So I tried taking it o
-
To remove leading zeroes and take directly from IDOC(Segment field) to file
Hi SapAll. i have got a simple requirement in a idoc to file Interface. here in a sender Idoc there will be one segment with one field,the requirement is to remove leading zeroes and take directly the data from this field and map to one of the field
-
Problem to give default value in entity object using query
hi, i have one entity object and i want to set default value of attribute like division which is based on employee code. entity object based on table leavedetail and using refrence table employee_hdr(empcode ,division). so how can i set default value
-
i have paid for mountain lion and it says it is dowloading ....but nothing seems to be happening? it is a macbook 6,1 (4 years old?) and is currently running OS X 10.6.8