PSA Performance
Hi All,
We have one particular load that pulls from our APO system to the PSA and finally pushes to a cube. The pull from APO and load to PSA take the expected time, but the push from PSA to the cube takes up to 16 hrs. When I view the process in sm66 I see it is stuck on a Sequential Read of table /BIC/B0000250000. I'm assuming this is a partition in the PSA.
Anyone have an idea why it would take so long to read from the PSA for just this load? We have other loads that do the same thing from the same system that run in a normal amount of time. For some reason it looks like the partition in the PSA gets screwed up for this load. I'm assuming that it is missing an index or the stats are not up to date. Can anyone let me know how to check those for a partition? And why would the indexes and stats not get updated for just this load when others work fine?
Thanks in advance.
hi Jason,
try to check the database statistics are current for that PSA table, transaction DB20, you can update to current with 'create statistics' (blank page icon).
if above not help, try to consult with basis, check setting rsdb/max_in_blocking_factor (default value = 1000, try 100 ?)
also make sure infocube index is deleted before data loading and re-create after loading.
may useful oss note 567747 - Composite note BW 3.x performance: Extraction & loading
hope this helps.
Similar Messages
-
Performance Appraisal - Add PA, PSA, EG and ESG to PHAP_ADMIN_PA
Hello
There is a requirement to add Personnel Area, Personnel Subarea, Employee Group, Employee subgroup to the output of the report generated by PHAP_ADMIN_PA.
What steps we need to take in order to meet the business need? Also, it would be great if you can also comment on the level of effort in manhours, if possible.
Many thanks in advance.Hi Jothi
I appreciate your feedback. However, just to clarify, I am looking to add the functionality to PHAP_ADMIN_PA on the R/3 side (not on the portal side). The transaction code generates the output for the following columns by default. I would like to add <b>PA, PSA, EG, ESG</b> and <b>Appraisal Substatus</b> to the output for meeting our business needs.
Number
Appraisal Document Name
Appraisal Document Type
Appraiser Type
Appraiser Name
Appraisee Type
Appraisee Name
Appraisal Status
Period
To
Objective Setting Date
Last Changed On
Last Change Time
Last Changed by User
I hope that makes more sense now. Please advice on the best approach to achieve this.
Also, the implementation PERSONNEL_HEADER_DATA for the BADI HRHAP00_ADD_HEADER has been included in our template and bring in the data as expected.
Many thanks in advance. -
Bad loading performance CO-PA Extractor (PSA) to InfoCube
Hello Experts,
I activated a CO-PA Extractor for one Controlling Area. I activated alle VV* Keyfigures (appr. 200 KFs) and around 30 InfoObjects.
I have loaded the Fiscal Year 2010 from the Source System into BW. When looking at the PSA the Package Size of each Package is around 5000 Data Records (in total appr. 14 000 000 Data Records).
When I load the data into the InfoCube via Transformation (1:1 Mapping) and DTP the load takes more than 14 hours because the DTP loads the data packagewise (around 2800 Packages) into the InfoCube.
I already increased the number of Batch Processes and deleted the InfoCube Index but the load takes still more than 12 hours.
I guess the solution would be the increase of the number of Data Records per Package but when I load data from the DataSource into the InfoCube you cannot choose the Package Size.
Can you help me, please
Thank you
StefanHi,
refer below thread
For Cost Center accounting, the most important datasources are 0CO_OM_CCA_9 (actual costs - line items) [http://help.sap.com/saphelp_nw04s/helpdata/en/8f/d93c00e39d5b4fb6eaa3bf16e6fb09/frameset.htm]
and 0CO_OM_CCA_1 (plan and actual costs - totals per fiscal period) [http://help.sap.com/saphelp_nw04s/helpdata/en/ae/8876724e671341a91ff3ba711bacd0/frameset.htm]
Thank,s -
Data not received in PSA Table
hi guyz,
im facing the below issue while loading master data attr from source system to bi system.
im loading data from 0vendor_attr to ovendor object.
only half of the records are processed and gets hanged.
in monitor- administrator workbench its showing (60434 from 157912 records)
below are the details of the issue:
Data not received in PSA Table
Diagnosis
Data has not been updated in PSA Table . The request is probably still running or there was a short dump.
Procedure
In the short dump overview in BI, look for the short dump that belongs to your data request. Make sure the correct date and time are specified in the selection screen.
You can use the wizard to get to the short dump list, or follow the menu path "Environment -> Short dump -> In Data Warehouse".
Removing errors
Follow the instructions in the short dump.
please guide with your expertise.
cheerz,
raps.Hi Rajeev,
Check the job in Background (SM37), you can analyze from here.
Check the short dump in ST22.and check system performance once.
If you are facing this problem first time, cancel the background job and make the request red.
Then repeat the IP again, it will get successful.
This should be due to performance issue or RFC problem of your system.
If the job is active in SM37, cancel the job and make the status IP red.
Re run the IP, it will get successful.
Regards,
Venkatesh
Edited by: Venky1903 on Aug 15, 2011 2:09 AM -
Data not received in PSA Table(urgent)
hello,
While loading Master data full load
which is a load of 457070 on 2days back
where only 4lac records r received to PSA but remaining 57070 was missing
with error<b> Data not received in PSA Table
</b>
when the next day we loaded 5lac records and we got the same error<b> Data not received in PSA Table</b> where except 57070 records remaining all reached PSA
where we tried 2day with 550000 records, same error without 57070 records all other reached to PSA
I required ur help to solve this
i will thank by giving points
Regards
PSCHi,
check if there is a TRFC haging in SM58.
can that be that your DB cannot extend its space for this amount of data.
How do you perform the load?
package by package (in series or in parallel)? could that be that the next stage (into IObj) is failing thus stopping the load in the PSA?
Can you try to perform the same load with only to PSA and then enable the "susbsequent update in target"?
As already mentionned it could also be that an IDOC is haging...
please let us know
Olivier. -
Data not received in InfoCube / PSA
Hi Everybody,
Today all the loads are failed with the Error "Data not received in InfoCube / PSA object / InfoObject", Could you please help how to solve the problem.
Regards
BWHi
First check that your ODS is active
Then ensure it has data activated and for the selection you want to extract
What kind of load are you performing? A full or a delta?
In any case when you fire your InfoPackage from the ODS to the cube a job named BI_REQ* will be kicked. Check this job in SM37. If you can open an InfoPackage with your ODS as DataSource then the DataSource is fine and the transfer program is activated.
Monitor this job and see what happens.
If the job is cancelled for any reason, look if there is a dump (ST22), a system message in the log (SM21 on all appserv) and if there is a TRFC transaction with error in BW (SM58).
Please let me know
regards,
Olivier. -
Start Routine code error of Perform routine 9998
Hi,
I have install standard and then migrate to 7.0 all the flow of modeling.
but after loading data in PSA, while activation i will found following problem (error in routine).
In PERFORM or CALL FUNCTION ROUTINE_9998, the actual parameter SOURCE_PACKAGE is incompatible with the formal parameter DATA_PACKAGE.
please suggest me.PROGRAM trans_routine.
CLASS routine DEFINITION
CLASS lcl_transform DEFINITION.
PUBLIC SECTION.
Attributs
DATA:
p_check_master_data_exist
TYPE RSODSOCHECKONLY READ-ONLY,
*- Instance for getting request runtime attributs;
Available information: Refer to methods of
interface 'if_rsbk_request_admintab_view'
p_r_request
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.
PRIVATE SECTION.
TYPE-POOLS: rsd, rstr.
Rule specific types
TYPES:
BEGIN OF tys_SC_1,
InfoObject: 0MPOINT Measuring point.
MPOINT TYPE /BI0/OIMPOINT,
InfoObject: 0MDOC Measurement Document.
MDOC TYPE /BI0/OIMDOC,
InfoObject: 0MPOBJ Object Number of Measuring Point Object.
MPOBJ TYPE /BI0/OIMPOBJ,
InfoObject: 0MPTYP Type of Measuring Point.
MPTYP TYPE /BI0/OIMPTYP,
InfoObject: 0PSORT Position Number of Measuring Point in Object (
*Sort Field).
PSORT TYPE /BI0/OIPSORT,
InfoObject: 0ASSEMBLY Assembly.
ASSEMBLY TYPE /BI0/OIASSEMBLY,
InfoObject: 0ATINN Internal Characteristic Number.
ATINN TYPE /BI0/OIATINN,
InfoObject: 0MDATE Date of Measurement.
MDATE TYPE /BI0/OIMDATE,
InfoObject: 0TIME Time.
TIME TYPE /BI0/OITIME,
InfoObject: 0DESIRSI Target Value for Measuring Point in SI Unit.
DESIRSI TYPE /BI0/OIDESIRSI,
InfoObject: 0TVIND Target Value for Measuring Point Is Defined.
TVIND TYPE /BI0/OITVIND,
InfoObject: 0MRMINI Lower Measurement Area Limit Is Defined.
MRMINI TYPE /BI0/OIMRMINI,
InfoObject: 0MRMINSI Lower Meas. Limit or Min. Total Counter Read
*ing in SI Unit.
MRMINSI TYPE /BI0/OIMRMINSI,
InfoObject: 0MRMAXI Upper Measurement Area Limit Is Defined.
MRMAXI TYPE /BI0/OIMRMAXI,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0CJUMPSI Counter Overflow Reading in SI Unit.
CJUMPSI TYPE /BI0/OICJUMPSI,
InfoObject: 0PYEARSI Annual Performance in SI Unit.
PYEARSI TYPE /BI0/OIPYEARSI,
InfoObject: 0CREATED_BY Created By.
CREATED_BY TYPE /BI0/OICREATED_BY,
InfoObject: 0READSI Measured Value/Total Counter Reading in SI Un
*it.
READSI TYPE /BI0/OIREADSI,
InfoObject: 0READEU Measured Value in Document Entry Unit.
READEU TYPE /BI0/OIREADEU,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0CNTRRSI Counter Reading in SI Unit.
CNTRRSI TYPE /BI0/OICNTRRSI,
InfoObject: 0CDIFFSI Counter Reading Difference in SI Unit.
CDIFFSI TYPE /BI0/OICDIFFSI,
InfoObject: 0CAT_TYPE Catalog.
CAT_TYPE TYPE /BI0/OICAT_TYPE,
InfoObject: 0CAT_GROUP Code group.
CAT_GROUP TYPE /BI0/OICAT_GROUP,
InfoObject: 0FI_STATUS Processing Status.
FI_STATUS TYPE /BI0/OIFI_STATUS,
InfoObject: 0DIM_ID Dimension Key.
DIM_ID TYPE /BI0/OIDIM_ID,
InfoObject: 0CNTIND Measuring Point is Counter.
CNTIND TYPE /BI0/OICNTIND,
InfoObject: 0EQUIPMENT Equipment Number.
EQUIPMENT TYPE /BI0/OIEQUIPMENT,
InfoObject: 0FUNCT_LOC Functional Location.
FUNCT_LOC TYPE /BI0/OIFUNCT_LOC,
InfoObject: 0CAT_CODE Code.
CAT_CODE TYPE /BI0/OICAT_CODE,
InfoObject: 0MRMAXSI Upper Meas. Limit or Max. Total Counter Read
*ing in SI Unit.
MRMAXSI TYPE /BI0/OIMRMAXSI,
InfoObject: 0DOCAF Measurement Document Included for Associated T
*ask.
DOCAF TYPE /BI0/OIDOCAF,
InfoObject: 0RECORDMODE BW Delta Process: Update Mode.
RECORDMODE TYPE RODMUPDMOD,
InfoObject: ZSTEXT Object Number Text.
/BIC/ZSTEXT TYPE /BIC/OIZSTEXT,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1.
TYPES:
tyt_SC_1 TYPE STANDARD TABLE OF tys_SC_1
WITH NON-UNIQUE DEFAULT KEY.
$$ begin of global - insert your declaration only below this line -
... "insert your code here
$$ end of global - insert your declaration only before this line -
METHODS
start_routine
IMPORTING
request type rsrequest
datapackid type rsdatapid
EXPORTING
monitor type rstr_ty_t_monitors
CHANGING
SOURCE_PACKAGE type tyt_SC_1
RAISING
cx_rsrout_abort.
METHODS
inverse_start_routine
IMPORTING
i_th_fields_outbound TYPE rstran_t_field_inv
i_r_selset_outbound TYPE REF TO cl_rsmds_set
i_is_main_selection TYPE rs_bool
i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
i_r_universe_inbound TYPE REF TO cl_rsmds_universe
CHANGING
c_th_fields_inbound TYPE rstran_t_field_inv
c_r_selset_inbound TYPE REF TO cl_rsmds_set
c_exact TYPE rs_bool.
ENDCLASS. "routine DEFINITION
$$ begin of 2nd part global - insert your code only below this line *
TYPES:
BEGIN OF tys_SC_1_full,
InfoObject: 0MPOINT Measuring point.
MPOINT TYPE /BI0/OIMPOINT,
InfoObject: 0MDOC Measurement Document.
MDOC TYPE /BI0/OIMDOC,
InfoObject: 0MPOBJ Object Number of Measuring Point Object.
MPOBJ TYPE /BI0/OIMPOBJ,
InfoObject: 0MPTYP Type of Measuring Point.
MPTYP TYPE /BI0/OIMPTYP,
InfoObject: 0PSORT Position Number of Measuring Point in Object (
*Sort Field).
PSORT TYPE /BI0/OIPSORT,
InfoObject: 0ASSEMBLY Assembly.
ASSEMBLY TYPE /BI0/OIASSEMBLY,
InfoObject: 0ATINN Internal Characteristic Number.
ATINN TYPE /BI0/OIATINN,
InfoObject: 0MDATE Date of Measurement.
MDATE TYPE /BI0/OIMDATE,
InfoObject: 0TIME Time.
TIME TYPE /BI0/OITIME,
InfoObject: 0DESIRSI Target Value for Measuring Point in SI Unit.
DESIRSI TYPE /BI0/OIDESIRSI,
InfoObject: 0TVIND Target Value for Measuring Point Is Defined.
TVIND TYPE /BI0/OITVIND,
InfoObject: 0MRMINI Lower Measurement Area Limit Is Defined.
MRMINI TYPE /BI0/OIMRMINI,
InfoObject: 0MRMINSI Lower Meas. Limit or Min. Total Counter Read
*ing in SI Unit.
MRMINSI TYPE /BI0/OIMRMINSI,
InfoObject: 0MRMAXI Upper Measurement Area Limit Is Defined.
MRMAXI TYPE /BI0/OIMRMAXI,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0CJUMPSI Counter Overflow Reading in SI Unit.
CJUMPSI TYPE /BI0/OICJUMPSI,
InfoObject: 0PYEARSI Annual Performance in SI Unit.
PYEARSI TYPE /BI0/OIPYEARSI,
InfoObject: 0CREATED_BY Created By.
CREATED_BY TYPE /BI0/OICREATED_BY,
InfoObject: 0READSI Measured Value/Total Counter Reading in SI Un
*it.
READSI TYPE /BI0/OIREADSI,
InfoObject: 0READEU Measured Value in Document Entry Unit.
READEU TYPE /BI0/OIREADEU,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0CNTRRSI Counter Reading in SI Unit.
CNTRRSI TYPE /BI0/OICNTRRSI,
InfoObject: 0CDIFFSI Counter Reading Difference in SI Unit.
CDIFFSI TYPE /BI0/OICDIFFSI,
InfoObject: 0CAT_TYPE Catalog.
CAT_TYPE TYPE /BI0/OICAT_TYPE,
InfoObject: 0CAT_GROUP Code group.
CAT_GROUP TYPE /BI0/OICAT_GROUP,
InfoObject: 0FI_STATUS Processing Status.
FI_STATUS TYPE /BI0/OIFI_STATUS,
InfoObject: 0DIM_ID Dimension Key.
DIM_ID TYPE /BI0/OIDIM_ID,
InfoObject: 0CNTIND Measuring Point is Counter.
CNTIND TYPE /BI0/OICNTIND,
InfoObject: 0EQUIPMENT Equipment Number.
EQUIPMENT TYPE /BI0/OIEQUIPMENT,
InfoObject: 0FUNCT_LOC Functional Location.
FUNCT_LOC TYPE /BI0/OIFUNCT_LOC,
InfoObject: 0CAT_CODE Code.
CAT_CODE TYPE /BI0/OICAT_CODE,
InfoObject: 0MRMAXSI Upper Meas. Limit or Max. Total Counter Read
*ing in SI Unit.
MRMAXSI TYPE /BI0/OIMRMAXSI,
InfoObject: 0DOCAF Measurement Document Included for Associated T
*ask.
DOCAF TYPE /BI0/OIDOCAF,
InfoObject: 0RECORDMODE BW Delta Process: Record Mode.
RECORDMODE TYPE RODMUPDMOD,
Field: RECNO Data record number.
RECNO TYPE RSARECORD,
END OF tys_SC_1_full.
TYPES:
tyt_SC_1_full TYPE STANDARD TABLE OF tys_SC_1_full
WITH NON-UNIQUE DEFAULT KEY.
TYPES:
BEGIN OF tys_TG_1_full,
Field: SID Master data ID.
SID TYPE RSSID,
Field: DATAPAKID Data packet number.
DATAPAKID TYPE RSDATAPID,
Field: RECORD Data record number.
RECORD TYPE RSARECORD,
InfoObject: 0MPOINT Measuring point.
MPOINT TYPE /BI0/OIMPOINT,
InfoObject: 0MDOC Measurement Document.
MDOC TYPE /BI0/OIMDOC,
InfoObject: 0MRMINSI Lower Meas. Limit or Min. Total Counter Read
*ing in SI Unit.
MRMINSI TYPE /BI0/OIMRMINSI,
InfoObject: 0FI_STATUS Processing Status.
FI_STATUS TYPE /BI0/OIFI_STATUS,
InfoObject: 0DIM_ID Dimension Key.
DIM_ID TYPE /BI0/OIDIM_ID,
InfoObject: 0CNTIND Measuring Point is Counter.
CNTIND TYPE /BI0/OICNTIND,
InfoObject: 0EQUIPMENT Equipment Number.
EQUIPMENT TYPE /BI0/OIEQUIPMENT,
InfoObject: 0FUNCT_LOC Functional Location.
FUNCT_LOC TYPE /BI0/OIFUNCT_LOC,
InfoObject: 0CAT_CODE Code.
CAT_CODE TYPE /BI0/OICAT_CODE,
InfoObject: 0CAT_TYPE Catalog.
CAT_TYPE TYPE /BI0/OICAT_TYPE,
InfoObject: 0CAT_GROUP Code group.
CAT_GROUP TYPE /BI0/OICAT_GROUP,
InfoObject: 0CREATED_BY Created By.
CREATED_BY TYPE /BI0/OICREATED_BY,
InfoObject: 0MPOBJ Object Number of Measuring Point Object.
MPOBJ TYPE /BI0/OIMPOBJ,
InfoObject: 0READEU Measured Value in Document Entry Unit.
READEU TYPE /BI0/OIREADEU,
InfoObject: 0CJUMPSI Counter Overflow Reading in SI Unit.
CJUMPSI TYPE /BI0/OICJUMPSI,
InfoObject: 0PYEARSI Annual Performance in SI Unit.
PYEARSI TYPE /BI0/OIPYEARSI,
InfoObject: 0PSORT Position Number of Measuring Point in Object (
*Sort Field).
PSORT TYPE /BI0/OIPSORT,
InfoObject: 0ASSEMBLY Assembly.
ASSEMBLY TYPE /BI0/OIASSEMBLY,
InfoObject: 0ATINN Internal Characteristic Number.
ATINN TYPE /BI0/OIATINN,
InfoObject: 0MDATE Date of Measurement.
MDATE TYPE /BI0/OIMDATE,
InfoObject: 0TIME Time.
TIME TYPE /BI0/OITIME,
InfoObject: 0TVIND Target Value for Measuring Point Is Defined.
TVIND TYPE /BI0/OITVIND,
InfoObject: 0MRMINI Lower Measurement Area Limit Is Defined.
MRMINI TYPE /BI0/OIMRMINI,
InfoObject: 0DESIRSI Target Value for Measuring Point in SI Unit.
DESIRSI TYPE /BI0/OIDESIRSI,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0DESIREU Target Value for Measuring Point in Document
Entry Unit.
DESIREU TYPE /BI0/OIDESIREU,
InfoObject: 0CNTRRSI Counter Reading in SI Unit.
CNTRRSI TYPE /BI0/OICNTRRSI,
InfoObject: 0CDIFFSI Counter Reading Difference in SI Unit.
CDIFFSI TYPE /BI0/OICDIFFSI,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0MRMINEU Lower Meas. Limit or Min. Total Counter Read
*ing Document UoM.
MRMINEU TYPE /BI0/OIMRMINEU,
InfoObject: 0MRMAXEU Upper Meas. Limit or Max. Total Counter Read
*ing in Doc. M.U..
MRMAXEU TYPE /BI0/OIMRMAXEU,
InfoObject: 0MRMAXI Upper Measurement Area Limit Is Defined.
MRMAXI TYPE /BI0/OIMRMAXI,
InfoObject: 0CNTRREU Counter Reading in Document Entry Unit.
CNTRREU TYPE /BI0/OICNTRREU,
InfoObject: 0CDIFFEU Counter Reading Difference in Document Entry
Unit.
CDIFFEU TYPE /BI0/OICDIFFEU,
InfoObject: 0RECORDMODE BW Delta Process: Record Mode.
RECORDMODE TYPE RODMUPDMOD,
InfoObject: 0MRMAXSI Upper Meas. Limit or Max. Total Counter Read
*ing in SI Unit.
MRMAXSI TYPE /BI0/OIMRMAXSI,
InfoObject: 0READSI Measured Value/Total Counter Reading in SI Un
*it.
READSI TYPE /BI0/OIREADSI,
END OF tys_TG_1_full.
TYPES:
BEGIN OF tys_SC_1__RULE_34,
InfoObject: 0DESIRSI Target Value for Measuring Point in SI Unit.
DESIRSI TYPE /BI0/OIDESIRSI,
InfoObject: 0TVIND Target Value for Measuring Point Is Defined.
TVIND TYPE /BI0/OITVIND,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1__RULE_34.
TYPES:
BEGIN OF tys_SC_1__RULE_35,
InfoObject: 0MRMINI Lower Measurement Area Limit Is Defined.
MRMINI TYPE /BI0/OIMRMINI,
InfoObject: 0MRMINSI Lower Meas. Limit or Min. Total Counter Read
*ing in SI Unit.
MRMINSI TYPE /BI0/OIMRMINSI,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1__RULE_35.
TYPES:
BEGIN OF tys_SC_1__RULE_36,
InfoObject: 0MRMAXI Upper Measurement Area Limit Is Defined.
MRMAXI TYPE /BI0/OIMRMAXI,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0MRMAXSI Upper Meas. Limit or Max. Total Counter Read
*ing in SI Unit.
MRMAXSI TYPE /BI0/OIMRMAXSI,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1__RULE_36.
TYPES:
BEGIN OF tys_SC_1__RULE_37,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0CNTRRSI Counter Reading in SI Unit.
CNTRRSI TYPE /BI0/OICNTRRSI,
InfoObject: 0CNTIND Measuring Point is Counter.
CNTIND TYPE /BI0/OICNTIND,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1__RULE_37.
TYPES:
BEGIN OF tys_SC_1__RULE_38,
InfoObject: 0SIUNIT SI unit.
SIUNIT TYPE /BI0/OISIUNIT,
InfoObject: 0RECDU Unit of Measurement for Document Entry.
RECDU TYPE /BI0/OIRECDU,
InfoObject: 0CDIFFSI Counter Reading Difference in SI Unit.
CDIFFSI TYPE /BI0/OICDIFFSI,
InfoObject: 0CNTIND Measuring Point is Counter.
CNTIND TYPE /BI0/OICNTIND,
Field: RECORD.
RECORD TYPE RSARECORD,
END OF tys_SC_1__RULE_38.
Additional types for start routine interface
TYPES:
data_package_structure type tys_SC_1_full.
Additional declaration for update rule interface
DATA:
MONITOR type standard table of rsmonitor WITH HEADER LINE,
MONITOR_RECNO type standard table of rsmonitors WITH HEADER LINE,
RECORD_NO LIKE SY-TABIX,
RECORD_ALL LIKE SY-TABIX,
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS.
global definitions from update rules
TABLES: ...
DATA: ...
*******DATA: G_T_0EQUIPMENT TYPE STANDARD TABLE OF /BI0/MEQUIPMENT.
data: lt_data type ref to data.
data: ls_data type ref to data.
field-symbols: <G_T_0EQUIPMENT> type standard table.
field-symbols: <G_S_0EQUIPMENT> type any.
field-symbols: <FUNCT_LOC> type any.
field-symbols: <DATEFROM> type any.
field-symbols: <DATETO> type any.
constants: c_tabname type rstlogotab value '/BI0/MEQUIPMENT'.
*& Form routine_9998
text
-->DATA_PACKAGE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_9998
TABLES DATA_PACKAGE TYPE tyt_SC_1_full
CHANGING
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
In this start routine missing functional location for
an equipment is added to a Data Package record, if it is available
in the master data of the equipment
create data lt_data type table of (c_tabname).
assign lt_data->* to <G_T_0EQUIPMENT>.
create data ls_data type (c_tabname).
assign ls_data->* to <G_S_0EQUIPMENT>.
assign component 'FUNCT_LOC' of structure <G_S_0EQUIPMENT> to
<FUNCT_LOC>.
assign component 'DATE_FROM' of structure <G_S_0EQUIPMENT> to
<DATEFROM>.
assign component 'DATE_TO' of structure <G_S_0EQUIPMENT> to
<DATETO>.
CALL FUNCTION 'RSAU_READ_MASTER_DATA'
EXPORTING
I_IOBJNM = '0EQUIPMENT'
I_FLG_WHOLE_TABLE = 'X'
IMPORTING
E_TABLE = <G_T_0EQUIPMENT>
EXCEPTIONS
READ_ERROR = 1
NO_SUCH_ATTRIBUTE = 2
WRONG_IMPORT_PARAMETERS = 3
CHAVL_NOT_FOUND = 4
OTHERS = 5.
CASE SY-SUBRC.
WHEN 0.
LOOP AT <G_T_0EQUIPMENT> INTO <G_S_0EQUIPMENT>.
IF <FUNCT_LOC> IS INITIAL.
DELETE <G_T_0EQUIPMENT>.
ENDIF.
master data collected and condensed
ENDLOOP.
LOOP AT DATA_PACKAGE.
IF ( DATA_PACKAGE-FUNCT_LOC IS INITIAL ) AND
( DATA_PACKAGE-EQUIPMENT IS NOT INITIAL ).
READ TABLE <G_T_0EQUIPMENT> WITH KEY
('EQUIPMENT') = DATA_PACKAGE-EQUIPMENT INTO <G_S_0EQUIPMENT>
IF SY-SUBRC = 0.
IF ( DATA_PACKAGE-MDATE >= <DATEFROM> )
AND ( DATA_PACKAGE-MDATE <= <DATETO> ).
record for equipment exists and valid
DATA_PACKAGE-FUNCT_LOC = <FUNCT_LOC>.
MODIFY DATA_PACKAGE.
ENDIF.
ENDIF.
ENDIF.
ENDLOOP.
WHEN 4.
master data is not (yet) loaded
WHEN OTHERS.
problem occured, output as warning
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDCASE.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_9998
*& Form routine_0001
text
-->COMM_STRUCTURE text
-->RESULT text
-->RETURNCODE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_0001
USING
COMM_STRUCTURE type tys_SC_1__RULE_34
CHANGING
RESULT TYPE tys_TG_1_full-DESIREU
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
IF COMM_STRUCTURE-TVIND = 'X'.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = COMM_STRUCTURE-DESIRSI
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = COMM_STRUCTURE-SIUNIT
UNIT_OUT = COMM_STRUCTURE-RECDU
IMPORTING
ADD_CONST =
DECIMALS =
DENOMINATOR =
NUMERATOR =
OUTPUT = RESULT
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10
IF SY-SUBRC <> 0.
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDIF.
ENDIF.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0001
*& Form routine_0002
text
-->COMM_STRUCTURE text
-->RESULT text
-->RETURNCODE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_0002
USING
COMM_STRUCTURE type tys_SC_1__RULE_35
CHANGING
RESULT TYPE tys_TG_1_full-MRMINEU
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
IF COMM_STRUCTURE-MRMINI = 'X'.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = COMM_STRUCTURE-MRMINSI
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = COMM_STRUCTURE-SIUNIT
UNIT_OUT = COMM_STRUCTURE-RECDU
IMPORTING
ADD_CONST =
DECIMALS =
DENOMINATOR =
NUMERATOR =
OUTPUT = RESULT
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10
IF SY-SUBRC <> 0.
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDIF.
ENDIF.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0002
*& Form routine_0003
text
-->COMM_STRUCTURE text
-->RESULT text
-->RETURNCODE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_0003
USING
COMM_STRUCTURE type tys_SC_1__RULE_36
CHANGING
RESULT TYPE tys_TG_1_full-MRMAXEU
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
IF COMM_STRUCTURE-MRMAXI = 'X'.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = COMM_STRUCTURE-MRMAXSI
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = COMM_STRUCTURE-SIUNIT
UNIT_OUT = COMM_STRUCTURE-RECDU
IMPORTING
ADD_CONST =
DECIMALS =
DENOMINATOR =
NUMERATOR =
OUTPUT = RESULT
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10
IF SY-SUBRC <> 0.
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDIF.
ENDIF.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0003
*& Form routine_0004
text
-->COMM_STRUCTURE text
-->RESULT text
-->RETURNCODE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_0004
USING
COMM_STRUCTURE type tys_SC_1__RULE_37
CHANGING
RESULT TYPE tys_TG_1_full-CNTRREU
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
IF COMM_STRUCTURE-CNTIND = 'X'.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = COMM_STRUCTURE-CNTRRSI
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = COMM_STRUCTURE-SIUNIT
UNIT_OUT = COMM_STRUCTURE-RECDU
IMPORTING
ADD_CONST =
DECIMALS =
DENOMINATOR =
NUMERATOR =
OUTPUT = RESULT
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10
IF SY-SUBRC <> 0.
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDIF.
ENDIF.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0004
*& Form routine_0005
text
-->COMM_STRUCTURE text
-->RESULT text
-->RETURNCODE text
-->ABORT text
-->RAISING text
-->CX_SY_ARITHMETIC_ERROR text
-->CX_SY_CONVERSION_ERROR text
FORM routine_0005
USING
COMM_STRUCTURE type tys_SC_1__RULE_38
CHANGING
RESULT TYPE tys_TG_1_full-CDIFFEU
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
IF COMM_STRUCTURE-CNTIND = 'X'.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = COMM_STRUCTURE-CDIFFSI
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = COMM_STRUCTURE-SIUNIT
UNIT_OUT = COMM_STRUCTURE-RECDU
IMPORTING
ADD_CONST =
DECIMALS =
DENOMINATOR =
NUMERATOR =
OUTPUT = RESULT
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10
IF SY-SUBRC <> 0.
MONITOR-MSGID = SY-MSGID.
MONITOR-MSGTY = 'W'.
MONITOR-MSGNO = SY-MSGNO.
MONITOR-MSGV1 = SY-MSGV1.
MONITOR-MSGV2 = SY-MSGV2.
MONITOR-MSGV3 = SY-MSGV3.
MONITOR-MSGV4 = SY-MSGV4.
APPEND MONITOR.
ENDIF.
ENDIF.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0005
TYPES: BEGIN OF TY_/BI0/PMPOINT,
MPOINT TYPE /BI0/PMPOINT-MPOINT,
MRNGU TYPE /BI0/PMPOINT-MRNGU,
END of TY_/BI0/PMPOINT.
DATA: /BI0/PMPOINT_TEMP TYPE STANDARD TABLE OF
TY_/BI0/PMPOINT,
W_/BI0/PMPOINT_TEMP like line of /BI0/PMPOINT_TEMP.
$$ end of 2nd part global - insert your code only before this line *
CLASS routine IMPLEMENTATION
CLASS lcl_transform IMPLEMENTATION.
Method start_routine
Calculation of source package via start routine
<-> source package
METHOD start_routine.
*=== Segments ===
FIELD-SYMBOLS:
<SOURCE_FIELDS> TYPE tys_SC_1.
DATA:
MONITOR_REC TYPE rstmonitor.
*$*$ begin of routine - insert your code only below this line *-*
Data:
l_subrc type sy-tabix,
l_abort type sy-tabix,
ls_monitor TYPE rsmonitor,
ls_monitor_recno TYPE rsmonitors.
REFRESH:
MONITOR,
MONITOR_RECNO.
Runtime attributs
SOURCE_SYSTEM = p_r_request->get_logsys( ).
Migrated update rule call
Perform routine_9998
TABLES
SOURCE_PACKAGE
CHANGING
l_abort.
*-- Convert Messages in Transformation format
LOOP AT MONITOR_RECNO INTO ls_monitor_recno.
move-CORRESPONDING ls_monitor_recno to MONITOR_REC.
append monitor_rec to MONITOR.
ENDLOOP.
LOOP AT MONITOR INTO ls_monitor.
move-CORRESPONDING ls_monitor to MONITOR_REC.
append monitor_rec to MONITOR.
ENDLOOP.
IF l_abort <> 0.
RAISE EXCEPTION TYPE CX_RSROUT_ABORT.
ENDIF.
$$ end of routine - insert your code only before this line -
ENDMETHOD. "start_routine
Method inverse_start_routine
This subroutine needs to be implemented only for direct access
(for better performance) and for the Report/Report Interface
(drill through).
The inverse routine should transform a projection and
a selection for the target to a projection and a selection
for the source, respectively.
If the implementation remains empty all fields are filled and
all values are selected.
METHOD inverse_start_routine.
$$ begin of inverse routine - insert your code only below this line-
... "insert your code here
$$ end of inverse routine - insert your code only before this line -
ENDMETHOD. "inverse_start_routine
ENDCLASS. "routine IMPLEMENTATION -
DB Connect DataSource PSA records and DSO records are not matching...
Dear All,
I'm working with SAP NetWeaver BW 7.3 and for the first time, I have loaded from Source System DB Connect. I have created a DataSource and pulled all records and found 8,136,559 records in PSA. When I designed and created DSO with Key Fields 0CALDAY, Item No and Company Code, it has transferred records about 8,136,559 and added records about 12,534 only. Similarly following InfoCube has about 12,534 records into its Fact table. When I tried to reconcile the data/records with source DBMS for a month, the records/data could not matched?
1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
2. Have I not mentioned the Key Fields of DSO in a correct manner?
3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
4. How should I resolve this issue?
5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
Many thanks,
Tariq AshrafDear Tariq,
1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
Ans: Check transformation once. Is there any start routine you have used or direct assignments. What kind of DTP settings you have done.
Check the messages at the DTP monitor. You will surely find some clue. Any duplicate records are being detected or not check once if you are using semantic keys in your DTP.
2. Have I not mentioned the Key Fields of DSO in a correct manner?
Ans: The transformation key and the DSo key are they same in your case?
What kind of DSO is it? Like for sales order DSO you take Order number as a key field., So you have to define the key fields according to business semantics I suppose. Do you agree?
3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
Ans: I dont think so as the keys you defined will help in having unique data records isnot it?
4. How should I resolve this issue?
Ans: Please check the above as in Ans:1 please share your observation.
5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
Ans: DSO overwriting of key figures is useful when you have full loads in picture. Are you always going to perform full loads ?
For reference would you like to check this thread: Data fileds and key fields in DSO
Lets see what experts give their inputs.
Thank You... -
Amount of records loaded to dso is not same as in psa
i performed a loading from psa to dso. i have 2 datasource under this dso, the amount of records loaded from psa for this 2 datasources to dso is not consistent. the psa for the 1st datasource having 3k records and the 2nd datasource having 5k records, when i perform the loading for both of this datasource to dso, the records is less. do anyone here know why is this so?
hi,
DSO have overwrite option and hence you have lesser records.
chk if you have enough key fields in DSO, so that you can reduce the number of records getting overwritten.
Ramesh -
QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES
WHAT ARE QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
WHAT ARE DATALOADING PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
WILL REWARD FULL POINT S
REGARDS
GURUBW Back end
Some Tips -
1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 Background Processing Job Management to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 ABAP/4 Run-time Analysis and then run the analysis for the transaction code RSA3 Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW BW IMG Menu on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
Hope it Helps
Chetan
@CP.. -
Hello BW Experts,
1) Is it advisibe to use remote cubes ( not for the user purpose, but for internal data reconciliation purpose). ?
2) tables we are looking at are the vbak, vbap, vbrk, vbrp from SAP and comparing them to the base Sales and billing ODS. And produce a reconciliaiton report. Since the data in the source tables vbap, vbrp is huge, it gives memory dumps. Is there any better way of using this. Any suggestion to get solve the memory dump?
3) Are there any measures to improe remote cube performance
4) Other than the remote cube is there any other way of doing the reconciliation for the above data
Any suggestions appreciated.
Thanks,
BWerHi BWer,
Remote cube performance is really an issue.
Instead, you can load data upto PSA and compare them.
There is a "How to... Validate InfoCube Data By Comparing it with PSA Data":
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how to validate infocube data by comparing it with psa data
You can also quickly build in R/3 a query upon those tables using SQ01 tcode and Quick Viewer button for comparison.
Best regards,
Eugene -
How can i delete data in psa and how can i partation psa?
hi friends,
can i delete data from psa and how can i partation psa?
Thanking u
suneel.hi Suneel,
http://help.sap.com/saphelp_nw70/helpdata/en/7d/724d3caa70ea6fe10000000a114084/frameset.htm
Using partitioning, you can separate the dataset of a PSA table into several smaller, physically independent, and redundancy-free units. This separation can mean improved performance when you update data from the PSA. In the <b>Implementation Guide with SAP NetWeaver > Business Intelligence > Connections to Other Systems > Maintain Control Parameters for Data Transfer</b> you define the number of data records needed to create a new partition. Only data records from a complete request are stored in a partition. The specified value is a threshold value.
transaction SPRO to go implementation guide
at database level
http://help.sap.com/saphelp_nw70/helpdata/en/33/dc2038aa3bcd23e10000009b38f8cf/frameset.htm
If you are using this database you can also partition PSA tables. You first have to activate this function using RSADMIN parameter DB4_PSA_PARTITIONING = 'X'. SAP Note 815186 includes more comprehensive information on this.
hope this helps. -
Long time to load data from PSA to DSO -Sequential read RSBKDATA_V
Hi ,
It is taking long time to load data from PSA to DSO. It is doing Sequential read on RSBKDATA_V and table contents no data .
we are at - SAPKW70105. It started since yesterday . There is no changes in system parameters.
Please advice.
Thanks
NileshHi Nilesh,
I guess the following SAP Note will help you in this situation.
[1476842 - Performance: read RSBKDATA only when needed|https://websmp107.sap-ag.de/sap/support/notes/1476842]
Just note that the reference to Support Packages is wrong. It is also included in SAP_BW 701 06. -
Hi All,
We are trying to delete the PSA requests by manually but when we go to few PSA tables there we are not able to delete the requests from the Data Source.
The PSA Table name is also like this /BIC/B0000535001.
When I try to delete the one request from the PSA its throwing error as RSDU_TABLE_DROP_PARTITION_MSS Error while creating the clone table /BIC/B0000535001
Can we know what the clone table and How to solve this issue ASAP.
Thanks
SravaniHi Sravani,
Please try the below things, hope this may solve ur problm.
There are two main steps you must perform:
(1) fix the reason why the table drop fails and then
(2) remove the shadow tables so processing can continue.
To determine why the partition drop failed, we recommend checking:
1. The SQL Server Error Log - this will tell you if the transaction log ran out of space. The log can be seen via SQL Server tools or transactions ST04/DBACOCKPIT.
2. The Blocking Lock Monitor - as long as you have the monitor job active, this transaction will show if there is blocking occurring. This monitor is in transactions ST04/DBACOCKPIT.
3. The Deadlock Monitor - will show if there was a deadlock at the time of the partition drop. This monitor is in transactions ST04/DBACOCKPIT.
After you have determined the root cause, you must correct it before removing the shadow tables. If the log ran out of space then you must move it to a drive with enough space or add another log file on a different drive with enough space. If the root cause is blocking or deadlocks then first search SAP Notes for known solutions or, secondly, open a customer message in BC-SYS-DB-MSS if you need help. In all other cases, open a BC-SYS-DB-MSS customer message so the development support team can assist you in finding the cause.
After you have corrected the root cause, you must remove the shadow tables. You can do this with the latest version of the SAP ABAP program RSDD_MSSQL_CUBEANALYZE (install the latest version which is attached to SAP Note 991014). After the program is installed, execute it with SE38 and choose Settings -> Expert Mode. Acknowledge the warning, press the Detail button under the Determine All Shadow Objects item and press Start Check. Then select each shadow table and press Delete Clone Table.
For more information regarding SAP BW's use of SQL Server table partitioning, read the white paper titled "Table Partitioning in SAP BI on Microsoft® SQL Server# 2005/2008". The paper is currently available for download at:
http://www.microsoft.com/isv/sap/technology/interop/bi.aspx
Regards
KP -
Performance problem in loading the Mater data attributes 0Equipment_attr
Hi Experts,
We have a Performance problem in loading the Mater data attributes 0Equipment_attr.It is running with psuedo delta(full update) the same infopakage runs with diffrent selections.The problme we are facing is the load is running 2 to 4 hrs in the US morning times and when coming to US night times it is loading for 12-22 hrs and gettin sucessfulluy finished. Even it pulls (less records which are ok )
when i checked the R/3 side job log(SM37) the job is running late too. it shows the first and second i- docs coming in less time and the next 3and 4 i- docs comes after 5-7 hrs gap to BW and saving in to PSA and then going to info object.
we have userexits for the data source and abap routines but thay are running fine in less time and the code is not much complex too.
can you please explain and suggest the steps in r/3 side and bw side. how can i can fix this peformance issue
Thanks,
dpHi,
check this link for data load performance. Under "Extraction Performance" you will find many useful hints.
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
Regards
Andreas
Maybe you are looking for
-
Problem printing envelopes with HP Deskjet 6540
Printer is about 6 years old, and to print addresses on envelopes, I always had to push the blinking yellow-arrow button to restart the print job after getting a message that the paper was the wrong size. This worked until recently, and now the prin
-
Blocking Goods Receipt for more than one PO
Dear All, We have a requirement. We do not want to do goods receipt for more than one PO.STD SAP is allowing us to do the same. Can we block it ? We want one One Material Doc for One PO .For One PO there can be more than one material docume
-
I want to email a photo through iPhoto but it says I need a password... I entered my password but it says it doesn't match - what password do they want and how do I fix? (Facebook and other social media iPhoto posts work - just email asking for passw
-
Compare service-types.db.pacnew, binary file
After today's update I have /usr/lib/avahi/service-types.db.pacnew. I use 'Meld' to usually, compare .pacnew with the original file. However I am unable to do so because 'Meld' says, "Could not read file... appears to be a binary file". I don't have
-
I have successfully downloaded and played NeverWinter Nights 2 in the past but now when I try I am asked for a regestration key and I don't know were to find it. Any suggestions?