Steps to load data from R/3 to BW
Hi all,
1. I am loading data fromDS : 2LIS_12_VCITM to cube. Can somebody explain what steps we need to follow in R/3 and then in BW.
2. Also i have a Z - Extract Structure for 2lis_13_vditm,which has additional fields, how or where should i link 2lis_13_vditm with Z-extract. And how does the newly added fields get value?\
Thanks,
Gowri
hi gowri,
LO Cockpit Step By Step
LO EXTRACTION
- Go to Transaction LBWE (LO Customizing Cockpit)
1). Select Logistics Application
SD Sales BW
Extract Structures
2). Select the desired Extract Structure and deactivate it first.
3). Give the Transport Request number and continue
4). Click on `Maintenance' to maintain such Extract Structure
Select the fields of your choice and continue
Maintain DataSource if needed
5). Activate the extract structure
6). Give the Transport Request number and continue
- Next step is to Delete the setup tables
7). Go to T-Code SBIW
8). Select Business Information Warehouse
i. Setting for Application-Specific Datasources
ii. Logistics
iii. Managing Extract Structures
iv. Initialization
v. Delete the content of Setup tables (T-Code LBWG)
vi. Select the application (01 u2013 Sales & Distribution) and Execute
- Now, Fill the Setup tables
9). Select Business Information Warehouse
i. Setting for Application-Specific Datasources
ii. Logistics
iii. Managing Extract Structures
iv. Initialization
v. Filling the Setup tables
vi. Application-Specific Setup of statistical data
vii. SD Sales Orders u2013 Perform Setup (T-Code OLI7BW)
Specify a Run Name and time and Date (put future date)
Execute
- Check the data in Setup tables at RSA3
- Replicate the DataSource
Use of setup tables:
You should fill the setup table in the R/3 system and extract the data to BW - the setup tables is in SBIW - after that you can do delta extractions by initialize the extractor.
Full loads are always taken from the setup tables
Similar Messages
-
CIF STEPS to load data from ECC to APO from Process chain
Hi,
i try to create on ECC system a process chain in order to execute the CIF program for updating APO Material/Plant Master data on DELTA mode .
I am not sure that the procedure is correct, at the moment i put on a chain:
- Step 1 Program RIMODGEN with a specific variant
-Step 2 Program RIMODACT2 with a specific variant.
Is necessary a third step with RIMODACT2 in order to deactivate the integration model?
On APO we have to insert other standard step in order to complete the delta update?
Any experience on similar situation?
Thanks,
VeronicaVeronica
You do not need a third step. You do not need to deactivate the model. In fact, it is not recommended to do so. You do not need any step in APO.
If your CIF setup is correct, the first two steps should be enough.
If by 'Delta', you mean field value changes to material/ plants in an existing integration model, you dont even need to do CFM1/ CFM2 if you have setup transaction CFC9 for BTE (online transfer).
Rishi Menon -
Loading data from SAP BW standard table to master data text
Hi,
Can anyone please tell me the steps to load data from SAP BW standard table to master data text?
How should the data source be created?
Thanks in advance.Hi aparajith,
Have you loaded the master data attr ?
ECC Steps
then load the attr text
create the ztable using standard table using generic data source using table function.
there you can select the which field you required
check the data in RSA3
BI STEPS: -
check the source system connection.
replicate the data source
create of find the master data object for text (including attr)
map the transformation
run the info package
create the DTP.
check the data in target object.
Thanks,
Phani. -
Step by Step details on how to load data from a flat file
hi can anyone explain how to load data from a flat file. Pls giv me step by step details. thnx
hi sonam.
it is very easy to load data from flat file. whn compared with other extrations methods...
here r the step to load transation data from a flat file.......
step:1 create a Flat File.
step:2 log on to sap bw (t.code : rsa1 or rsa13).
and observe the flat file source system icon. i.e pc icon.
step:3 create required info objects.
3.1: create infoarea
(infoObjects Under Modeling > infoObjects (root node)-> context menu -
> create infoarea).
3.2: create char /keyfig infoObject Catalog.(select infoArea ---.context menu --->create infoObject catalog).
3.3: create char.. infoObj and keyFig infoObjects accourding to ur requirement and activate them.
step:4 create infoSource for transaction data and create transfer structure and maintain communication structure...
4.1: first create a application component.(select InfoSources Under modeling-->infosources<root node>>context menu-->create applic...component)
4.2: create infoSource for transation data(select appl..comp--.context menu-->create infosource)
>select O flexible update and give info source name..
>continue..
4.4: *IMp* ASSIGN DATASOURCE..
(EXPAND APPLIC ..COMP..>EXPAND YOUR INFOSOURCE>CONTEXT MENU>ASSIGN DATASOURCE.)
>* DATASOURCE *
>O SOURCE SYSTEM: <BROWSE AND CHOOSE YOUR FLAT FILE SOURCE SYSTEM>.(EX:PC ICON).
>CONTINUE.
4.5: DEFINE DATASOURCE/TRANSFER STRUCTURE FOR IN FOSOURCE..
> SELECT TRANSFER STRUCTURE TAB.
>FILL THE INFOOBJECT FILLED WITH THE NECESSARY INFOOBJ IN THE ORDER OR SEQUENCE OF FLAT FILE STRUCTURE.
4.6: ASSIGN TRANSFER RULES.
---> NOW SELECT TRANSFER RULES TAB. AND SELECT PROPOSE TRANSFER RULES SPINDLE LIKE ICON.
(IF DATA TARGET IS ODS -
INCLUDE 0RECORDMODE IN COMMUNICATION STRUCTURE.)
--->ACTIVATE...
STEP:5 CREATE DATATARGET.(INFOCUBE/ODS OBJECT).
5.1: CREATE INFO CUBE.
-->SELECT YOUR INFOAREA>CONTEXT MENU>CREATE INFOCUBE.
5.2: CREATE INFOCUBE STRUCTURE.
---> FILL THE STRUCTURE PANE WILL REQUIRE INFOOBJECTS...(SELECT INFOSOURCE ICON>FIND UR INFOSOURCE >DOUBLE CLICK > SELECT "YES" FOR INFOOBJECT ASSIGNMENT ).
>MAINTAIN ATLEAST ON TIME CHAR.......(EX; 0CALDAY).
5.3:DEFINE AND ASSIGN DIMENSIONS FOR YOUR CHARACTERISTICS..
>ACTIVATE..
STEP:6 CREATE UPDATE RULES FOR INFOCUDE USING INFOSOURCE .
>SELECT UR INFOCUBE >CONTEXT MENU> CREATE UPDATE RULES.
> DATASOURCE
> O INFOSOURCE : _________(U R INFOSOURCE). AND PRESS ENTER KEY.......
>ACTIVATE.....UR UPDATE RULES....
>>>>SEE THE DATA FLOW <<<<<<<<----
STEP:7 SCHEDULE / LOAD DATA..
7.1 CREATE INFOPACKAGE.
--->SELECT INFOSOURCE UNDER MODELING> EXPAND UR APPLIC.. COMP..> EXPAND UR INFOSOURCE..> SELECT UR DATASOURCE ASSIGN MENT ICON....>CONTEXT MENU> CREAE INFOPACKAGE..
>GIVE INFOPACKAGE DISCREPTION............_________
>SELECT YOUR DATA SOURCE.-------> AND PRESS CONTINUE .....
>SELECT EXTERNAL DATA TAB...
> SELECT *CLIENT WORKSTATION oR APPLI SERVER >GIVE FILE NAME > FILE TYPE> DATA SAPARATER>
>SELECT PROCESSING TAB
> PSA AND THEN INTO DATATARGETS....
>DATATARGET TAB.
>O SELECT DATA TARGETS
[ ] UPDATE DATATARGET CHECK BOX.....
--->UPDATE TAB.
O FULL UPDATE...
>SCHEDULE TAB..
>SELECT O START DATA LOAD IMMEDIATELY...
AND SELECT "START" BUTTON........
>>>>>>>>>>
STEP:8 MONITOR DATA
> CHECK DATA IN PSA
CHECK DATA IN DATA TARGETS.....
>>>>>>>>>>> <<<<<<<<<----
I HOPE THIS LL HELP YOU..... -
Help - Step by Step Material on How to load Data from flatfile to BI 7.0
Hi BI Experts,
Can any one send the document or a pdf file on how to load data from flat file to BI 7.0 in step by step process.
Thanks in Advance
Regards
Ramakrishna Kamurthy
+91-99631010731. On log on you will be placed in the SAP Easy Access Strategic Management/ Business Analytics
1. You see the screen divided in two segments.
2. Follow the left segment.
3. Click on the Business Information Warehouse.
4. Sub Menu  BW Administration.
5. You find various entries in it.
6. Now double click on UG_BW_RSA1-Administrator Workbench
2. Creation of Info Area
1. Click on InfoObjects.
2. Now in the right hand side screen select the InfoObjects (First item on the top). Right click the mouse and select Create InfoArea.
3. A dialog box pops up.
4. Give the InfoArea as ZIVY_Iarea.
5. Give the Long Description as ZIVY Info Area.
3. Create a Catalog:
1. Select the InfoArea just created ZIVY_Iarea.
2. Right click the mouse pointer and select the option Create InfoObject Catalog
3. A dialog box pops up.
4. Give the InfoObjCat = ZIVY_CHCAT
i. Description = ZIVY Character Catalog
ii. InfoObjectType: Check the Character. ( As we are creating the master data, check the Character)
iii. Now at the bottom of this dialog box you see an icon called CREATE. Click this icon.
1.
1. The next screen in front of you is the Edit InfoObject Catalog.
2. Simply click the activate icon present on the Application Toolbar.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CUST and
i. Description Customer Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CUST: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP .
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_CUST, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_CITY
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CITY and
i.
Description
Customer
City Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CITY: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 25
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CITY, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_STATE
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_STATE and
i.
Description
Customer
State Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_STATE: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 25
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_STATE, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_CTRY
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CTRY and
i. Description Customer Country Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CTRY: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data
2. Check the With Texts and make suitable Text Table Properties.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CTRY, by clicking the icon ACTIVATE.
1. Creation of Attributes and InfoObject from the ZIVY_CUST InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_CUST. Double click the InfoObject ZIVY_CUST.
6. You find the screen Change Characteristic ZIVY_CUST:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_CITY and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Similarly, type ZIVY_STATE, ZIVY_CTRY.
11. Now, we will create another InfoObject ZIVY_TEL, from here.
12. So type ZIVY_Tel and enter the ENTER KEY.
13. A dialog box pops up, and request you If you want to Create an InfoObject
14. Select Create Attribute As Characteristic and click the check mark.
15. Another window pops up. Give the following:
i. In the General Tab:
1. Long Description: ZIVY Telephone Number
2. Short Description: ZIVY Telephone Number
3. DataType : NUMC
4. Length : 10
5. Uncheck Attribute Only
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies
iv. Now click the TICK MARK
1.
1. Click the ACTIVATE
2. A window pops up.
i. Select activate dependent InfoObjects.
ii. Click the Check Mark.
1. Creation of InfoObject: ZIVY_DATE
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_DATE and
i. Description ZIVY Date.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_DATE: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: DATS
2. Length: 8
ii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CTRY, by clicking the icon ACTIVATE.
2. Create ZIVY Application Component:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Right click the mouse pointer and select the entry Create Application Component
5. A window pops up.
i. Application Component: ZIVY_APPCOMP
ii. Long Description : ZIVY Application Component
iii. Click the check mark.
1. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_CUST
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the Customer Info Object.
2. Select the Customer Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_CUST
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_CUST,
3. Second field is ZIVY_CITY.
4. And so on.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_CUST MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_CUST,
4. Third field is 0TXTMD.
5. Click the Transfer Rule adjacent to 0LANGUAGE
6. A window pops up.
a. Check the CONSTANT and give the value as EN
b. Click the TICK MARK.
7. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_CUST_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_CUST_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_CUST alias Customer Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_CUST_ATTR.
ii. Select ZIVY_CUST_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_CUST_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_CUST alias Customer Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_CUST_TEXT.
ii. Select ZIVY_CUST_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_CUST_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_MAT and
i. Description ZIVY Material Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_MAT: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP 17).
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_MAT, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_GROUP
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_GROUP and
i. Description ZIVY Material Group.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_GROUP: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data
2. Check the With Texts and make suitable Text Table Properties.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_GROUP, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_COLOR
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_COLOR and
i. Description ZIVY Material Color.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_COLOR: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_COLOR, by clicking the icon ACTIVATE.
2. Creation of InfoObject from the ZIVY_MAT InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_MAT. Double click the InfoObject ZIVY_MAT.
6. You find the screen Change Characteristic ZIVY_MAT:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_GROUP and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Similarly, type ZIVY_COLOR.
11. Click the ACTIVATE
3. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_MAT
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the ZIVY Material Info Object.
2. Select the ZIVY Material Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_MAT
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_MAT,
3. Second field is ZIVY_GROUP.
4. Third field is ZIVY_COLOR.
5. Check the mapping. If you dont see the mapping.
6. Click on the empty row in the group Assign InfoObjectField, the box to the right.
7. Type ZIVY_GROUP, ZIVY_COLOR.
8. Check the transfer rules.
9. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_CUST MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_MAT,
4. Third field is 0TXTSH.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_Material_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_Material_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_MAT alias ZIVY Material Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_MAT_ATTR.
ii. Select ZIVY_MAT_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_Material_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_MAT alias ZIVY Material Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_MAT_TEXT.
ii. Select ZIVY_MAT_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_MATERIAL_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_SPER and
i. Description ZIVY Sales Person Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_SPER: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 20
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP 24).
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_SPER, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_SREG
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_SREG and
i. Description ZIVY Sales Region.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_SREG: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 20
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_SREG, by clicking the icon ACTIVATE.
2. Creation of Attributes from the ZIVY_SPER InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_SPER. Double click the InfoObject ZIVY_SPER.
6. You find the screen Change Characteristic ZIVY_SPER:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_SREG and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Click the ACTIVATE
3. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_SPER
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the ZIVY Sales Person Info Object.
2. Select the ZIVY Sales Person Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_SREP
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_SPER,
3. Second field is ZIVY_SREG.
4. Check the mapping. If you dont see the mapping.
5. Click on the empty row in the group Assign InfoObjectField, the box to the right.
6. Check the transfer rules.
7. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_SREP MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_SPER,
4. Third field is 0TXTSH.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_SREP_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_SREP_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_SPER alias ZIVY Sales Person Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_SPER_ATTR.
ii. Select ZIVY_SPER_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_SPER_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_SPER alias ZIVY Sales Person Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_SPER_TEXT.
ii. Select ZIVY_SPER_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_SPER_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii.
Browse the data.
1. Creation of InfoObject Catalog with Key Figures:
1. Select the InfoArea just created ZIVY_Iarea.
2. Right click the mouse pointer and select the option Create InfoObject Catalog
3. A dialog box pops up.
4. Give the InfoObjCat = ZIVY_KEYCAT
i. Description = ZIVY Key Figures Catalog
ii. InfoObjectType: Check the Key Figures.
iii. Now at the bottom of this dialog box you see an icon called CREATE. Click this icon.
1.
1. The next screen in front of you is the Edit InfoObject Catalog.
2. Simply click the activate icon present on the Application Toolbar.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_KEYCAT,which is ZIVY Key Figures Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_AMT and
i. Description ZIVY Amount Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Create Key Figure ZIVY_AMT: Detail.
2. In this screen the following fields have to be modified:
i. In the Type/ Unit Tab
1. DataType: Amount
2. Unit/ Currency = 0Currency
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_AMT, by clicking the icon ACTIVATE.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. -
Unable to load data from an ODS to a Cube
Hi all,
I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
Thank you very much and kind regards,
MM.
May be this helps...
When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
Thanks again.Hello MM,
Can you do the following..
make the Total status Red Delete the request from cube.
goto ODS -> Remove the Data Mart Status -> Try loading it again.
The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
Hope it will help! -
To load data from a cube in SCM(APO) system to a cube in BI system.
Experts,
Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
Thanks,
MeeraHi,
Think in this way,
To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO, then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
Thanks
Reddy -
Dump when loading Data from DSO to Cube
Hi,
i get a short dump DBIF_REPO_PART_NOT_FOUND when trying to load data from a dso to a cube. From DSO to DSO it works fine.
any idea`s?
thx!
Dominikthe Dump occurs in the last step of the last data pakage in the DTP Monitor.
I checked the OSS note, but i don´t know how to check the kernel version.
Now i used another DSO for testing and i got another Dump with a different name.
After that i logged in again and it works with my test DSO data...
But still not working with the original one...
@Kazmi: What detaisl u mean exactly? -
Automatically trigger the event to load data from Planning cube to Standard Cube
Hello,
We have a below set up in our system..
1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
2. An actual reporting cube which gets data from the planning cube above.
Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
This involves 2 things..
1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
2. Trigger the DTP which loads data from Planning cube to reporting cube.
We want to automate the above two steps...
I have tried few things to achieve the same..
1. Created an event in SM64,
2. In the Planning cube "Manage" Screen, clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
3. Wrote a ABAP program which changes the setting of the planning cube ( " Change real time load behaviour " to Loading )
4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?hi,
try to do the transformation directly in the input cube by using CR of type exit, more details :
http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
hope it helps. -
Problem in Loading Data from SQL Server 2000 to Oracle 10g
Hi All,
I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
Where Prj_Dim_Region is the name of my target table in Oracle.
The error message is:
955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
java.sql.SQLException: ORA-00955: name is already used by an existing object
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
<%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
I have chosen to execute this code on target.
Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
But when i press OK to run it, I get this error message:
java.lang.IllegalArgumentException: Row index out of range
at javax.swing.JTable.boundRow(Unknown Source)
at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.open(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
Thank you so much in advance.
NeelHi Cezar,
Can u plz help me with this scenario?
I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
java.sql.SQLException: ORA-00942: table or view does not exist
and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
"-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
Please let me know how to make it work?
Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
Thank you for your time in advance.
Regards,
Neel -
Unable to load data from FDMEE staging table to HFM tables
Hi,
We have installed EPM 11.1.2.3 with all latest related products (ODI/FDMEE) in our development environment
We are inprocess of loading data from EBS R12 to HFM using ERPI (Data Managemwnt in EPM 11.1.2.3). We could Import and validate the data but when we try to Export the data to HFM, the process continuously running hours (neither it gives any error nor it completes).
When we check the process details in ODI work repository, following processes are completed successfully
COMM_LOAD_BALANCES - Running .........(From past 1 day, still running)
EBS_GL_LOAD_BALANCES_DATA - Successful
COMM_LOAD_BALANCES - Successful
Whereas we could load data in staging table of FDMEE database schema and moreover we are even able to drill through to source system (EBS R12) from Data load workbench but not able to load the data into HFM application.
Log details from the process are below.
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Process Start, Process ID: 31
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 17:04:59,748 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_31.log
2013-11-05 17:04:59,748 INFO [AIF]: User:admin
2013-11-05 17:04:59,748 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 17:04:59,749 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 17:04:59,749 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 17:04:59,749 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 17:05:00,844 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 17:05:00,844 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 17:05:02,910 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 17:05:02,953 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 17:05:03,030 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 17:05:03,108 INFO [AIF]:
Move Data for Period 'JAN'
Any help on above is much appreciated.
Thank you
Regards
PraneethHi,
I have followed the steps 1 & 2 above. Noe the log shows something like below
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-05 09:47:31,180 INFO [AIF]: User:admin
2013-11-05 09:47:31,180 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 09:47:31,180 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 09:47:31,180 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 09:47:31,181 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 09:47:32,378 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 09:47:32,378 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 09:47:34,652 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 09:47:34,698 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 09:47:34,744 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 09:47:34,828 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:10,478 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Logging Level: 5
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-08 11:49:10,494 INFO [AIF]: User:admin
2013-11-08 11:49:10,494 INFO [AIF]: Location:VISIONLOC (Partitionkey:1)
2013-11-08 11:49:10,494 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-08 11:49:10,495 INFO [AIF]: Category Name:VISIONCAT (Category key:1)
2013-11-08 11:49:10,495 INFO [AIF]: Rule Name:VISIONRULE (Rule ID:1)
2013-11-08 11:49:11,903 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-08 11:49:11,909 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-08 11:49:14,037 INFO [AIF]: -------START IMPORT STEP-------
2013-11-08 11:49:14,105 INFO [AIF]: -------END IMPORT STEP-------
2013-11-08 11:49:14,152 INFO [AIF]: -------START EXPORT STEP-------
2013-11-08 11:49:14,178 DEBUG [AIF]: CommData.exportData - START
2013-11-08 11:49:14,183 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,188 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,195 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,197 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,197 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,224 DEBUG [AIF]: CommData.insertPeriods - START
2013-11-08 11:49:14,228 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - END
2013-11-08 11:49:14,229 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 1
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-11-08 11:49:14,230 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2013-11-08 11:49:14,232 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,2202 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'OASIS'
INNER JOIN TPOVPERIODSOURCE ppsrc
ON ppsrc.PERIODKEY = pp.PERIODKEY
AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
AND ppsrc.SOURCE_SYSTEM_ID = 2
AND ppsrc.CALENDAR_ID IN ('29067')
LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
ON prd.PERIOD_ID = ppsrc.PERIOD_ID
AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = '507'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
WHERE brl.LOADID = 22
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-11-08 11:49:14,235 DEBUG [AIF]: CommData.insertPeriods - END
2013-11-08 11:49:14,240 DEBUG [AIF]: CommData.moveData - START
2013-11-08 11:49:14,242 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,242 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,244 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,245 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:14,246 DEBUG [AIF]:
UPDATE TDATASEG
SET LOADID = 22
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,320 DEBUG [AIF]: Number of Rows updated on TDATASEG: 1842
2013-11-08 11:49:14,320 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT 22
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,'N'
FROM AIF_APPL_LOAD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,321 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,322 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT 22
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
FROM AIF_APPL_LOAD_PRD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,323 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_PRD_AUDIT: 1
2013-11-08 11:49:14,325 DEBUG [AIF]: CommData.moveData - END
2013-11-08 11:49:14,332 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,332 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,336 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 20
,PROCESSEXP = 0
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSEXPNOTE = NULL
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,338 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,339 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - START
2013-11-08 11:49:14,341 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID = 22
AND (
PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
AND VALID_FLAG = 'N'
2013-11-08 11:49:14,342 DEBUG [AIF]: Number of Rows deleted from TDATASEG: 0
2013-11-08 11:49:14,342 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - END
2013-11-08 11:49:14,344 DEBUG [AIF]: CommData.updateAppLoadAudit - START
2013-11-08 11:49:14,344 DEBUG [AIF]:
UPDATE AIF_APPL_LOAD_AUDIT
SET EXPORT_TO_TARGET_FLAG = 'Y'
WHERE LOADID = 22
AND PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY= '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,345 DEBUG [AIF]: Number of Rows updated on AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateAppLoadAudit - END
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,346 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 21
,PROCESSEXP = 1
,PROCESSEXPNOTE = 'Export Successful'
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.exportData - END
2013-11-08 11:49:14,404 DEBUG [AIF]: HfmData.loadData - START
2013-11-08 11:49:14,404 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,404 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,406 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,407 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,407 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,409 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,410 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 30
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,411 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,412 DEBUG [AIF]:
SELECT COALESCE(usr.PROFILE_OPTION_VALUE, app.PROFILE_OPTION_VALUE, site.PROFILE_OPTION_VALUE) PROFILE_OPTION_VALUE
FROM AIF_PROFILE_OPTIONS po
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES site
ON site.PROFILE_OPTION_NAME = po.PROFILE_OPTION_NAME
AND site.LEVEL_ID = 1000
AND site.LEVEL_VALUE = 0
AND site.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES app
ON app.PROFILE_OPTION_NAME = site.PROFILE_OPTION_NAME
AND app.LEVEL_ID = 1005
AND app.LEVEL_VALUE = NULL
AND app.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES usr
ON usr.PROFILE_OPTION_NAME = usr.PROFILE_OPTION_NAME
AND usr.LEVEL_ID = 1010
AND usr.LEVEL_VALUE = NULL
AND usr.LEVEL_ID <= po.MAX_LEVEL_ID
WHERE po.PROFILE_OPTION_NAME = 'JAVA_HOME'
2013-11-08 11:49:14,413 DEBUG [AIF]: HFM Load command:
%EPM_ORACLE_HOME%/products/FinancialDataQuality/bin/HFM_LOAD.vbs "22" "a9E3uvSJNhkFhEQTmuUFFUElfdQgKJKHrb1EsjRsL6yZJlXsOFcVPbGWHhpOQzl9zvHoo3s%2Bdq6R4yJhp0GMNWIKMTcizp3%2F8HASkA7rVufXDWEpAKAK%2BvcFmj4zLTI3rtrKHlVEYrOLMY453J2lXk6Cy771mNSD8X114CqaWSdUKGbKTRGNpgE3BfRGlEd1wZ3cra4ee0jUbT2aTaiqSN26oVe6dyxM3zolc%2BOPkjiDNk1MqwNr43tT3JsZz4qEQGF9d39DRN3CDjUuZRPt4SEKSSL35upncRJiw2uBOtV%2FvSuGLNpZ2J79v1%2Ba1Oo9c4Xhig7SFYbE6Jwk1yXRJLTSw0DKFu%2FEpcdjpOnx%2F6YawMBNIa5iu5L637S91jT1Xd3EGmxZFq%2Bi6bHdCJAC8g%3D%3D" "C%3A%5COracle%5CMiddleware%5Cuser_projects%5Cepmsystem1" "%25EPM_ORACLE_HOME%25%2F..%2Fjdk160_35"
Is there anywhere we need to set EPM Home and also let me know what is PSU1 ?
Thanks in advance.
Praneeth -
Loading data from flat file into 0EMPLOYEE
Dear,
I want to see the <b>'monthly change of numbers of employees'</b> of my company in a BW report.
i want to do this by loading data from a dummy flat file into 0EMPLOYEE and afterwards into cube 0PAPA_C02.
i've searched the whole 'help and service sap' websites but nowhere, i could find a usefull user guide.
Could you explain me how this job needs to be done, are there certain steps that need to be taken?
here are a few of my questions:
1: what fields and data do i have to fill in my flat file?
2: how does the time-dependency of master data of employee work?
3: why do you only have to map 0calday and 0employee from infosource 0HR_PA_PA_1 to the corresponding fields of cube 0PAPA_C02?
4: how is the key figure 0HDCNT_VC filled automathically?
(the activation of cube 0PAPA_C02 and its flow is already done.)
or maybe you know a user guide which explains the whole picture and could you send me its link?
thanks a lot in advance,
Julie de MeyerHi,
Employee is master data : You need to extarct data every month.Once the extraction is done , you need rto activate to see the results. Otherwise it will display old result.
You can extract data for 0HR_PA_0 and 0HR_PA_1 cubes also. After that you can do the reporting based on this.
You can try some sap standard BI reports .
Headcount is calculated based on employement percentage which is maintained in infotype 0007 (Planned Working Time). If it is 100 then it is considerd as headcount 1.
you can check .
http://help.sap.com/saphelp_nw2004s/helpdata/en/63/351e3c6a2fc036e10000000a114084/frameset.htm
Regards
Nilesh -
How to load data from PSA to CUBE & DSO at a time using DTP in BI 7 ?
HI all,
I am new to BI 7 . How to load the data at same time to DSO & INFO CUBE using DTP.
Please provide me steps to load & plz specify which update mode I have to use ( FULL OR DELTA ) which one is best.
Plz Suggest me.
Thanks & Regards,
Kiran m.
Message was edited by:
kiran manyamBelow are the basic steps which we follow in any BI 2004S system:
1)Create datasource. Here u can set/check the Soucre System fields.
2)Create Transformation for that datasource. (no more update rules/transfer rules)
2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
DataSource -> Transformation -> Data Target
Now if you want to load data into data target from Source System Datasource:
1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
2)Now Create DTP (Data Transfer Process) for that data source.
3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
Use the below link for detailed example:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
Full or delta depends on your requirement...
chk the below thread to know better
difference between the various loads
hope it helps
Message was edited by:
sriram viswanathan -
Error while loading data from write optimized ODS to cube
Hi All,
I am loading data from a write optimized ODS to cube
I have done Generate Export Datasource
schedulled the info packge with 1 selection for full load
then it gave me following error in Transfer IDOCs & TRFC
Info IDOC 1: IDOC with errors added
Info IDOC 2: IDOC with errors added
Info IDOC 3: IDOC with errors added
Info IDOC 4: IDOC with errors added
Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
Processing below is green
shows update of 4 new records to Datapackage 1.
Please provide inputs for the resolution
Thanks & Regards,
Rashmi.please let me know, What more details you need?
If I click F1 for error details i get following message
Messages from source system
see also Processing Steps Request
These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
Thanks & Regards,
Rashmi. -
Error While Loading data from HFM to Essbase
Hi,
I am trying to load data from a HFM application to Essbase application. The LKM i am using is "HFM Data to SQL" and the IKM is "SQL to Hyperion Essbase(DATA). I am also using couple of mapping expression like {CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account") } , {CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period") } etc.
The error i am getting looks like this -
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [Account]
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
I am mapping account dimension of HFM to to account dimension of essbase. Both source and target column type is "string" with length 80. Instead of keeping the target "Essbase application", if i keep an Oracle table, columns of which are exactly like essbase application, ODI is loading data perfectly with proper coversion.
Can anyody please help me out.
Many thanks.
N. Laha.Hi John,
Thank you very much for your response. Pasting here the SQl generated at step "8. Integration - Load data into Essbase" in the operator in the description tab, for your perusal -
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select CASEWHEN(C3_ACCOUNT = '3110', 'PL10100', C3_ACCOUNT) "Account",CASEWHEN(C4_PERIOD = 'January', 'Jan', C4_PERIOD) "TimePeriod",'HSP_Inputvalue' "HSP_Rates",C8_SCENARIO "Scenario",CASEWHEN(C2_YEAR = '2007', 'FY09', C2_YEAR) "Years",CASEWHEN(C1_CUSTOM1 = '2012', 'Atacand', C1_CUSTOM1) "Product",CASEWHEN(C7_VIEW = 'YTD', 'Input', C7_VIEW) "Version",CASEWHEN(C6_VALUE = 'USD', 'Local', C6_VALUE) "Currency",C9_ENTITY "Entity",C5_DATAVALUE "Data" from "C$_0AZGRP_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
From the SQL, you may find the CASEWHEN expressions. Yes these i have added in the interface mapping section, with the help of ODI expression editor. Following are the expressions i have used, one statement for one mapping each. This is just to make the data of HFM, compatible to Essbase. e.g. - account number in HFM for which i am estracting the data is 3110, but there is no account '3110' in the essbase application, so essbase application wont be able to load the data. To make the HFM data compatible with Essbase, these casewhen statements have been used. I hope its fine to use such statements in mapping.
CASEWHEN(HFMDATA."Year" = '2007', 'FY09', HFMDATA."Year")
CASEWHEN(HFMDATA."View" = 'YTD', 'Input', HFMDATA."View")
CASEWHEN(HFMDATA."Entity" = '100106_LC', '1030GB', HFMDATA."Entity")
CASEWHEN(HFMDATA."Value" = 'USD', 'Local', HFMDATA."Value")
CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account")
CASEWHEN(HFMDATA."Custom1" = '2012', 'Atacand', HFMDATA."Custom1")
CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period")
If you can get some idea from the SQL and help me out!
Many thanks.
N. Laha
Edited by: user8732338 on Sep 28, 2009 9:45 PM
Maybe you are looking for
-
Satellite U - Crashes when starting windows and recovery cannot fix
Most of the time but not every time the laptop crashes when starting windows. I get a blue screen, a dump file is created and then it restarts and gives me the option of starting windows normally or using windows recovery. The windows recovery cannot
-
How to remove objects from pics using LR...
I am not a professional photographer and I do not have a whole lot of photo editing knowledge. I started out several years ago using Photoshop elements 5 and eventually learned how to do several things like moving objects around or removing them from
-
Printing to a printer via a print server
Hi all, I have a slight problem with printing from an application that I have written. The application does print correctly if the printer is connected to the PC locally, however the printer has recently become attached to a print server and since th
-
No FI Document During GR of Production
Hi, I am working in MTO environment. When I post confirmation of Production Order with Auto GR option then No FI document is creating for Goods Receipt. But when I do confirmation of Order without Sales Order reference then the FI document for GR is
-
BT Home Hub Manager - how do I get a session start...
I'm a patient person but really ... I've been trying to get the d*** thing online so I can make some changes and I've veen waiting now for nearly three hours .... only 100 sessions allowed ... I had no idea I wouldn't be able to access my own router