Steps to selectively load data from r/3 to bw
Hi all,
I have a requirement to load data selectively into bw.I want to run init load multiple times for the below datasources.
2LIS_12_VCITM
2LIS_13_VDITM
2LIS_13_VDKON
2LIS_03_BF
2LIS_02_ITM
2LIS_02_SCL
Now my doubt is that form which table i should consider data for the above datasources and run stastical setup.I want with the steps.Also provide t-codes to run stastical set up for each of the above datasources.
If satisfied i will definitley reward points.
Please,
Its urgent
Niranjan
[email protected]
HI,
for purchasing data source that is 02---- OLI3BW
in inventory that is 03-- for materials movement--OLI1BW, for invoice verification
OLIZBW
deliveries 12---OLI8BW
billing 13---OLI9BW,
these are t-code for set up table fill up in r/3 side(02,03,12,13)
if helpful provide reward points
regards
harikrishna.N
Similar Messages
-
Loading data from R/3 systems to BI infocubes
Hi all BI gurus,
Can anyone pls explain in simple steps how to load data from SAP R/3 systems into an infocube?
Thanks in advance,
karthikHi,
Which module u want?
if its SD(for example)
steps>>
1>In AWB goto "business content"
2> goto "Info provider"
3>Under infoarea select SD cubes
4> Drag related cubes and ODS to right panel
5> Set the grouping option "In Data flow before&afterwards"
6>Install the collected objects
Go to R/3
7> Use Tcode RSA5 Transfer all regarding SD module datasources
Goto BW
8> Right click on the source system "Replicate datasources"
[DataSources|http://help.sap.com/saphelp_nw70/helpdata/en/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm]
Edited by: Obily on Jul 10, 2008 8:36 AM -
Loading data from flat file to infocube
hi,can any one give me the steps invovled in loading data from flatfile to info cube in an elegant way
hi,
data loading in BI 7.0 from fla file means
fisrt create one Cube or DSO with the same structure which you have in flatfile..and activate it.. ->now comes to Datasource tab> create one Datasource here you need to select type of data for example..
select Transactional data --> and menntion your flatfile name in extraction tab- and file type and eneter your info object names in FIELDS tab -->
and load preview data Activate it..
now select your datasource and create info package and schedule it..
now your data will loded in to PSA level...----> and now comes to info provider select your cube.. and right clcik it..
and create transformations.,. and activate it..----> and create DTP -- Activate it.. and Execute it..
1)Create datasource. Here u can set/check the Soucre System fields.
2)Create Transformation for that datasource. (no more update rules/transfer rules)
2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
DataSource -> Transformation -> (DTP)-->Data TargetNow if you want to load data into data target from Source System Datasource:
1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
2)Now Create DTP (Data Transfer Process) for that data source.
3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
Data Transfer Process (DTP) is now used to load data using the dataflow created by the Transformation.
Here's how the DTP data load works:
1) Load InfoPackage
2) Data gets loaded into PSA (hence why PSA only is selected)
3) DTP gets "executed"
4) Data gets loaded from PSA into the data target once the DTP has executed
if helpful provide points
regards
harikrishna N -
Step by Step details on how to load data from a flat file
hi can anyone explain how to load data from a flat file. Pls giv me step by step details. thnx
hi sonam.
it is very easy to load data from flat file. whn compared with other extrations methods...
here r the step to load transation data from a flat file.......
step:1 create a Flat File.
step:2 log on to sap bw (t.code : rsa1 or rsa13).
and observe the flat file source system icon. i.e pc icon.
step:3 create required info objects.
3.1: create infoarea
(infoObjects Under Modeling > infoObjects (root node)-> context menu -
> create infoarea).
3.2: create char /keyfig infoObject Catalog.(select infoArea ---.context menu --->create infoObject catalog).
3.3: create char.. infoObj and keyFig infoObjects accourding to ur requirement and activate them.
step:4 create infoSource for transaction data and create transfer structure and maintain communication structure...
4.1: first create a application component.(select InfoSources Under modeling-->infosources<root node>>context menu-->create applic...component)
4.2: create infoSource for transation data(select appl..comp--.context menu-->create infosource)
>select O flexible update and give info source name..
>continue..
4.4: *IMp* ASSIGN DATASOURCE..
(EXPAND APPLIC ..COMP..>EXPAND YOUR INFOSOURCE>CONTEXT MENU>ASSIGN DATASOURCE.)
>* DATASOURCE *
>O SOURCE SYSTEM: <BROWSE AND CHOOSE YOUR FLAT FILE SOURCE SYSTEM>.(EX:PC ICON).
>CONTINUE.
4.5: DEFINE DATASOURCE/TRANSFER STRUCTURE FOR IN FOSOURCE..
> SELECT TRANSFER STRUCTURE TAB.
>FILL THE INFOOBJECT FILLED WITH THE NECESSARY INFOOBJ IN THE ORDER OR SEQUENCE OF FLAT FILE STRUCTURE.
4.6: ASSIGN TRANSFER RULES.
---> NOW SELECT TRANSFER RULES TAB. AND SELECT PROPOSE TRANSFER RULES SPINDLE LIKE ICON.
(IF DATA TARGET IS ODS -
INCLUDE 0RECORDMODE IN COMMUNICATION STRUCTURE.)
--->ACTIVATE...
STEP:5 CREATE DATATARGET.(INFOCUBE/ODS OBJECT).
5.1: CREATE INFO CUBE.
-->SELECT YOUR INFOAREA>CONTEXT MENU>CREATE INFOCUBE.
5.2: CREATE INFOCUBE STRUCTURE.
---> FILL THE STRUCTURE PANE WILL REQUIRE INFOOBJECTS...(SELECT INFOSOURCE ICON>FIND UR INFOSOURCE >DOUBLE CLICK > SELECT "YES" FOR INFOOBJECT ASSIGNMENT ).
>MAINTAIN ATLEAST ON TIME CHAR.......(EX; 0CALDAY).
5.3:DEFINE AND ASSIGN DIMENSIONS FOR YOUR CHARACTERISTICS..
>ACTIVATE..
STEP:6 CREATE UPDATE RULES FOR INFOCUDE USING INFOSOURCE .
>SELECT UR INFOCUBE >CONTEXT MENU> CREATE UPDATE RULES.
> DATASOURCE
> O INFOSOURCE : _________(U R INFOSOURCE). AND PRESS ENTER KEY.......
>ACTIVATE.....UR UPDATE RULES....
>>>>SEE THE DATA FLOW <<<<<<<<----
STEP:7 SCHEDULE / LOAD DATA..
7.1 CREATE INFOPACKAGE.
--->SELECT INFOSOURCE UNDER MODELING> EXPAND UR APPLIC.. COMP..> EXPAND UR INFOSOURCE..> SELECT UR DATASOURCE ASSIGN MENT ICON....>CONTEXT MENU> CREAE INFOPACKAGE..
>GIVE INFOPACKAGE DISCREPTION............_________
>SELECT YOUR DATA SOURCE.-------> AND PRESS CONTINUE .....
>SELECT EXTERNAL DATA TAB...
> SELECT *CLIENT WORKSTATION oR APPLI SERVER >GIVE FILE NAME > FILE TYPE> DATA SAPARATER>
>SELECT PROCESSING TAB
> PSA AND THEN INTO DATATARGETS....
>DATATARGET TAB.
>O SELECT DATA TARGETS
[ ] UPDATE DATATARGET CHECK BOX.....
--->UPDATE TAB.
O FULL UPDATE...
>SCHEDULE TAB..
>SELECT O START DATA LOAD IMMEDIATELY...
AND SELECT "START" BUTTON........
>>>>>>>>>>
STEP:8 MONITOR DATA
> CHECK DATA IN PSA
CHECK DATA IN DATA TARGETS.....
>>>>>>>>>>> <<<<<<<<<----
I HOPE THIS LL HELP YOU..... -
Help - Step by Step Material on How to load Data from flatfile to BI 7.0
Hi BI Experts,
Can any one send the document or a pdf file on how to load data from flat file to BI 7.0 in step by step process.
Thanks in Advance
Regards
Ramakrishna Kamurthy
+91-99631010731. On log on you will be placed in the SAP Easy Access Strategic Management/ Business Analytics
1. You see the screen divided in two segments.
2. Follow the left segment.
3. Click on the Business Information Warehouse.
4. Sub Menu  BW Administration.
5. You find various entries in it.
6. Now double click on UG_BW_RSA1-Administrator Workbench
2. Creation of Info Area
1. Click on InfoObjects.
2. Now in the right hand side screen select the InfoObjects (First item on the top). Right click the mouse and select Create InfoArea.
3. A dialog box pops up.
4. Give the InfoArea as ZIVY_Iarea.
5. Give the Long Description as ZIVY Info Area.
3. Create a Catalog:
1. Select the InfoArea just created ZIVY_Iarea.
2. Right click the mouse pointer and select the option Create InfoObject Catalog
3. A dialog box pops up.
4. Give the InfoObjCat = ZIVY_CHCAT
i. Description = ZIVY Character Catalog
ii. InfoObjectType: Check the Character. ( As we are creating the master data, check the Character)
iii. Now at the bottom of this dialog box you see an icon called CREATE. Click this icon.
1.
1. The next screen in front of you is the Edit InfoObject Catalog.
2. Simply click the activate icon present on the Application Toolbar.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CUST and
i. Description Customer Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CUST: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP .
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_CUST, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_CITY
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CITY and
i.
Description
Customer
City Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CITY: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 25
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CITY, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_STATE
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_STATE and
i.
Description
Customer
State Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_STATE: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 25
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_STATE, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_CTRY
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_CTRY and
i. Description Customer Country Name.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_CTRY: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data
2. Check the With Texts and make suitable Text Table Properties.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CTRY, by clicking the icon ACTIVATE.
1. Creation of Attributes and InfoObject from the ZIVY_CUST InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_CUST. Double click the InfoObject ZIVY_CUST.
6. You find the screen Change Characteristic ZIVY_CUST:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_CITY and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Similarly, type ZIVY_STATE, ZIVY_CTRY.
11. Now, we will create another InfoObject ZIVY_TEL, from here.
12. So type ZIVY_Tel and enter the ENTER KEY.
13. A dialog box pops up, and request you If you want to Create an InfoObject
14. Select Create Attribute As Characteristic and click the check mark.
15. Another window pops up. Give the following:
i. In the General Tab:
1. Long Description: ZIVY Telephone Number
2. Short Description: ZIVY Telephone Number
3. DataType : NUMC
4. Length : 10
5. Uncheck Attribute Only
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies
iv. Now click the TICK MARK
1.
1. Click the ACTIVATE
2. A window pops up.
i. Select activate dependent InfoObjects.
ii. Click the Check Mark.
1. Creation of InfoObject: ZIVY_DATE
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_DATE and
i. Description ZIVY Date.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_DATE: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: DATS
2. Length: 8
ii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_CTRY, by clicking the icon ACTIVATE.
2. Create ZIVY Application Component:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Right click the mouse pointer and select the entry Create Application Component
5. A window pops up.
i. Application Component: ZIVY_APPCOMP
ii. Long Description : ZIVY Application Component
iii. Click the check mark.
1. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_CUST
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the Customer Info Object.
2. Select the Customer Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_CUST
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_CUST,
3. Second field is ZIVY_CITY.
4. And so on.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_CUST MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_CUST,
4. Third field is 0TXTMD.
5. Click the Transfer Rule adjacent to 0LANGUAGE
6. A window pops up.
a. Check the CONSTANT and give the value as EN
b. Click the TICK MARK.
7. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_CUST_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_CUST_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_CUST alias Customer Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_CUST_ATTR.
ii. Select ZIVY_CUST_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_CUST_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_CUST alias Customer Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_CUST_TEXT.
ii. Select ZIVY_CUST_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_CUST_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_MAT and
i. Description ZIVY Material Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_MAT: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP 17).
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_MAT, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_GROUP
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_GROUP and
i. Description ZIVY Material Group.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_GROUP: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data
2. Check the With Texts and make suitable Text Table Properties.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_GROUP, by clicking the icon ACTIVATE.
2. Creation of InfoObject: ZIVY_COLOR
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_COLOR and
i. Description ZIVY Material Color.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_COLOR: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 10
ii. In the Master Data/ Texts Tab:
1. Uncheck the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_COLOR, by clicking the icon ACTIVATE.
2. Creation of InfoObject from the ZIVY_MAT InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_MAT. Double click the InfoObject ZIVY_MAT.
6. You find the screen Change Characteristic ZIVY_MAT:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_GROUP and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Similarly, type ZIVY_COLOR.
11. Click the ACTIVATE
3. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_MAT
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the ZIVY Material Info Object.
2. Select the ZIVY Material Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_MAT
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_MAT,
3. Second field is ZIVY_GROUP.
4. Third field is ZIVY_COLOR.
5. Check the mapping. If you dont see the mapping.
6. Click on the empty row in the group Assign InfoObjectField, the box to the right.
7. Type ZIVY_GROUP, ZIVY_COLOR.
8. Check the transfer rules.
9. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_CUST MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_MAT,
4. Third field is 0TXTSH.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_Material_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_Material_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_MAT alias ZIVY Material Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_MAT_ATTR.
ii. Select ZIVY_MAT_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_Material_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_MAT alias ZIVY Material Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_MAT_TEXT.
ii. Select ZIVY_MAT_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_MATERIAL_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_SPER and
i. Description ZIVY Sales Person Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_SPER: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 20
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
2. Use suitable formats in the fields.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
iv. In the Attributes Tab:
1. We need to create the Attribute InfoObjects. These could be done directly from here or even from the RSA1  InfoObject  ZIVY_IareaZIVY_CHCAT. This will be delt in the forth coming steps (STEP 24).
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_SPER, by clicking the icon ACTIVATE
2. Creation of InfoObject: ZIVY_SREG
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_SREG and
i. Description ZIVY Sales Region.
ii. Click the TICK mark.
1.
1. The next screen you see is Change Characteristic ZIVY_SREG: Detail.
2. In this screen the following fields have to be modified:
i. In the General Tab
1. DataType: CHAR
2. Length: 20
ii. In the Master Data/ Texts Tab:
1. Check the With Master Data and With Texts.
iii. In the Hierarchy Tab:
1. Uncheck the With Hierarchies.
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the InfoObject ZIVY_SREG, by clicking the icon ACTIVATE.
2. Creation of Attributes from the ZIVY_SPER InfoObject:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_CHCAT,which is ZIVY Character Catalog.
5. In this ZIVY_CHCAT, you see the InfoObject ZIVY_SPER. Double click the InfoObject ZIVY_SPER.
6. You find the screen Change Characteristic ZIVY_SPER:Detail.
7. Click on the tab Attributes
8. In the Character Attribute Box, type ZIVY_SREG and enter the ENTER KEY.
9. You see the relevant information regarding this being inserted.
10. Click the ACTIVATE
3. Create InfoSoure:
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Right click the mouse pointer and select the entry Create InfoSource
6. A window pops up:
i. Select the Direct Update of Master Data
ii. InfoObject : ZIVY_SPER
iii. Click the CHECK MARK.
1.
1. Now beneath ZIVY Application Component you see the ZIVY Sales Person Info Object.
2. Select the ZIVY Sales Person Info Object mentioned above.
3. Right click the mouse pointer, and select the entry Assign DataSource.
4. A window pops up.
i. InfoSource: ZIVY_SREP
ii. Source System: FLATFILE
iii. Check the TICK MARK.
iv. Just click yes for the coming window pop ups.
1.
1. Key thing
i. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First field is ZIVY_SPER,
3. Second field is ZIVY_SREG.
4. Check the mapping. If you dont see the mapping.
5. Click on the empty row in the group Assign InfoObjectField, the box to the right.
6. Check the transfer rules.
7. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. ii. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
iii. Click the Activate Button. By clicking the ACTIVATE BUTTON you just activated the ZIVY_SREP MASTER DATA.
iv. Now in DataSource : select the text.
v. Click on the tab Transfer Rules
1. Have a look at the data mapping here.
2. First Field : 0LANGUAGE
3. Second field is ZIVY_SPER,
4. Third field is 0TXTSH.
5. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file. Ignore the first field, as it is a constant. vi. Click on the tab DataSource/Transport Structure1. This order is very very important, as we retrieve the data from a CSV file. The same order must be in the CSV file.
2. If the order is different, just shuffle the rows.
vii. Click the Activate Button. By clicking the ACTIVATE.
1. Create the DATA required for this MASTER DATA
1. Open EXCEL.
2. Type the following data:
1.
1. Save as ZIVY_SREP_MASTER.CSV
2. The file should be saved in COMMA SEPARATED FORMAT.
3. Open EXCEL
4. Type the following data:
5. Save the file *** ZIVY_SREP_TEXT.CSV
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_SPER alias ZIVY Sales Person Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_SPER_ATTR.
ii. Select ZIVY_SPER_ATTR
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_SPER_MASTER.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii. Browse the data.
1. Create InfoPackage
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoSource in the left hand segment.
3. In the right hand segment, select InfoSources.
4. Select the application component you created ZIVY_APPCOMP alias ZIVY Application Component.
5. Beneath this, you see ZIVY_SPER alias ZIVY Sales Person Info Object
6. Beneath this you see a FLAT FILE.
7. Select this FLAT FILE, right click the mouse pointer and select the option Create InfoPackage.
8. A window pops up.
i. InfoPackage Description: Info package for ZIVY_SPER_TEXT.
ii. Select ZIVY_SPER_TEXT
iii. Click the CHECK MARK
1.
1. You see a new screen Scheduler (Maintain InfoPackage).
2. Click the tab External Data
i. Name of file: C:\Personal\SAP\MyWork\MyBW\ZIVY_SPER_TEXT.csv
ii. Check the CSV file.
iii. Click the Preview Button.
iv. A window pops up, click the CHECK MARK.
v. You see the data present in the EXCEL File.
1.
1. Click the tab Update
i. Have a look at the entries.
1.
1. Click the tab: Scheduler:
i. Click the button Start
ii. You now click on MONITOR BUTTON present on the tool bar.
iii.
Browse the data.
1. Creation of InfoObject Catalog with Key Figures:
1. Select the InfoArea just created ZIVY_Iarea.
2. Right click the mouse pointer and select the option Create InfoObject Catalog
3. A dialog box pops up.
4. Give the InfoObjCat = ZIVY_KEYCAT
i. Description = ZIVY Key Figures Catalog
ii. InfoObjectType: Check the Key Figures.
iii. Now at the bottom of this dialog box you see an icon called CREATE. Click this icon.
1.
1. The next screen in front of you is the Edit InfoObject Catalog.
2. Simply click the activate icon present on the Application Toolbar.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. Look for the InfoArea you created ZIVY_Iarea.
4. Select the catalog you created ZIVY_KEYCAT,which is ZIVY Key Figures Catalog.
5. Right click your mouse pointer, and select Create InfoObject.
6. A dialog box pops up.
7. Give the Character ZIVY_AMT and
i. Description ZIVY Amount Info Object
ii. Click the TICK mark.
1.
1. The next screen you see is Create Key Figure ZIVY_AMT: Detail.
2. In this screen the following fields have to be modified:
i. In the Type/ Unit Tab
1. DataType: Amount
2. Unit/ Currency = 0Currency
1.
1. Now save you work. Click on the SAVE icon on the COMMAND ToolBar.
2. Activate the ZIVY_AMT, by clicking the icon ACTIVATE.
2. Create InfoObject :
1. Go to the UG_BW_RSA1-Administrator Workbench.
2. Click on InfoObjects.
3. -
Problem in Loading Data from SQL Server 2000 to Oracle 10g
Hi All,
I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
Where Prj_Dim_Region is the name of my target table in Oracle.
The error message is:
955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
java.sql.SQLException: ORA-00955: name is already used by an existing object
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
<%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
I have chosen to execute this code on target.
Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
But when i press OK to run it, I get this error message:
java.lang.IllegalArgumentException: Row index out of range
at javax.swing.JTable.boundRow(Unknown Source)
at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.open(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
Thank you so much in advance.
NeelHi Cezar,
Can u plz help me with this scenario?
I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
java.sql.SQLException: ORA-00942: table or view does not exist
and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
"-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
Please let me know how to make it work?
Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
Thank you for your time in advance.
Regards,
Neel -
Unable to load data from FDMEE staging table to HFM tables
Hi,
We have installed EPM 11.1.2.3 with all latest related products (ODI/FDMEE) in our development environment
We are inprocess of loading data from EBS R12 to HFM using ERPI (Data Managemwnt in EPM 11.1.2.3). We could Import and validate the data but when we try to Export the data to HFM, the process continuously running hours (neither it gives any error nor it completes).
When we check the process details in ODI work repository, following processes are completed successfully
COMM_LOAD_BALANCES - Running .........(From past 1 day, still running)
EBS_GL_LOAD_BALANCES_DATA - Successful
COMM_LOAD_BALANCES - Successful
Whereas we could load data in staging table of FDMEE database schema and moreover we are even able to drill through to source system (EBS R12) from Data load workbench but not able to load the data into HFM application.
Log details from the process are below.
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Process Start, Process ID: 31
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 17:04:59,748 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_31.log
2013-11-05 17:04:59,748 INFO [AIF]: User:admin
2013-11-05 17:04:59,748 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 17:04:59,749 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 17:04:59,749 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 17:04:59,749 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 17:05:00,844 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 17:05:00,844 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 17:05:02,910 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 17:05:02,953 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 17:05:03,030 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 17:05:03,108 INFO [AIF]:
Move Data for Period 'JAN'
Any help on above is much appreciated.
Thank you
Regards
PraneethHi,
I have followed the steps 1 & 2 above. Noe the log shows something like below
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-05 09:47:31,180 INFO [AIF]: User:admin
2013-11-05 09:47:31,180 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 09:47:31,180 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 09:47:31,180 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 09:47:31,181 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 09:47:32,378 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 09:47:32,378 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 09:47:34,652 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 09:47:34,698 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 09:47:34,744 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 09:47:34,828 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:10,478 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Logging Level: 5
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-08 11:49:10,494 INFO [AIF]: User:admin
2013-11-08 11:49:10,494 INFO [AIF]: Location:VISIONLOC (Partitionkey:1)
2013-11-08 11:49:10,494 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-08 11:49:10,495 INFO [AIF]: Category Name:VISIONCAT (Category key:1)
2013-11-08 11:49:10,495 INFO [AIF]: Rule Name:VISIONRULE (Rule ID:1)
2013-11-08 11:49:11,903 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-08 11:49:11,909 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-08 11:49:14,037 INFO [AIF]: -------START IMPORT STEP-------
2013-11-08 11:49:14,105 INFO [AIF]: -------END IMPORT STEP-------
2013-11-08 11:49:14,152 INFO [AIF]: -------START EXPORT STEP-------
2013-11-08 11:49:14,178 DEBUG [AIF]: CommData.exportData - START
2013-11-08 11:49:14,183 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,188 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,195 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,197 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,197 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,224 DEBUG [AIF]: CommData.insertPeriods - START
2013-11-08 11:49:14,228 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - END
2013-11-08 11:49:14,229 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 1
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-11-08 11:49:14,230 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2013-11-08 11:49:14,232 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,2202 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'OASIS'
INNER JOIN TPOVPERIODSOURCE ppsrc
ON ppsrc.PERIODKEY = pp.PERIODKEY
AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
AND ppsrc.SOURCE_SYSTEM_ID = 2
AND ppsrc.CALENDAR_ID IN ('29067')
LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
ON prd.PERIOD_ID = ppsrc.PERIOD_ID
AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = '507'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
WHERE brl.LOADID = 22
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-11-08 11:49:14,235 DEBUG [AIF]: CommData.insertPeriods - END
2013-11-08 11:49:14,240 DEBUG [AIF]: CommData.moveData - START
2013-11-08 11:49:14,242 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,242 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,244 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,245 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:14,246 DEBUG [AIF]:
UPDATE TDATASEG
SET LOADID = 22
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,320 DEBUG [AIF]: Number of Rows updated on TDATASEG: 1842
2013-11-08 11:49:14,320 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT 22
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,'N'
FROM AIF_APPL_LOAD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,321 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,322 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT 22
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
FROM AIF_APPL_LOAD_PRD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,323 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_PRD_AUDIT: 1
2013-11-08 11:49:14,325 DEBUG [AIF]: CommData.moveData - END
2013-11-08 11:49:14,332 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,332 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,336 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 20
,PROCESSEXP = 0
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSEXPNOTE = NULL
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,338 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,339 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - START
2013-11-08 11:49:14,341 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID = 22
AND (
PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
AND VALID_FLAG = 'N'
2013-11-08 11:49:14,342 DEBUG [AIF]: Number of Rows deleted from TDATASEG: 0
2013-11-08 11:49:14,342 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - END
2013-11-08 11:49:14,344 DEBUG [AIF]: CommData.updateAppLoadAudit - START
2013-11-08 11:49:14,344 DEBUG [AIF]:
UPDATE AIF_APPL_LOAD_AUDIT
SET EXPORT_TO_TARGET_FLAG = 'Y'
WHERE LOADID = 22
AND PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY= '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,345 DEBUG [AIF]: Number of Rows updated on AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateAppLoadAudit - END
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,346 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 21
,PROCESSEXP = 1
,PROCESSEXPNOTE = 'Export Successful'
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.exportData - END
2013-11-08 11:49:14,404 DEBUG [AIF]: HfmData.loadData - START
2013-11-08 11:49:14,404 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,404 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,406 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,407 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,407 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,409 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,410 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 30
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,411 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,412 DEBUG [AIF]:
SELECT COALESCE(usr.PROFILE_OPTION_VALUE, app.PROFILE_OPTION_VALUE, site.PROFILE_OPTION_VALUE) PROFILE_OPTION_VALUE
FROM AIF_PROFILE_OPTIONS po
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES site
ON site.PROFILE_OPTION_NAME = po.PROFILE_OPTION_NAME
AND site.LEVEL_ID = 1000
AND site.LEVEL_VALUE = 0
AND site.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES app
ON app.PROFILE_OPTION_NAME = site.PROFILE_OPTION_NAME
AND app.LEVEL_ID = 1005
AND app.LEVEL_VALUE = NULL
AND app.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES usr
ON usr.PROFILE_OPTION_NAME = usr.PROFILE_OPTION_NAME
AND usr.LEVEL_ID = 1010
AND usr.LEVEL_VALUE = NULL
AND usr.LEVEL_ID <= po.MAX_LEVEL_ID
WHERE po.PROFILE_OPTION_NAME = 'JAVA_HOME'
2013-11-08 11:49:14,413 DEBUG [AIF]: HFM Load command:
%EPM_ORACLE_HOME%/products/FinancialDataQuality/bin/HFM_LOAD.vbs "22" "a9E3uvSJNhkFhEQTmuUFFUElfdQgKJKHrb1EsjRsL6yZJlXsOFcVPbGWHhpOQzl9zvHoo3s%2Bdq6R4yJhp0GMNWIKMTcizp3%2F8HASkA7rVufXDWEpAKAK%2BvcFmj4zLTI3rtrKHlVEYrOLMY453J2lXk6Cy771mNSD8X114CqaWSdUKGbKTRGNpgE3BfRGlEd1wZ3cra4ee0jUbT2aTaiqSN26oVe6dyxM3zolc%2BOPkjiDNk1MqwNr43tT3JsZz4qEQGF9d39DRN3CDjUuZRPt4SEKSSL35upncRJiw2uBOtV%2FvSuGLNpZ2J79v1%2Ba1Oo9c4Xhig7SFYbE6Jwk1yXRJLTSw0DKFu%2FEpcdjpOnx%2F6YawMBNIa5iu5L637S91jT1Xd3EGmxZFq%2Bi6bHdCJAC8g%3D%3D" "C%3A%5COracle%5CMiddleware%5Cuser_projects%5Cepmsystem1" "%25EPM_ORACLE_HOME%25%2F..%2Fjdk160_35"
Is there anywhere we need to set EPM Home and also let me know what is PSU1 ?
Thanks in advance.
Praneeth -
Error while loading data from write optimized ODS to cube
Hi All,
I am loading data from a write optimized ODS to cube
I have done Generate Export Datasource
schedulled the info packge with 1 selection for full load
then it gave me following error in Transfer IDOCs & TRFC
Info IDOC 1: IDOC with errors added
Info IDOC 2: IDOC with errors added
Info IDOC 3: IDOC with errors added
Info IDOC 4: IDOC with errors added
Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
Processing below is green
shows update of 4 new records to Datapackage 1.
Please provide inputs for the resolution
Thanks & Regards,
Rashmi.please let me know, What more details you need?
If I click F1 for error details i get following message
Messages from source system
see also Processing Steps Request
These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
Thanks & Regards,
Rashmi. -
Error While Loading data from HFM to Essbase
Hi,
I am trying to load data from a HFM application to Essbase application. The LKM i am using is "HFM Data to SQL" and the IKM is "SQL to Hyperion Essbase(DATA). I am also using couple of mapping expression like {CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account") } , {CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period") } etc.
The error i am getting looks like this -
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [Account]
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
I am mapping account dimension of HFM to to account dimension of essbase. Both source and target column type is "string" with length 80. Instead of keeping the target "Essbase application", if i keep an Oracle table, columns of which are exactly like essbase application, ODI is loading data perfectly with proper coversion.
Can anyody please help me out.
Many thanks.
N. Laha.Hi John,
Thank you very much for your response. Pasting here the SQl generated at step "8. Integration - Load data into Essbase" in the operator in the description tab, for your perusal -
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select CASEWHEN(C3_ACCOUNT = '3110', 'PL10100', C3_ACCOUNT) "Account",CASEWHEN(C4_PERIOD = 'January', 'Jan', C4_PERIOD) "TimePeriod",'HSP_Inputvalue' "HSP_Rates",C8_SCENARIO "Scenario",CASEWHEN(C2_YEAR = '2007', 'FY09', C2_YEAR) "Years",CASEWHEN(C1_CUSTOM1 = '2012', 'Atacand', C1_CUSTOM1) "Product",CASEWHEN(C7_VIEW = 'YTD', 'Input', C7_VIEW) "Version",CASEWHEN(C6_VALUE = 'USD', 'Local', C6_VALUE) "Currency",C9_ENTITY "Entity",C5_DATAVALUE "Data" from "C$_0AZGRP_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
From the SQL, you may find the CASEWHEN expressions. Yes these i have added in the interface mapping section, with the help of ODI expression editor. Following are the expressions i have used, one statement for one mapping each. This is just to make the data of HFM, compatible to Essbase. e.g. - account number in HFM for which i am estracting the data is 3110, but there is no account '3110' in the essbase application, so essbase application wont be able to load the data. To make the HFM data compatible with Essbase, these casewhen statements have been used. I hope its fine to use such statements in mapping.
CASEWHEN(HFMDATA."Year" = '2007', 'FY09', HFMDATA."Year")
CASEWHEN(HFMDATA."View" = 'YTD', 'Input', HFMDATA."View")
CASEWHEN(HFMDATA."Entity" = '100106_LC', '1030GB', HFMDATA."Entity")
CASEWHEN(HFMDATA."Value" = 'USD', 'Local', HFMDATA."Value")
CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account")
CASEWHEN(HFMDATA."Custom1" = '2012', 'Atacand', HFMDATA."Custom1")
CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period")
If you can get some idea from the SQL and help me out!
Many thanks.
N. Laha
Edited by: user8732338 on Sep 28, 2009 9:45 PM -
Error While Loading data from ODS to Infocube
I am trying to load data from ODS to Infocube thru Update ODS data into data target. My requirement was to take a small subset of fields from ODS and design IC and load the data.
My load fails at Extraction process as I get 0 records from total number of records sent in each package. Please let me know if you need more information.
Please advise.
Thanks,
RRIn Details tab of monitor, in extraction step,
Extraction(messages): Errors occured
Green Light for Data Request Received
Green Light for Data Selection Scheduled
Yellow for 25000 Records sent(0 records received)
Yellow for 25000 Records sent(0 records received)
Yellow for 15000 Records sent(0 records received)
Green Light for Data Selection Ended
Please let me know, If you need more information.
Thanks,
R R -
Using FDM to load data from oracle table (Integration Import Script)
Hi,
I am using Integration Import Script to load data from oracle table to worktables in FDM.
i am getting following error while running the script.
Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done
Attaching the full error report
ERROR:
Code............................................. -2147217887
Description...................................... Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.
At line: 22
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 6260
IDENTIFICATION:
User............................................. ******
Computer Name.................................... *******
App Name......................................... FDMAPP
Client App....................................... WebClient
CONNECTION:
Provider......................................... ORAOLEDB.ORACLE
Data Server......................................
Database Name.................................... DBNAME
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... SCRTEST
Location ID...................................... 750
Location Seg..................................... 4
Category......................................... FDM ACTUAL
Category ID...................................... 13
Period........................................... Jun - 2011
Period ID........................................ 6/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
I am using the following script
Function ImpScrTest(strLoc, lngCatKey, dblPerKey, strWorkTableName)
'Oracle Hyperion FDM Integration Import Script:
'Created By: Dhananjay
'Date Created: 1/17/2012 10:29:53 AM
'Purpose:A test script to import data from Oracle EBS tables
Dim cnSS 'ADODB.Connection
Dim strSQL 'SQL string
Dim rs 'Recordset
Dim rsAppend 'tTB table append rs object
'Initialize objects
Set cnSS = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
'Connect to SQL Server database
cnss.open "Provider=OraOLEDB.Oracle.1;Data Source= +server+;Initial Catalog= +catalog+;User ID= +uid+;Password= +pass+"
'Create query string
strSQL = "Select AMOUNT,DESCRIPTION,ACCOUNT,ENTITY FROM +catalog+.TEST_TMP"
'Get data
rs.Open strSQL, cnSS
'Check for data
If rs.bof And rs.eof Then
RES.PlngActionType = 2
RES.PstrActionValue = "No Records to load!"
Exit Function
End If
'Loop through records and append to tTB table in location’s DB
If Not rs.bof And Not rs.eof Then
Do While Not rs.eof
rsAppend.AddNew
rsAppend.Fields("PartitionKey") = RES.PlngLocKey
rsAppend.Fields("CatKey") = RES.PlngCatKey
rsAppend.Fields("PeriodKey") = RES.PdtePerKey
rsAppend.Fields("DataView") = "YTD"
rsAppend.Fields("CalcAcctType") = 9
rsAppend.Fields("Amount") = rs.fields("Amount").Value
rsAppend.Fields("Desc1") = rs.fields("Description").Value
rsAppend.Fields("Account") = rs.fields("Account").Value
rsAppend.Fields("Entity") = rs.fields("Entity").Value
rsAppend.Update
rs.movenext
Loop
End If
'Records loaded
RES.PlngActionType = 6
RES.PstrActionValue = "Import successful!"
'Assign Return value
SQLIntegration = True
End Function
Please help me on this
Thanks,
Dhananjay
Edited by: DBS on Feb 9, 2012 10:21 PMHi,
I found the problem.It was because of the connection string.The format was different for oracle tables.
PFB the format
*cnss.open"Provider=OraOLEDB.Oracle.1;Data Source= servername:port/SID;Database= DB;User Id=aaaa;Password=aaaa;"*
And thanks *SH* for quick response.
So closing the thread......
Thanks,
Dhananjay -
STEPS REQUIRED TO EXTRACT DATA FROM FLAT FILE TO BI 7.0
HI ALL,
I NEED THE STEPS TO EXTRACTED THE DATA FROM A FLAT FILE TO BI 7.0
PLEASE PROVIDE ME THE STEPS RATHER THAN PROVIDEING ME THE COMPLETE LINK OF SAP HELP.
I NEED THE STEPS ONLY.
THANKS,
GAUTAMBW 7.0
Uploading of master data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
BW 7.0
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
5. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
6. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
7. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
8. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
cheers
sunil -
How to load data from PSA in BI 7.0
Hi,
Gurus
I'm trying to load data from PSA to my InfoObject masterdata but I can't get the change mode, so I change the processing to data Target.Right now its on PSA only
I will reward poaints
Regards
AlexHi,
I am just a beginner in 7.0, i had the same issue I loaded the master data in PSA but couldn't load in Data target i.e. info object (correct me if my understanding of your problem is wrong).
Try these steps:
1) Create and activate Transformation.
2) Create and activate DTP first then in DTP itself in <b>Execute tab</b> click "execute" ,go to infoobject right click on the info object you have created and select "maintain info object" , the datas will be loaded into info object
*Please assign points if useful. -
Loading data from SAP BW standard table to master data text
Hi,
Can anyone please tell me the steps to load data from SAP BW standard table to master data text?
How should the data source be created?
Thanks in advance.Hi aparajith,
Have you loaded the master data attr ?
ECC Steps
then load the attr text
create the ztable using standard table using generic data source using table function.
there you can select the which field you required
check the data in RSA3
BI STEPS: -
check the source system connection.
replicate the data source
create of find the master data object for text (including attr)
map the transformation
run the info package
create the DTP.
check the data in target object.
Thanks,
Phani. -
Loading data from Z table to an ODS in BW/BI
Hello Gurus,
Can some one guide me how do I load data from a Z table which exists in the same BI system into an ODS/DSO. I'm working on a 04S system.
Your help is highly appreciated.
Thanks & Regards,
Prashanthhi Prasanth
u r using generic extraction method to load data from R/3 to BW server
u can use
T.Code SBIW or RSO2 to create Generic DataSource.
step 1- logon to R/3 system
step2 - check data in table
for this use t.code se11.
Db table name --- ZXXXXX.
1. select " Display " button.
2. select " contents"(shftctrlf11).--->execute.
*step-3 *- create generic datasource for trasactiona data
 enter t.code- rso2
 select t.data -
zXXXXX.(specify ur datasource name to create a new one).
 select create icon.
 appl.. component --- (browse and choose ur application component)(EX- sd).
 Extration from DBView"
 table/view---zXXXXX(give ur ztable name).
 text----give sht .des, m.des, L .des.......... for u data source.
 Select generic delta option in toolbar.
 Give delta specific field
 Field name---- (ex- pid)
Select any radio button(ex-numeric pointer).
 Settings additive delta radio button( for delta loads from ods to i.cube).
 Select save save .
 Package -
some package name.
 Save
 Continiue
 Coustomize the datasource by seleting selection check boxes for fields.
 Save
 MSg:- datasource hve been created.
SAP BW side:
Step :1
Enter t.code rsa13.
 Identify R/3 source system icon.
 Double click on R/3 s . system.
 Expand BW datasource
 Expand sap Appl. Comp..
 Select ur application component( for Ex- SD).
 Context menu -- replicate datasource
 Refresh tree once the replication is complete.
 Find ur datasource.
 Double click on data source icon { this implies data source is not assigned).
 Context menu
 Assign infosource..
 i.source assignment:
o select others radio button
o and select optioncreate.
 Flexible update.-------XXXXX
 Des----XXXXXX
 Continue
 Create I.Obj w .r t r/3 s.s fields.
 Assign the I.Obj to the fields of r/3 respectively.
 Enter 0RECORDMODE in comm.. structure.
 Activate
 create ODS obj and create structure and activate
 create update rules for Ods object with reference to i.source. and activate.
 Create infopackage and schedule data .and monitor the data in psa and ods objects tables.
Maybe you are looking for
-
My adobe id comes up with an error: dcab8d34-1460-445e-bdd7-fe03d0640a49
After I submit my serial number (which validates) an error occurs when I try to log in with my adobe id error: dcab8d34-1460-445e-bdd7-fe03d0640a49
-
Hi! My iPhone 4S's wi-fi icon is greyed out, which means I can no longer connect to wi-fi. How do I resolve this?
-
Better training for BT help staff
Recently I needed to 'phone the BT help because our Broadband had failed. We have four devices connected to the router and all could see the router but none could connect to the Internet. I used the testing function on the router and found that the r
-
Windows 2008 R2 - High memory on lsass.exe
Windows 2008 R2 fully patched VM with 8GB of RAM and 4 vCPUs. Machine is a DC and file server. Had an issue that just came up where lsass.exe will start sucking up memory. It gets to the point where the machine is almost not usable. The only way to g
-
Limite d'activation atteinte Photoshop Elements 10 - Formattage
Bonjour, J'ai formaté plusieurs fois mon PC et j'ai installé un nouveau SSD et Adobe Elements 10 ne fonctionne plus : Limite d'activation atteinte ! Quelle est la solution pour pouvoir ré-utiliser le programme ? Merci pour votre aide.