Load data from R3 to BI for a business content infocube

Hello
How can I create infopackages and DTP to load the data from R3 to BI(PSA) and PSA to Info Cube?
and what are the neccesary infopackages for infocube 0PT_C01?
Thank ypu

Hi
While installing business content for info cube, select grouping as indataflow before.
http://help.sap.com/saphelp_em70/helpdata/en/c2/a5cfa183be443b8184aec085cf9a85/frameset.htm
In the above link you will get all the business content related personal time management.
Please refer the below link for Cube business content instalation.
http://www.google.co.in/url?sa=t&rct=j&q=0pt_c01%20info%20cube%20in%20sap%20bi&source=web&cd=4&ved=0CEUQFjAD&url=http%3A%2F%2Fhelp.sap.com%2Fbp_bw370%2FBBLibrary%2Fdocumentation%2FD13_BB_ConfigGuide_EN_DE.doc&ei=veM0T-yHL8zIrQeIrqG6Dw&usg=AFQjCNHenO_2pi4VeEMXztFpLZUqwRGbpA&sig2=ZMKgnoHQnloljqermoF_GA
Regards,
Venkatesh

Similar Messages

  • How to load data from XML DOM into tables using Business Components

    <p>
    Hi,
    </p>
    <p>
    I need to upload XML file (it&#39;s not a problem) an load data (DOM tree) from this file into relationan tables. This filelooks like this:
    </p>
    <p>
    <font face="courier new,courier" size="2">&lt;Departments&gt;
       &lt;Department&gt;
          &lt;DepartmentName&gt;OPERATIONS&lt;/DepartmentName&gt;
          &lt;Localization&gt;BOSTON&lt;/Localization&gt;
          &lt;Employees&gt;
             &lt;Employee&gt;
                &lt;LastName&gt;TURNER&lt;/LastName&gt;
                &lt;Job&gt;SALESMAN&lt;/Job&gt;
                &lt;Manager&gt;7698&lt;/Manager&gt;
                &lt;HireDate&gt;1981-09-08&lt;/HireDate&gt;
                &lt;Salary&gt;1500&lt;/Salary&gt;
                &lt;Commerce&gt;0&lt;/Commerce&gt;
             &lt;/Employee&gt;
          &lt;/Employees&gt;
       &lt;/Department&gt;
    &lt;/Departments&gt;</font>
    </p>
    <p>
    Is there any Business Components support to obtain this ? What about primary and foreign keys values (there is no in XML file). How to place this XML data in appropriate tables ?
    </p>
    <p>
    Kuba 
    </p>

    Pl post details of exact OS and database versions, along with a sample of the XML file and description of the tables. What have you tried so far ?
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#i1005614
    HTH
    Srini

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • SOS!!!!----Error for Loading data from Oracle 11g to Essbase using ODI

    Hi all.
    I want to load data from oracle database to essbase using ODI.
    I configure successfully the physical and logical Hyperion essbase on Topology Manager, and got the ESSBASE structure of BASIC app DEMO.
    The problem is.
    1. When I try view data right click on the essbase table,
    va.sql.SQLException: Driver must be specified
         at com.sunopsis.sql.SnpsConnection.a(SnpsConnection.java)
         at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java)
         at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java)
         at com.sunopsis.graphical.frame.b.jc.bE(jc.java)
         at com.sunopsis.graphical.frame.bo.bA(bo.java)
         at com.sunopsis.graphical.frame.b.ja.dl(ja.java)
         at com.sunopsis.graphical.frame.b.ja.<init>(ja.java)
         at com.sunopsis.graphical.frame.b.jc.<init>(jc.java)
    I got answer from Oracle Supporter It's ok, just omit it. Then the second problem appear.
    2. I create an interface between oracle database and essbase, click the option "staging area deffirent from target"(meaning the staging is created at oracle database), and using IKM SQL to Hyperion Essbase(metadata), execute this interface
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 61, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
         at com.hyperion.odi.essbase.ODIEssbaseMetaWriter.validateLoadOptions(Unknown Source)
         at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx1.f$0(<string>:61)
         at org.python.pycode._pyx1.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
    I'm very confused by it. Anybody give me a solution or related docs.
    Ethan.

    Hi ethan.....
    U always need a driver to be present inside ur <ODI installation directory>\drivers folder.....for both the target and the source......Because ojdbc14.jar is the driver for oracle already present inside the drivers folder ....we don't have to do anything for it.....But for Essbase ...you need a driver......If u haven't already done this.....try this.....
    Hope it helps...
    Regards
    Susane

  • R12 Interface table for loading data from 3rd part payroll system

    Hi All,
    Can anyone help me to have a lists and detailed technical information of available interface table on Oracle R12 for importing/loading data from third party payroll system. And what should be the best way of importing the data? It should be load first to AP then to GL or load it directly to GL?
    Any help much appreciated. Thanks.
    Cyrus

    Hi Cyrus,
    Can you please let us know your business requirements of this integration, i.e what business want to acheive out of this integration.
    It depends on what your business requirements are wether to send only accounting information from payroll system to your Oracle GL ( then you can integrate Payroll system to Oracle GL directly, by sending the accounting information from your payroll) or if your requirement is to create payroll related invoices in AP and then do payments in oracle AP and then pass accounting information to GL ( then integrate your payroll to AP)
    Regards,
    Madhav

  • Unable to load data from an ODS to a Cube

    Hi all,
    I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
    I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
    Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
    Thank you very much and kind regards,
    MM.
    May be this helps...
    When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
    Thanks again.

    Hello MM,
    Can you do the following..
    make the Total status Red Delete the request from cube.
    goto ODS -> Remove the Data Mart Status -> Try loading it again.
    The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
    Hope it will help!

  • To load data from a cube in SCM(APO) system to a cube in BI system.

    Experts,
         Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
    Thanks,
    Meera

    Hi,
    Think in this way,
    To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO,  then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
    Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
    Thanks
    Reddy

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Dump when loading Data from DSO to Cube

    Hi,
    i get a short dump DBIF_REPO_PART_NOT_FOUND when trying to load data from a dso to a cube. From DSO to DSO it works fine.
    any idea`s?
    thx!
    Dominik

    the Dump occurs in the last step of the last data pakage in the DTP Monitor.
    I checked the OSS note, but i don´t know how to check the kernel version.
    Now i used another DSO for testing and i got another Dump with a different name.
    After that i logged in again and it works with my test DSO data...
    But still not working with the original one...
    @Kazmi: What detaisl u mean exactly?

  • How to load data from one Infocube to another request by request using DTP

    Hi All,
    I have a scenario where we are maintaining backup Infocube B for Infocube A.  User loads data from a flat file many times a day in different requests.  We need to maintain backup of the Infocube A on weekly basis.  i.e data to be refreshed with delta update from cube A to cube B.  There are some situations where user deletes some of the requests in Infocube A randomly after use.  Will this change effect in back up cube B when performed data refresh from cube A to cube B.  i.e. this functionality is similar to reconstruct in BW 3.5.  Now we are running on BI 7.0 SP 9.
    Can anyone answer this ASAP.
    Many Thanks,
    Ravi

    You cannot load request by request, " Get Data By Request " DTP loads request by request on " First In First Out " basis. You can run some Pseudo/Fake DTP's if you dont want to load data from a particular request.
    If the user deletes a request from Cube A, it wont be loaded to Cube B but if it is already loaded in Cube B and later the user deletes the request from Cube A you have to delete the request frm Cube B. Inorder to monitor request by request run DTP with   " Get Data By Request set.

  • Loading data from Cube to Planning area

    Hi,
             If I am loading data from a cube to a planning area using transaction TSCUBE,
    does the system load data into planning area for the combinations that exist in the cube or does it load for all CVCs?
    For example,
    I have my CVC as Plant, Material, Customer
    If there are 4 CVCs in the POS that were previously generated as
    Plant--Material--Customer
    01--M1--
    C1
    01--M2--
    C3
    01--M2--
    C2
    01--M4--
    C5
    If the cube has data like this:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    (doesnot have the last combination), then if I use TSCUBE transaction to load data to Planning area from this cube,
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    Only for the 3 combinations that exist in the cube and not load anything for the last one
    OR
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    01--M4C5--
    0
    Load all 4 combinations and send 0 as the cube doesnot have this combination?
    Hope I am clear on this question.
    Thanks.

    Thanks a lot Vinod, Srinivas and Harish. The reason why I am asking you is that we have a scenario where we get this situation.
    We initially get data from R/3 to BW to APO like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    Later when the customer is changed or bought out by somebody C1 is changed to C2. Some times when the business doesnot know who the customer is initially they just put C1 as dummy and then after sometime replace it by C2. Then the new record coming in is as follows:
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    BW can identify changes in transaction data  but not in Master data. What I mean by this is when Qty. 10 changes from 10 to 20, the system can identify it in deltas.
    If the customer (master data) changes to C2 from C1, the system thinks it's a new record all together then if I use delta loads, it gets me the following:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M1C2--
    10
    If I am looking at Plant and Material Level, my data is doubled.
    So we are planning to do a full load that works like this:
    1. Initial data like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    The CVC is created and the planning area has Qty.10
    Then we delete the contents of cube and do a full load into the cube with changed customer
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    This time a new CVC is created. Then we have another 10 loaded into Planning area.
    If the system loads all CVCs, then the it would send
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    0
    01--M1C1--
    10
    If the system loads only combinations in cube,
    then it loads
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    But the system already has another 10 for Customer C1 duplicating the values.
    We are trouble in the second case.
    We had to go fr this solution instead of realignment as our business has no way pf knowing that C1 was replaced by C2.
    Hope I am clear.

  • Help needed with loading data from ODS to cube

    Hi
    I am loading data from an ODS to a cube in QA. The update rules are active and definition seems right. I am getting the following error, in the update rules.
    "Error in the unit conversion. Error 110 occurred"
    Help.
    Thanks.

    Hi Chintai,
    You can see the record number where the error occured in the monitor (RSMO) for this data load > goto the details tab and open up the Processing area (+ sign). Try it out...
    Also about ignoring the error record and uploading the rest, this is done if you have set Error Handling in your InfoPackage (Update tab), but this would have to be done before the load starts, not after the error happens.
    Hope this helps...
    And since you thanked me twice, also please see here:-) https://www.sdn.sap.com/irj/sdn?rid=/webcontent/uuid/7201c12f-0701-0010-f2a6-de5f8ea81a9e [original link is broken]

  • Problem in Loading Data from SQL Server 2000 to Oracle 10g

    Hi All,
    I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
    21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
    Where Prj_Dim_Region is the name of my target table in Oracle.
    The error message is:
    955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
    java.sql.SQLException: ORA-00955: name is already used by an existing object
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
         at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
         at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
         at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
         at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
    <%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
    I have chosen to execute this code on target.
    Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
    When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
    select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
    But when i press OK to run it, I get this error message:
    java.lang.IllegalArgumentException: Row index out of range
         at javax.swing.JTable.boundRow(Unknown Source)
         at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
         at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
         at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
         at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.open(Unknown Source)
         at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
         at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
         at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
         at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
         at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
         at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
         at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
         at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
         at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
         at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
         at java.awt.Component.processMouseEvent(Unknown Source)
         at java.awt.Component.processEvent(Unknown Source)
         at java.awt.Container.processEvent(Unknown Source)
         at java.awt.Component.dispatchEventImpl(Unknown Source)
         at java.awt.Container.dispatchEventImpl(Unknown Source)
         at java.awt.Component.dispatchEvent(Unknown Source)
         at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
         at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
         at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
         at java.awt.Container.dispatchEventImpl(Unknown Source)
         at java.awt.Window.dispatchEventImpl(Unknown Source)
         at java.awt.Component.dispatchEvent(Unknown Source)
         at java.awt.EventQueue.dispatchEvent(Unknown Source)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
         at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
         at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
         at java.awt.EventDispatchThread.run(Unknown Source)
    I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
    Thank you so much in advance.
    Neel

    Hi Cezar,
    Can u plz help me with this scenario?
    I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
    Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
    java.sql.SQLException: ORA-00942: table or view does not exist
    and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
    "-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
    Please let me know how to make it work?
    Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
    Thank you for your time in advance.
    Regards,
    Neel

  • Unable to load data from FDMEE staging table to HFM tables

    Hi,
    We have installed EPM 11.1.2.3 with all latest related products (ODI/FDMEE) in our development environment
    We are inprocess of loading data from EBS R12 to HFM using ERPI (Data Managemwnt in EPM 11.1.2.3). We could Import and validate the data but when we try to Export the data to HFM, the process continuously running hours (neither it gives any error nor it completes).
    When we check the process details in ODI work repository, following processes are completed successfully
    COMM_LOAD_BALANCES - Running .........(From past 1 day, still running)
    EBS_GL_LOAD_BALANCES_DATA - Successful
    COMM_LOAD_BALANCES - Successful
    Whereas we could load data in staging table of FDMEE database schema and moreover we are even able to drill through to source system (EBS R12) from Data load workbench but not able to load the data into HFM application.
    Log details from the process are below.
    2013-11-05 17:04:59,747 INFO  [AIF]: FDMEE Process Start, Process ID: 31
    2013-11-05 17:04:59,747 INFO  [AIF]: FDMEE Logging Level: 4
    2013-11-05 17:04:59,748 INFO  [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_31.log
    2013-11-05 17:04:59,748 INFO  [AIF]: User:admin
    2013-11-05 17:04:59,748 INFO  [AIF]: Location:VisionLoc (Partitionkey:1)
    2013-11-05 17:04:59,749 INFO  [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
    2013-11-05 17:04:59,749 INFO  [AIF]: Category Name:VisionCat (Category key:3)
    2013-11-05 17:04:59,749 INFO  [AIF]: Rule Name:VisionRule (Rule ID:2)
    2013-11-05 17:05:00,844 INFO  [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
    [Oracle JRockit(R) (Oracle Corporation)]
    2013-11-05 17:05:00,844 INFO  [AIF]: Java Platform: java1.6.0_37
    2013-11-05 17:05:02,910 INFO  [AIF]: -------START IMPORT STEP-------
    2013-11-05 17:05:02,953 INFO  [AIF]: -------END IMPORT STEP-------
    2013-11-05 17:05:03,030 INFO  [AIF]: -------START EXPORT STEP-------
    2013-11-05 17:05:03,108 INFO  [AIF]:
    Move Data for Period 'JAN'
    Any help on above is much appreciated.
    Thank you
    Regards
    Praneeth

    Hi,
    I have followed the steps 1 & 2 above. Noe the log shows something like below
    2013-11-05 09:47:31,179 INFO  [AIF]: FDMEE Process Start, Process ID: 22
    2013-11-05 09:47:31,179 INFO  [AIF]: FDMEE Logging Level: 4
    2013-11-05 09:47:31,179 INFO  [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
    2013-11-05 09:47:31,180 INFO  [AIF]: User:admin
    2013-11-05 09:47:31,180 INFO  [AIF]: Location:VisionLoc (Partitionkey:1)
    2013-11-05 09:47:31,180 INFO  [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
    2013-11-05 09:47:31,180 INFO  [AIF]: Category Name:VisionCat (Category key:3)
    2013-11-05 09:47:31,181 INFO  [AIF]: Rule Name:VisionRule (Rule ID:2)
    2013-11-05 09:47:32,378 INFO  [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
    [Oracle JRockit(R) (Oracle Corporation)]
    2013-11-05 09:47:32,378 INFO  [AIF]: Java Platform: java1.6.0_37
    2013-11-05 09:47:34,652 INFO  [AIF]: -------START IMPORT STEP-------
    2013-11-05 09:47:34,698 INFO  [AIF]: -------END IMPORT STEP-------
    2013-11-05 09:47:34,744 INFO  [AIF]: -------START EXPORT STEP-------
    2013-11-05 09:47:34,828 INFO  [AIF]:
    Move Data for Period 'JAN'
    2013-11-08 11:49:10,478 INFO  [AIF]: FDMEE Process Start, Process ID: 22
    2013-11-08 11:49:10,493 INFO  [AIF]: FDMEE Logging Level: 5
    2013-11-08 11:49:10,493 INFO  [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
    2013-11-08 11:49:10,494 INFO  [AIF]: User:admin
    2013-11-08 11:49:10,494 INFO  [AIF]: Location:VISIONLOC (Partitionkey:1)
    2013-11-08 11:49:10,494 INFO  [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
    2013-11-08 11:49:10,495 INFO  [AIF]: Category Name:VISIONCAT (Category key:1)
    2013-11-08 11:49:10,495 INFO  [AIF]: Rule Name:VISIONRULE (Rule ID:1)
    2013-11-08 11:49:11,903 INFO  [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
    [Oracle JRockit(R) (Oracle Corporation)]
    2013-11-08 11:49:11,909 INFO  [AIF]: Java Platform: java1.6.0_37
    2013-11-08 11:49:14,037 INFO  [AIF]: -------START IMPORT STEP-------
    2013-11-08 11:49:14,105 INFO  [AIF]: -------END IMPORT STEP-------
    2013-11-08 11:49:14,152 INFO  [AIF]: -------START EXPORT STEP-------
    2013-11-08 11:49:14,178 DEBUG [AIF]: CommData.exportData - START
    2013-11-08 11:49:14,183 DEBUG [AIF]: CommData.getRuleInfo - START
    2013-11-08 11:49:14,188 DEBUG [AIF]:
            SELECT brl.RULE_ID
            ,br.RULE_NAME
            ,brl.PARTITIONKEY
            ,brl.CATKEY
            ,part.PARTVALGROUP
            ,br.SOURCE_SYSTEM_ID
            ,ss.SOURCE_SYSTEM_TYPE
            ,CASE
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
              ELSE 'Y'
            END SOURCE_ADAPTER_FLAG
            ,app.APPLICATION_ID
            ,app.TARGET_APPLICATION_NAME
            ,app.TARGET_APPLICATION_TYPE
            ,app.DATA_LOAD_METHOD
            ,brl.PLAN_TYPE
            ,CASE brl.PLAN_TYPE
              WHEN 'PLAN1' THEN 1
              WHEN 'PLAN2' THEN 2
              WHEN 'PLAN3' THEN 3
              WHEN 'PLAN4' THEN 4
              WHEN 'PLAN5' THEN 5
              ELSE 0
            END PLAN_NUMBER
            ,br.INCL_ZERO_BALANCE_FLAG
            ,br.PERIOD_MAPPING_TYPE
            ,br.INCLUDE_ADJ_PERIODS_FLAG
            ,br.BALANCE_TYPE ACTUAL_FLAG
            ,br.AMOUNT_TYPE
            ,br.BALANCE_SELECTION
            ,br.BALANCE_METHOD_CODE
            ,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
            ,br.CURRENCY_CODE
            ,br.BAL_SEG_VALUE_OPTION_CODE
            ,brl.EXECUTION_MODE
            ,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
            ,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
            ,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
            ,CASE
              WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
              WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
              ELSE 'NONE'
            END LEDGER_GROUP_CODE
            ,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
            ,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
            ,br.LEDGER_GROUP
            ,(SELECT brd.DETAIL_CODE
              FROM AIF_BAL_RULE_DETAILS brd
              WHERE brd.RULE_ID = br.RULE_ID
              AND brd.DETAIL_TYPE = 'LEDGER'       
            ) PS_LEDGER
            ,CASE lg.LEDGER_TEMPLATE
              WHEN 'COMMITMENT' THEN 'Y'
              ELSE 'N'
            END KK_FLAG
            ,p.LAST_UPDATED_BY
            ,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
            ,p.EPM_ORACLE_INSTANCE
            FROM AIF_PROCESSES p
            INNER JOIN AIF_BAL_RULE_LOADS brl
              ON brl.LOADID = p.PROCESS_ID
            INNER JOIN AIF_BALANCE_RULES br
              ON br.RULE_ID = brl.RULE_ID
            INNER JOIN AIF_SOURCE_SYSTEMS ss
              ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
            INNER JOIN AIF_TARGET_APPLICATIONS app
              ON app.APPLICATION_ID = brl.APPLICATION_ID
            INNER JOIN TPOVPARTITION part
              ON part.PARTITIONKEY = br.PARTITIONKEY
            INNER JOIN TBHVIMPGROUP imp
              ON imp.IMPGROUPKEY = part.PARTIMPGROUP
            LEFT OUTER JOIN AIF_COA_LEDGERS l
              ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
              AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
            LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
              ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
              AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
              AND scr.RECNAME = 'LED_GRP_TBL'
            LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
              ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
              AND lg.SETID = scr.SETID
              AND lg.LEDGER_GROUP = br.LEDGER_GROUP
            WHERE p.PROCESS_ID = 22
    2013-11-08 11:49:14,195 DEBUG [AIF]:
          SELECT adim.BALANCE_COLUMN_NAME DIMNAME
          ,adim.DIMENSION_ID
          ,dim.TARGET_DIMENSION_CLASS_NAME
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
          ) COA_SEGMENT_NAME1
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
          ) COA_SEGMENT_NAME2
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
          ) COA_SEGMENT_NAME3
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
          ) COA_SEGMENT_NAME4
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
          ) COA_SEGMENT_NAME5
          ,(SELECT CASE mdd.ORPHAN_OPTION_CODE
              WHEN 'CHILD' THEN 'N'
              WHEN 'ROOT' THEN 'N'
              ELSE 'Y'
            END DIMENSION_FILTER_FLAG
            FROM AIF_MAP_DIM_DETAILS_V mdd
            ,AIF_MAPPING_RULES mr
            WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
            AND mdd.RULE_ID = mr.RULE_ID
            AND mdd.DIMENSION_ID = adim.DIMENSION_ID
          ) DIMENSION_FILTER_FLAG
          ,tiie.IMPCONCATCHAR
          FROM TPOVPARTITION tpp
          INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
            ON adim.APPLICATION_ID = 2
          INNER JOIN AIF_DIMENSIONS dim
            ON dim.DIMENSION_ID = adim.DIMENSION_ID
          LEFT OUTER JOIN TBHVIMPITEMERPI tiie
            ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
            AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
            AND tiie.IMPMAPTYPE = 'ERP'
          WHERE tpp.PARTITIONKEY = 1
          AND adim.BALANCE_COLUMN_NAME IS NOT NULL
          ORDER BY adim.BALANCE_COLUMN_NAME
    2013-11-08 11:49:14,197 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
    2013-11-08 11:49:14,197 DEBUG [AIF]: CommData.getRuleInfo - END
    2013-11-08 11:49:14,224 DEBUG [AIF]: CommData.insertPeriods - START
    2013-11-08 11:49:14,228 DEBUG [AIF]: CommData.getLedgerListAndMap - START
    2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - START
    2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - END
    2013-11-08 11:49:14,229 DEBUG [AIF]:
              SELECT l.SOURCE_LEDGER_ID
              ,l.SOURCE_LEDGER_NAME
              ,l.SOURCE_COA_ID
              ,l.CALENDAR_ID
              ,'0' SETID
              ,l.PERIOD_TYPE
              ,NULL LEDGER_TABLE_NAME
              FROM AIF_BALANCE_RULES br
              ,AIF_COA_LEDGERS l
              WHERE br.RULE_ID = 1
              AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
              AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
    2013-11-08 11:49:14,230 DEBUG [AIF]: CommData.getLedgerListAndMap - END
    2013-11-08 11:49:14,232 DEBUG [AIF]:
            INSERT INTO AIF_PROCESS_PERIODS (
              PROCESS_ID
              ,PERIODKEY
              ,PERIOD_ID
              ,ADJUSTMENT_PERIOD_FLAG
              ,GL_PERIOD_YEAR
              ,GL_PERIOD_NUM
              ,GL_PERIOD_NAME
              ,GL_PERIOD_CODE
              ,GL_EFFECTIVE_PERIOD_NUM
              ,YEARTARGET
              ,PERIODTARGET
              ,IMP_ENTITY_TYPE
              ,IMP_ENTITY_ID
              ,IMP_ENTITY_NAME
              ,TRANS_ENTITY_TYPE
              ,TRANS_ENTITY_ID
              ,TRANS_ENTITY_NAME
              ,PRIOR_PERIOD_FLAG
              ,SOURCE_LEDGER_ID
                    SELECT DISTINCT brl.LOADID PROCESS_ID
                    ,pp.PERIODKEY PERIODKEY
                    ,prd.PERIOD_ID
                    ,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
                    ,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
                    ,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
                    ,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
                    ,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
                    ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
                    ,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
                    ,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
                    ,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
                    ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
                    ,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') IMP_ENTITY_NAME
                    ,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
                    ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
                    ,pp.PERIODDESC TRANS_ENTITY_NAME
                    ,'N' PRIOR_PERIOD_FLAG
                    ,2202 SOURCE_LEDGER_ID
                    FROM (
                      AIF_BAL_RULE_LOADS brl
                      INNER JOIN TPOVCATEGORY pc
                        ON pc.CATKEY = brl.CATKEY
                      INNER JOIN TPOVPERIOD_FLAT_V pp
                        ON pp.PERIODFREQ = pc.CATFREQ
                        AND pp.PERIODKEY >= brl.START_PERIODKEY
                        AND pp.PERIODKEY <= brl.END_PERIODKEY
                      LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
                        ON ppa.PERIODKEY = pp.PERIODKEY
                        AND ppa.PERIODFREQ = pp.PERIODFREQ
                        AND ppa.INTSYSTEMKEY = 'OASIS'
                    INNER JOIN TPOVPERIODSOURCE ppsrc
                      ON ppsrc.PERIODKEY = pp.PERIODKEY
                      AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
                      AND ppsrc.SOURCE_SYSTEM_ID = 2
                      AND ppsrc.CALENDAR_ID IN ('29067')
                    LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
                      ON prd.PERIOD_ID = ppsrc.PERIOD_ID
                      AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
                      AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
              AND prd.SETID = '0'
              AND prd.PERIOD_TYPE = '507'
                      AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
                    WHERE brl.LOADID = 22
                    ORDER BY pp.PERIODKEY
                    ,GL_EFFECTIVE_PERIOD_NUM
    2013-11-08 11:49:14,235 DEBUG [AIF]: CommData.insertPeriods - END
    2013-11-08 11:49:14,240 DEBUG [AIF]: CommData.moveData - START
    2013-11-08 11:49:14,242 DEBUG [AIF]: CommData.getPovList - START
    2013-11-08 11:49:14,242 DEBUG [AIF]:
            SELECT PARTITIONKEY
            ,PARTNAME
            ,CATKEY
            ,CATNAME
            ,PERIODKEY
            ,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
            ,RULE_ID
            ,RULE_NAME
            FROM (
              SELECT DISTINCT brl.PARTITIONKEY
              ,part.PARTNAME
              ,brl.CATKEY
              ,cat.CATNAME
              ,pprd.PERIODKEY
              ,pp.PERIODDESC
              ,brl.RULE_ID
              ,br.RULE_NAME
              FROM AIF_BAL_RULE_LOADS brl
              INNER JOIN AIF_BALANCE_RULES br
                ON br.RULE_ID = brl.RULE_ID
              INNER JOIN TPOVPARTITION part
                ON part.PARTITIONKEY = brl.PARTITIONKEY
              INNER JOIN TPOVCATEGORY cat
                ON cat.CATKEY = brl.CATKEY
              INNER JOIN AIF_PROCESS_PERIODS pprd
                ON pprd.PROCESS_ID = brl.LOADID
              LEFT OUTER JOIN TPOVPERIOD pp
                ON pp.PERIODKEY = pprd.PERIODKEY
              WHERE brl.LOADID = 22
            ) q
            ORDER BY PARTITIONKEY
            ,CATKEY
            ,PERIODKEY
            ,RULE_ID
    2013-11-08 11:49:14,244 DEBUG [AIF]: CommData.getPovList - END
    2013-11-08 11:49:14,245 INFO  [AIF]:
    Move Data for Period 'JAN'
    2013-11-08 11:49:14,246 DEBUG [AIF]:
              UPDATE TDATASEG
              SET LOADID = 22
              WHERE PARTITIONKEY = 1
              AND CATKEY = 1
              AND RULE_ID = 1
              AND LOADID < 22
                AND PERIODKEY = '2012-01-01'
    2013-11-08 11:49:14,320 DEBUG [AIF]: Number of Rows updated on TDATASEG: 1842
    2013-11-08 11:49:14,320 DEBUG [AIF]:
            INSERT INTO AIF_APPL_LOAD_AUDIT (
              LOADID
              ,TARGET_APPLICATION_TYPE
              ,TARGET_APPLICATION_NAME
              ,PLAN_TYPE
              ,SOURCE_LEDGER_ID
              ,EPM_YEAR
              ,EPM_PERIOD
              ,SNAPSHOT_FLAG
              ,SEGMENT_FILTER_FLAG
              ,PARTITIONKEY
              ,CATKEY
              ,RULE_ID
              ,PERIODKEY
              ,EXPORT_TO_TARGET_FLAG
            SELECT DISTINCT 22
            ,TARGET_APPLICATION_TYPE
            ,TARGET_APPLICATION_NAME
            ,PLAN_TYPE
            ,SOURCE_LEDGER_ID
            ,EPM_YEAR
            ,EPM_PERIOD
            ,SNAPSHOT_FLAG
            ,SEGMENT_FILTER_FLAG
            ,PARTITIONKEY
            ,CATKEY
            ,RULE_ID
            ,PERIODKEY
            ,'N'
            FROM AIF_APPL_LOAD_AUDIT
            WHERE PARTITIONKEY = 1
            AND CATKEY = 1
            AND RULE_ID = 1
            AND LOADID < 22
                AND PERIODKEY = '2012-01-01'
    2013-11-08 11:49:14,321 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_AUDIT: 1
    2013-11-08 11:49:14,322 DEBUG [AIF]:
            INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
              LOADID
              ,GL_PERIOD_ID
              ,GL_PERIOD_YEAR
              ,DELTA_RUN_ID
              ,PARTITIONKEY
              ,CATKEY
              ,RULE_ID
              ,PERIODKEY
            SELECT DISTINCT 22
            ,GL_PERIOD_ID
            ,GL_PERIOD_YEAR
            ,DELTA_RUN_ID
            ,PARTITIONKEY
            ,CATKEY
            ,RULE_ID
            ,PERIODKEY
            FROM AIF_APPL_LOAD_PRD_AUDIT
            WHERE PARTITIONKEY = 1
            AND CATKEY = 1
            AND RULE_ID = 1
            AND LOADID < 22
                AND PERIODKEY = '2012-01-01'
    2013-11-08 11:49:14,323 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_PRD_AUDIT: 1
    2013-11-08 11:49:14,325 DEBUG [AIF]: CommData.moveData - END
    2013-11-08 11:49:14,332 DEBUG [AIF]: CommData.updateWorkflow - START
    2013-11-08 11:49:14,332 DEBUG [AIF]:
        SELECT tlp.PROCESSSTATUS
        ,tlps.PROCESSSTATUSDESC
        ,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
        FROM TLOGPROCESS tlp
        ,TLOGPROCESSSTATES tlps
        WHERE tlp.PARTITIONKEY = 1
        AND tlp.CATKEY = 1
        AND tlp.PERIODKEY = '2012-01-01'
        AND tlp.RULE_ID = 1
        AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
    2013-11-08 11:49:14,336 DEBUG [AIF]:
            UPDATE TLOGPROCESS
            SET PROCESSENDTIME = CURRENT_TIMESTAMP
            ,PROCESSSTATUS = 20
              ,PROCESSEXP = 0
              ,PROCESSENTLOAD = 0
              ,PROCESSENTVAL = 0
              ,PROCESSEXPNOTE = NULL
              ,PROCESSENTLOADNOTE = NULL
              ,PROCESSENTVALNOTE = NULL
            WHERE PARTITIONKEY = 1
            AND CATKEY = 1
            AND PERIODKEY = '2012-01-01'
            AND RULE_ID = 1
    2013-11-08 11:49:14,338 DEBUG [AIF]: CommData.updateWorkflow - END
    2013-11-08 11:49:14,339 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - START
    2013-11-08 11:49:14,341 DEBUG [AIF]:
            DELETE FROM TDATASEG
            WHERE LOADID = 22
              AND (
            PARTITIONKEY = 1
            AND CATKEY = 1
            AND PERIODKEY = '2012-01-01'
            AND RULE_ID = 1
            AND VALID_FLAG = 'N'
    2013-11-08 11:49:14,342 DEBUG [AIF]: Number of Rows deleted from TDATASEG: 0
    2013-11-08 11:49:14,342 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - END
    2013-11-08 11:49:14,344 DEBUG [AIF]: CommData.updateAppLoadAudit - START
    2013-11-08 11:49:14,344 DEBUG [AIF]:
            UPDATE AIF_APPL_LOAD_AUDIT
            SET EXPORT_TO_TARGET_FLAG = 'Y'
            WHERE LOADID = 22
            AND PARTITIONKEY = 1
            AND CATKEY = 1
            AND PERIODKEY= '2012-01-01'
            AND RULE_ID = 1
    2013-11-08 11:49:14,345 DEBUG [AIF]: Number of Rows updated on AIF_APPL_LOAD_AUDIT: 1
    2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateAppLoadAudit - END
    2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateWorkflow - START
    2013-11-08 11:49:14,346 DEBUG [AIF]:
            UPDATE TLOGPROCESS
            SET PROCESSENDTIME = CURRENT_TIMESTAMP
            ,PROCESSSTATUS = 21
              ,PROCESSEXP = 1
              ,PROCESSEXPNOTE = 'Export Successful'
            WHERE PARTITIONKEY = 1
            AND CATKEY = 1
            AND PERIODKEY = '2012-01-01'
            AND RULE_ID = 1
    2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.updateWorkflow - END
    2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.exportData - END
    2013-11-08 11:49:14,404 DEBUG [AIF]: HfmData.loadData - START
    2013-11-08 11:49:14,404 DEBUG [AIF]: CommData.getRuleInfo - START
    2013-11-08 11:49:14,404 DEBUG [AIF]:
            SELECT brl.RULE_ID
            ,br.RULE_NAME
            ,brl.PARTITIONKEY
            ,brl.CATKEY
            ,part.PARTVALGROUP
            ,br.SOURCE_SYSTEM_ID
            ,ss.SOURCE_SYSTEM_TYPE
            ,CASE
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
              WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
              ELSE 'Y'
            END SOURCE_ADAPTER_FLAG
            ,app.APPLICATION_ID
            ,app.TARGET_APPLICATION_NAME
            ,app.TARGET_APPLICATION_TYPE
            ,app.DATA_LOAD_METHOD
            ,brl.PLAN_TYPE
            ,CASE brl.PLAN_TYPE
              WHEN 'PLAN1' THEN 1
              WHEN 'PLAN2' THEN 2
              WHEN 'PLAN3' THEN 3
              WHEN 'PLAN4' THEN 4
              WHEN 'PLAN5' THEN 5
              ELSE 0
            END PLAN_NUMBER
            ,br.INCL_ZERO_BALANCE_FLAG
            ,br.PERIOD_MAPPING_TYPE
            ,br.INCLUDE_ADJ_PERIODS_FLAG
            ,br.BALANCE_TYPE ACTUAL_FLAG
            ,br.AMOUNT_TYPE
            ,br.BALANCE_SELECTION
            ,br.BALANCE_METHOD_CODE
            ,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
            ,br.CURRENCY_CODE
            ,br.BAL_SEG_VALUE_OPTION_CODE
            ,brl.EXECUTION_MODE
            ,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
            ,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
            ,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
            ,CASE
              WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
              WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
              ELSE 'NONE'
            END LEDGER_GROUP_CODE
            ,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
            ,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
            ,br.LEDGER_GROUP
            ,(SELECT brd.DETAIL_CODE
              FROM AIF_BAL_RULE_DETAILS brd
              WHERE brd.RULE_ID = br.RULE_ID
              AND brd.DETAIL_TYPE = 'LEDGER'       
            ) PS_LEDGER
            ,CASE lg.LEDGER_TEMPLATE
              WHEN 'COMMITMENT' THEN 'Y'
              ELSE 'N'
            END KK_FLAG
            ,p.LAST_UPDATED_BY
            ,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
            ,p.EPM_ORACLE_INSTANCE
            FROM AIF_PROCESSES p
            INNER JOIN AIF_BAL_RULE_LOADS brl
              ON brl.LOADID = p.PROCESS_ID
            INNER JOIN AIF_BALANCE_RULES br
              ON br.RULE_ID = brl.RULE_ID
            INNER JOIN AIF_SOURCE_SYSTEMS ss
              ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
            INNER JOIN AIF_TARGET_APPLICATIONS app
              ON app.APPLICATION_ID = brl.APPLICATION_ID
            INNER JOIN TPOVPARTITION part
              ON part.PARTITIONKEY = br.PARTITIONKEY
            INNER JOIN TBHVIMPGROUP imp
              ON imp.IMPGROUPKEY = part.PARTIMPGROUP
            LEFT OUTER JOIN AIF_COA_LEDGERS l
              ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
              AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
            LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
              ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
              AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
              AND scr.RECNAME = 'LED_GRP_TBL'
            LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
              ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
              AND lg.SETID = scr.SETID
              AND lg.LEDGER_GROUP = br.LEDGER_GROUP
            WHERE p.PROCESS_ID = 22
    2013-11-08 11:49:14,406 DEBUG [AIF]:
          SELECT adim.BALANCE_COLUMN_NAME DIMNAME
          ,adim.DIMENSION_ID
          ,dim.TARGET_DIMENSION_CLASS_NAME
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
          ) COA_SEGMENT_NAME1
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
          ) COA_SEGMENT_NAME2
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
          ) COA_SEGMENT_NAME3
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
          ) COA_SEGMENT_NAME4
          ,(SELECT COA_SEGMENT_NAME
            FROM AIF_COA_SEGMENTS cs
            WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
          ) COA_SEGMENT_NAME5
          ,(SELECT CASE mdd.ORPHAN_OPTION_CODE
              WHEN 'CHILD' THEN 'N'
              WHEN 'ROOT' THEN 'N'
              ELSE 'Y'
            END DIMENSION_FILTER_FLAG
            FROM AIF_MAP_DIM_DETAILS_V mdd
            ,AIF_MAPPING_RULES mr
            WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
            AND mdd.RULE_ID = mr.RULE_ID
            AND mdd.DIMENSION_ID = adim.DIMENSION_ID
          ) DIMENSION_FILTER_FLAG
          ,tiie.IMPCONCATCHAR
          FROM TPOVPARTITION tpp
          INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
            ON adim.APPLICATION_ID = 2
          INNER JOIN AIF_DIMENSIONS dim
            ON dim.DIMENSION_ID = adim.DIMENSION_ID
          LEFT OUTER JOIN TBHVIMPITEMERPI tiie
            ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
            AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
            AND tiie.IMPMAPTYPE = 'ERP'
          WHERE tpp.PARTITIONKEY = 1
          AND adim.BALANCE_COLUMN_NAME IS NOT NULL
          ORDER BY adim.BALANCE_COLUMN_NAME
    2013-11-08 11:49:14,407 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
    2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getRuleInfo - END
    2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getPovList - START
    2013-11-08 11:49:14,407 DEBUG [AIF]:
            SELECT PARTITIONKEY
            ,PARTNAME
            ,CATKEY
            ,CATNAME
            ,PERIODKEY
            ,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
            ,RULE_ID
            ,RULE_NAME
            FROM (
              SELECT DISTINCT brl.PARTITIONKEY
              ,part.PARTNAME
              ,brl.CATKEY
              ,cat.CATNAME
              ,pprd.PERIODKEY
              ,pp.PERIODDESC
              ,brl.RULE_ID
              ,br.RULE_NAME
              FROM AIF_BAL_RULE_LOADS brl
              INNER JOIN AIF_BALANCE_RULES br
                ON br.RULE_ID = brl.RULE_ID
              INNER JOIN TPOVPARTITION part
                ON part.PARTITIONKEY = brl.PARTITIONKEY
              INNER JOIN TPOVCATEGORY cat
                ON cat.CATKEY = brl.CATKEY
              INNER JOIN AIF_PROCESS_PERIODS pprd
                ON pprd.PROCESS_ID = brl.LOADID
              LEFT OUTER JOIN TPOVPERIOD pp
                ON pp.PERIODKEY = pprd.PERIODKEY
              WHERE brl.LOADID = 22
            ) q
            ORDER BY PARTITIONKEY
            ,CATKEY
            ,PERIODKEY
            ,RULE_ID
    2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.getPovList - END
    2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.updateWorkflow - START
    2013-11-08 11:49:14,409 DEBUG [AIF]:
        SELECT tlp.PROCESSSTATUS
        ,tlps.PROCESSSTATUSDESC
        ,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
        FROM TLOGPROCESS tlp
        ,TLOGPROCESSSTATES tlps
        WHERE tlp.PARTITIONKEY = 1
        AND tlp.CATKEY = 1
        AND tlp.PERIODKEY = '2012-01-01'
        AND tlp.RULE_ID = 1
        AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
    2013-11-08 11:49:14,410 DEBUG [AIF]:
            UPDATE TLOGPROCESS
            SET PROCESSENDTIME = CURRENT_TIMESTAMP
            ,PROCESSSTATUS = 30
              ,PROCESSENTLOAD = 0
              ,PROCESSENTVAL = 0
              ,PROCESSENTLOADNOTE = NULL
              ,PROCESSENTVALNOTE = NULL
            WHERE PARTITIONKEY = 1
            AND CATKEY = 1
            AND PERIODKEY = '2012-01-01'
            AND RULE_ID = 1
    2013-11-08 11:49:14,411 DEBUG [AIF]: CommData.updateWorkflow - END
    2013-11-08 11:49:14,412 DEBUG [AIF]:
        SELECT COALESCE(usr.PROFILE_OPTION_VALUE, app.PROFILE_OPTION_VALUE, site.PROFILE_OPTION_VALUE) PROFILE_OPTION_VALUE
        FROM AIF_PROFILE_OPTIONS po
        LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES site
          ON site.PROFILE_OPTION_NAME = po.PROFILE_OPTION_NAME
          AND site.LEVEL_ID = 1000
          AND site.LEVEL_VALUE = 0
          AND site.LEVEL_ID <= po.MAX_LEVEL_ID
        LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES app
          ON app.PROFILE_OPTION_NAME = site.PROFILE_OPTION_NAME
          AND app.LEVEL_ID = 1005
          AND app.LEVEL_VALUE = NULL
          AND app.LEVEL_ID <= po.MAX_LEVEL_ID
        LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES usr
          ON usr.PROFILE_OPTION_NAME = usr.PROFILE_OPTION_NAME
          AND usr.LEVEL_ID = 1010
          AND usr.LEVEL_VALUE = NULL
          AND usr.LEVEL_ID <= po.MAX_LEVEL_ID
        WHERE po.PROFILE_OPTION_NAME = 'JAVA_HOME'
    2013-11-08 11:49:14,413 DEBUG [AIF]: HFM Load command:
    %EPM_ORACLE_HOME%/products/FinancialDataQuality/bin/HFM_LOAD.vbs "22" "a9E3uvSJNhkFhEQTmuUFFUElfdQgKJKHrb1EsjRsL6yZJlXsOFcVPbGWHhpOQzl9zvHoo3s%2Bdq6R4yJhp0GMNWIKMTcizp3%2F8HASkA7rVufXDWEpAKAK%2BvcFmj4zLTI3rtrKHlVEYrOLMY453J2lXk6Cy771mNSD8X114CqaWSdUKGbKTRGNpgE3BfRGlEd1wZ3cra4ee0jUbT2aTaiqSN26oVe6dyxM3zolc%2BOPkjiDNk1MqwNr43tT3JsZz4qEQGF9d39DRN3CDjUuZRPt4SEKSSL35upncRJiw2uBOtV%2FvSuGLNpZ2J79v1%2Ba1Oo9c4Xhig7SFYbE6Jwk1yXRJLTSw0DKFu%2FEpcdjpOnx%2F6YawMBNIa5iu5L637S91jT1Xd3EGmxZFq%2Bi6bHdCJAC8g%3D%3D" "C%3A%5COracle%5CMiddleware%5Cuser_projects%5Cepmsystem1" "%25EPM_ORACLE_HOME%25%2F..%2Fjdk160_35"
    Is there anywhere we need to set EPM Home and also let me know what is PSU1 ?
    Thanks in advance.
    Praneeth

  • How to load data from a virtual cube with services

    Hello all,
    we have set up a virtual cube with service and create a BEx report to get the data from an external database. That works fine. The question is now:
    Is it some how possible to "load" the data from this virtual cube with service (I know that there are not really data...) into an other InfoCube?
    If that is possible, can you please give my some guidance how to set up this scenario.
    Thanks in advance
    Jürgen

    Hi:
    I don't have system before me, so try this.
    I know it works for Remote Cube.
    Right Click on the Cube and Select Generate Export Data Source.
    If you can do this successfully, then go to Source Systems tab and select the BW. Here, Right CLick on select Replicate DataSources.
    Next, go to InfoSOurces, click on Refresh. Copy the name of Virtual Cube and add 8 as a prefix and search for the infosource.
    If you can see it, that means, you can load data from this cube to anywhere you want, just like you do to ODS.
    ELSE.
    Try and see if you can create an InfoSpoke in Virtual Cube. Tran - RSBO.
    Here, you can load to a database table and then, from this table, you can create datasource, etc.
    ELSE.
    Create query and save it as CSV file and load it anywhere you want. This is more difficult.
    Good luck
    Ram Chamarthy

Maybe you are looking for

  • Power View error: Cannot create a connection to data source 'EntityDataSource'

    I have Power View on SharePoint 2013. I have both Multidimentional and Tabular model in use. I have several Report.drlx file in Report libary in SharePoint. My colleague is getting following error when building report by dragging Facts and Dimentions

  • My ipad 3 froze and now is not turning back on what should i do?

    my ipad 3 froze and now is not turning back on what should i do?

  • To restrict the change of quantity of component

    Hi, What is the most effective way to restrict the change of quantity of reserve material after the MO is release. Considering the create , release and processing of the MO is done by separate user. My requirement is after the release of the maintena

  • Problem with oracle workflow builder 2.6.3.5

    Hello I have installed oracle workflow server 2.6.3 on a oracle 10G 10.2.0.1 in Red Hat linux Enterprise 4L and I have Oracle workflow builder 2.6.3.5 on Microsoft Windows XP Service pack 2 I have the following error when i try to save or open and it

  • Re: BI Server very slow

    Hi All, Currently the BI server is very slow. Accessing the tcodes itself takes long time. Can anybody tell me the tcodes to monitor the server utilisation. How to improve the performance of BI system. Regards, Anand