DTP Full load failure ?

Hi all ,
we have enhanced the 0material_attr datasource  by adding  4 new fields .
we ran the full infopackage and it is updated successfully in PSA .
but iam running the DELTA DTP it is getting failed and goes to short dump , saying that  no more storage sapce availble for extending an internal table .
any help on this ?
Thanks

This is related to Table space issue.
Ask your basis to check weather you have sufficient free space in PSATEMP table and other respective tables.
You can get more details about table space in DB02 transaction.
Or
decrease the Packet size, and split the selection and run the DTP or ask basis to increase the space.
Can you post the exact Dump so that we can provide you the exact details if above info does not help.
Regards
KP

Similar Messages

  • What is the diffrence between full load and delta load in DTP

    hI ,
    I am trying to load the data into CUBE from another cube using DTP ..
    There are 2 DTPS ..
    1: DTP with full load
    2: DTP with DELTA load ..
    what is the diffrence betwen thse two in DTP ...
    Please can somebody help me

    1: DTP with full load  - will update all the requests in PSA/source to the target,
    2: DTP with DELTA load - will update only new requests to the datatarget
    The system doesnt distinguish new records on the basis of changed records, rather by the request. Thats the reason you have datamart status to indicate if the request has been loaded to further datatargets.

  • Is there any repair full load option in DTP?

    Hi Experts,
    I have run delta DTP from Cube A to Cube B with selection less than or equal to 31.03.2004 and I want to run grater than 31.03.2004. Now the problem is this load is not bringing the required records because data mart symbol is already set in Cube A.
    How do I extract grater than 01.04.2009?
    Is there any repair full load option in DTP?

    Hi,
    I am afraid that we have only full and delta options available in DTP, but the several other options available on extraction tab will enable us to do something similar to it.
    If you looking to get data after 31.03.2004, in you selction condition change the from date to 01.04.2004 and  try extracting, I am sure it will give you the desired records.
    Please follow info on this link and decide which would suit to your requirement, if you get stuck, post back and I'll try to assist you.
    http://help.sap.com/saphelp_nw70/helpdata/en/42/fa50e40f501a77e10000000a422035/frameset.htm
    Regards,
    Ray
    Edited by: Rayapa on Apr 4, 2009 9:04 AM

  • DTP Delta and Full Load...

    Dear All,
    I'm working with SAP BW 7.3 and I have a standard data flow, starting with DataSource, DSO and InfoCube. My process chain faced an error and for last week or so I could not fetch deltas. To remove the error I had deleted all records from InfoCube, since I had all requests available into DSO. Then I manually loaded Full Load from DSO to InfuCube. But next day when my process chain executed it brought all precious and new records again into InfoCube, as the delta was properly fetched into DSO but it did not come into InfoCube, as it was coming before. When I checked Active and Change Log tables of DSO these two had same number of records.
    1. What could be the reason that delta DTP between DSO and InfoCube is not fetching only delta records, it was fetching deltas records before full load?
    2. Have I made a mistake or missed any setting in DTP while executing full Load DTP between DSO and InfoCube?
    3. What are these options for in Extraction Tab of DTP and when we use following options and why:
        i.   Active Table (with Archive)
        ii.  Acitve Table (Without)
        iii. Archive (Full Extraction only)
        iv. Change Log
    I will appreciate your reply.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Please check the below:
    Delta Init. Extraction from
    Active Table (with Archive)
    The data is read from the DSO active table and from the archived data.
    Active Table (Without Archive)
    The data is only read from the active table of a DSO. If there is data in the archive or in near-line storage at the time of extraction, this data is not extracted.
    Archive (Full Extraction Only)
    The data is only read from the archive data store. Data is not extracted from the active table.
    Change Log
    the data is read from the change log and not the active table of the DSO.
    Hope this answers your question. Let me know if any required.
    Regarsd
    Ramesh V

  • DTP Full/Delta loads

    I am loading data from one DSO to another. The change log of the source DSO has been deleted for any request before 20 days.
    I want to do a full load from active table and then start the delta loads from change log. But the problem is the load is picking up all the previous deltas from change log again, instead of picking deltas after the full load.
    Is there a way to do a "init without data transfer", like the option we had before - where it resets the delats so that only the deltas after the full load are picked.
    Thanks. Points will be awarded.

    Hello Sachin,
    In BI7, there's no longer the notion of delta initialization. Just do a delta from your DSO.
    Just load the full data in our DSO and launch the delta, then run the deltas for populating your DSO and the corresponding deltas from it.
    Hope it helps,

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Task fails while running Full load ETL

    Hi All,
    I am running full load ETL For Oracle R12(vanila Instance) HR But 4 tasks are failing SDE_ORA_JobDimention, SDE_ORA_HRPositionDimention, SDE_ORA_CodeDimension_Pay_level and SDE_ORA_CodeDimensionJob, I changed the parameter for all these task as mentioned in the Installation guide and rebuilled. Please help me out.
    Log is like this for SDE_ORA_JobDimention
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBFAMILYCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_JobDimension_Full] at [Fri Sep 26 10:52:05 2008]
    DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
    DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Base_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_JobDimension_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_JobDimension [version 1]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.1.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_JobDimension_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_JobDimension_Full] is run by 32-bit Integration Service [node01_HSCHBSCGN20031], version [8.1.1], build [0831].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_JobDimension_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Fri Sep 26 10:52:13 2008)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Fri Sep 26 10:52:14 2008)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 1280000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [dev], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [orcl], user [obia], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_JOB_DS :SQL INSERT statement:
    INSERT INTO W_JOB_DS(JOB_CODE,JOB_NAME,JOB_DESC,JOB_FAMILY_CODE,JOB_FAMILY_NAME,JOB_FAMILY_DESC,JOB_LEVEL,W_FLSA_STAT_CODE,W_FLSA_STAT_DESC,W_EEO_JOB_CAT_CODE,W_EEO_JOB_CAT_DESC,AAP_JOB_CAT_CODE,AAP_JOB_CAT_NAME,ACTIVE_FLG,CREATED_BY_ID,CHANGED_BY_ID,CREATED_ON_DT,CHANGED_ON_DT,AUX1_CHANGED_ON_DT,AUX2_CHANGED_ON_DT,AUX3_CHANGED_ON_DT,AUX4_CHANGED_ON_DT,SRC_EFF_FROM_DT,SRC_EFF_TO_DT,DELETE_FLG,DATASOURCE_NUM_ID,INTEGRATION_ID,TENANT_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_JOB_DS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    WRITER_1_*_1> WRT_8005 Writer run started.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_JobDimension.Sq_Jobs] User specified SQL Query [SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT,      PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS.  AS JOB_CODE,
      '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID]
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Fri Sep 26 10:53:05 2008
    Target tables:
    W_JOB_DS
    READER_1_1_1> RR_4049 SQL Query issued to database : (Fri Sep 26 10:53:05 2008)
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01747: invalid user.table.column, table.column, or column specification
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_JOB_DS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Fri Sep 26 10:53:06 2008
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_JOB_DS (Instance Name: [W_JOB_DS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_JOB_DS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Fri Sep 26 10:53:06 2008)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Fri Sep 26 10:53:06 2008)
    MAPPING> TM_6018 Session [SDE_ORA_JobDimension_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [Sq_Jobs] (Instance Name: [mplt_BC_ORA_JobDimension.Sq_Jobs])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_JOB_DS] (Instance Name: [W_JOB_DS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_JobDimension_Full] completed at [Fri Sep 26 10:53:07 2008]

    To make use of the warehouse you would probably want to connect to an EBS instance in order to populate the warehouse.
    Since the execution plan you intend to run is designed for the EBS data-model. I guess if you really didn't want to connect to the EBS instance to pull data you could build one using the universal adapter. This allows you to load out of flat-files if you wish, but I wouldn't reccomend making this a habit for actual implementation as it does create another potential point of failure (populating the flat-files).
    Thanks,
    Austin

  • Regarding Financial Analytics Full Load in DAC

    Hi All,
    Today i have started full Load for financial analytics for subject areas : "Revenue, Receivables, Paybles, General Ledger, Cost of goods Sold". The issue is "SDE_ORA_GLBlanceFact_Full", "SDE_ORA_GLJournals_Full" these two tasks failed due Database driven error(Unable to execute the query in Sourcequalifier). Because of these two tasks remaining tasks are gone into "Stopped" status in DAC. Out of 400 tasks only 56 tasks successful, 342 in stopped status, 2 failure. Can anyone please tell me how to resolve the error and start the remaining tasks. Please guide me if am i miss any steps while configuration.
    Regards
    Sundar

    SDE_ORA_GLBalanceFact:
    I had changed the number data type precision and scale from (22,7) to (28,10). Even though i am getting same error. Please find the below Informatica log file
    Severity     Timestamp     Node     Thread     Message Code     Message
    ERROR     10/4/2011 6:04:04 PM     node01_WIN-O0IX1SFES7T     READER_1_1_1     RR_4035     SQL Error [
    ORA-00936: missing expression
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    BAL.LEDGER_ID,
    BAL.CODE_COMBINATION_ID,
    BAL.CURRENCY_CODE,
    LED.CURRENCY_CODE,
    PER.PERIOD_NAME,
    BAL.ACTUAL_FLAG,
    BAL.TRANSLATED_FLAG,
    BAL.TEMPLATE_ID,
    BAL.PERIOD_NET_DR,
    BAL.PERIOD_NET_CR,
    ( BAL.BEGIN_BALANCE_DR + BAL.PERIOD_NET_DR ),
    ( BAL.BEGIN_BALANCE_CR + BAL.PERIOD_NET_CR) ,
    BAL.PERIOD_NET_DR_BEQ,
    BAL.PERIOD_NET_CR_BEQ,
    ( BAL.BEGIN_BALANCE_DR_BEQ + BAL.PERIOD_NET_DR_BEQ ) PERIOD_END_BALANCE_DR_BEQ,
    ( BAL.BEGIN_BALANCE_CR_BEQ + BAL.PERIOD_NET_CR_BEQ ) PERIOD_END_BALANCE_CR_BEQ,
    PER.START_DATE,
    PER.END_DATE,
    BAL.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_BAL,
    BAL.LAST_UPDATED_BY AS LAST_UPDATED_BY_BAL,
    PER.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_PERIODS,
    PER.LAST_UPDATED_BY AS LAST_UPDATED_BY_PERIODS,
    LED.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_SOB,
    LED.LAST_UPDATED_BY AS LAST_UPDATED_BY_SOB,
    BAL.BUDGET_VERSION_ID AS BUDGET_VERSION_ID,
    PER.ADJUSTMENT_PERIOD_FLAG AS ADJUSTMENT_PERIOD_FLAG,
    CASE WHEN
    BAL.TRANSLATED_FLAG = 'Y' THEN 'TRANSLATED'
    WHEN
    BAL.TRANSLATED_FLAG = 'R' THEN 'ENTERED_FOREIGN'
    WHEN
    BAL.CURRENCY_CODE = 'STAT' THEN 'STAT'
    WHEN
    ((BAL.PERIOD_NET_DR_BEQ = 0) OR (BAL.PERIOD_NET_DR_BEQ IS NULL)) AND
    ((BAL.PERIOD_NET_CR_BEQ = 0) OR (BAL.PERIOD_NET_CR_BEQ IS NULL)) AND
    ((BAL.BEGIN_BALANCE_DR_BEQ = 0) OR (BAL.BEGIN_BALANCE_DR_BEQ IS NULL)) AND
    ((BAL.BEGIN_BALANCE_CR_BEQ = 0) OR (BAL.BEGIN_BALANCE_CR_BEQ IS NULL))
    THEN 'BASE'
    ELSE 'ENTERED_LEDGER'
    END CURRENCY_BALANCE_TYPE
    FROM
    GL_BALANCES BAL
    , GL_LEDGERS LED
    , GL_PERIODS PER
    WHERE LED.LEDGER_ID = BAL.LEDGER_ID
    AND PER.PERIOD_SET_NAME = LED.PERIOD_SET_NAME
    AND BAL.PERIOD_NAME = PER.PERIOD_NAME
    AND BAL.PERIOD_TYPE = PER.PERIOD_TYPE
    AND NVL(BAL.TRANSLATED_FLAG, 'X') IN ('Y', 'X', 'R')
    AND BAL.ACTUAL_FLAG IN ( 'A','B')
    AND BAL.TEMPLATE_ID IS NULL
    AND
    BAL.LAST_UPDATE_DATE >=
    TO_DATE('', 'MM/DD/YYYY HH24:MI:SS')
    OR PER.LAST_UPDATE_DATE >=
    TO_DATE('', 'MM/DD/YYYY HH24:MI:SS')
    AND DECODE(, 'Y', LED.LEDGER_ID, 1) IN ()
    AND DECODE(, 'Y', LED.LEDGER_CATEGORY_CODE, 'NONE') IN ()
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    BAL.LEDGER_ID,
    BAL.CODE_COMBINATION_ID,
    BAL.CURRENCY_CODE,
    LED.CURRENCY_CODE,
    PER.PERIOD_NAME,
    BAL.ACTUAL_FLAG,
    BAL.TRANSLATED_FLAG,
    BAL.TEMPLATE_ID,
    BAL.PERIOD_NET_DR,
    BAL.PERIOD_NET_CR,
    ( BAL.BEGIN_BALANCE_DR + BAL.PERIOD_NET_DR ),
    ( BAL.BEGIN_BALANCE_CR + BAL.PERIOD_NET_CR) ,
    BAL.PERIOD_NET_DR_BEQ,
    BAL.PERIOD_NET_CR_BEQ,
    ( BAL.BEGIN_BALANCE_DR_BEQ + BAL.PERIOD_NET_DR_BEQ ) PERIOD_END_BALANCE_DR_BEQ,
    ( BAL.BEGIN_BALANCE_CR_BEQ + BAL.PERIOD_NET_CR_BEQ ) PERIOD_END_BALANCE_CR_BEQ,
    PER.START_DATE,
    PER.END_DATE,
    BAL.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_BAL,
    BAL.LAST_UPDATED_BY AS LAST_UPDATED_BY_BAL,
    PER.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_PERIODS,
    PER.LAST_UPDATED_BY AS LAST_UPDATED_BY_PERIODS,
    LED.LAST_UPDATE_DATE AS LAST_UPDATE_DATE_SOB,
    LED.LAST_UPDATED_BY AS LAST_UPDATED_BY_SOB,
    BAL.BUDGET_VERSION_ID AS BUDGET_VERSION_ID,
    PER.ADJUSTMENT_PERIOD_FLAG AS ADJUSTMENT_PERIOD_FLAG,
    CASE WHEN
    BAL.TRANSLATED_FLAG = 'Y' THEN 'TRANSLATED'
    WHEN
    BAL.TRANSLATED_FLAG = 'R' THEN 'ENTERED_FOREIGN'
    WHEN
    BAL.CURRENCY_CODE = 'STAT' THEN 'STAT'
    WHEN
    ((BAL.PERIOD_NET_DR_BEQ = 0) OR (BAL.PERIOD_NET_DR_BEQ IS NULL)) AND
    ((BAL.PERIOD_NET_CR_BEQ = 0) OR (BAL.PERIOD_NET_CR_BEQ IS NULL)) AND
    ((BAL.BEGIN_BALANCE_DR_BEQ = 0) OR (BAL.BEGIN_BALANCE_DR_BEQ IS NULL)) AND
    ((BAL.BEGIN_BALANCE_CR_BEQ = 0) OR (BAL.BEGIN_BALANCE_CR_BEQ IS NULL))
    THEN 'BASE'
    ELSE 'ENTERED_LEDGER'
    END CURRENCY_BALANCE_TYPE
    FROM
    GL_BALANCES BAL
    , GL_LEDGERS LED
    , GL_PERIODS PER
    WHERE LED.LEDGER_ID = BAL.LEDGER_ID
    AND PER.PERIOD_SET_NAME = LED.PERIOD_SET_NAME
    AND BAL.PERIOD_NAME = PER.PERIOD_NAME
    AND BAL.PERIOD_TYPE = PER.PERIOD_TYPE
    AND NVL(BAL.TRANSLATED_FLAG, 'X') IN ('Y', 'X', 'R')
    AND BAL.ACTUAL_FLAG IN ( 'A','B')
    AND BAL.TEMPLATE_ID IS NULL
    AND
    BAL.LAST_UPDATE_DATE >=
    TO_DATE('', 'MM/DD/YYYY HH24:MI:SS')
    OR PER.LAST_UPDATE_DATE >=
    TO_DATE('', 'MM/DD/YYYY HH24:MI:SS')
    AND DECODE(, 'Y', LED.LEDGER_ID, 1) IN ()
    AND DECODE(, 'Y', LED.LEDGER_CATEGORY_CODE, 'NONE') IN ()
    Oracle Fatal Error].
    ERROR     10/4/2011 6:04:04 PM     node01_WIN-O0IX1SFES7T     READER_1_1_1     BLKR_16004     ERROR: Prepare failed.
    INFO     10/4/2011 6:04:04 PM     node01_WIN-O0IX1SFES7T     WRITER_1_*_1     WRT_8333     Rolling back all the targets due to fatal session error.
    INFO     10/4/2011 6:04:04 PM     node01_WIN-O0IX1SFES7T     WRITER_1_*_1     WRT_8325     Final rollback executed for the target [W_ACCT_BUDGET_FS, W_GL_BALANCE_FS] at end of load
    SDE_ORA_GLJournals Task Log File
    Severity     Timestamp     Node     Thread     Message Code     Message
    ERROR     10/4/2011 6:04:05 PM     node01_WIN-O0IX1SFES7T     READER_1_1_1     RR_4035     SQL Error [
    ORA-00936: missing expression
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    JEL.JE_HEADER_ID,
    JEL.JE_LINE_NUM,
    JEL.LAST_UPDATE_DATE,
    JEL.LAST_UPDATED_BY,
    JEL.LEDGER_ID,
    JEL.CODE_COMBINATION_ID,
    JEL.PERIOD_NAME,
    JEL.STATUS,
    JEL.CREATION_DATE,
    JEL.CREATED_BY,
    JEL.ENTERED_DR,
    JEL.ENTERED_CR,
    JEL.ACCOUNTED_DR,
    JEL.ACCOUNTED_CR,
    JEL.REFERENCE_1,
    JEL.REFERENCE_2,
    JEL.REFERENCE_3,
    JEL.REFERENCE_4,
    JEL.REFERENCE_5,
    JEL.REFERENCE_6,
    JEL.REFERENCE_7,
    JEL.REFERENCE_8,
    JEL.REFERENCE_9,
    JEL.REFERENCE_10,
    JEL.GL_SL_LINK_ID,
    JEH.JE_CATEGORY,
    JEH.JE_SOURCE,
    JEH.NAME,
    JEH.CURRENCY_CODE,
    JEH.POSTED_DATE,
    JEB.NAME,
    PRDS.START_DATE,
    PRDS.END_DATE,
    GL.LEDGER_CATEGORY_CODE,
    PRDS.ADJUSTMENT_PERIOD_FLAG
    FROM
    GL_JE_LINES JEL,
    GL_JE_HEADERS JEH,
    GL_JE_BATCHES JEB,
    GL_PERIOD_STATUSES PRDS,
    GL_LEDGERS GL
    WHERE
    JEL.JE_HEADER_ID = JEH.JE_HEADER_ID
    AND JEH.ACTUAL_FLAG = 'A'
    AND JEB.STATUS = 'P'
    AND JEH.JE_BATCH_ID = JEB.JE_BATCH_ID (+)
    AND JEL.PERIOD_NAME = PRDS.PERIOD_NAME
    AND JEL.LEDGER_ID = PRDS.SET_OF_BOOKS_ID
    AND JEL.LEDGER_ID = GL.LEDGER_ID
    AND PRDS.APPLICATION_ID = 101
    AND JEH.CURRENCY_CODE<>'STAT'
    AND ( JEB.CREATION_DATE >=
    TO_DATE('01/01/1753 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
    AND DECODE(, 'Y', GL.LEDGER_ID, 1) IN ()
    AND DECODE(, 'Y', GL.LEDGER_CATEGORY_CODE, 'NONE') IN ()
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    JEL.JE_HEADER_ID,
    JEL.JE_LINE_NUM,
    JEL.LAST_UPDATE_DATE,
    JEL.LAST_UPDATED_BY,
    JEL.LEDGER_ID,
    JEL.CODE_COMBINATION_ID,
    JEL.PERIOD_NAME,
    JEL.STATUS,
    JEL.CREATION_DATE,
    JEL.CREATED_BY,
    JEL.ENTERED_DR,
    JEL.ENTERED_CR,
    JEL.ACCOUNTED_DR,
    JEL.ACCOUNTED_CR,
    JEL.REFERENCE_1,
    JEL.REFERENCE_2,
    JEL.REFERENCE_3,
    JEL.REFERENCE_4,
    JEL.REFERENCE_5,
    JEL.REFERENCE_6,
    JEL.REFERENCE_7,
    JEL.REFERENCE_8,
    JEL.REFERENCE_9,
    JEL.REFERENCE_10,
    JEL.GL_SL_LINK_ID,
    JEH.JE_CATEGORY,
    JEH.JE_SOURCE,
    JEH.NAME,
    JEH.CURRENCY_CODE,
    JEH.POSTED_DATE,
    JEB.NAME,
    PRDS.START_DATE,
    PRDS.END_DATE,
    GL.LEDGER_CATEGORY_CODE,
    PRDS.ADJUSTMENT_PERIOD_FLAG
    FROM
    GL_JE_LINES JEL,
    GL_JE_HEADERS JEH,
    GL_JE_BATCHES JEB,
    GL_PERIOD_STATUSES PRDS,
    GL_LEDGERS GL
    WHERE
    JEL.JE_HEADER_ID = JEH.JE_HEADER_ID
    AND JEH.ACTUAL_FLAG = 'A'
    AND JEB.STATUS = 'P'
    AND JEH.JE_BATCH_ID = JEB.JE_BATCH_ID (+)
    AND JEL.PERIOD_NAME = PRDS.PERIOD_NAME
    AND JEL.LEDGER_ID = PRDS.SET_OF_BOOKS_ID
    AND JEL.LEDGER_ID = GL.LEDGER_ID
    AND PRDS.APPLICATION_ID = 101
    AND JEH.CURRENCY_CODE<>'STAT'
    AND ( JEB.CREATION_DATE >=
    TO_DATE('01/01/1753 00:00:00', 'MM/DD/YYYY HH24:MI:SS')
    AND DECODE(, 'Y', GL.LEDGER_ID, 1) IN ()
    AND DECODE(, 'Y', GL.LEDGER_CATEGORY_CODE, 'NONE') IN ()
    Oracle Fatal Error].
    ERROR     10/4/2011 6:04:05 PM     node01_WIN-O0IX1SFES7T     READER_1_1_1     BLKR_16004     ERROR: Prepare failed.

  • Process chain for a full load failed

    Hi,
    I have a process chain where load data from an InfoCube to a DSO with the following steps:
    Begin
    Load data: Execute an Infopackage with full load.
    Update from PSA
    Activate Data: Activate data of DSO.
    When I execute the process chain, the step of execute InfoPackage succes, but the step of Update from PSA failed. If I saw the errors, I can see that when the infopackage finished need to Activate the data of DSO, and the status of the request is yellow and to Update from PSA need the QM status was green. What can I do? The data transfer correctly to the DSO but not is active.
    Thank you very much.
    PD: May be, some steps and options are not exactly like I wrote them but I'm working in Spanish and the meaning is the same

    Hi....
    Look if Update from PSA has failed......then go the details tab in the IP monitor.........expand the failed data packet and see the error message........there may be several reasons .......try to solve that issue............ then delete the request from the target without making the QM status red.......the reconstruct the request................
    Actually I am not getting you clearly....I think your process chain is fine......Are you trying to say your ODS activation also failed...........but how ODS activation will start..........the load itself failed.........Is the link is on success or failure....
    Regards,
    Debjani....
    Edited by: Debjani  Mukherjee on Oct 16, 2008 2:02 PM

  • Unable to delete the last request in full load infopackage

    Hi All,
    I have full load infopackage with many requests with green status  and thier request genetaed is 0.
    Because of last failure request ia m aunable to activate dso.
    i Made last failure request green and triggered. but not successful.
    i made it red and tried deleting, no sucess. when i delete its giving Dump
    Now i am unable to delete the first request in green status also.its not deleting.
    can i delete whole data in dso and do full load again? how to check if there any other infopackages  on this dso..
    ALL This issues camw across triggering process chian...plz let me know oyur answers....
    Thanks,
    Venkat

    Hi Venkatesh,
    I tried deleting through  RSODSACTREQ.. but the delete option was disabled in the  TABLE ENTRY TAB.
    The last request is failed one with red and not tranfered with full records..
    All the other records which are in green status  does not have symbol genated for reporting purpose and
    Request id generated up on activation is zero..
    I  tried deleting the first request.. when i change its satus to red...
    QM action on PSA Z[DSONAME] must have now:Checked to see if automatic activation of the M version should be started.
    The M version is then activated if necessary  Now?
    Req. 0001439786 in DataStore Z[DSONAME]must have QM
    status green before it is activated
    Request 0001427251 is not completely activated.
    Please activate it again.
    But i am unable to find  both the requests under psa and  in adminstartion data target tab... there are some other requests with red status iin psa...
    When i try to delete failed request its giving dump
    Thanks,
    Venkat.

  • Cube full load failed

    Hi Experts,
    When i am loading the data from ODS to cube (FUll load) load is failed due to the below error.
    Short dump in the Warehouse
    Diagnosis
    The data update was not completed. A short dump has probably been logged in BW providing information about the error.
    System response
    "Caller 70" is missing.
    Further analysis:
    Search in the BW short dump overview for the short dump belonging to the request. Pay attention to the correct time and date on the selection screen.
    You get a short dump list using the Wizard or via the menu path "Environment -> Short dump -> In the Data Warehouse".
    Error correction:
    Follow the instructions in the short dump.
    and i checked in ST22 and the error analysis is,
    An exception occurred. This exception is dealt with in more detail below
    . The exception, which is assigned to the class 'CX_SY_OPEN_SQL_DB', was
      neither
    caught nor passed along using a RAISING clause, in the procedure "WRITE_ICFA
      "(FORM)"
    Since the caller of the procedure could not have expected this exception
      to occur, the running program was terminated.
    The reason for the exception is:
    The database system recognized that your last operation on the database
    would have led to a deadlock.
    Therefore, your transaction was rolled back
    to avoid this.
    ORACLE always terminates any transaction that would result in deadlock.
    The other transactions involved in this potential deadlock
    are not affected by the termination.
    Please can any one tell me why the error occured and how to resolved.
    Thanks in advance
    David

    David,
    It appears that there was a Table lock when you executed your DTP. This means that there was a multiple read on any of the tables used by DTP at the same time. This has resulted in the error. Check your PC's once.
    As of now, delete the request in IC and reload!
    -VA

  • InfoSpoke Delta - Full loads

    Hello All,
    I have an Infospoke that's in PROD with Delta update sourcing from an ODS.
    We have planned a full load.
    With selection option 06-31-2005 to 06-31-2007
    changed the update mode from delta to full and started the full load on
    06-01-2007 and completed the load successfully on 06-09-2007.
    restored the delta once the data is consumed from the /BIC/OHXXXX tables on
    06-22-2007 (full load was huge).
    During this process of full load, ODS was fed with deltas (load to ODS was not stopped).
    Users are complainig delta data is missing during that period (06/01 - 06/09)in which full load was performed.
    Can someone throw some light on why did the delta data is missing only for that period? Is not putting the ODS load on Hold the reason for this failure which messed up the delta pointer.
    Full points assured.

    Hi,
    enabling delta load:
    chk the below link for step by step instructions
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/97433e99ee70dbe10000000a1553f6/frameset.htm
    which describes:1.Delta Load Management Framework Overview
    2.Enabling Delta Load
    3.Initializing Delta Load

  • DTP to load master data, gives the message 'No more data available'.

    hi,
    when i execute the DTP to load master data from DSO, it executes and gives the message 'No more data available'. The request is green but no data is transferred from the DSO to master data.
    I want the DSO data to get into master data how do i do it?

    Hi Hardik,
    Since the request is green, there is no error in extraction. But, as they've rightly pointed out, the Update mode of the DTP must be Delta and there must not be any new Master data that you are looking for.
    Compare the data and check if the Delta DTP should bring anything at all i.e. new records. If there are no new records, then the Delta DTP will not bring any more data. Else, change mode to Full and check again.
    Reg,
    Dhaval

  • POSDM Full Load

    Hi Gurus,
    I'm in BI 7.0 and I load POSDM cubes with delta extraction, this is the only way I know to load data from POSDM.
    If i want to load old data from POSDM, what is the way to do it? If I delete my delta init, I only can load the new data...
    Another way to do a full load of POSDM data using a delta init, can be modifiying the time mark of the delta init. Can it be posible to do this?
    Can anyone help me?
    Thanks!
    Edited by: Iecisa Iecisa on Sep 2, 2010 1:48 PM

    Thanks to reply Murali,
    but I need to find the way to make a full upload of this data.
    I am aware that it may be a very difficult or impossible task, but getting to do a full load of historical data becaouse detla can't provide this, would give a break to potential system failures.
    If someone find this, It will be very helpfull for me.
    Thanks a lot

  • Full load and eliminated records

    Hi gurus,
    I'm using the 0TR_LP_1 extractor to extract data into BI7 system.
    This extractor only support full load.
    The infopackage does not have any restriction same goes to the DTP
    When i run the extraction , the number of records that transferred (105) in and the number of record added (30) are not the same but the description shows that this is full load.
    On what basis system actually eliminate the records ? How to investigate further on this. I could not verify the records since this are the dummy records created on ECC.
    In the source system i have run RSA3 and the number of records are 105.
    Please help
    TQ

    Lookup at transformation and find does filter is happening coz of routines.
    Aggregation of data in source pacakge may result in reduction in total number of records.
    Omission of log
    etc may be the reason of change is no. of records..
    In r/3 rsa3, all the log of a order will be there 
    eg: a order which is created on 8am has undergone several changes will look like below in rsa3
    Creation       10304   Pen   19 qty  29.12.2009 8 am
    modification 10304   Pen   18 qty  29.12.2009 9 am
    modification 10304   Pen   17 qty  29.12.2009 10 am
    modification 10304   Pen   15 qty  29.12.2009 11 am
    modification 10304   Pen   25 qty  29.12.2009 12 pm
    Same replicates in psa
    In bw the data transferred will 5 and added will be 1
    the final version of truth ie "modification 10304   Pen   25 qty  29.12.2009 12 pm" will be loaded.

Maybe you are looking for

  • Clean access rules and Windows service pack 3

    I am having a small issue with our Clean Access Manager blocking any Windows XP computer that has service pack 3 installed. The main failure it is giving in the reports is this Failed Checks: pc_Windows-XP-SP2, Registry Check [\HKEY_LOCAL_MACHINE\SOF

  • MS-OFFICE interface in FORMS6i

    I would like to do the following : 1. Create a MS-Word document with certain field in the doc template filled up from database on Click of button in Forms6i web solution. IS IT POSSIBLE to call MS_OFFICE Products from FORMS6i web deployment?. VERY UR

  • Package unable to distribute to DP

    Wanna ask, I have 1 DP which I have a problem to distribute a package to and the DP is part of a distribution point group. I have tried before to manually remove the particular DP in Content Locations from the package properties. Then i add back indi

  • CC Desktop not opening properly

    When I launch CC desktop, all tabs work EXCEPT the APPS tab which just shows a never ending spinning wheel. Quit is ghosted yet I can download files, typekits from ASSETS and the HOME tab reflects this activity. I have: Contacted Adobe help several t

  • How to install in r/3 side datasources in implimentaiotn full steps plz

    plz