Error in Transaction Data - Full Load

Hello All,
    This is the current scenario that I am working on:
There is a process chain which has two transaction data load (FULL LOADS) processes to the same cube.In the process monitor everything seems okay (data loads seem fine) but overall status for both loads failed due to 'Error in source system/extractor' and it says 'error in data selection'.
Processing is set to data targets only.
On doing a manage on the cube, I found 3 old requests that were red and NOT set to QM status red. So I set them to QM status red and Deleted them and the difference I saw was that the subsequent requests became available for Reporting.
Now this data load which is a full load takes for ever - I dont even know why I do not see a initialize delta update option there - can Anyone tell me why I dont see that.
And, coming to the main question, how do I get the process chain completed - will I have to repeat the data loads or what options do I have to have a succesfully running process chain or at least these 2 full loads of transaction data.
Thank you - points will be assigned for helpful answers
- DB
Edited by: Darshana on Jun 6, 2008 12:01 AM
Edited by: Darshana on Jun 6, 2008 12:05 AM

One interesting discovery I just found in R/3, was this job log with respect to the above process chain:
it says that the job was cancelled in R/3 because the material ledger currencies were changed.
the process chain is for inventory management and the data load process that get cancelled are for  the job gets cancelled in the source system:
1. Material Valuation: period ending inventories
2. Material Valuation: prices
The performance assistant says this but I am not sure how far can I work on the R/3 side to rectify this:
Material ledger currencies were changed
Diagnosis
The currencies currently set for the material ledger and the currency types set for valuation area 6205 differ from those set at conversion of the data (production startup).
System Response
The system does not allow you to post transactions after changing the currency settings to ensure consistency.
Procedure
Replace the current settings with the those entered at production
start-up.
If you wish to change the currency settings, you must use programs to convert data from the old to the new currencies.
Inform your system administrator
Anyone knowledgable in this area please give your inputs.
- DB

Similar Messages

  • Error in Transaction data

    Hi Folks,
    In System logs, daily I am encountering the message as "Error in transaction data".
    The messages are listed in SM21 in sequence as -
    Error processing batch input session PMFMASSIGN
    > Queue ID: 07011706105177351507
    > Transaction no. 1, block no. 1
    > Error in transaction data
    On more detailed analysis, after drilling further through each message I found the following message relevant -
    Documentation for system log message D2 0 :
    Entry relating to an error during the processing of a batch input session.  In the batch input session the data for the next transaction is requested.  However, the data read is no transaction data. The session has probably been destroyed in the database.
    Please explain as why the issue is occuring. I remember some time back, I executed RSBTCDEL2 on Production to clear invalid batch sessions but had to terminate due to long run.
    Kindly help me fix this.
    Regards,
    Nick

    One interesting discovery I just found in R/3, was this job log with respect to the above process chain:
    it says that the job was cancelled in R/3 because the material ledger currencies were changed.
    the process chain is for inventory management and the data load process that get cancelled are for  the job gets cancelled in the source system:
    1. Material Valuation: period ending inventories
    2. Material Valuation: prices
    The performance assistant says this but I am not sure how far can I work on the R/3 side to rectify this:
    Material ledger currencies were changed
    Diagnosis
    The currencies currently set for the material ledger and the currency types set for valuation area 6205 differ from those set at conversion of the data (production startup).
    System Response
    The system does not allow you to post transactions after changing the currency settings to ensure consistency.
    Procedure
    Replace the current settings with the those entered at production
    start-up.
    If you wish to change the currency settings, you must use programs to convert data from the old to the new currencies.
    Inform your system administrator
    Anyone knowledgable in this area please give your inputs.
    - DB

  • What if i load transaction data without loading master data

    Hello experts,
    What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
    <b>What kind of potential inconsistencies will occur?</b>
    Problem here is:
    when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
    Thanks and Regards
    Uma Srinivasa rao

    Hi Rao,
    In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
    Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
    Bye
    Dinesh

  • Transactional data is loaded in a infocube

    Hi Experts,
    How to identify how much transactional data is loaded in a infocube?  Please recommend the best way to quantify this. 
    For any suitable answare will be awarded with suitable points.
    Thanks!!

    CUBE -> Right Click -> Manage-> Requests Tab- > View the loaded Request
    See two columns .
    Transfered Record and Added number of Records.
    Based on Update Rules, Added Record Count might be <= Transfered Rec Count.
    Good Luck

  • Each time master data full load removes previous data and load with new ?

    We load company code (0COMP_CODE) master data (full load).
    1st day, we check company code master data, the record count is 150, 2nd day, the record count is 90.  It sounds like the master data full load each time would clean the previous data and load with new data, am I right?  If what I guess is right, then what setup is controlling this?
    Thanks

    I dont think it does cleanup.
    MD records in the new load simply overwrite the records already present in the master data.
    I mean if same record comes again its overwritten.If new record it gets added.
    I wouldnt expect number of records to reduce drastically from 150 to 90.
    Maybe before MD activation records could be more(as there will be both M(modified) and A(active) records).
    cheers,
    Vishvesh

  • Can transaction data be loade into info object

    Hi Gurus
    Can a Transaction data be loaded into info objects. Appreciate if some one give a simple Definition of transaction data.
    GSR

    Hi,
    You can probably do that but why would you want to do it? Transaction data is generally required for querying purposes and queries run best on multidimensional cube structure, not a flat structure.

  • Master data Full load to Delta load

    HI
    I have Order number (ZONDRNO)for this we are loading the Master data form BW  to BW full load.
    This Order number is Generi Data source
    Can any one help me how to do master data Full load to Delta load.
    Thanks in Advance
    kumar
    Edited by: kumar reddy on Oct 9, 2008 3:14 AM

    Hi,
    Firstly you are stating that you are loading from BW to BW. How are you doing it ? Are you doing through a datamart ?
    Secondly you have also told that you have created a generic datasource so from this we can deduce that you are not doing a datamart load but you are doing through a datasource.
    Thirdly to achieve delta for a generic datasource you should make sure that the datasource in question here is delta enabled.
    Once all these are done all you need to do is a delta init and schedule further delta loads.
    Regards,
    Pramod

  • Can we load Transaction data without loading Master data, explain  w.r.tSID

    Can we load Transaction data without loading Master data, if so can you explain me how and when Surrogate Ids/ Dimension Ids  gets generated.

    Hi,
    We can load the transaction data without loading master data.
    But for loading performance issues it is recommended not to load the transaction data first.
    Every infoobject in the BW is identified by SID, that means which we are loading the transaction data, considering a scenario for an infocube, the system will identify the infoobject by existing sid against that infoobject in the bw and correspondingly create DIM id's.
    Now consider the secnario where master data is not loaded and transaction data is being loaded.
    Now the system should have sid to identify and infoobject and to create DIM id ( which is the combination of sid's of the characterisc in it)
    And as the sid's are not yet created i mean as there is no referece for the master data, the system will first create the sid for that infoobject it means it will create a copy of the master data in that infoobject nd then create the dim ids based on it and then load the data.
    Here the additional task of creating sids are also being done during loading the transaction data which hinders the loading performance
    I hope this is clear5 now
    Janardhan Kumar

  • Project Analytics 7.9.6.1 - Error while running a full load

    Hi All,
    I am performing a full load for Projects Analytics and get the following error,
    =====================================
    ERROR OUTPUT
    =====================================
    1103 SEVERE Wed Nov 18 02:49:36 WST 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_CodeDimension_Gl_Account
    1104 SEVERE Wed Nov 18 02:49:36 WST 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORA11510_Adaptor:SDE_ORA_CodeDimension_Gl_Account:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_CodeDimension_Gl_Account
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    Session log initialises NULL value to mapping parameter MPLT_ADI_CODES.$$CATEGORY. This is then used insupsequent SQL and results in ORA-00936: missing expression error following are the initialization section and the load section containing the error in the log
    Initialisation
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10.DATAWAREHOUSE.SDE_ORA11510_Adaptor.SDE_ORA_CodeDimension_Gl_Account_Segments.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[MPLT_ADI_CODES.$$CATEGORY].
    DIRECTOR> VAR_27028 Use override value [4] for mapping parameter:[MPLT_SA_ORA_CODES.$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[MPLT_SA_ORA_CODES.$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_CodeDimension_Gl_Account_Segments] at [Wed Nov 18 02:49:11 2009].
    DIRECTOR> TM_6683 Repository Name: [repo_service]
    DIRECTOR> TM_6684 Server Name: [int_service]
    DIRECTOR> TM_6686 Folder: [SDE_ORA11510_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_CodeDimension_Gl_Account_Segments] Run Instance Name: [] Run Id: [17]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_CodeDimension_GL_Account_Segments [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.6.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_CodeDimension_Gl_Account_Segments].
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_CodeDimension_Gl_Account_Segments] is run by 32-bit Integration Service [node01_ASG596138], version [8.6.1], build [1218].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [UNICODE]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> DBG_21075 Connecting to database [orcl], user [DAC_REP]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Map] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Code] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_W_CODE_D] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_CodeDimension_Gl_Account_Segments]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Nov 18 02:49:14 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Nov 18 02:49:14 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [asgdev], user [APPS]
    READER_1_1_1> BLKR_16051 Source database connection [ORA_11_5_10] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [DAC_REP], bulk mode [OFF]
    WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL INSERT statement:
    INSERT INTO W_CODE_D(DATASOURCE_NUM_ID,SOURCE_CODE,SOURCE_CODE_1,SOURCE_CODE_2,SOURCE_CODE_3,SOURCE_NAME_1,SOURCE_NAME_2,CATEGORY,LANGUAGE_CODE,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,W_UPDATE_DT,TENANT_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL UPDATE statement:
    UPDATE W_CODE_D SET SOURCE_CODE_1 = ?, SOURCE_CODE_2 = ?, SOURCE_CODE_3 = ?, SOURCE_NAME_1 = ?, SOURCE_NAME_2 = ?, MASTER_DATASOURCE_NUM_ID = ?, MASTER_CODE = ?, MASTER_VALUE = ?, W_INSERT_DT = ?, W_UPDATE_DT = ?, TENANT_ID = ? WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL DELETE statement:
    DELETE FROM W_CODE_D WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_CODE_D]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    Load section
    *****START LOAD SESSION*****
    Load Start Time: Wed Nov 18 02:49:16 2009
    Target tables:
    W_CODE_D
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_Codes_GL_Account_Segments.Sq_Fnd_Flex_Values] User specified SQL Query [SELECT
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    MAX(FND_FLEX_VALUES_TL.DESCRIPTION),
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME
    FROM
    FND_FLEX_VALUES,
    FND_FLEX_VALUES_TL,
    FND_ID_FLEX_SEGMENTS,
    FND_SEGMENT_ATTRIBUTE_VALUES
    WHERE
    FND_FLEX_VALUES.FLEX_VALUE_ID = FND_FLEX_VALUES_TL.FLEX_VALUE_ID AND FND_FLEX_VALUES_TL.LANGUAGE ='US' AND
    FND_ID_FLEX_SEGMENTS.FLEX_VALUE_SET_ID =FND_FLEX_VALUES.FLEX_VALUE_SET_ID AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_ID = 101 AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_CODE ='GL#' AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM =FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_NUM AND
    FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_ID =101 AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_CODE = 'GL#' AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME=FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_COLUMN_NAME AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ATTRIBUTE_VALUE ='Y'
    GROUP BY
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:17 2009)
    READER_1_1_1> RR_4050 First row returned from database to reader : (Wed Nov 18 02:49:17 2009)
    LKPDP_3> DBG_21312 Lookup Transformation [mplt_ADI_Codes.Lkp_W_CODE_D]: Lookup override sql to create cache: SELECT W_CODE_D.SOURCE_NAME_1 AS SOURCE_NAME_1, W_CODE_D.SOURCE_NAME_2 AS SOURCE_NAME_2, W_CODE_D.MASTER_DATASOURCE_NUM_ID AS MASTER_DATASOURCE_NUM_ID, W_CODE_D.MASTER_CODE AS MASTER_CODE, W_CODE_D.MASTER_VALUE AS MASTER_VALUE, W_CODE_D.W_INSERT_DT AS W_INSERT_DT, W_CODE_D.TENANT_ID AS TENANT_ID, W_CODE_D.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, W_CODE_D.SOURCE_CODE AS SOURCE_CODE, W_CODE_D.CATEGORY AS CATEGORY, W_CODE_D.LANGUAGE_CODE AS LANGUAGE_CODE FROM W_CODE_D
    WHERE
    W_CODE_D.CATEGORY IN () ORDER BY DATASOURCE_NUM_ID,SOURCE_CODE,CATEGORY,LANGUAGE_CODE,SOURCE_NAME_1,SOURCE_NAME_2,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,TENANT_ID
    LKPDP_3> TE_7212 Increasing [Index Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [1000000] to [4734976].
    LKPDP_3> TE_7212 Increasing [Data Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [2000000] to [2007040].
    READER_1_1_1> BLKR_16019 Read [625] rows, read [0] error rows for source table [FND_ID_FLEX_SEGMENTS] instance name [mplt_BC_ORA_Codes_GL_Account_Segments.FND_ID_FLEX_SEGMENTS]
    READER_1_1_1> BLKR_16008 Reader run completed.
    LKPDP_3> TM_6660 Total Buffer Pool size is 609824 bytes and Block size is 65536 bytes.
    LKPDP_3:READER_1_1> DBG_21438 Reader: Source is [orcl], user [DAC_REP]
    LKPDP_3:READER_1_1> BLKR_16051 Source database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    LKPDP_3:READER_1_1> BLKR_16003 Initialization completed successfully.
    LKPDP_3:READER_1_1> BLKR_16007 Reader run started.
    LKPDP_3:READER_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:18 2009)
    LKPDP_3:READER_1_1> CMN_1761 Timestamp Event: [Wed Nov 18 02:49:18 2009]
    LKPDP_3:READER_1_1> RR_4035 SQL Error [
    ORA-00936: missing expression
    Could you please suggest what the issue might be and how it can be fixed?
    Many thanks,
    Kiran

    I have continued related detains in the following thread,
    Mapping Parameter  $$CATEGORY not included in the parameter file (7.9.6.1)
    Apologies for the inconvenience.
    Thanks,
    Kiran

  • Set up table and transaction data delta load init

    Dear experts,
    I am following a document left by my predecessor on "post processing activities" for BI prod post go-live.
    And for transaction data it says to :
    1. execute init with data transfer infopack manually
    2. execute delta update infopack manually
    3. go to ECC : delete setup data (lbwg)
    4. setup the tables for  (OLIxxBW)
    5. then finally activate Process chains.
    Is this sequence correct?
    Shouldn't I first delete the setup table, setup the tables and then execute the infopackages??
    Regards,
    Alice

    Hi,
    The sequence is correct ,and after this you need to run the job for the collective run in SAP ,which is based on your delta queue.
    i.e if you choose direct delta it is not required ,but if you use 'queued delta' then you need to run the collective run to move the delta from the extraction queue to delta queue .This job should run before the delta load.
       It is better to fill the set-up table during downtime .ie.no documents posted during the set-up table run. But if you dont have an option you can fill the set-up table during operation hours ,but need fo follow certain steps to get all the delta data. You may need to use 'Init without delta transfer' or early delta initialization in the infopackage .You need ODS as a data target to achieve this . Please search SDN ,you will get more information about this.
    Thanks.

  • Error in DAC ETL Full Load (Supply Chain Inventory Transactions)

    Hi Gurus,
    My Environment is like this,
    Syatems: windows 2003 server 32 bit
    DAC: 10.1.3.4.1 with Patch 10052370
    Informatica : Power Center 8.6.1 Hot Fix 11
    OBIA: 7.9.6.2
    OBIEE: 10.1.3.4.1
    I Created a plan with one Subject area Supply Chain Inventory Transactions. while running i am getting an error (out of 241 tasks 22 got failed). Nothing got Failed in Informatica. TASK_Group_Load_PositionHierarchy got failed. when i go through the log the error is like this.
    Error:
    Values :
    Null Map
    MESSAGE:::ORA-00604: error occurred at recursive SQL level 1
    ORA-01654: unable to extend index SYS.I_WRI$_OPTSTAT_H_OBJ#_ICOL#_ST by 128 in tablespace SYSAUX
    ORA-06512: at "SYS.DBMS_STATS", line 18408
    ORA-06512: at "SYS.DBMS_STATS", line 18429
    ORA-06512: at line 1
    EXCEPTION CLASS::: java.sql.SQLException
    Error:
    MESSAGE:::DataWarehouse:DBMS_STATS.GATHER_TABLE_STATS(ownname => 'DWH_REP', tabname => 'W_POSITION_DH', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL INDEXED COLUMNS SIZE AUTO',cascade => false, degree => DBMS_STATS.DEFAULT_DEGREE)
    ORA-00604: error occurred at recursive SQL level 1
    ORA-01654: unable to extend index SYS.I_WRI$_OPTSTAT_H_OBJ#_ICOL#_ST by 128 in tablespace SYSAUX
    ORA-06512: at "SYS.DBMS_STATS", line 18408
    ORA-06512: at "SYS.DBMS_STATS", line 18429
    ORA-06512: at line 1
    Values :
    Null Map
    Please Help me to Solve this.
    Edited by: 846311 on Apr 11, 2011 10:52 PM

    Hi Naveen,
    Thanks for your reply,
    I added one more temp tablespace like temp3, it resolves my problem, i am getting sucesses now. temp tablespace is using while exicuting large queries like aggrications.
    Thanks,
    Rajesh Thumuluri

  • Error during master data delta load

    Hi All,
    I am loading delta master data to an info object 0DPM_DCAS through a flexible update.
    Received an error "101 data records in table /BI0/XDPM_DCAS marked for deletion...
    158 duplicate record found. 126 recordings used in table /BI0/XDPM_DCAS"  at the Update level.
    The status is red. There is no additional info in the Error Message button.
    Also subsequent deltas cant be triggered and it says "Repeat cant be requested".
    Please suggest an explanation and an way out.

    Hello VishNo,
    How r u ?
    This error is because of the Duplicate records existing in the InfoObject. Also, seems to be the problem in the Master Data Time independent navigational attributes table. Check the data and also the Monitor screen Details Tab entries & come back again.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Loading 0employee master data (full load)

    0employee master data has been loaded several times in BW...it looks like sometimes the load runs and creates a new record eventhough the data hasn't changed. guessing this is because of the time parameter entered on the (infopackage) update screen?? How do we remove the duplicate records?
    Thanks for your help!
    Chuck

    because its time dependent master data....check table PA0000....
    hope it helps...when you are creating your queries you have to give them a key data...an employee might have worked in a particular department during a certain time and another department during another time...
    begin date and end date are very important here...
    Hope it helps

  • Error in Source System while running the Full Load

    HI Experts,
    We are using 0CRM_UT_SRV_CONT_I datasource from CRM to extract data.
    It is giving error while running the full load as 'Error Occured in Source system' with messaage Number : RSM340.
    I  want to mention certain points here:
    1) In RSA3 its running fine.
    2) Connection between BW and CRM system is OK.
    3) We have replicated datasource and send active veriosn from Dev system.
    4) When I run Init without data, Init set sucessful with entry in RSA7 in CRM.
    5) Delta run fine with green status.
    6) As source system is static client, hence do data for delta.
    The Main issue happens when we run either the Full load or Init With Data transfer.
    Kindly suggest.
    Thanks
    Mayank

    Hi Mayank,
    see in source system (CRM) with transaction SM21 and ST22 the error log and and then you let us know.
    Charly

  • This year master data-previous year transaction data load

    hi guys,
    I support some infocubes where financial data is stored and reported on.our source systems are financial databases.
    now we load transaction data every month-end throughout a year.we load master data(full load) from flatfiles whenever there are changes.whenever we change master data,we include these changes in transaction data infopackage selections,so the transaction data related to new master data will also be loaded.
    now my situation is:
    Now this is June,2010.My boss said I have to reload transaction data for 2009 as there was some changes in rates on which the data was calulated in 2009.Some new rates came in,so they re-calculated 2009 transaction data and sent it to us.Now we have to reload this 2009 data and delete OLD 2009 data from cubes.
    My question---As I am going to re-load 2009 transaction data...but the master data present in system is up-to-date..i.e, June 2010..(its not the same master data of 2009....there were few additions and renamings until today.)...so what do we need to do in this type of situations?
    Also,as I said above,whenever we have changes in master data,we change transaction data infopackage selections so they load transaction data related to this new master data also...so it means Infopackage selections of today donot reflect the infopackage selections of 2009....but we are still trying to reload 2009 transaction data with todays infopackage??
    How do we need to do in these type of situations?Do we just go ahead and load 2009 data with 2010 master data and 2010 infopackage selections........?
    Thanks alot for your patience....
    Rgds.

    we are actually having time-dependent master data....one version of master data for each year ...like for 2005..we have verion 2....for 2006..we have version 3....master data version and to which year they are linked to is maintained in a customised table....
    so coming back to my previous post...
    for 2009...we have version 4....and transaction data was loaded then.....
    for 2010..we have version 5...and transaction data will be loaded at the end of every month...and also we have decided to reload 2009 data because the the rates on which 2009 data was calculated,changed and so our business recalculated that data and sent to us and we have to reload 2009 data and delete OLD 2009 data....
    now my questions...if we re-load 2009 data,today master data is of 2010(version 5)....this is meant for 2010 transaction data loadings....Doesnt it create inconsistencies....or am I thinking too much...is just to go ahead and load....?
    Also the infopackage selections today donot match with infopackage selections of 2009 as there were additions in master data and so we included in selections....Do we need to change back the infopackage selections to 2009 until 2009 re-load is over?
    Hope I explained it better...............
    Thanks alot.

Maybe you are looking for

  • Multiple check boxes in an update form

    I have a database of the membership of a club. One field in the database is input by checking multiple check boxes. When the insert member record is submitted and multiple check boxes are selected, the information becomes an array of text separated b

  • Installer hung -- M$ Office 2011 on my new iMac.

    Hello, Totally new Mac newbie here.  I am trying to install M$ Office 2011 on my new iMac. The installer process stopped on the "installation" step and said I needed to quit Safari and Firefox before it could continue.  I did quit those apps (at leas

  • Premiere CS6 crashing when I move clips in the timeline

    This error box appears when I try to move things around in the timeline. I'm not exactly sure where to start so I don't know what to troubleshoot. Anyone ever get this before?

  • Attachments garbled,  unreadable

    Hi All, I am using several machines to access my work email from an imap server. My desktop machine (G5, 10.4.6) usually works fine, over the office ethernet. However, there are a fraction of messages with (usually large) attachments that do not come

  • My adobe flash plugin for firefox is not working? I have windows 7

    I have uninstalled - reinstalled, I have checked plugins - and when I run the plugincheck via mozilla it says it needs up dated. ?? I have windows 7. I have specifically downloaded the version for firefox flashplaer 16 and windows / browser is still