Dataloading error

Hi,
I have a IPG . One IP failed and i have to reload it , but error message  was:Collection in source system ended and  the error messages in details tab are as follows:
  6 duplicate record found. 3 recordings used in table /BI0/XASSE
6 duplicate record found. 256 recordings used in table /BI0/YASSET     
What should i do now??
I  fear that if i reload , then there will be duplicate records added again.
Regards,
Pavan

hi,
   to remove the duplicate record empty the content of /bio/xasse and
/BI0/YASSET by using SE14 and reloading it again .
however if u are loading to master data then the records will be overwriteen if the primary key is same.
regards
pls asign points if helpful.

Similar Messages

  • Dataload Error : Source Currency is blank or null

    Hi,
    I have loaded data from the DSO into the InfoCube and the Staus is Not OK.
    I am facing the dataload error:
    The unit currency Source Currency 0CURRENCY with the value 'space' is assigned to the key figure 'Source key figure ZMTR2WPC' with the value '   100.00'.
    Does anyone know the resolution to this problem?
    Regards,
    Amogh

    Hi,
    One of your record is having 0CURRENCY as blank for key figure ZMTR2WPC.
    Either correct the record in source or PSA (by assigning some value) or write field level UOM conversion for 0CURRENCY for the key figure ZMTR2WPC. (As assign some default unit for key figure if blank).
    I hope it will help.
    Thanks,
    S

  • Max. Errors in Dataload error file

    Hello
    I am loading data in essbase cube and i am having about 1000 rejections showing in dataload error file. I doubt there are some more rejections which are not showing up. Is there any way to select max. limit of errors in dataload error file. If this is the case, i need to increase the limit.
    Thanks

    John is, of course, as per usual, correct, but this whole exchange has me wincing.
    Look, you should never have more than 1,000 bad records. Really, you shouldn't have 10 bad records.
    Ever.
    There has to be a better process than, "I will load the data, I will get 30,000+ bad loads, I will laboriously review the dataload.err file and fix each issue, one by one, then clear out and reload (or maybe just load the errors and not clear out all data), until the next 30,000+ bad records come through, blah, blah, blah."
    How about figuring out ahead of time what's in Essbase and what's in the data, identify the difference, and load to hierarchy (if indeed that is desired, otherwise exclude from source) the missing members?
    Yes, I know, people do use the dataload.err file to do that, but that is a recipe for pain and agony.
    Just a thought.
    Regards,
    Cameron Lackpour

  • Essbase Studion Dataload error

    Hi There,
    work on essbase studion (11.1.1.3), able to deploy dimensions successfully, the error I got is from load data, I take the SQL I use in essbase studio, and run it from SQL, then save the data file.
    I reload the data from EAS, and it is successful. But I keep getting error when deploying data from essbase studio, it is the same sql. Essbase studio error log:
    Dataload errors sample:
    Error: 3303;#~@12#0#257#0@#~@3#0#488#0@#~140099;2010     140099      0000003039     13130      Nov     5203      14964.420     ;
    Error: 3303;#~@12#0#257#0@#~@3#0#488#0@#~150099;2010     150099      0000002573     13724      Sep     5203      37183.000     ;
    Error: 3303;#~@12#0#257#0@#~@3#0#488#0@#~150099;2010     150099      0000002573     14796      Sep     5203      6465.000     ;
    So I am wondering why the same sql data, loading through EAS without any issue, but got the above error loading through studio.

    I look at the error, it seems complaining 140099 and 150099 which is DeptID, here is sample data return with these two deptids from data set:
    2010     140099      0000003039     13130      Nov     5203      14964.420
    2010     150099      0000002573     13724      Sep     5203      37183.000
    2010     150099      0000002573     14796      Sep     5203      6465.000
    2010     150099      0000002596     14796      Aug     5203      105932.000
    Year, DeptID, VendorID, ProjectID, Month and Account, TotalAmt is the layout.
    I don't see issue here, could you tell if anyting wrong from the above?

  • Dataload Errors - No Load Rule

    I am using Essbase v 11.1.1.3, and attempting to load an exported file via EAS. If you do not select a load rule, no dataload.err file is created. I think I'm missing something, how do I get a dataload error file?

    This is expected behavour. To get an error file you have to use a load rule

  • 11.1.2.1 execute dataload error in erpi and FDM

    Hi
    I met a critical problem when execute dataload in erpi using classic method.
    there are data in E business suite, but I don't know why I can't pull data from EBS.
    also I execute dataload using FDM method, the import step in FDM yields an error: "error creating database link".
    could you help me to find the problem?
    regards,
    Samantha
    the error message please refering this:
    java.lang.Exception: ERROR: Drill Region was not created/updated. No valid data rows exist in TDATASEG_T for Process ID: 14
    ERPI Process Start, Process ID: 14
    ERPI Logging Level: DEBUG (5)
    ERPI Log File: C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\1\/aif_14.log
    Jython Version: 2.1
    Java Platform: java1.4.2_08
    COMM Process Periods - Insert Periods - START
    COMM Process Periods - Insert Periods - END
    COMM Process Periods - Insert Process Details - START
    COMM Process Periods - Insert Process Details - END
    LKM EBS/FS Extract Type - Set Extract Type - START
    LKM EBS/FS Extract Type - Set Extract Type - END
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - START
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - END
    LKM EBS/FS Extract Delta Balances - Load Audit - START
    LKM EBS/FS Extract Delta Balances - Load Audit - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    LKM EBS/FS Extract Type - Set Extract Type - START
    LKM EBS/FS Extract Type - Set Extract Type - END
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - START
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - END
    LKM EBS/FS Extract Delta Balances - Load Audit - START
    LKM EBS/FS Extract Delta Balances - Load Audit - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    EBS/FS Load Data - Load TDATASEG_T - START
    Import Data from Source for Period '2011年八月'
    Monetary Data Rows Imported from Source: 182
    Total Data Rows Imported from Source: 182
    EBS/FS Load Data - Load TDATASEG_T - END
    COMM Update Data - Update TDATASEG_T/TDATASEGW - START
    Warning: Data rows with zero balances exist
    Zero Balance Data Rows Deleted: 8
    Processing Mappings for Column 'ACCOUNT'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'ENTITY'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'ICP'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'UD1'
    Data Rows Updated by 'Custom1' (LIKE): 174
    Processing Mappings for Column 'UD2'
    Data Rows Updated by 'Custom2' (LIKE): 174
    Processing Mappings for Column 'UD3'
    Data Rows Updated by 'Custom3' (LIKE): 174
    Processing Mappings for Column 'UD4'
    Data Rows Updated by 'Custom4' (LIKE): 174
    Processing Mappings for Column 'UD5'
    Data Rows Updated by 'Value' (LIKE): 174
    Warning: Data rows with unmapped dimensions exist
    Data Rows Marked as Invalid: 174
    Warning: No data rows available for Export to Target
    Total Data Rows available for Export to Target: 0
    COMM Update Data - Update TDATASEG_T/TDATASEGW - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    EBS/FS Load Data - Load TDATASEG_T - START
    Import Data from Source for Period '2011年九月'
    Monetary Data Rows Imported from Source: 193
    Total Data Rows Imported from Source: 193
    EBS/FS Load Data - Load TDATASEG_T - END
    COMM Update Data - Update TDATASEG_T/TDATASEGW - START
    Warning: Data rows with zero balances exist
    Zero Balance Data Rows Deleted: 69
    Processing Mappings for Column 'ACCOUNT'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'ENTITY'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'ICP'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'UD1'
    Data Rows Updated by 'Custom1' (LIKE): 124
    Processing Mappings for Column 'UD2'
    Data Rows Updated by 'Custom2' (LIKE): 124
    Processing Mappings for Column 'UD3'
    Data Rows Updated by 'Custom3' (LIKE): 124
    Processing Mappings for Column 'UD4'
    Data Rows Updated by 'Custom4' (LIKE): 124
    Processing Mappings for Column 'UD5'
    Data Rows Updated by 'Value' (LIKE): 124
    Warning: Data rows with unmapped dimensions exist
    Data Rows Marked as Invalid: 124
    Warning: No data rows available for Export to Target
    Total Data Rows available for Export to Target: 0
    COMM Update Data - Update TDATASEG_T/TDATASEGW - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    COMM Update YTD Amounts - Update YTD Amounts - START
    COMM Update YTD Amounts - Update YTD Amounts - END
    ODI Hyperion Financial Management Adapter Version 9.3.1
    Load task initialized.
    LKM COMM Load Data into HFM - Load Data to HFM - START
    Connecting to Financial Management application [EEHFM2] on [taly_cluster] using user-name [admin].
    Connected to Financial Management application.
    HFM Version: 11.1.2.1.0.
    Options for the Financial Management load task are:
    Load Options validated.
    Source data retrieved.
    Pre-load tasks completed.
    HFM Log file path: "C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\1\aif_14HFM59238.log".
    Load to Financial Management application completed.
    Load task statistics - success-count:0, error-count:0
    Cleanup completed.
    Disconnected from Financial Management application.
    LKM COMM Load Data into HFM - Load Data to HFM - END
    ERPI Process End, Process ID: 14

    The data load method for your target application should be set to "FDM" if you are going to be using FDM for your Data Load Rule imports. The Error in creating the DB link is due to the FDM Schema not being granted the DBLINK role in Oracle. In the FDM DBA Guide it explains that the FDM Schema must be granted the create DBLINK role when using the ERPi adapter.

  • Dataload error

    Hi all,
    Each time when i run my scxript and load a file , the same file loads from navigator.
    You can see the same file imported to data portal marked in red.
     Help me to resolve this ..
    Solved!
    Go to Solution.
    Attachments:
    dataload.JPG ‏125 KB

    Hello RSH,
    This is not
    an error it is a defined behavior in DIAdem. DIAdem has by default a unique
    group name and channel name condition. That means that the name of a group must
    be unique in the Dataportal and a channel name must be unique within a channel
    group. If you load some data that doesn’t fit with that condition DIAdem
    automatically counts-up the Name. If the end of the name is already a number
    this number will be count-up. To prevent a misunderstanding of original name
    and counted name you can choose a separator. You can set this separator in the “General
    Settings”, “Unique group and channel names”
    Greetings
    Walter

  • Maxl statement dataload error

    Hi all wen iam loading the data through maxl its throwing errors , i am confusing ,
    this is my maxl
    alter application 'CDG' disable connects;
    alter system logout session on application 'CDG'
    and partiton script
    import database 'CDG'.'cdgc'data connect as 's-odi' identified by 'Nbvc001' using server rules_file 'cdsdata' on error write to '\\dcfr-41\logs\Repcdgcy_Err.log';
    its throwing error: syntax error near [$ ]
    but there is no '$' ( in this statement any were ) iam sure
    and some times its giving error
    loggin out user[admin] active for 0 minutes
    wen iam excuting individualy one by one there is no any errors , excuting evrything successfully but wen iam excuting all at a time in maxl editor its giving that error
    plz help on this 2 errors its urjent
    Thanks

    alter system logout session on application 'CDG'
    and partiton script
    import database 'CDG'.'cdgc'data connect as 's-odi' identified by 'Nbvc001' using server rules_file 'cdsdata' on error write to '\\dcfr-41\logs\Repcdgcy_Err.log';
    this may be a longshot, but I see two possible problems. First. There is no semicolon at the end of the Alter system statement above. Second on the import statement there is no spave between 'cdgc' and data. It might be getting confused on that.

  • Dataload error using MaxL on LINUX

    Hi All,
    I'm using Essbase 11.1.2.1 & working on ASO cube. I'm trying to load the data using Maxl on LINUX.
    This is the following error: essmsh error: End of File breaks the Statement.
    any ideas.......
    Thanks,

    rasch wrote:
    Hi,
    I am using DAQmx-base (C library) on Linux with a USB-6009 device. When I am acquiring data
    using DAQmxBaseReadAnalogF64() (the code is based on the example contAcquireNChan.c)
    everything works, but when I am using DAQmxBaseReadBinaryI16() the following error
    happens:
    DAQmxBase Error: Value passed to the Task/Channels In control is invalid.
    Everything is the same just the ...AnalogF64 is replaced by the ...BinaryI16 function call
    (of course the buffer used for storing the data was also changed from float64 to in16).
    Does anyone know what settings (or missing settings) result in this error?
    Every explanation or pointer to some documentation is welcome. I already checked the
    C reference HTML documentation coming with DAQmx-base without success.
    Thanks
    Raphael
    From the ReadMe file for DAQmx Base on Macintosh OS X:
    NI USB-9211
    *Analog Input
    -Multi-channel, multi-sample read (scaled)
    -Multi-channel, single sample read (scaled)
    NI USB-9215
    *Analog Input
    -Multi-channel, multi-sample read (scaled)
    -Multi-channel, single sample read (scaled)
    NI USB-6008/9
    *Analog Input
    -Digital start triggering
    -Multi-channel, multi-sample read (scaled)
    -Multi-channel, single sample read (scaled)
    Raw input simply isn't supported for the USB devices. I would guess that the situation for Linux is the same, since DAQmx Base is based on Labview code in order to achieve cross-platform portability. See the thread
    http://forums.ni.com/ni/board/message?board.id=250&message.id=11873
    and, for a slightly different problem:
    http://forums.ni.com/ni/board/message?board.id=250&message.id=11963
    Be sure to read down to where there is a message indicating that support for USB devices is planned for NI-DAQmx. Of course, that won't help on Macintosh and Linux.
    John Weeks
    WaveMetrics, Inc.
    Phone (503) 620-3001
    Fax (503) 620-6754
    www.wavemetrics.com

  • Inconsistent dataload  error

    hi Gurus
    i got peculiar error
    i'm loading master data product group form R/3 to BW 3.5  for example out of 10 PG i'm getting data for 9 and 10th one it showing blank and checked in R/3 side every thing is fine and i checked mapping in BW side it working fine  and i checked in update rules also ,i'm not why the 10th PG data is not comming ,
    i'll deeply appreciate if u help me out
    Regards
    Raju

    Hi.......
    What is the status of the IP monitor........is it yellow.............is Extraction is completed?........hav u checked the extraction job in the source system..............is it showing Missing Messages for 10th Data packet........or simply that packet is also green but no of records is 0 there......
    If Extraction is comppleted............but still load is stucked in the 10th data packet.........I mean it is in yellow state.............if this stage is persisting for a long time............then check TRFC in SM58............if any TRFC is there ...........then push them pressing F6..............If still the load does'nt progress...........then check short dumps in ST22...............if short dump is there.............then make the QM status red............delete the request from the target and repeat the load...........If there is no TRFC or no shortdumps.............and this condition is persisting for a long time............then check the previous log...............after how much time it picks the 10th packet..............how much time the IP takes to complete..............if require repeat the load...........
    If the 10th packet is Green............but the no of records is 0............then Go to the tcode RSA3..inthe ssource system..........here give the datasource name.........in the Display Extr. Calls field..........write 10...............execute.........in this way check whether this data packet contains any records or not............if contains records............then make the QM status red..............delete the request from the target..........and again repeat the load............see what happen............
    Regards,
    Debjani.......

  • Dataload error after refresh partitionned data

    Hi,
    I need some help about a problem :
    I have a replication partition and in the process, there is a dlr to load some data to the target cube after a process of replicated data. I have this error :
    ERROR - 1003069 - Data Load Error: No data is loaded as the user does not have permission.
    The user is used for creation of replication and for dlr, it has "Provisionning Manager" rights for the 2 cubes. Why there is a lock in the target cube?
    What's wrong with that?!!
    Xsay

    Provision manager just have rights to provision other users. give the user the amin rights

  • MaxL error 1241114 - There were errors, look in...

    I am running the following MaxL in a batch script on one of our Essbase cubes, which passes the MaxL to ESSMSH.
    MaxL:
    login admin identified by ******* on *******;
    iferror "errorhandler";
    spool stdout on to "\\*******\\*******\\Logs\\*******BefLoad.log";
    iferror "errorhandler";
    spool stderr on to "\\*******\\*******\\Logs\\*******BefLoad.err";
    iferror "errorhandler";
    set timestamp on;
    execute calculation "'FIX(\"*******\",@ICHILDREN(\"*******\")) CLEARDATA \"*******\"; ENDFIX'" on *******.*******;
    iferror "errorhandler";
    import database *******.******* dimensions from data_file "\\\\*******\\FDMDATA\\*******\\Outbox\\*******.Dat" using server rules_file "L_Data" preserve input data on error append to "\\\\*******\\*******\\Logs\\*******BefLoad.err";
    iferror "errorhandler";
    logout;
    exit 0;
    define label "errorhandler";
    exit 10;
    However, every time the MaxL executes, I have the following single line generated in the "\\*******\*******\Logs\*******BefLoad.err" file:
    WARNING - 1241114 - There were errors, look in \\*******\*******\Logs\*******BefLoad.err.
    This error is actually pointing to itself (the .err file), however this is the only line with no further information, and is generated every time the batch script is run, even though the build appears to have worked fine. I have double-checked and there are no errors (or even warnings) in the Essbase application log to suggest anything has gone wrong!
    Does anyone please have a suggestion as to why it is being generated? Can we just ignore this error? I have read the following, which points out that data load errors will not be trapped by the MaxL "iferror" statement. However, it would be useful to know why this error is occurring and specifically what is going wrong.
    MaxL Error Codes

    Hi James -
    Your dataload errors are getting spooled to the same file as Essbase is trying to spool the message saying that there were dataload errors. I wouldn't count on that working the way you expect it to, and it's not an approach I have ever used - I'd think keeping the stderr spool separate (i.e. separate files) from the dataload errors makes sense.
    My bet is that if you point the two error streams to different files you'll see that there are in fact load errors, they're just getting messed up by the fact that stderr and dataload errors are pointing at the same file.
    BTW, I appreciate that you're anonymizing for client confidentiality (amazing how many folks post info that obviously identifies their company and give out server / user names...), but any chance you could use e.g. 'appname', 'dbname' etc rather than *******? Asterisks make my eyes hurt, but more importantly they make the different tokens impossible to distinguish. Cheers!

  • How to sortout the data load error?

    Hi,
    I have a data model which has the following flow:
    First from IS to ODS1.
    Second from ODS1 to ODS2
    Third from ODS2 to Infocube.
    Now while loading from ODS2 to Infocube there were some date fields which are incorrect and hence there was a dataload error. Since the first two dataloads were successful i am not sure where should i change the date values
    The incorrect date values were in the format: "10.0..06.2"
    Any ideas?
    Raj

    Hi
    Check this links:
    http://sap.ittoolbox.com/groups/technical-functional/sap-bw/loading-fails-from-an-ods-into-a-cube-error-603460
    Re: error while loading data into cube
    Hope it helps
    Regards
    Ashwin

  • Process overdue error

    Hi all,
    How to reslove the process overdue error and TRFC errors.Plz tell me
    Rgds
    Bharath

    HI
    Follow these threads
    Process Overdue error
    error: process overdue
    Process is overdue (during dataload)
    error like process ovedue in extraction data
    Khaja

  • Installation Ides ECC 5.0

    Dear Experts,
           During Ides ECC 5.0 installation i am facing probelm in step dataload
    ERROR 2010-08-16 22:42:24
    MSC-01015  Process finished with error(s), check log file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPAPPL1.log
    ERROR 2010-08-16 22:42:24
    MSC-01015  Process finished with error(s), check log file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPSSEXC.log
    ERROR 2010-08-16 22:42:54
    MSC-01015  Process finished with error(s), check log file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPAPPL2.log
    ERROR 2010-08-16 22:42:54
    MSC-01015  Process finished with error(s), check log file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPSDIC.log
    Log : DbSl Trace: ORA-1403 when accessing table SAPUSER
    I:\usr\sap\SIV\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    Version : Oracle 9.2
            Os : Windows 2003 with SP2
                    Non-UniCode
    Thnaks In advance
    Surendra Jain

    Dear Vincent,
    Thnx your quick reply.
    DbSl Trace: ORA-1403 when accessing table SAPUSER
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (TSK) ERROR: file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPAPPL0.TSK.bck already seems to exist
                 a previous run may not have been finished cleanly
                 file D:\Program Files\sapinst_instdir\ECC04SR1\WEBAS_ABAP_ORA_NUC\DB/SAPAPPL0.TSK possibly corrupted
    I:\usr\sap\SIV\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    I:\usr\sap\SIV\SYS\exe\run/R3load.exe: END OF LOG: 20100816224155
    (DB) INFO: E071K~0 created #20100816225233
    (Tried 2nd time installation ,listener working, able to connect with DB)
    Surendra

Maybe you are looking for

  • VO Extension Error in modifying the query of XML Publisher Report Template

    Hi Friends, I have one requirement related to calling XML Publisher Reports Template from OAF Page. When clicking on the submit button the XML Publisher template is calling and displaying the data. Our requirement is to display some more columns in t

  • Have i killed my computer?

    I have a computer which is about two years old. It has a MB So. 775 MSI P4M900M 2-L board and a PCI-e NVidea GeForce 7300GT 512MB TV out & DVI graphics card. Everything was fine until i decided to upgrade one of my DVD writers to blueray. It turned o

  • Edit text in an email response

    How do I edit an email response to someone when using mac mail?

  • AVAppGetToolByName

    Hi I use the AVAppGetToolByName API in my plugin and have found the list of tool names in the API Overview documentation but it seems that some tools are missing from this list. I'd like to select the "Object" tool (Tools > Advanced editing > Object)

  • Filter Enterprise Resources Dialog Box

    Hello, Today, after create new Enterprise Resource (Actually we have 2004 Enterprise Resource) when I try to add Resource in Project Professional 2010 using Build Team from Enterprise... show me a Dialog Box named "Filter Enterprise Resource" for to