Data Loader - Only imports first record; remaining records fail

I'm trying to use Data Loader to import a group of opportunities. Everytime I run the Data Loader it only imports the first record. All the other records fail with the message "An unexpected error occurred during the import of the following row: 'External Unique Id: xxxxxxx'". After running the Data Loader, I can modify the Data file and remove the first record that was imported. By running the Data Loader again, the first row (previously the second row) will import successfully.
Any idea what could be causing this behavior?

W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
What else is on the page? Are you really telling it to merge all the records, or just one?
You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

Similar Messages

  • VBS-Scripting: DATAFILEIMPORT (ASCII) only imports first 20 channels

    Hello,
    I've got some problems with the automated DataFileImport via VBS-Script. The command DATAFILEIMPORT works, but it only imports the first 20 channels that are present in the data-file. The remaining 30 channels are ignored. If I import the same data file with the menu-command 'open' in the File-Menu in DIAdem-Navigator all data is imported correctly. What is the problem here?
    I'm using DIAdem 9.01 Service Pack 1.
    This is the command I'm using:
    Call DATAFILEIMPORT("D:\Messdaten\August_04\12.08.2004\Ergebnisse\2004-08-12_16-00-00_19-00-00_10sec.txt","asciiFilter",False)
    Thanks in advance
    Stefan

    Hallo Stefan,
    Problem erkannt! Du hast die STP-Datei mit einer Datendatei erstallt, die 20 Kanäle hatte und versuchst diese STP-Datei auf eine Datendatei mit 46 Kanälen anzuwenden. Hier liegt das Problem, denn eine STP-Datei kann für mehrere Datendateien nur dann benutzt werden, wenn die Struktur dieser Dateien identisch ist dh. alle Dateien die gleiche Anzahl Kanäle haben. Das dass interaktiv funktioniert liegt an einer Defaultannahme, die für Kanäle getroffen wird - aber die muss natürlich nicht stimmen (interaktiv kann das ja noch korrigiert werden per Script aber nicht). Zu diesem Problem gibt es in DIAdem 9.0 keine andere Lösung als für jede unterschiedliche Datenstruktur einen STP-File zu erstellen. In DIAdem 9.1 haben wir mit den DataPlugins eine flexible Alternative geschaffen. Infos hierzu findest du auf www.ni.com/dataplugins.
    Gruß
    br>Walter

  • IPTC Date Created only imports for October

    I'm creating JPEGS with only IPTC and EXIF data in and the field 2:55 Date Created only seems to appear in the File Info / Origin / Date Created in CS4 if the date is set to october??
    Manually editing the jpeg file, for example 20081007 works but 20080907 fails, even if I change the remaining EXIF and IPTC date fields to match
    I dont suppose anyone has any ideas as this is driving me nuts!
    Thanks
    Derek

    No, that changes the Date on the EXIF panel, in my case the date the scans were made. I want that date to stand and the IPTC Date Created date changed.
    Probably doesn't matter anyway, since I have just discovered that changes I make in Aperture IPTC metadata cannot be written to the master copy. At least not for TIFF files. Strangely, it does seem to work for jpgs.
    Too bad. I liked some Aperture features.

  • Importing movie only imports first 15 seconds

    I am trying to import, well a large movie, to try my new MBP. For some reason only the first 15 seconds are imported and nothing else.
    Prior to my upgrade to iLife11 the import took several minutes.
    I am new to my MAC, love it but I am learning.
    Any help would be appreciated.

    What kind of large movie? Is it from a camcorder or somewhere else?
    Will it open in QuickTime Player? If so, click WINDOW/INSPECTOR and tell us everything listed by the word FORMAT:

  • Data merge only prints first record

    Help, please: I've done it over and over, checked my .cvs file, replaced prefs...
    BUT data merge will only print the first record either in create merged document or export to pdf.
    It WILL preview all records.

    W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
    Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
    What else is on the page? Are you really telling it to merge all the records, or just one?
    You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

  • Data loads is not picking up all records

    Hello, wonder if anyone ever encountered the following problem.  I've been trying to load a flat file to BPC and for a year now, we've had no issues.  This time however, it seems that not all records were loaded, eventhough the status indicated the correct number of rows were successfully loaded.  Any ideas what may be causing this?  I've checked the conversion files, account dimension and nothing has changed.  Any help with this is greatly appreciated.  Ideas on how / where to check for potential trouble spots are all welcome.
    Thanks in advance.
    David

    Hello,
        If you are using the standard SSIS package import, the rejected record should be reported into the log file at the end. However, in order to see if there are rejected records or not, you can try to validate your transformation file using the same import file.
    You can also verify also the temporary files generated into D:\BPC\Data\Webfolders\APSHELL_Sorin\Finance\PrivatePublications. In this way you can find the right calues imported into the cube.
        How you check if data is loaded or not?
    Best regards,
    Mihaela

  • IMovie only imports first 10 seconds of video clip which is 14 seconds long

    I have a Samsung HMX20 camcorder, and when I import a few of the clips, which are around 14 seconds long, into iMovie, I only get about 10 seconds imported into my project (so basically I don't get the end of the clip). What's going on?

    It looks like your _High Definition Samsung SC-HMX20_ is NOT listed as supported by iMovie '09.
    http://support.apple.com/kb/HT3290#2
    The only camera listed is the _Standard Definition Samsung SC-MX20_
    There is also an article about it:
    http://support.apple.com/kb/HT2986

  • Select first and last records in grouped results - Oracle 11g

    Say I have the following information in an Oracle 11g table:
    Qty
    Production order
    Date and time
    20
    00000000000000001
    12-JAN-14 00:02
    20
    00000000000000001
    12-JAN-14 00:05
    20
    00000000000000001
    12-JAN-14 00:07
    20
    00000000000000001
    13-JAN-14 00:09
    30
    00000000000000002
    12-JAN-14 00:11
    30
    00000000000000002
    12-JAN-14 00:15
    30
    00000000000000002
    12-JAN-14 00:20
    30
    00000000000000002
    14-JAN-14 00:29
    I would like to write a query that would return the following:
    Qty
    Production order
    First
    Last
    80
    00000000000000001
    12-JAN-14 00:02
    13-JAN-14 00:09
    120
    00000000000000002
    12-JAN-14 00:11
    14-JAN-14 00:29
    That is, the sum of the Qty column grouped by Production order, and the date/time of the first and last records for each Production order.
    I came up with a query that yielded this result:
    Qty
    Production order
    First
    Last
    80
    00000000000000001
    12-JAN-14 00:02
    14-JAN-14 00:29
    120
    00000000000000002
    12-JAN-14 00:02
    14-JAN-14 00:29
    Which means that the First and Last columns show the overall first and last date / time of the whole table. Please note that this is a dummy table. Sorry I am now allowed to write the actual query
    I came up with since work policies do not allow me to share it. Also, I tried with windowing functions such as rank()and row_number() but my user does not have enough privileges to do so.
    Any help or hints will be greatly appreciated.

    Due to the fact that Oracle does not record the rows in any particular order, it would be wrong that the "first date" would be the first row processed by the query.
    Therefore you would have to supply some other column if you do not want to consider the table as ordered by date.
    Also, any analytical functions will need you to supply the "order by" and if its the date, then just a simple query will do:
    SQL>WITH Tab1 (Qty, Production_Order, Pdate)
      2       AS (SELECT 20, '00000000000000001', TO_DATE ( '12-JAN-14 00:02', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      3           SELECT 20, '00000000000000001', TO_DATE ( '12-JAN-14 00:05', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      4           SELECT 20, '00000000000000001', TO_DATE ( '12-JAN-14 00:07', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      5           SELECT 20, '00000000000000001', TO_DATE ( '13-JAN-14 00:09', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      6           SELECT 30, '00000000000000002', TO_DATE ( '12-JAN-14 00:11', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      7           SELECT 30, '00000000000000002', TO_DATE ( '12-JAN-14 00:15', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      8           SELECT 30, '00000000000000002', TO_DATE ( '12-JAN-14 00:20', 'DD-MON-YY HH24:MI') FROM DUAL UNION ALL
      9           SELECT 30, '00000000000000002', TO_DATE ( '14-JAN-14 00:29', 'DD-MON-YY HH24:MI') FROM DUAL)
    10  SELECT   SUM ( Qty), Production_Order, MIN ( Pdate), MAX ( Pdate)
    11      FROM Tab1
    12  GROUP BY Production_Order
    13* ORDER BY Production_Order
    SQL> /
      SUM(QTY) PRODUCTION_ORDER     MIN(PDATE)                    MAX(PDATE)
            80 00000000000000001    12-Jan-2014 00:02:00          13-Jan-2014 00:09:00
           120 00000000000000002    12-Jan-2014 00:11:00          14-Jan-2014 00:29:00

  • Need to generate multiple error files with rule file names during parallel data load

    Hi,
    Is there a way that MAXL could generate multiple error files during parallel data load?
    import database AsoSamp.Sample data
      connect as TBC identified by 'password'
      using multiple rules_file 'rule1' , 'rule2'
      to load_buffer_block starting with buffer_id 100
      on error write to "error.txt";
    I want to get error files as this -  rule1.err, rule2.err (Error files with rule file name included). Is this possible in MAXL? 
    I even faced a situation , If i hard code the error file name like above, its giving me error file names as error1.err and error2.err. Is there any solution for this?
    Thanks,
    DS

    Are you saying that if you specify the error file as "error.txt" Essbase actually produces multiple error files and appends a number?
    Tim. 
    Yes its appending the way i said.
    Out of interest, though - why do you want to do this?  The load rules must be set up to select different 'chunks' of input data; is it impossible to tell which rule an error record came from if they are all in the same file?
    I have like 6 - 7 rule files using which the data will be pulled from SQL and loaded into Essbase. I dont say its impossible to track the error record.
    Regardless, the only way I can think of to have total control of the error file name is to use the 'manual' parallel load approach.  Set up a script to call multiple instances of MaxL, each performing a single load to a different buffer.  Then commit them all together.  This gives you most of the parallel load benefit, albeit with more complex scripting.
    Even i had the same thought of calling multiple instances of a Maxl using a shell script.  Could you please elaborate on this process? What sort of complexity is involved in this approach.? Did anyone tried it before?
    Thanks,
    DS

  • Rate data loader

    Hey, Gurus!
    Does anyone know how many records does Oracle Data Loader On Demand send by package?
    I didn't find anything on the documentation (Data Loader FAQ, Data Loader Overview for R17, Data Loader User Guide).
    thanks in advance
    Rafael Feldberg

    Rafael, there is no upper limit for the number of records that the Data Loader can import. However, after doing a test import using the Import Wizard I would recommend keeping the number of records at a reasonable level.

  • Data loading from ODS toODS

    Hi All,
    Iam having one source ODS and target also ODS.Can u explain when we have to go for
    1. Full Repair.
    2.Full Load.
    3. Delta Load.
    and also pls give the procedure for loading for all the three above said scenarios.
    Points will be awarded suitably for the answers.
    Thanks in advance gurus.
    Regards...
    BW World

    Hi,
    For loading from one ODS to another you use export datasource. Its called a datamart load.
    Full load is generally performed once to load all the historical data.
    After that you do an init load and deltas will follow which will load only the changed or new records frequently.
    Full repair can be said as a Full with selections. But the main use or advantage of Full repair load is that it wont affect delta loads in the system. If you load a full to a target with deltas running you again will have to initialize them for deltas to continue. But if you do full repair it wont affect deltas.
    This is normally done we when we lose some data or there is data mismatch between source system and BW.
    Check the OSS Note 739863 'Repairing data in BW' for all the details
    You need to create an export datasource (8<Source ODS>) and the update rules from Source ODS to target ODS for loading.
    Also InfoPackages for Init, Full/Full repair and Delta loads.
    Loading procedure had been discussed previously in SDN.
    You will get detailed info in SDN.
    Thanks,
    JituK

  • Transaction Data Load from BW info provider

    Hi gurus,
    I am doing a transaction dat aload from the BW data feed to BPC cube. When I did the validation of the transformation file the task was sucessfully completed with some skipped records as expected based on the conversion files.
    ValidateRecords = YES
    [List of conversion file]
    Conversion file: DataManager\ConversionFiles\EXAMPLES\TIMECONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\VERSIONCONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\ACCOUNTCONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\ENTITY.XLS!CONVERSION
    Record count: 25
    Accept count: 13
    Reject count: 0
    Skip count: 12
    This task has successfully completed
    but when did run package, load fails.
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    InforProvide=ZPCAOB01
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\EXAMPLES\ABF_TRANS_LOAD.xls
    CLEARDATA= Yes
    RUNLOGIC= No
    CHECKLCK= No
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other cube
    Application: TEST Package status: ERROR
    This is a fresh system and we are doing the data load for the first time.  we are using BPC NW 7.5 with SP4
    Is there something which we are missing which is supposed to be performed before starting the load for the first time.
    my transformation file is as below
    *MAPPING
    Account=0ACCOUNT
    Currency=0CURRENCY
    DataSrc=*NEWCOL(INPUT)
    Entity=ZPCCCPLN
    ICP=*NEWCOL(ICP_NONE)
    Scenario=0VERSION
    Time=0FISCPER
    SIGNEDDATA=0AMOUNT
    *CONVERSION
    TIME=EXAMPLES\TIMECONV.XLS
    SCENARIO=EXAMPLES\VERSIONCONV.XLS
    ACCOUNT=EXAMPLES\ACCOUNTCONV.XLS
    ENTITY=EXAMPLES\entity.xls
    Thanks a lot in advance.
    Regards
    Sharavan

    Hi Gersh,
    Thanks for the quick response.
    I checked in SLG1 and i have the below error in the log.
    Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:DATATYPE 3
    Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:CURRENT_ROUND 1
    Error occurs when loading transaction data from other cube
    Message no. UJD_EXCEPTION137
    we are on SP4, BPC NW 7.5. Please advice
    Regards
    Sharavan.

  • DTP problem on master data loading

    Dear Experts
           When we loading 0Employee master data daily, sometime PSA data can not reflect to master table
         For example, data in PSA as following
    10001    01/01/2008     01/31/2008    C1
    10001    02/01/2008     08/31/2008    C1
    10001    09/01/2008     12/31/9999    C2
         After DTP request success running, the data as following
    10001    01/01/2008     01/31/2008    C1
    10001    02/01/2008     08/31/2008    C1
    10001    09/01/2008     12/31/9999    C1
        the C2 in PSA can not update to master table, this problem happened time to time, but it hardly find it until users' complain.
        When I manually deleted DTP request from Master attribute list and re-load from PSA to master table, the problem can be fixed.
        By the way, I have already set DTP Semantic Groups as Employee ID.
         Anyone can advice it, thanks in advance.
    Regards
    Jie

    jie miao,
    Previous master data already available in BW. every day load you need to load only changes. So last record is enough.
    Eg:
    Initial load on 01-01-2008
    10001 01/01/2008 12/31/9999 C1
    --> Data loaded on 2nd Jan and above data available in bw.
    on 03-11-2008 you will get following records
    10001 01/01/2008 11/02/2008 C1
    10001 09/01/2008 12/31/9999 C2
    Load these records on 03 Nov, so system updates accordingly.
    Hope it Helps
    Srini

  • PA40 data load for infotype 0,1,2

    Hi All
    i am doing data upload for infotype 0,1,2 using pa40 data load, this works fine for one record, but if there are two, then the screen doenst refresh and it doenst work. Does any body have this proble before and can guide me
    thanks
    C

    No its not resolved, these are the fields in my source file, the problem is for the second record, the pernr remains the same and doesnt refresh, so it throws error, is there any way when it comes to next record the pernr becomes blank.
    begda     werks     persg     persk     plans     anred     vorna     midnm     nachn     perid     gesch     gbdat     gblnd     natio     sprsl     famst     anzkd
    cheers
    C

  • Error in data load from application server

    Well, this problem again!
    In datasource for dataload by flatfile, tab Extraction i selected Load Text-Type File From Application Server.
    In tab Proposal:
    Converter: Separated with Separator (for Example, CSV).
    No. of Data Records: 9198
    Data load sucefull.
    I add 1 record:
    Converter: Separated with Separator (for Example, CSV).
    No. of Data Records: 9199
    So appear a  message error "Cannot convert character sets for one or more characters".
    I look in line 9199 in file and not have any special character. I use AL11 and debug to see this line.
    When i load file from local workstation, this error not occur.
    What's happen?

    Hi Rodrigo,
    What type of logical file path have yopu created in the application server for loading this file.
    Is it a UNIX file path or an ASCII file path.
    Did you check how the file looks in the application server AL11?
    Prathish

Maybe you are looking for

  • Handling unit management rquired

    Hi Friends, How do we determine that we need to implement HUM with in four walls?What are the specific business scenarios that require the HUM to be implemented? is the HUM implemented at plant level or the storage location level ? Please provide som

  • URGENT !!! Detail Event in Agenda

    Hello, I have made an Agenda. But I have a problem with that. My Agenda show upcoming events. If I click on an event then I get every upcoming events. I know that if you link a report with a report, you must use a bind variable in your detail report

  • ERM 5.3 - Mass Role Import Issue

    Good day Chaps. I need you kind assistance. I have an issue uploading the Derived (Child) roles via the Bulk Down Load File into ERM 5.3 The error message I receive is Role processing failed; role not imported I have tried several trouble shooting so

  • 1.1.3 upgrade help....

    My bad I put this in the wrong forum last time. Anyways I have the newest version of Itunes and I run vista. But everytime I attempt to upgrade my Iphone, I tunes crazy,I've re-installed Itunes 3 times, is there another way?

  • 0VALUE_LC values are doubled

    Hi All ! Using business content installed the DSO 0CM_DS03, Update Rule 0CM_DS03 -> 2LIS_03_BF. But when loading the data the value of field 0VALUE_LC is getting double. Few notes : 1) I have not written any code in Update Rule. It was installed thro