J1INQEFILE - efile generation - Exported file shows Duplicate records.

Dear Team,
When I execute J1INQEFILE, I am facing problem with the e-file generation i.e. exported Excel file. When I execute and export the file in excel to the desktop, I can see duplicate records.
For eg. On execution of J1INQEFILE, I can see 4 records on the SAP screen, whereas the exported file to the desktop shows me 2 more identical records i.e 6 records. As a result, in the SAP system i can see Base Amount as 40000 ie. 10000 for each. on the contrary the excel sheet shows me 60000 i.e. 6 records of 10000 each (bcse of 2 more duplicated records) and also shows the TDS amount wrong. How are the records getting duplicated? Is there any SAP note to fix this? We are debugging on this but no clue....
Please assist on this issue....
Thanks in Advance !!!!

Dear Sagar,
I  am an abaper,
Even I came across the same kind of situation for one of our  client ,When we  execute J1INQEFILE, after exporting the same to  Excel file we use to get duplicate records.
For this I have Debug the program and checked at  point of efile generation, there duplicate records were getting appended for the internal table that is downloaded for Excel, so I have pulled the Document number in to Internal table and  used  Delete Adjacent duplicates by comparing all fields and hence able to resolve the issue.
Hope the same logic helps or guide you to proceed with the help of an abaper.
<<Text removed>>
Regards,
Kalyan
Edited by: Matt on Sep 8, 2011 9:14 PM

Similar Messages

  • Showing Duplicate records in a report with the count of their occurrence

    Hi Members,
    I am novice to BI. i need your suggestion to achiever one of the functionality. A report is required to be built with columns Incident ID, Task ID, Task Name. The Original report is below:-
    In the above report I have to display only duplicated records once. Distincat records are not required to be shown. That is, if the same no. of Task Name is associated with the same Incident ID then it should be displayed once with the count of their occurrences in a separate column say , Count. For example, the Task 'TASK_MANUAL_KCI'  got 4 times associated with the same Incident ID INC000000001434. Then the report should display this task one time in a report with count 4. Similarly, for INC000000000943 where task 'IPCG Diagnostic Template' got associated 2 times. The Count is 2 for this task.
    Other records should not be displayed in a report.
    Would highly appreciate quick response. Please suggest.
    Thanks,
    Neha Pateria

    I tried this Gill, but result is bit different :-
    Seems some little modifications are needed. Report should be displayed like the below one where it says how many times the same Task is associated with the same incident. Eg. Task 'TASK_CIRCUIT_RESOLVER' is associated with the incident 'INC000000001434' 4 times. 'IPCG Diagnostic Template' is associated with  'INC000000000943' 2 times. Similarly for other records.
    But I really thankful to you for giving me the logic proceed further. I tried  '=RunningCount([Task Name]; Row; ([Task Name]; [Incident ID]))'. It gave me desired results. :-
    But the only thing that needs to be done is to bring the single Task associated with the Incident..
    Thanks,
    Neha Pateria

  • Time out error in J1INQEFILE while generation ETDS file

    hi,
    when ever i am executing transaction J1INQEFILE. it giving Time out error. in dump folloeing message is coming.
    Termination occurred in the ABAP program "J_1I_QER_EFILE" - in "MOVE_VENDOR".
    The main program was "J_1I_QER_EFILE ".
    In the source code you have the termination point in line 1410
    of the (Include) program "J_1I_QER_EFILE_FETCH_DATA".
    i checked in program. can you people tell me how to optimize this query after making this program as a Y program.so that this error can be avoided.
    Thanks
    SP

    Hi Prateek,
    Thanks for your replay. I am able to open the URL from XI/PI.
    I tried by testing the same URL by using XI 3.0, its working fine and i am getting required data, but in PI 7.1 i am getting an error.
    Error is:
    In MONI:
    com.sap.engine.interfaces.messaging.api.exception.MessagingException: java.net.ConnectException: Connection refused: connect
    In RWB:
    Message processing failed. Cause: com.sap.engine.interfaces.messaging.api.exception.MessagingException: java.net.ConnectException: Connection timed out: connect
    What could be the reason?
    Regards,
    Venu

  • ABAP Query Shows duplicate records

    Hi
    I have a ABAP query which links the VIQMEL table(notifcation number) and the VBAK table(order & net value).  The report outputs as:
    Notification A   Order 10   SO value 1000
    Notification B   Order 10   SO value 1000
    Notification C   Order 10   SO value 1000
    Notification D   Order 10   SO value 1000
    The problem is the value is repeated 4 times when we only want it shown the once, is this possible to do and if so how can this be done?
    thanks
    Joe

    Desp09 wrote:
    Hi
    >
    > I have connected
    >
    > EKKO-EBELN to EKPO
    >
    >  EKPO -EBELN and EKPO-EBELP  to the same of EBAN
    >
    > also EKPO-BANFN and EKPO-BNFPO to the same of EBAN
    >
    > EBAN-BANFN and EBAN-BNFPO to VBEP
    >
    > VBEP-VBELN to VBAK-VBELn
    >
    > VBAK-VBELN to VBPA-VBELN
    >
    > VBPA-ADRNR to ADRC-ADDRNUMBER.
    >
    > This is the first time I have created a query so if you can kindly explain the logic of the error it would help in my future reports.
    >
    > Thanks
    > Priya
    Hi Priya,
    You can simplify the process if you make use of the LDB, which will fetch the same report with std LDB available in the system.
    You can define multiple infosets and  pull the fields for easier approach.
    Regards
    Shiva

  • SIS Infostructure show duplicate records after running oli7, oli8, oli9

    Hi,
    I ran oli7, oli8, oli9 on my relevant sales documents as some infostructure were not getting update. i selected the sales document for specific dates and then enter those document start and end number in these Tcodes.
    lets say : old data in sis infostrctre was for dates 01-Jan-10  to 08-Jan-10
    data missing for days:  08-Jan-10 to 16-Jan-10  (some of documents from 8th Jan was missing)
    now when i ran it on documents for dates 8-Jan-10 to 116-Jan-10, the data in that info structure was duplicated for date 8-Jan-10 (meaning that all previous data which was already in that info structure was duplicated).
    Can any1 pls guide me how to eradicate that error.
    Thanks.

    with olix we deleted the version of update which we created through oli7,oli8 & oli9. and then reexecuted the update tcode with correct date options...thus the problem was resolved.

  • Where is the Show Duplicate option in the new iTunes 8,0?

    The thing I loved about the iTunes before the new one was that it had a function to show the duplicates so I could keep only one song instead of 4 versions of the same song from different cd's.
    So my question is, where did the the "Show Duplicate" function go?

    Hi,
    Try: File>Show Duplicates.
    arg9182

  • SQL Loader loads duplicate records even when there is PK defined

    Hi,
    I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
    The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
    Below is the ctl file
    OPTIONS(DIRECT=TRUE, ERRORS=0)
    UNRECOVERABLE
    load data
    infile 'test.txt'
    into table abcdedfg
    replace
    fields terminated by ',' optionally enclosed by '"'
    col1 ,
    col2
    i defined pk on col1

    Check out..
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
    It states...
    During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
    Enabled Constraints
    The constraints that remain in force are:
    NOT NULL
    UNIQUE
    PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug?

  • Remove duplicate records in the Footer

    Hello all:
    I only want to print the final result(distinct records) in the footer section, therefore, the header and details sections are suppressed. However, I'm showing duplicate records in the footer section. How do I suppress the duplicate records?
    Please help.....Thank you so much in advance
    Here are my formulas : 
    Header
    WhilePrintingRecords;
    StringVar ConCat:=""
    Details
    WhilePrintingRecords;
    StringVar Concat;
    ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13)
    Footer
    WhilePrintingRecords;
    StringVar ConCat;
    Here's my desired output:
    HUMALOG PEN 100 UNIT/ML SOLN
    LANTUS 100 U/ML SOLN
    METFORMIN HCL 1000 MG TABS
    Below is an example my current output:
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    11/24/2009-LANTUS 100 U/ML SOLN
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS
    01/24/2007-METFORMIN HCL 1000 MG TABS

    Sorry, I forgot to mention I'm already grouping by Patient Name, and that's where I have created a formula called:
    MEDSHEAD - which I've modified and added the concat line.
    WhilePrintingRecords;
    //StringVar ConCat:="";
    StringVar ConCat;
    ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13);
    In the detail section : I'm displaying the the patient's name, DOB and Gender and the hemoglabin lab result and date of the lab result.
    In the patient's footer section is where I have the formula called: medsfooter (WhilePrintingRecords; StringVar Concat; Concat;)
    The changes that I made, eliminate some of the duplicate records, however, it's no longer initializing the values - what I meant is I am getting the previous patient's medication information:
    See output below:
    Patient One              4/25/1958     F
    6/26/09 12.2
    9/2/2008-Glipizide XL 5 MG TB24
    4/2/2009-Novolog 100 unit/ml soln
    Patient Two          12/11/45       F
    9/2/2008-Glipizide XL 5 MG TB24
    4/2/2009-Novolog 100 unit/ml soln
    11/24/2009-Humalog Pen 100 Unit/ML soln

  • Display duplicate records

    Hi,
    I want to make a report which shows duplicate records in a table. We can fetch the duplicate records by using the rowid in sql but how to make it work in OBIEE? the table does not hold a unique key.

    try something similar http://carpediemconsulting.wordpress.com/2008/08/13/displaying-duplicate-values-in-obiee-reports/
    OR
    Use Direct Database Request to write direct SQLs as you do in Database

  • Strange duplicate files showing up

    I primarily develop in PHP running MAMP on my Mac for the testing server. For a few months now I'm getting all kinds of duplicate files showing up as I'm editing and testing on my local server. This never happened before.
    Here's my usual workflow. The testing server folder is set to the same folder that I use for my site folder. I have two reasons for this. When the testing server folder and the site folder are the same I can just go to localhost/mySite and see everything working without loading up files to a different testing server. It's much easier and uses less disk space than having duplicate files.
    Second reason, when I'm working on debugging pages a simple save / option tab / and refresh gets me to the page much faster than when I have a separate testing server folder setup and have to use Option+F12 to upload the file and render it.
    The bug that's popped up is the generation of duplicate files. For example, while working on main.php, when I look at or edit a record set I often end up with a file called main_QFK9H5ZWYH.php or something similar in the folder. This file is not used for an f12 preview, never shows up in any localhost previews, and, as far as I can tell is never used by Dreamweaver for anything. The only thing that happens is my site folder fills up with a bunch of files that do not need to be used for anything, clutter up the directory, and unless I'm very careful, end up getting uploaded to the server as just a bunch of junk.
    Anyone else seeing this? I thought for a while that it was a preferences setting that I fouled up. Deleting the preferences file and starting over didn't solve the problem.
    This was especially problematic today as I was going through the entire site prior to publishing and I ended up with more than 40 duplicate files. For the main.php file I ended up with 6 versions in less than 5 minutes of testing. I'd love to shut off whatever is causing this.
    One last thing. It makes no difference whether I'm editing on my laptop or on the desktop at home, or on a Windows machine at the clients office.
    Thanks for any ideas.

    Yup, preview in secondary browser is turned off. Further testing reveals that if I have a scripting error or I use shortcuts CMND/Option to bring up a direct link to CSS, or if I try and edit an existing recordset from the bindings panel I'm very likely to get a new temp file. It's never opened in an edit window and the temp file never shows up in the site preview unless I browse directly to it.
    Weird and annoying for sure. When I get this intranet site completed I'll try and diagnose the problem. No time in the day to do that right now. Deadlines and family commitments don't leave me with the spare hour or so I'll need to figure out what's going on.
    OSX 10.7.4   Dreamweaver CS6 Version 12.0 Build 5808
    No errors in premissions. Ran a repair anyway. No change.
    Also, as per my other post, Dreamweaver is doing this on 3 different systems. A Dell laptop running Windows Ultimate, my iMac and my MacBook Pro.

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • Duplicate value in export file

    Hi,
    I have a sql report where I have implemented download to excel using the print attribute. When I export the report into Excel, there are few columns whose values getting duplicated in the spreadsheet where as those column values are blank in the report.
    for example.
    Query:
    select col1 col2 col3 col4 from table1;
    The report displays the correct value for col1 col2 col3 col4 but in the export file value for col1, col2, col3 are correct but col4 is picking up the value of col3.
    Any help? Thanks.
    --Manish                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Hi Scott,
    Could you please have a look at it.
    http://apex.oracle.com/pls/otn/f?p=47427:12
    When you will click download_excel at the bottom of the page, the export file will show you the duplicate values for the column [G-L].
    Thanks again for your help.
    --Manish
    Message was edited by:
    Manish Jha

  • BUG: LR2.0 shows duplicate drive after edit in photoshop export

    Lightroom 2.0 shows a new duplicate drive after edit in photoshop export. file opens as DNG in photoshop, saves as PSD, and saves to the right folder on the hard drive. But lightroom now shows 2 "drive G"'s(that's my image vault drive letter) The folders are correct, the second drive only contains the edited psd files. display in explorer shows the file in the right location with the original DNG file. This is a random event not every edited file displays under the second drive. I have a total of 6 files (I had more, but I moved my image vault to a new drive and fixed the ten I had at that time.)
    Vista Ultimate 32bit SP1
    AMD 64X2 duel core 6000+ 3Ghz
    6GB 666mhz Ram PAE force enabled
    NVIDIA Gforce 7600 256MB video card (1.75GB total with system shared ram)
    Primary hard drive: 500GB
    Hard drive 2: 400GB (system and photoshop paging files and lightroom catalog drive)
    Hdrive 3: 1TB (primary image storage)
    user control turned off
    system graphics set to performance
    Duel display
    Lightroom 2.0
    photoshop CS3

    No I had to do it manually. When I had lightroom 2, here's what would happen.
    1. My external drive would show up, but all the photos would be marked as missing.
    2. I would then relocate the photos, which I had do do directory by directory. At this time I did not have all of them under a master directory.
    3. As I relocated them, the second drive would show up, which would contain all my directories as I located them.
    4. After I had relocated all the directories, I would delete the now empty duplicate - which has been the original - drive.
    5. After I shut down LR and restarted, the whole process would begin again.
    6. After I installed 2.1, I put all my directories under a parent directory, relocated the parent, it found all the files, automatically deleted the ghost duplicate, and I have not had the problem since.
    Hope all this makes sense,
    Good luck

  • HT1349 Why won't "show duplicates" show up in either my File or Edit menus?  And, why does my entire library keep getting duplicated?

    I have a new computer and I'm trying to reload my library and my playlists.  I also have a new iPhone 5 and I'm trying to load that too.  I'm getting duplicates of everything in my library.  When I try to find the "show duplicates" tab in my File menu and/or my Edit menu it isn't there.  Any ideas?  Thanks!

    Yes, it has always worked in the past.  The past projects audio clips do not change names.  I obviously did something different in this current project, but unfortunately I have no idea what.

  • Script to merge multiple CSV files together with no duplicate records.

    I like a Script to merge multiple CSV files together with no duplicate records.
    None of the files have any headers and column A got a unique ID. What would be the best way to accomplish that?

    OK here is my answer :
    2 files in a directory with no headers.
    first column is the unique ID, second colomun you put whatever u want
    The headers are added when using the import-csv cmdlet
    first file contains :
    1;a
    2;SAMEID-FIRSTFILE
    3;c
    4;d
    5;e
    second file contains :
    6;a
    2;SAMEID-SECONDFILE
    7;c
    8;d
    9;e
    the second file contains the line : 2;b wich is the same in the first file
    the code :
    $i = 0
    Foreach($file in (get-childitem d:\yourpath)){
    if($i -eq 0){
    $ref = import-csv $file.fullname -Header id,value -Delimiter ";"
    }else{
    $temp = import-csv $file.fullname -Header id,value -Delimiter ";"
    foreach($line in $temp){
    if(!($ref.id.contains($line.id))){
    $objet = new-object Psobject
    Add-Member -InputObject $objet -MemberType NoteProperty -Name id -value $line.id
    Add-Member -InputObject $objet -MemberType NoteProperty -Name value -value $line.value
    $ref += $objet
    $i++
    $ref
    $ref should return:
    id                                                         
    value
    1                                                          
    a
    2                                                          
    SAMEID-FIRSTFILE
    3                                                          
    c
    4                                                          
    d
    5                                                          
    e
    6                                                          
    a
    7                                                          
    c
    8                                                          
    d
    9                                                          
    e
    (get-childitem d:\yourpath) -> yourpath containing the 2 csv file

Maybe you are looking for