Enable Duplicate Record Functionality

Hello All,
I have just completed a custom 6i form based on TEMPLATE.fmb.
One of my requirements is to be able to copy an entire record and duplicate it with standard functionality at Edit > Duplicated > Row.
Is there any way to enable this functionality?
I enabled the menu item with app_special.enable. This enabled the menu item but it won't duplicate a record.
Thanks for any help,
Bradley

Review the developer guide, controlling records in a window topic. There it indicates why this is disabled by default and the cautions to have if implementing, as well as an example of it's use.

Similar Messages

  • Implementing Duplicate record functionality

    Hi,
    In my application, there is one master table (T1) and 11 model specific details tables(MT1-MT11). They are related by a PK/FK relationship on config_id in both the tables. Depending on the value of a particular field in the details table, one of the 11 table is populated with some data.
    Now, i have a search page which is based on the details table(T1). I need to provide a 'duplicate record' functionality on the search page itself, such that when the user selects a particular record and presses 'Duplicate' button, the record in T1 and its corresponding table (say MT1) should be duplicated. He should also have to facility to change some data in the T1 table before committing the changes.
    The following is my flow of thoughts:
    * From the search page, find out the record that has been selected (config_id is PK).
    * redirect to another page (based on a new VO) where the current record is again queried and some fields made updateable (others remain the same). A new config_id is generated from a sequence.
    * A new record for the details table (MT1) should be created having the new config_id, but all other data remain the same as before.
    Is this the correct approach? How should i go about doing this??
    THanks
    Ashish

    Hi,
    This is what i have done till now.
    My search is based on VO1. I select one of the records (using a singleSelect radio button), and press 'Copy' button. In the Controller of this search page, i check for button press and then invoke a method in the AM where i get the confif_id (PK) of the record that has been selected.
    Then i query another VO (VO2) for this config_id. I get the current row. I create another row from VO2 and assign each attribute from the oldRow to the newRow. I then alter some values in the newVO (such as a new ConfigId) and nullify the WHO columns.
    The code i have written in the AM is:
        SketchDetailsCopyVOImpl vo = (SketchDetailsCopyVOImpl)getSketchDetailsCopyVO1();
        vo.initQuery(ConfigId);
        OARow oldRow = (OARow)vo.getRowAtRangeIndex(0);
        vo.insertRow(vo.createRow()); 
        OARow newRow = (OARow)vo.first();
        for (int i=1; i<oldRow.getAttributeCount(); i++)
          System.out.println(i+" -- "+oldRow.getAttribute(i));
          newRow.setAttribute(i,oldRow.getAttribute(i));     
        OADBTransaction txn = getOADBTransaction();
        Number newConfigId = txn.getSequenceValue("NRCONFIG_DETAILS_S");
        newRow.setAttribute("ConfigId",newConfigId);
        newRow.setAttribute("SiteName",null);
        newRow.setAttribute("PointNumber",null);
        newRow.setAttribute("ConfigStatusFlag","D");
        newRow.setAttribute("CreationDate",null);
        newRow.setAttribute("CreatedBy",null);
        newRow.setAttribute("LastUpdateDate",null);
        newRow.setAttribute("LastUpdatedBy",null);
        newRow.setAttribute("LastUpdateLogin",null);
        vo.setCurrentRow(newRow);I then re-direct to another page where the user can fill in the other required fields (such as SiteName and PointNumber). On the 'Apply' button, i have simply called transaction.commit();
    Although this works, but the WHO columns are not automatically populated.
    Is this the correct way of doing it?
    Souns fine, when you said new Vo I am assuming that represents MT1 and you are just >querying up the current record from T1 and pushing the values to the new record of MT1. Table MT1 contains some additional data that the user dosent see in the search page, nor is his intervention required while querying. So, once the record in T1 has been duplicated, I will simply duplicate the record in MT1 repalcing the config_id with the new config_id.
    Thanks
    Ashish

  • Enabling duplicate records in infopackage?

    Hi,
    Is there any provision to have the duplicate records in the datasource ?
    i am uploading from a flat file and flatfile has too many duplicate records!
    this is for testing purpose only!
    is it possible to have the duplicate records in BW?
    Thanks,
    Ravi

    Hi Ravi,
    if you load to ODS there is and option in settings "Unique Data Records"
    if you check this option it will allow only unique records. from ODS to cube make this ODS as data mart then load to cube.
    In Cube there is no such options.
    Cheers,
    Kumar.

  • Forms 6i duplicate record copy

    I am using forms 6i, I want to insert a new record in the form based on an existing record (so I do not have to type all the values)
    is there a shortcut key that I can use to copy the entire record and then when I say insert new, I just paste the record I just copied ?

    That being the case, when your form is running bring up the Default Key Mapping screen (should be Ctrl+F1) to see a list of all key strokes you can use.  The F4 key defaults (in Forms 6i) to "Duplicate Record" function.  Give that a try.
    Craig...

  • Duplicate Check Function in CRM when creating new BPs?

    Hi
    Three questions:
    1. does anyone know if there is some standard way in CRM (we are on CRM5.0) to enable a duplicate check to happen when a new business partner is created - one that will warn that a similar BP already exists when the user goes to save (or even better, when part way through creation)? I have already looked into using the data cleansing cases in CRM, which is ok for existing duplicate records, but doesn't help with stopping them being created in the first place.
    2. Is there a standard function/report available that will go through the database and identify any potential duplicate records? Or, does anyone know of some standard function calls/bapis etc that can be used by our developers to design something to do this for us?
    3. Failing the above two, are there any recommendations out there for 3rd party add-ins for address/duplicate checks?
    Any insight would be most gratefully received!
    Regards
    Cara

    Hi Cara
    SAP Business Address Services (BAS) is used for maintaining BP address data.
    You can maintain any number of addresses for each business partner. One address per business partner is always flagged as being the standard address. You can define address usages by assigning the different addresses to the relevant business processes.
    Postal data and information on different communication types, such as phone numbers, fax numbers and e-mail, can be assigned to the address. If you have only the name and the (mobile) phone number of a business partner but you donu2019t know the address, you can create a BP with this address-independent communication data.
    Now Answer to your question
    A postal validation for the postal code, the city and the street can be carried out by checking against the SAP Regional Structure. You can also use external software for postal validation, checks for duplicates, and error-tolerant searches. (For more information, please see SAP Note 176559.)
    The following are examples of possible checks:
    Postal codes, cities and streets, and combinations of all of them are checked for consistency. During the check, missing elements are added. For example, if you enter only the city, the postal code is added.
    When you create and change a business partner, several phonetically similar, existing BPs are proposed for comparison purposes.
    This prevents you from creating the same partner more than once.
    Please let me know if this answered your questions.
    Regards
    Dinesh

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • Duplicate records update

    Hi
    We do have primary key in source table and key columns without constraint in target table. We do transformation of converting schema name in source while capturing.
    We do allow_duplicate_rows=Y in apply side. Whenever, duplicate rows insert happening, it works fine. When it comes to update it not working it throws an error.
    ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    Could you please shed some light to fix this issue.?
    Alternate solution, remove the duplicate and execute the streams error transaction, works fine. But allow_duplicate_rows functionality is not fulfilled.
    Thanks
    Bala

    Source 10.2.0.4.3
    Target 10.2.0.3
    We are creating the reporting instance, duplicate is allowed. Insert on duplicate records are successfully inserted, when it comes to update, it is failed with the following error.
    ----Error in Message: 1
    ----Error Number: 1422
    ----Message Text: ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    --message: 1
    type name: SYS.LCR$_ROW_RECORD
    source database: PROD2
    owner: REPORT_TWO
    object: DEVICE
    is tag null: Y
    command_type: UPDATE
    Supplemental logging is enabled for both source and target database.
    Thanks
    Bala

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • SQL Loader loads duplicate records even when there is PK defined

    Hi,
    I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
    The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
    Below is the ctl file
    OPTIONS(DIRECT=TRUE, ERRORS=0)
    UNRECOVERABLE
    load data
    infile 'test.txt'
    into table abcdedfg
    replace
    fields terminated by ',' optionally enclosed by '"'
    col1 ,
    col2
    i defined pk on col1

    Check out..
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
    It states...
    During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
    Enabled Constraints
    The constraints that remain in force are:
    NOT NULL
    UNIQUE
    PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug?

  • Duplicate Records in Output

    I have a script which works fine however what I cant seem to work out is why I get duplicate records in the output. If anyone knows why it would be a great help/
    My code is below:
    Connect-QADService "domain.com"
    Function CheckUserExistance {
    if(Get-QADUser -Identity $Global:Sam) {
    Write-Host "User Found: $Global:Sam"
    $Example = $Global:Sam + "2"
    $prompt = "This User Already Exists, Please Choose Another Name. (e.g. $Example)"
    $Title = "Error"
    Add-Type -AssemblyName microsoft.visualbasic
    $popup = [Microsoft.VisualBasic.interaction]::MsgBox($prompt,"OkOnly,Critical", $title)
    if ($popup -eq "Ok"){
    $prompt = "Please Enter The Name To Create (All Lowercase e.g john.smith):"
    $Title = "User To Create"
    Add-Type -AssemblyName microsoft.visualbasic
    $Global:Sam = [Microsoft.VisualBasic.interaction]::inputbox($prompt,$title)
    if ($Global:Sam -eq ""){exit}
    # The following Function is used to randomly generate complex passwords.
    function New-Password
    param
    [int]$length,
    [switch]$lowerCase,
    [switch]$upperCase,
    [switch]$numbers,
    [switch]$specialChars
    BEGIN
    # Usage Instructions
    function Usage()
    Write-Host ''
    Write-Host 'FUNCTION NAME: New-Password' -ForegroundColor White
    Write-Host ''
    Write-Host 'USAGE'
    Write-Host ' New-Password -length 10 -upperCase -lowerCase -numbers'
    Write-Host ' New-Password -length 10 -specialChars'
    Write-Host ' New-Password -le 10 -lo -u -n -s'
    Write-Host ' New-Password'
    Write-Host ''
    Write-Host 'DESCRIPTION:'
    Write-Host ' Generates a random password of a given length (-length parameter)'
    Write-Host ' comprised of at least one character from each subset provided'
    Write-Host ' as a switch parameter.'
    Write-Host ''
    Write-Host 'AVAILABLE SWITCHES:'
    Write-Host ' -lowerCase : include all lower case letters'
    Write-Host ' -upperCase : include all upper case letters'
    Write-Host ' -numbers : include 0-9'
    Write-Host ' -specialChars : include the following- !@#$%^&*()_+-={}[]<>'
    Write-Host ''
    Write-Host 'REQUIREMENTS:'
    Write-Host ' You must provide the -length (four or greater) and at least one character switch'
    Write-Host ''
    function generate_password
    if ($lowerCase)
    $charsToUse += $lCase
    $regexExp += "(?=.*[$lCase])"
    if ($upperCase)
    $charsToUse += $uCase
    $regexExp += "(?=.*[$uCase])"
    if ($numbers)
    $charsToUse += $nums
    $regexExp += "(?=.*[$nums])"
    if ($specialChars)
    $charsToUse += $specChars
    $regexExp += "(?=.*[\W])"
    $test = [regex]$regexExp
    $rnd = New-Object System.Random
    do
    $pw = $null
    for ($i = 0 ; $i -lt $length ; $i++)
    $pw += $charsToUse[($rnd.Next(0,$charsToUse.Length))]
    Start-Sleep -milliseconds 20
    until ($pw -match $test)
    return $pw
    # Displays help
    if (($Args[0] -eq "-?") -or ($Args[0] -eq "-help"))
    Usage
    break
    else
    $lCase = 'abcdefghijklmnopqrstuvwxyz'
    $uCase = $lCase.ToUpper()
    $nums = '1234567890'
    $specChars = '!@#$%^&*()_+-={}[]<>'
    PROCESS
    if (($length -ge 4) -and ($lowerCase -or $upperCase -or $numbers -or $specialChars))
    $newPassword = generate_password
    else
    Usage
    break
    $newPassword
    END
    Import-Csv "C:\ExternalADAccounts.csv" |
    ForEach-Object {
    $First = $_.Forename
    $Last = $_.Surname
    $Password = New-Password -length 8 -upperCase -lowerCase -numbers -specialChars
    $stuff = $_.stuff
    $OU = "domain.com/ou"
    $UPN = "$First.$Last@$stuff.domain.com"
    $Global:Sam = "$First.$Last"
    $Display = "$First $Last"
    $Global:Sam = $Global:Sam -Replace " ", ""
    $Global:Sam = $Global:Sam -Replace "'",""
    $Count = $Global:Sam.Length
    If($Count -gt "19"){
    $Global:Sam = $First[0] + "." + $Last
    Do{CheckUserExistance}
    While(Get-QADUser -Identity $Global:Sam)
    Write-Host "Creating User: $Global:Sam"
    New-QADUser -FirstName $First `
    -LastName $Last `
    -DisplayName $Display `
    -Name $Global:Sam `
    -SamAccountName $Global:Sam.ToLower() `
    -UserPassword $Password `
    -UserPrincipalName $UPN.ToLower() `
    -ParentContainer $OU
    do{
    $Test = Get-QADUser -Identity $Global:Sam
    } until(Get-QADUser -Identity $Global:Sam)
    Set-QADUser $Global:Sam -Title 'Non-Managed'
    } | Select-Object @{label="DisplayName";expression={$Display}},@{label="SamAccountName";expression={$Global:Sam}}, @{label="Password";expression={$Password}}, @{label="Stuff";expression={$Stuff}} |
    Sort-Object $Stuff |
    Export-Csv "C:\NonManagedCreated.csv" -NoTypeInformation
    Any help is greatly appreciated.
    James

    Hi James,
    Please try the script below, which will use Psobject to format the output:
    $output=@()##########
    Import-Csv "C:\ExternalADAccounts.csv" |
    ForEach-Object {
    do{
    $Test = Get-QADUser -Identity $Global:Sam
    } until(Get-QADUser -Identity $Global:Sam)
    Set-QADUser $Global:Sam -Title 'Non-Managed'
    ###########output
    $Object = New-Object -TypeName PSObject
    $object | Add-Member -Name 'DisplayName' -MemberType Noteproperty -Value $DisplayName
    $object | Add-Member -Name 'SamAccountName' -MemberType Noteproperty -Value $Global:Sam
    $object | Add-Member -Name 'Password' -MemberType Noteproperty -Value $Password
    $object | Add-Member -Name 'Stuff' -MemberType Noteproperty -Value $Stuff
    $output+=$Object
    $output|select DisplayName,SamAccountName,Password,Stuff| Sort-Object Stuff |Export-Csv "C:\NonManagedCreated.csv" -NoTypeInformation
    If there s anything else regarding this issue, please feel free to post back.
    Best Regards,
    Anna Wang

  • Duplicate Records in Details for ECC data source. Help.

    Hello. First post on SDN. I have been searching prior posts, but have come up empty. I am in the middle of creating a report linking directly into 4 tables in ECC 6.0. I am having trouble in getting either the table links set up correctly, or filtering out duplicate record sets that are being reporting in the details section of my report. It appears that I have 119 records being displayed, when the parameters values should only yeild 7. The details section is repeating the 7 records 17 times (there are 17 matching records for the parameter choices in one of the other tables which I think is the cause).
    I think this is due to the other table links for my parameter values. But, I need to keep the links the way they are for other aspects of the report (header information). The tables in question are using an Inner Join, Enforced Both, =. I tried the other link options, with no luck.
    I am unable to use the "Select Disctinct Records" option in the Database menu since this is not supported when connecting to ECC.
    Any ideas would be greatly appreciated.
    Thanks,
    Barret
    PS. I come from more of a Functional background, so development is sort of new to me. Take it easy on the newbie.

    If you can't establish links to bring back unique data then use a group to diplay data.
    Group report by a filed which is the lowest commom denominator.
    Move all fields into group footer and suppress Group header and details
    You will not be able to use normal summaries as they will count/sum all the duplicated data, use Running Totals instead and select evaluate on change of the introduced group
    Ian

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

Maybe you are looking for

  • Display or Graphics Card Problem

    Hey all, Recently I've notice little spots apprearing discolored on the display. And by little I mean 1 pixel, but there are many of them. They usually appear on the grey frame of windows, as well when I'm watching a dvd they appear on the movie scre

  • Photoshop CS6 issue with shortcuts and my keyboard after upgrade from CS5

    I recently upgraded from CS5 Extended to CS6 Extended on my Mac (Mavericks), but installed it in English instead of German this time. I transferred most of my presets just fine, but none of the keyboard shortcuts were working. All I got was an unequa

  • Best way to record from analogue camcorder

    I have just purchased a TV@nywhere Master card with MSIPVS.  My main aim is to transfer my analogue 8mm video tapes into digital format.  I am currently running MSIPVS on an 800mhz P3 and the quality is OK but I had hoped for better.  I have chosen t

  • BPMN 11g - Process initiation schema validation error

    Gurus, we are facing one absurd issue. When we create a BPMN process and expose it as a service in composite then the WSDL it creates automatically does not have 'formElementDefault' attribute under <xsd:schema > element.. hence if we test from EM af

  • Ipad Mini to DVD for better audio?

    For iPad Mini, I want to output signal to DVD HDMI port for better 5.1 sound.  Is that possible? Is splitter required?  Presently I am using the Lightning Digital AV Adapter to connect Ipad mini directly to TV, but that limits audio to TV's speakers.