Duplicate file handling using Module

Hi All
The Scenario is like this------
XI is picking a files from FTP location.
Duplicate files are also getting picked by XI.
To handle this i have written a module using NWDS which is finding it out wheather the file is duplicate or not. If the file is not duplicate then it is getting processed.
Now the problem i am facing is
I dont want to process the file if it is found duplicate, then what code i should write.
What are the ways i can stop the processing of duplicate file.
Regards
Dheeraj Kumar

Hi
I have implemented a module in which i can find out wheather the file is duplicate or not. If the file is not duplicate process the file.
now the problem is --- if file is duplicate then i dont want to process the file.
How can i achieve this?
Regards
Dheeraj Kumar
Edited by: Dheeraj Kumar on Nov 30, 2009 3:21 PM

Similar Messages

  • Duplicate File Handling using Adapter Module

    Hi All
    The Scenario is like this------
    XI is picking a files from FTP location.
    Duplicate files are also getting picked by XI.
    To handle this i have written a module which is finding it out wheather the file is duplicate or not. If the file is not duplicate then it is getting processed.
    Now the problem i am facing is -
    I dont want to process the file if it is found duplicate, then what code i should write.
    What are the ways i can stop the processing of duplicate file. 
    Regards
    Dheeraj Kumar

    Hi
    I have implemented a module in which i can find out wheather the file is duplicate or not. If the file is not duplicate process the file.
    now the problem is --- if file is duplicate then i dont want to process the file.
    How can i achieve this?
    Regards
    Dheeraj Kumar
    Edited by: Dheeraj Kumar on Nov 30, 2009 3:21 PM

  • Duplicate File Handling Issues - Sender File Adapter - SAP PO 7.31 - Single Stack

    Hi All,
    We have a requirement to avoid processing of duplicate files. Our system is PI 7.31 Enh. Pack 1 SP 23. I tried using the 'Duplicate File Handling' feature in Sender File Adapter but things are not working out as expected. I processed same file again and again and PO is creating successful messages everytime rather than generating alerts/warnings or deactivating the channel.
    I went through the link  Michal's PI tips: Duplicate handling in file adapter - 7.31  . I have maintained similar setting but unable to get the functionality achieved. Is there anything I am missing or any setting that is required apart from the Duplicate file handling check box and a threshold count??
    Any help will be highly appreciated.
    Thanks,
    Abhishek

    Hello Sarvjeet,
    I'd to write a UDF in message mapping to identify duplicate files and throw an exception. In my case, I had to compare with the file load directory (source directory) with the archive directory to identify whether the new file is a duplicate or not. I'm not sure if this is the same case with you. See if below helps: (I used parameterized mapping to input the file locations in integration directory rather than hard-coding it in the mapping)
    AbstractTrace trace;
        trace = container.getTrace();
        double archiveFileSize = 0;
        double newFileSizeDouble = Double.parseDouble(newFileSize);
        String archiveFile = "";
        String archiveFileTrimmed = "";
        int var2 = 0;
        File directory = new File(directoryName);
        File[] fList = directory.listFiles();
        Arrays.sort(fList, Collections.reverseOrder());
        // Traversing through all the files
        for (File file : fList){   
            // If the directory element is a file
            if (file.isFile()){       
                            trace.addInfo("Filename: " + file.getName()+ ":: Archive File Time: "+ Long.toString(file.lastModified()));
                            archiveFile = file.getName();
                          archiveFileTrimmed = archiveFile.substring(20);       
                          archiveFileSize = file.length();
                            if (archiveFileTrimmed.equals(newFile) && archiveFileSize == newFileSizeDouble ) {
                                    var2 = var2 + 1;
                                    trace.addInfo("Duplicate File Found."+newFile);
                                    if (var2 == 2) {
                                            break;
                            else {
                                    continue;
        if (var2 == 2) {
            var2 = 0;
            throw new StreamTransformationException("Duplicate File Found. Processing for the current file is stopped. File: "+newFile+", File Size: "+newFileSize);
    return Integer.toString(var2);
    Regards,
    Abhishek

  • Error 00007 : 0Stale NFS file handle in Module rslgcoll(041)

    What could be the reason for "Error 00007 : 0Stale NFS file handle in Module rslgcoll(041) " in sm21. After restarting application server I get this errors and verry soon I can not connect to server anymore(have to restart it)

    the error area is(However we are not using SSO (in terms of single sign on functionality in order not to need enter password at logon)  :
    N  =================================================
    N  === SSF INITIALIZATION:
    N  ===...SSF Security Toolkit name SAPSECULIB .
    N  ===...SSF trace level is 0 .
    N  ===...SSF library is /usr/sap/CR7/SYS/exe/run/libsapsecu.sl .
    N  ===...SSF hash algorithm is SHA1 .
    N  ===...SSF symmetric encryption algorithm is DES-CBC .
    N  ===...sucessfully completed.
    N  =================================================
    N  MskiInitLogonTicketCacheHandle: Logon Ticket cache pointer retrieved from sh
    red memory.
    N  MskiInitLogonTicketCacheHandle: Workprocess runs with Logon Ticket cache.
    M
    M Mon Apr  6 21:28:36 2009
    M  ThReschedAfterCommit: th_force_sched_after_commit = 1
    A
    A Mon Apr  6 21:28:50 2009
    A  *** ERROR => RFC ======> Name or password is incorrect. Please re-enter
    [abrfcio.c    6880]
    N
    N Mon Apr  6 21:29:13 2009

  • Regarding Adapter Module for Duplicate file handling at Sender side

    Hi All
    my requirement is to develop a adapter module . Source is FTP  target is R/3  .source Commuincation cahnnel is File sender .
    want to handle duplicate files
    can any one provide  me what are all the steps being used for the same .
    any step-via -step doc will be helpful for me ......and adapetr module will be written in NetWeaver Developer studio or which s/W ????
    also where to put the ejb or import the ejb s...
    Pls help
    Regards
    Priya

    Hi Priya,
    YES YOU NEED SAP NetWeaver Developer studio to develop Adapter Modules,or you can use any othe IDEs like Eclipse ....many documents available in sdn ,how to develop AM,refer below link it explaines clearly
    let me know which version of XI u working on,if it PI7.1 jar files different.
    http://wiki.sdn.sap.com/wiki/display/stage/AdapterModuleToReadExcelFilewithMultipleRowsandMultiple+Columns
    Regards,
    Raj

  • Duplicate File issue using FTP adapter - BizTalk 2010

    Hi We encountered an Issue of picking Duplicate files in BizTalk 2010 from the FTP Location. Need your assistance on this.
    Mainframe sends multiple files a 0 KB file to FTP and BizTalk picks during a particular service window using FTP Adapter. 
    Receive Location has a pipeline component which decodes the MF file(EBCDICRow format). 
    Send Port transmits the file which is decoded in the Receive location. There is no mapping or orchestration involved. 
    When 2 files are placed in FTP Location. BizTalk Transmits the files successfully
    No suspended messages in BizTalk but we see below error in event log. we tried reproduce the issue but no luck. 
    Host Instance running FTP Location is Clustered  
     There was a failure executing the receive pipeline: "XXXX.XX.Pipelines.Receive_XXX_TransactionsMC_passthru, XXX.XX.Pipelines, Version=1.0.0.0, Culture=neutral, PublicKeyToken=c2d1f476d5c2f97d"
    Source: "EbcdicRowDeCode" Receive Port: "XXX.XX.MCSTransactionsMC" URI: "ftp://XXXXXXXX:21/'XXXX'/MCSDT.DEFKOP.R001.D*.T*" Reason: Unable to cast object of type 'Microsoft.BizTalk.Streaming.BasicStreamWrapper' to type 'XXX.BizTalk.Pipeline.Components.Streams.V3.VirtualStream
    MF Puts below Files at FTP Location
    PSNOX.MCSDT.DEFKOP.R001.D150406.T009000
    PSNOX.MCSDT.DEFKOP.R001.D150406.T002100
    BizTalk Picks ( at specified Service window) and Transmits as 
    PSNOX.MCSDT.DEFKOP.R001.D150406.T009000
    PSNOX.MCSDT.DEFKOP.R001.D150406.T009000
    PSNOX.MCSDT.DEFKOP.R001.D150406.T002100
    Regards
    -Sri

    Hi Sri,
    There could be two reason for such a behavior:
    1) Using Non-Clustered Hosts: It is always recommended to use the clustered host for FTP adapter. Because FTP don't allow
    any locking mechanism on the files so in case of non-clustered with multiple host instances you might receive same file multiple times through different host instances.
    2) If the original document is still being written to the FTP server by the host application, the FTP adapter cannot
    delete the document and will retrieve another copy of the document at the next polling interval that is configured for the receive location. This behavior causes document duplication to occur. 
    Workaround could be:
    Configure the host application to write to a temporary folder on the same hard disk as the public FTP folder and to periodically move the contents of the temporary folder
    to the FTP folder. The temporary folder should be on the same hard disk as the public FTP folder to make sure that the move operation is atomic. An atomic operation is an operation that is functionally indivisible. If you write data to the public FTP folder
    by using the BizTalk Server FTP adapter, you can do this by specifying a Temporary Folder property in the FTP Transport Properties dialog box when you configure a send port. If you specify a Temporary Folder property, make sure that this folder is on the same
    physical disk as the public FTP folder.
    Configure the FTP receive location to operate within a service window when the host application is not writing data to the FTP server. You can specify the service window
    when you configure the receive location properties.
    Refer: Known Issues with the FTP
    Adapter
    Rachit
    Please mark as answer or vote as helpful if my reply does

  • Maximum file handled using dom4j

    Hi,
    Wanna know whats the maximum xml file size that can be handled using dom4j.
    To be precise, I have a 1gb xml file that I wanna parse, I have implemented it using SAX, but wanna know if dom4j can be used?
    Regards,
    R

    Just try, you'll probably only be limited by available memory. I assume that building any kind of DOM tree of the document (be it org.w3c.dom or dom4j) will use more memory than the original file. So you'll need > 1 GB of memory for your file. If you have to handle files that big and don't absolutely need it all at the same time then SAX is probably the better solution as it doesn't require the whole file to be in memory at a time.

  • Error in Duplicate file Handling

    Hi All,
          We want to avoid duplicate files in XI for that we used below code in  UDF in mapping
    try{
    String processedFileDatabase = processedFile[0];
    String sourceFileName;
    DynamicConfiguration attrib = (DynamicConfiguration)container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey fileKey = DynamicConfigurationKey.create("http:/"+"/sap.com/xi/XI/System/File","FileName");
    attrib.put(fileKey,attrib.get(fileKey));
    sourceFileName = attrib.get(fileKey);
    File fileDB=new File(processedFileDatabase);
    if (!(fileDB.exists() && fileDB.canWrite() && fileDB.canRead())){
    fileDB.createNewFile();
    Vector fileNameList = new Vector();
    BufferedReader br = null;
    br = new BufferedReader(new FileReader(processedFileDatabase));
    String name = new String();
    //loop and read a line from the file as long as we dont get null
    while ((name = br.readLine()) != null)
    //add the read word to the wordList
    fileNameList.add(name);
    br.close();
    boolean fileAlreadyProcessed = fileNameList.contains(sourceFileName);
    if (!fileAlreadyProcessed) {
    Writer output = new BufferedWriter(new FileWriter(new File(processedFileDatabase),true));
    output.write(sourceFileName + "\r\n");
    output.flush();
    output.close();
    result.addValue("" + !fileAlreadyProcessed);
    }catch(java.io.IOException e){
                e.printStackTrace();
    But this is not working ..its unable to Map....
    Plz help us in this regard,

    The file name is not going to fileDB and hence it is unable to create the target element.
    I followed the below link solution 2
    http://wiki.sdn.sap.com/wiki/display/XI/DifferentwaystokeepyourInterfacefromprocessingduplicate+files
    I seen that fileDB file is empty ,

  • Exporting Versions: avoid duplicate files when using external editor?

    I've got an Aperture Library containing TIFF masters. Some of the files have been edited via built-in Aperture tools; others have been edited using an external editor (PhotoShop). I want to do a batch export of all the Versions as scaled-down JPEG. If I select all the files in the library and export, everything works as expected, but there is one nuisance: every file that has been edited externally is exported twice.
    For example, consider a file named "MyFile-1.tiff", which exists in the library in the original version and the edited version, with the same filename. The export writes two files:
    MyFile-1.jpg (the original version)
    MyFile-1 (001).jpg (the Photoshopped version)
    I only want and need the edited version. So "MyFile-1.jpg" is extraneous and needs to be deleted, and "MyFile-1 (001).jpg" needs to be renamed.
    Is there any way to prevent this from happening without manually deselecting all of the individual version files I don't want to export? I don't see any setting or smart album feature that would say "when you find a file that has been edited externally, export only the externally edited version".
    It's not a big deal for me to use batch file renaming to clean up the exported files, but it is one more step in the workflow that I'd rather avoid.

    Do the original versions of the externally edit images have Aperture adjustments applied? And if so would you then want both of the images exported?
    If the answer to the first question is no or the answer to the second is yes then this should work:

  • Duplicate message handling using Co-relation set

    I have orchestion published as service(request-response port).
    The service consumer will send duplicate request if no response is received with in time out(configured at application end not biztalk and for example say 2 minutes).
    I designed my orchestration to have receive shape first(with corelation initialized) and my some process logic and listen shape(receive with follow co-relation & delay)
    All works good when my process logic is successfully executing. But when any failure or delay in processing and duplicate message is received before listen shape, then orchestration supends with error
    Request-response operation Operation on port Port is already in progress.
    One more contraint is that i need to send response to latest request and there should not be any zombies created.
    Plese advice a solution

    what you can try is as follows:
    Have a receive shape which will initialize a correlation set based on the message properties
    then have a parallel shape where in one branch you would do all the processing while in the other you will have another receive shape (following the correlation).
    if the request-repeat message is received set an orchestration variable indicating this event (e.g.: duplicate_request = true)
    after the parallel shape have a decision based on duplicate request to handle your response. So if you have received the duplicate then respond accordingly otherwise respond to the original request.
    What you'd have just done is a set a convoy which will handle duplicate requests without triggering a new orchestration instance.
    Regards.

  • Duplicate message handling in the sender file adapter

    Hi,
    I enabled duplicate file handling check in the sender file adapter so that whenever there is a duplicate file it should send me an alert also it should disable the channel so that i do not get that duplicate file alert message again and again.
    My question is will it activate the channel again as soon as a new file arrives or do i need to manually do that.
    Michal's PI tips: Duplicate handling in file adapter - 7.31

    Hi Hema,
    You will have to activate the channel manually. The idea behind the 'disable' functionality is to avoid further file processing through that channel which can only start once the channel is activated again manually.
    Regards,
    Abhishek

  • File Uploads using stored procedures

    Hello, I'm quite new here, but I have a question that I've been butting my head against for the past day. Here goes.
    We need to upload a file using a stored procedure (PL/SQL procedure.)
    The two things I have found that work are
    1) Having oracle do the file handling (using bfiles) in the procedure
    2) using an insert statement directly to upload the file contents into a blob.
    The platform is php (Oracle instant client) and I will show some code examples.
    1) is unworkable because Oracle will not have direct access to any files.
    2) is fine, but we would prefer to use a procedure so as to abstract what exactly goes on and possibly other operations away from the php and the framework.
    What worked:
    1)
    CREATE OR REPLACE PROCEDURE php_upload_file (file_name in varchar2, canonical_name in varchar2, owner in number, file_id IN OUT number)
    AS
    src_loc bfile:= bfilename('DOC_LOC',php_upload_file.file_name);
    dest_loc BLOB;
    begin
    insert into files values(owner,canonical_name,empty_blob(),files_seq.nextval) returning files.data, files.file_id
    into dest_loc, file_id;
    dbms_lob.open(src_loc,DBMS_LOB.LOB_READONLY);
    DBMS_LOB.OPEN(dest_loc, DBMS_LOB.LOB_READWRITE);
    DBMS_LOB.LOADFROMFILE(
    dest_lob => dest_loc
    ,src_lob => src_loc
    ,amount => DBMS_LOB.getLength(src_loc));
    DBMS_LOB.CLOSE(dest_loc);
    DBMS_LOB.CLOSE(src_loc);
    COMMIT;
    end;
    'DOC_LOC' is a directory I've set up that the user has access to.
    Interfacing with PHP this just looks like
    oci_parse($conn,"BEGIN php_upload_file('{$uploadfilename}','{$propername}',{$ownerid},:file_id); END;");
    Dead simple, right?
    I also do a bind command to pull out 'file_id' so I know the id that was just inserted.
    The other solution is
    $contents = file_get_contents($_FILES["uploadedfile"]["tmp_name"]);
    $lob = oci_new_descriptor($conn, OCI_D_LOB);
    $stmt = oci_parse($conn,
    "INSERT INTO files (employee_id, filename, data, file_id) VALUES(175,'"
    .$_FILES["uploadedfile"]["name"].
    "', empty_blob(), files_seq.nextval) RETURNING file_id, files.data INTO :file_id, :src_contents");
    where :src_contents is binded to a lob and :file_id as before is binded to an INT:
    oci_bind_by_name($stmt, ':src_contents', $lob, -1, OCI_B_BLOB);
    oci_bind_by_name($stmt, ':file_id', $insert_id, -1, SQLT_INT);
    oci_execute($stmt,OCI_DEFAULT);
    In this case the last thing I do before the commit is
    $lob->save($contents);
    Both work fine, but what I need is this
    $contents = file_get_contents($_FILES["uploadedfile"]["tmp_name"]);
    $lob = oci_new_descriptor($conn, OCI_D_LOB);
    $stmt = oci_parse($conn,"BEGIN do_upload_file(:src_contents,'{$propername}',{$ownerid},:file_id); END;");
    oci_bind_by_name($stmt, ':src_contents', $lob, -1, OCI_B_BLOB);
    oci_bind_by_name($stmt, ':file_id', $insert_id, -1, SQLT_INT);
    oci_execute($stmt,OCI_DEFAULT);
    $lob->save($contents);
    oci_commit($conn);
    this omits error conditions (such as on $lob->save ... etc.) it is simplified.
    The content of the procedure I changed as follows, but it seems untestable.
    CREATE OR REPLACE PROCEDURE do_upload_file (src_contents IN OUT blob, canonical_name in varchar2, owner in number, file_id IN OUT number)
    AS
    dest_loc BLOB;
    begin
    insert into files values(owner,canonical_name,empty_blob(),files_seq.nextval) returning files.data, files.file_id
    into dest_loc, file_id;
    dbms_lob.open(src_contents,DBMS_LOB.LOB_READONLY);
    DBMS_LOB.OPEN(dest_loc, DBMS_LOB.LOB_READWRITE);
    DBMS_LOB.COPY(dest_lob => dest_loc,
    src_lob => src_contents
    ,amount => DBMS_LOB.getLength(src_contents));
    DBMS_LOB.CLOSE(dest_loc);
    DBMS_LOB.CLOSE(src_contents);
    COMMIT;
    end;
    I don't get errors because I cannot figure out a way to run this procedure in my PL/SQL environment with valid data. (I can run it with a blank blob.)
    But when I work out the order of what's going on, it doesn't make sense; the commit in the procedure is before the $lob->save(...) and thus it would never save the data... nonetheless it should at least create a record with an empty blob but it does not. What is wrong is beyond the error level that seems to be supported by PHP's oci_error function (unless I have not discovered how to turn all errors on?)
    In any case I think the logic is wrong here, but I'm not experienced enough to figure out how.
    To test it I would need to create a driver that loads an external file into a blob, and passes that blob into the procedure. Trouble is, even if I make a blob and initialize it with empty_blob() it treats it as an invalid blob for the purposes of the dbms_lob.copy procedure.
    If someone has solved this problem, please let me know. I would love to be able to do the file upload with just a single procedure.

    Thanks. In my estimation that is exactly the issue. But that doesn't help with a resolution.
    The actual file size: 945,991 bytes
    If Firefox is miscalculating the length (in Safari/Chrome 945991 and 946241 in Firefox), then Firefox is reporting erroneously and should be raised as a bug in Firefox, would you agree?

  • Cannot duplicate files in finder

    I recently ugraded to Lion, and am just realizing now that I cannot duplicate files (Command D) in the finder.  The progress window shows the copy being made, but the duplicate file, which used to show up just under the original file as "file name copy" does not appear.  When I do a spotlight search, the copied file is nowhere to be seen.  Also tried doing an Option-drag-to-finder.  That didn't work either.  I was trying to make a copy of an image and then rename the new file to send to someone.  I had to go through Preview, then Unlock the file and then save as another name.  A ridiculously long process.  Anyone have any ideas?

    did you try selecting the item in the finder
    press cmd + c
    press cmd + v
    presto, you have a dupe of the file that has it's name appended with the number 1.

  • Deleting duplicate files in the I tunes folder

    How do you remove duplicate file from the itunes folder if you need to restore itunes library?

    There are some programmes for XP to find duplicate files. I wanted to delete music duplicate file  I used the programme delete duplicate files for such aim. It helped me. This program is not only for the music, but for all files. And now it is more free place on my PC

  • HELP!!! ... need to get Window Handle using PDA module

    I am trying to find a way to get the current window handle using the PDA module.  I need this handle to call other function via Win dll's.  Has anyone done this before?  Any help would be greatly appreciated (using windows mobile 6.1 and Labview PDA 8.5)  Thanks
    Greycat

    Thanks Mike ...
    I understand that this function does exist, but somehow I am not writing my wrapper properly or calling the function properly because everytime I build my labview app I get a "GetForegroundWindow is a missing VI or C file." - I cannot for the life of me figure this out ... what I was really asking is if anyone has succeeded calling this function and getting the Window Handle.  What I really want to do is run my Labview program on the PDA in full screen mode with no title bar and no SIP button visible or accessible, but to my knowledge, that will take a function call to the aygshell.dll (namely the SHFullScreen function) and that function requires a window handle to work properly ... any more help would be appreciated ... Thanks again
    Greycat

Maybe you are looking for

  • I Get an error message when trying to import cd's

    I keep getting an error message when trying to import cd's - "You don't have write access for your media player" I already tried going to the itunes media folder and unchecking the "read only" box but it didn't help. I recently changed computers but

  • My ThinkPad T400 won't boot!

    I honestly have no clue what could have caused this!  I turned it on this morning & it was stuck on the booting screen. It read "ThinkPad" in large letters followed by "To interrupt normal start up, press the blue ThinkVantage button." I waited for a

  • Current best phone to purchase with Syncing in mind?

    Hey everyone, I was looking at the Motorola RAZR and the Motorola KRZR phones as a replacement for my T-Mobile MDA, which is getting a bit bulky... I figured before I splurged on a unlocked phone, what's currently the best slim phone (similar to the

  • Installation error: Full Java Edition2004s-Trial

    I’m trying to install “Full Java Edition2004s-Trial” and I got the below errors. Please help <b>FJS-00003</b>: TypeError: dir_profile has no properties (in script NW_Java_OneHost|ind|ind|ind|ind, line 39637: ???) <b>MUT-03025</b>: Caught ESAPinstExce

  • Show date list

    Hi I am trying to show a date list in column. I have got a count of jobs opened, however all of the jobs are grouped into one count. I need a count of the 1st then the 2nd etc. EG I am getting.. Opened Closed Movement 3 0 -3 I need Date Opened Closed