XI Sender file adapter - How to process data and control files.

Hello all,
   I have the following requirement to fulfill: I am using an FTP client (XI Sender file adapter) to retrieve data from an FTP site. To make sure I am not picking up a data file that is currently being written to, 2 files are actually present on the FTP site (for each data file):
1. abc.ctrl (control file with no data in it. Indicates that the data file has been completely written).
2. abc.dat (actual data file).
  I want the file/ftp connector in XI to retrieve the data file (abc.dat) only if the control file (abc.ctrl) is present. After the processing of the data file is finished, both files (.dat and .ctrl) should be deleted.
  Is there an elegant and robust way to accomplish this?
Thanks for your help.

Hi Yves,
in my opinion there's no problem with files currently being written in combination with a polling file adapter because the final file name should be available only when the file is transferred completely. I'm using different file sender adapters very often and never had any problems. After picking up the files I move them to the corresponding archive folders mentioned in the adapters so that a second processing cannot occur.
Regards
Ralph

Similar Messages

  • How to create parameter and control file like filename + date

    Hello there
    I am trying to create parameter and control file with following command
    in SQLPLUS
    create pfile='/u03/oradata/WEBDB/backup/initWEBDB.ora' from spfile;
    In RMAN
    copy current controlfile to '/u03/oradata/WEBDB/backup/cf_longterm.cpy';
    how can I put date at the end of filename like
    initWEBDB8jan06.ora and cf_longterm8jan06.cpy
    Thanks in advance
    Lionel

    ASM is reliable but a smart DBA is very careful. If ASM is doing mirroring this is like RAID doing mirroring. What happens if you accidentally delete one copy ... the other one disappears instantly. Not a good idea.
    With respect to redo logs you need a minimum of three groups, two members, and one thread per instance. So a 2 node cluster should, at a minimum have 12 physical files.
    Not mirroring the redo logs, assuming multiple members, is not as critical.

  • How to create redlog and control file at ASM in linux RAC

    Hi Experts,
    I will to maintance a oracle 10g database at ASM as RAC iin linux red hat.
    i am a new person with some question.
    nornally speaking, oracle recommadition for oracle 10g database as
    create 3 copy fills for control file
    create at least 2 redo log with mirror files in system.
    However, I checked find
    redlog file is at FRA place +FLSdisk1 and no mirror
    control file is at FRA place--+FLSDISK1/
    datebase file at ‘+DATA1/
    There are no mirror for relog.
    Go to EM, I also could not find place to enter file name.
    We use ASM to hold database to support RAC.
    Do i need to create redlog file as
    ALTER DATABASE ADD LOGFILE GROUP 1 ('+FLSdisk1/sale/onlinelog/REDO01.LOG','+FLSdisk1/sale/onlinelog/REDO01_mirror.LOG') SIZE 1000M reuse;
    my boss told me that ASM is reliable.
    Do you need to creat more directory to arrange redlog and control files in ASM for RAC system?
    FRA is a good place to store control file and redlog file ?
    Thanks
    JIM
    Edited by: user589812 on Jul 3, 2009 3:03 PM

    ASM is reliable but a smart DBA is very careful. If ASM is doing mirroring this is like RAID doing mirroring. What happens if you accidentally delete one copy ... the other one disappears instantly. Not a good idea.
    With respect to redo logs you need a minimum of three groups, two members, and one thread per instance. So a 2 node cluster should, at a minimum have 12 physical files.
    Not mirroring the redo logs, assuming multiple members, is not as critical.

  • How to load data using Control File in BW 7

    Hi All,
    I have a requirement to load data in BW using control file. The development is done in in BW 7.0. In BW 3.5 in infopackage, there is an option of FILE IS ( Control File or Data File ). Please suggest how to simulate the same in BW 7.0
    Regards,
    Vikram

    Any suggestions?

  • How to copying, comparing and control large projects with deeply arranged directory tree?

    MacBook Pro, Redina, Mid 2012.
    When copying large projects with su and sub-subdirectories some files are copied with 0-data and other files are not at all.
    Is there a command to copy and compare files with deeply branched directory?
    To copy level by level of the directory is very time-consuming.
    Ramob44

    Hi,
    Its good that u pasted the complete log file. In your environment you have to run this upgrade tool only once from any of the middle tier.
    And with respect to your error that u got in precheck is quite simple. All u have to do is just run this script from by connecting to portal schema using sqlplus.
    Run dropupg.sql
    Location-------- /raid/product/OraHome_1/upgrade/temp/portal/prechktmp/dropupg.sql
    Later you re-run the upgrade tool and let me know the status.
    Good luck
    Tanmai

  • How  Sender SOAP Adapter will retriew the data  to process further

    How  Sender SOAP Adapter will  receive the data  to send the integration server to process further

    Hi,
    1. All details are always taken from the Sender Agreement.
    For the Sender File adapter and the Sender SOAP adapter , the details that include are , Sender Service, Sender Interface Name and namespace and these are the details that make up the SOAP header when the message hits the integration Engine.
    Once the corresponding Adapter for the message is identified, ie. the Sender SOAP adapter , the message for the SOAP header is then taken from the Sender Agreement of that adapter. This is the exact reason why one Sender Adapter can be involved in one and only one Sender Agreement.
    Regards
    Bhavesh

  • How to process large input CSV file with File adapter

    Hi,
    could someone recommend me the right BPEL way to process the large input CSV file (4MB or more with at least 5000 rows) with File Adapter?
    My idea is to receive data from file (poll the UX directory for new input file), transform it and then export to one output CSV file (input for other system).
    I developed my process that consists of:
    - File adapter partnerlink for read data
    - Receive activity with checked box to create instance
    - Transform activity
    - Invoke activity for writing to output CSV.
    I tried this with small input file and everything was OK, but now when I try to use the complete input file, the process doesn't start and automatically goes to OFF state in BPEL console.
    Could I use the MaxTransactionSize parameter as in DB adapter, should I batch the input file or other way could help me?
    Any hint from you? I've to solve this problem till this thursday.
    Thanks,
    Milan K.

    This is a known issue. Martin Kleinman has posted several issues on the forum here, with a similar scenario using ESB. This can only be solved by completely tuning the BPEL application itself, and throwing in big hardware.
    Also switching to the latest 10.1.3.3 version of the SOA Suite (assuming you didn't already) will show some improvements.
    HTH,
    Bas

  • How to process data in the past based from data in the present

    hello guys,
    i have a problem in my labview programs. how to process data in the past based from data in the present ?
    i have a formula self-organizing maps
    this formula is looking for D1, D1 is neuron index that will be searched for the smallest value.and the result are D1=2 ,D2=5, D3=17 from calculating with formula  .it means the smallest value is 2, "2" from weight [2 2] in file attached.
    and then it will be in other formula
    it mean [2 2] + 0.5 ( [1 1]-[2 2] ) = [1.5 1.5]
    and the weight will be  [1.5 2 2 ] in matrix
                                              1.5 3 5
    I would appreciate any input/help on solving this
    thanks
    Attachments:
    dika.vi ‏16 KB
    weight.txt ‏1 KB
    data .txt ‏1 KB

    Hi Ronny Hanks,
    Moving your records from internal table into the database table depends upon various scenarios :-
    1. If you use INSERT statement.
    INSERT <database_table> FROM TABLE <internal_table>.
    But in this case, you need to make sure that you don't have any duplicate entries in your internal table that violates data entry into database table, else you will get a dump.
    INSERT <database_table> FROM TABLE <internal_table> ACCEPTING DUPLICATE KEYS.
    In this case, you are forcefully inserting duplicate records into your database table which may lead to data redundancy in your database table.
    2. If you use UPDATE statement.
    UPDATE <database_table> FROM TABLE <internal_table>.
    This will update the existing records in your database table from the internal table.
    3. If you use MODIFY statement.
    MODIFY <database_table> FROM TABLE <internal_table>.
    This statement works both in combination of INSERT & UPDATE statements.
    Existing records (in database table) will be eventually updated/modified and new records (not in database table currently) will be successfully inserted into the database table.
    Hope this solves your problem.
    Thanks & Regards.
    Tarun Gambhir.

  • How to add data in a file

    hi,
    I have written following code.I have a file. i want to add data in this file.
    When i add data then data is added but i donnot get the previous data. What`s the problem of my code? Is there anyone can help me? how i will add data with the previous data? Please help me.
    String username=request.getParameter("UserName");
    String userage=request.getParameter("UserAge");
    String address=request.getParameter("UserAddress");
    String sex=request.getParameter("sex");
    FileWriter f = new FileWriter("d:\\download_dreamweaver\\Project_3\\WebContent\\SaveData.txt");
    f.append(username);
    f.close(); ith regards
    bina

    Looks like you're creating a new file (which overwrites the old one) every time. You need to open the file first and then append to it.

  • I used time machine to restore on a formatted MAC. Now the HDD space has reduced by 100GB but I cannot see any of the files. How do I find and delete those 100GB data from the HDD?

    I used time machine to restore on a formatted MAC. Now the HDD space has reduced by 100GB but I cannot see any of the files. How do I find and delete those 100GB data from the HDD?

    dglenn9000 wrote:
    I created a new user account just to see if it was my user Library or if there was something wrong with my system. And the new user account is doing most of the same things so I will need to do a full restore anyway.
    Not necessarily. I'd suggest downloading and installing the "combo" update. That's a combination (thus the clever name) of all the updates to Leopard since it was first released, so installing it should fix anything that's gone wrong since then, such as with one of the normal "point" updates. Info and download available at: http://support.apple.com/downloads/MacOS_X_10_5_8_ComboUpdate Be sure to do a +Repair Permissions+ via Disk Utility (in your Applications/Utilities folder) afterwards.

  • How to read data from a file that was formatted by excel?

    Hi everyone, I'm familiar with java.io and the ability to read from files, can anyone tell me how to read data from a file that was formatted by excel? Or at least give me some web references so that I can learn about it?

    http://jakarta.apache.org/poi/hssf/index.html
    HSSF stands for Horrible Spreadsheet Format, but it still works!

  • File Adapter does not process the same file twice

    SOA: 11.1.1.4 (non-HA).
    I have a file adapter that triggers when a new file gets to a directory; the file is not deleted after the process triggers.
    The process is supposed to call other services via a mediator and if any remote fault happens it should rollback automatically and re-trigger with the same file at the next polling interval.
    I have literally 3 scenarios:
    1). The file gets picked up once, the process fails and the file is never picked up again.
    Msg in log: The file : /xx/xx/xx/abc.xml is being ignored as it has already been processed
    2). If the mediator only routes to one service after the file gets picked up, it works as expect (that is rollback and restart at the next polling interval). If it has more than one sequential routing rule, I see the same error as above.
    3). File does not get picked up EVEN if I "touch" or rename the file.
    Msg in Log:
    File Adapter ProcessName Poller enqueuing file for processing :/xx/xx/xx/abc.xml
    File Adapter ProcessName Ignoring File : abc.xml as it is already enqued for processing.
    I have already checked, there is no permission issue.
    This is what my .jca file looks like:
    <adapter-config name="getFile" adapter="File Adapter" wsdlLocation="getFile.wsdl" xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
    <connection-factory location="eis/FileAdapter" UIincludeWildcard="*"/>
    <endpoint-activation portType="Read_ptt" operation="Read">
    <activation-spec className="oracle.tip.adapter.file.inbound.ScalableFileActivationSpec">
    <property name="DeleteFile" value="false"/>
    <property name="MinimumAge" value="5"/>
    <property name="SingleThreadModel" value="true"/>
    <property name="PhysicalDirectory" value="/xx/xx"/>
    <property name="Recursive" value="false"/>
    <property name="PollingFrequency" value="20"/>
    <property name="IncludeFiles" value=".*"/>
    <property name="UseHeaders" value="true"/>
    <property name="MaxRaiseSize" value="5"/>
    <property name="ListSorter" value="oracle.tip.adapter.file.inbound.listing.TimestampSorterAscending"/>
    </activation-spec>
    </endpoint-activation>
    </adapter-config>
    Thanks for looking into it in advance.
    Any help with the error messages will be appreciated.

    You have to use MOVE operation, if there is any remote exception occured, then move the file to someother folder and again move back to the same folder where the file pickup will start.
    In that way you will be use the same file picked up next time when the polling happens.
    It is considered good etiquette to reward answerers with points (as "helpful" - 5 pts - or "correct" - 10pts).
    Thanks,
    Vijay

  • How to output data to a file in SCC-SG04?

    I am using SCC-2345 with SCC-SG04 connected to NI-6221 in Windows 2000 in VC6.
    How to output data to a file like data1.dat in VC6 or in Labwindows/CVI?

    Hello mwibm,
    If you just want to do file input/output in LabWindows/CVI, I would take a look at the Formatting and I/O CVI library or the ANSI File I/O functions. For example, if you just want to write an array of data to a file, I would look at the ArrayToFile function or the fwrite/fputs/fprintf ANSI C functions included in stdio.h. The ANSI C functions will also work in Visual Studio. More information on the LabWindows/CVI File I/O functions can be found in CVI help, and more information on ANSI C functions can be found in CVI Help and online at various websites.
    Maybe you could further clarify what problems you are having and what kind of data you want to write to a file.
    Thanks.
    Wendy L
    LabWindows/CVI Developer Newsletter - ni.com/cvinews

  • How to share data in cfc files?

    I can save shared data (like dsn) in application.cfm so all
    the cfm file can read it.
    But how to share data in cfc files like application.cfm. I
    hear application.cfc,may I use it share data in cfc files?
    Do you think if it is ok that application.cfm and
    application.cfc exist in the site?
    Thanks
    Mark

    quote:
    Originally posted by:
    mark416
    I can save shared data (like dsn) in application.cfm so all
    the cfm file can read it.
    But how to share data in cfc files like application.cfm. I
    hear application.cfc,may I use it share data in cfc files?
    Do you think if it is ok that application.cfm and
    application.cfc exist in the site?
    Thanks
    Mark
    Don't use both. Things like a dsn are best defined in the
    onApplicationStart method of an application.cfc.

  • How to extract data from XML file with JavaScript

    HI All
    I am new to this group.
    Can anybody help me regarding XML.
    I want to know How to extract data from XML file with JavaScript.
    And also how to use API for XML
    regards
    Nagaraju

    This is a Java forum.
    JavaScript is something entirely different than Java, even though the names are similar.
    Try another website with forums about JavaScript.
    For example here: http://www.webdeveloper.com/forum/forumdisplay.php?s=&forumid=3

Maybe you are looking for

  • Problem dispaying jtable data

    Hello All, and thanks for any help. I have a jTable that is populated with an array of strings. During the course of the program these values get changed depending on the selection of a jcombobox. What I have noticed is that I am unable to set the in

  • Idoc Type  for Vehicle Master

    Hi Friends, Can any body tell me the Idoc type for 1. Vehicle Master 2. Transport unit and their respective transaction code to send data. Points would be awarded for useful answers. With Regards Vasu

  • Photoshop install error.. please help

    Why cant i install photoshop CS6? I downloaded it and when i launch the setup.exe thing, it loads and then at the end it shows a error.. Heres a pic of the error(in norwegian): http://gyazo.com/2ef750437165f74ff3967a026557dd3d What its saying is that

  • HT4796 Migration assistant had stopped with 17 minutes to go

    Migrations assistant from windows laptop to my new Mac book Pro , mountain lion, has stopped with 17 minutes still to go! Any help appreciated. 

  • Fatal Internal Error MemoryManager.cpp

    I've created a sequence file in teststand that uses the sub vi's for reading out config files. If I run this on my development machine there is no problem. I tried to deloy everything onto a new target machine. I start to run the sequence and i'm abl