Empty file picked although file have lot of data in file adapter SAP PI

Hello All,
We are facing the weired behaviour of SAP PI file adapter channel.
In without mapping Scenario of 7.31
At some time whole file is archived properly in archive folder but in SAP PI log we can see the file size 0 bytes.
And at received side empty file will be sent.
But in actual there is lot of data in the file.
The same file triggered next time then it working fine.archived and sent as it is.
So can anybody help me for this ?what is the issue why sometime file with no data is  picked by PI?
If file is pick up by PI before it is written completely by third party then is it possible that archived file will have all data as it although processed empty ?
Thanks in advance,
Anant

Hi,
Thanks for reply.
I know this is common issue of file picking before completing the process of file writting and also i can use Msec to wait before modification check for avoiding this but
my question is how the archived folder have the file with all data that sender is sending.
So is it possible that empty file (before compete writting )is picked up by sap pi and then after completing the picking process it will archive the file that is completely written by third party in source folder.
In other way file picking & File archiving is separate process means file that is picked is not archived but once pi processed the file then it will again check the source folder and archive and delete the same file.
Thanks
Anant

Similar Messages

  • Best way to stream lots of data to file and post process it

    Hello,
    I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it.  I am running a test that has over 100 channels of mixed sensor data.  The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running.  The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values.  I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large.  I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs.  This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem.   After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
    I am considering several options to improve my current process.  The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization.   The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp.  Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them).  Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.  
    The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem.  If someone could provide me with some suggestions as to what option would be the best I would appreciate it.  Or, if there is a better option that I have not mentioned feel free to recommend it.  Thanks in advance for your help.

    Hello,
    Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
    Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
    I hope it helps, of course there are much more advanced features with TDMS files, read the help docs!

  • Do i have to back up my files first, do i have to back up my files first

    I have a macbook air it has Mac OSX version 10.7.5 I assume that is Lion? I am thinking about upgrading to the new Mountain Lion. I use a program VM Fusion to run windows also. I belive it runs in the back ground and loads windows in a partioned hard drive (but im not sure, but it works fine). Is it ok to upgrade and do I have to back up the files on the mac side of the computer and if so how? I am not so worried about the files on the windows side because i have carbonite backup on that side. What about the software too? Will everything have to be reinstalled? Im sorry i dont know a lot about computers and dont want to mess up this piece of art..lol..

    cnochumson wrote:
    ... do I have to back up the files on the mac side of the computer and if so how? ...
    It is Prudent that Before attempting any Major Upgrade you Backup your Current System.
    Get yourself an External Hard Drive... and create a Bootable Clone Backup of your current Hard Drive...
    By far the easiest way to make such a Backup, is to use something like
    SuperDuper  http://www.shirt-pocket.com/
    or CCC  http://www.bombich.com/
    That way, should anything untoward happen during the Upgrade,
    you will NOT LOSE ANYTHING.
    (Get an EHD that is at least equal to your current Drive...)

  • I can't open my Filemaker Pro files in iCloud I was told to buy pages and numbers but I still can't open and I have a lot of data on file

    Is there any way I can download ? All I get when I create a folder to burn a disk is an alias but will this work ? I have another 2 macs one is an iMac the other is a 660 AV from 1994 maybe I can create a cd and open in the older iMac ??

    Thanks for all the tips   YES I did open the ALIAS on my older iMac now I can relax

  • HT4859 I HAVE AN ONLINE i CLOUD ACCOUNT. WHEN I SIGN IN TO IT ON MY PC I SEE NO PHOTS OR MUSIC STORED THERE. i CAN SEE NOTES ADDED THERE. i AM TOLD i HAVE LOTS ODF DATA STORED AND IT WOULD BE WISE TO INCREASE THE STORAGE CAPACITY BUT CANT SEE VIRTUALLY AN

    i HAVE AN ONLINE i CLOUD ACCOUNT. WHEN i SIGN IN TO IT ON MY PC i CAN SEE CHANGES I MAKE UNDER NOTES BUT THERE IS NO SIGN OF ANY OF THE  PHOTOS OR MUSIC THERE WHICH ARE IN MY I PAD. i HAVE PAID TO ENLARGE THE DATA CAPACITY SAVED THERE ONE STEP ABOVE THE BASIC LEVEL. iTS NOW A YEAR ON AND BECAUSE i CANT SEE ANYTHING THERE ON i CLOUD OTHER THAN A FEW SIMPLE NOTES i DECIDED TO REDUCE THE CAPACITY TO THE BASIC LEVEL AGAIN. APPLE NOW HAVE SENT ME AN E MAIL SAYING  IF I DO THAT I WILL LOSE DATA AS I HAVE MORE STORED THERE THAN THE BASIC AMOUNT.CAPACITY. wHY THEN CANT i SEE THIS DATA?

    "I still dont understand why I dont see any of these camera roll photos (or any) when I sign in on line on my pc to my icloud account.
    If photos cant be seen there, how do you arrange for an accidently deleted photo to be restored from there into the iPad?"
    If you set up photostream correctly on your computer/ipad and instructions can be found here:
    http://www.apple.com/icloud/setup/
    it automatically created photostream folder in your Pictures/my pictures folder for Windows or iphoto for Mac.
    You will find all your pictures in there, no need to go online. If you didn't use Photostream -start doing so now.
    Otherwise you pictures are included in your back up, but that is an image file. What it means you will not be able to retreive one picture out of back up. You can however restore from that back up - instructions here
    iOS: How to back up - Apple - Support
    You probably do not want to do that for one picture, or even two but the option exists and steps are also in the article above.
    "You say music does not take up i Cloud space . Does that mean it is not saved there in any way?"
    What I meant that whatever is saved - does not go against 5 gigs of storage that apple gives you - unlimited space available, but only for the stuff you purchased in itunes.
    "If it is saved but cant be seen, how do I set about restoring a music track that was deleted accidentily in the i Pad?"
    It can be seen and instructions here:
    Downloading past purchases from the App Store ... - Apple - Support

  • When a pdf file is accessed through an online link nothing appears although I can open and read pdf files otherwise.

    I uninstalled then reinstalled Firefox 7, but still get a blank page when I click on a pdf file link. I have no trouble reading PDF files stored on my PC. I use Foxit Reader.

    See the following discussion:
    https://discussions.apple.com/message/17722116#17722116

  • Determining File Name in Info Package under External Data

    Determining File Name in Info Package under External Data
    I am on SAP BW 3.0. A System is sending a flat file every few days
    With a date time stamp, e.g., d:\loaddar\file_20080212_122300.csv
    I know in Info Package one can create routine under external data to determine the file name. I have seen
    Examples where people determine file name based on date. Since my file has a time stamp, what code  I write a  to pick the file. Is there a way to read one or more files and
    Determine file name.
    I am new to SAP BW and ABAP. However, I have lot of experience with Oracle and Java.
    Can someone point me how this will be done. I am looking for some sample code as well.
    Thanks in advance. I will really appreciate your help.

    Hello Prem,
    Even i used to get the file suffix with date & time, and i found very difficult to pick up the file from application server using routine in infopackge. Then i asked to change to date only and it was easy to pick the file using routine. But i think in your case files are coming more than once, in such a case you should write a small unix script to add these files and then convert into single file with date only and execute the infopackage to load it.
    Cheers!
    Sanjiv

  • Problem with Date - Text File Source and Oracle Target

    Hi All,
    I have a source data (text file) with date column in the format of 'MM/DD/YYYY'. My target is oracle. I am using the LKM FILE TO SQL and IKM SQL Control Append. When i execute this interface, i am getting an error as
    7000 : null : com.sunopsis.jdbc.driver.file.b.i
    com.sunopsis.jdbc.driver.file.b.i
    at com.sunopsis.jdbc.driver.file.b.f.getColumnClassName(f.java)
    at the Load Data step.
    How to load date columns from text file to oracle tables? Please help me in resolving this...
    Thanks in Advance,
    Ram Mohan T

    The worst solution is to define for your text file the date as a string and then to rebuild your date type in Oracle.
    something like
    convert(Substr(myfield,4,2)||Substr(myfield,1,2)||Substr(myfield,7,4) ,'MMDDYYYY')
    this is maybe the worst solution but it should work.
    Regards
    Brice

  • How to insert data from file into matlab script node

    I have interfaced input data from file to be processed using matlab script node. But the problem is that I could not capture the multiple data into matlab script node and to convert it into matrix. Further to this I need to process the data by plotting graphs from it. Thank you in advance for the advice

    Zarina,
    To clarify your problem, you have a script node contaning your Matlab code. Are you then using the standard LV functions to load in your data from a file and pass it into the script node?
    Regards
    Tristan

  • Handling large xml data source files

    Post Author: LeCoqDeMort
    CA Forum: Crystal Reports
    Hello. I have developed a crystal report that uses an xml data file as a data source (supported by an xml schema file). I have successfully run the report for small sample xml data files, but I now have a realistic data file that is around 4Mb in size.When I run the report within the Crystal Reports designer (ver. 11.0.0.1994), i get a "failure to retrieve data from database" error.  Is there some sort of configurable limit on data file/cache size that I can adjust - if indeed that is the problem? Thanks LeCoq 

    Post Author: LeCoqDeMort
    CA Forum: Crystal Reports
    Hello. I have developed a crystal report that uses an xml data file as a data source (supported by an xml schema file). I have successfully run the report for small sample xml data files, but I now have a realistic data file that is around 4Mb in size.When I run the report within the Crystal Reports designer (ver. 11.0.0.1994), i get a "failure to retrieve data from database" error.  Is there some sort of configurable limit on data file/cache size that I can adjust - if indeed that is the problem? Thanks LeCoq 

  • Encoded data to  files

    Hi Expts,
    My requirement is i have to encoded data to  files using file receiver adapter.
    am also using FCC.
    can anybody help me on this how to approach.
    Thanks&Regards,
    Mp Reddy

    Hi,
    >>>>My requirement is i have to encoded data to files using file receiver adapter.
    in file receiver adapter specify encoding in:
    - select  File Type
    - then Text
    - next under File Encoding, specify a code page.
    examples of code pages from:
    http://help.sap.com/saphelp_nw04/helpdata/en/bc/bb79d6061007419a081e58cbeaaf28/content.htm
    this way your file will be encoded
    Regards,
    Michal Krawczyk

  • Cannot get LR4 to read xmp files from a Nikon D4, Reads the D3S xmp files fine.

    I am doing some editing work for a magazine. The lead editor had issues with adobe camera raw viewing some of my edits. The xmp files are there but for some reason they are not loading. We narrowed it down to just the Nikon D4 files. I just had her download a trial version of LR4 and we are getting the same issue. The xmp's are there in the same folder as the .nef files. Import into Lightroom, and all D4 files do not show my edits.
    We are both running Lightroom 4.3, Camera Raw 7.3.

    LR may or may not have up-to-date XMP files if you haven’t explicitly written them.
    I have not encountered the XMP-ignored-if-timestamp-before bug but I believe it is the actual Windows/Mac file-modification timestamp, not something inside the file.
    My suggestion is to write all the XMP files, again, and resend those to her.  That way you’ll know their contents is current and that their timestamp will have been updated to something after the raw files timestamp.  If the images are all in oen or a efw folders then the easiest way to update all their XMPs is to right-click on a folder in Library and choose Save Metadata.  If the photos are more scattered but you can select them all in the Library grid view just right-click on one and choose Save Metadata to Files—make sure it says Files not File.
    I assume both of you are on Windows, not one on Windows and one on a Mac?

  • Data log file refnum - what is it?

    Hello all,
    i want know what is it data log file refnum and how to use it.
    Thanks.

    A datalog file is a file that stores data as a sequence of records of a single, arbitrary data type that you specify when you create the file. Although all the records in a datalog file must be a single type, that type can be complex. For instance, you can specify that each record is a cluster that contains a string, a number, and an array.
    You could use a datalog file refnum if, for instance, you were creating a subVI which will be accepting a datalog file as an input. You could use this refnum to write to the file and perform other actions on it.
    J.R. Allen

  • Jazn-data.xml file, authentication in web.xml and embedded oc4j

    I've defined new security-roles, security-constraints and an authentication method in my web.xml file and i've created a new jazn-data.xml file that holds the different users and their groups.
    If I want to link the security-roles of the web.xml file to the roles in the jazn-data.xml file I always need an orion-application.xml file, is this correct?
    If I want to deploy an application as this to the embedded container the security won't work, I'm always getting the '403 forbidden' page.
    Can somebody point me out how I need to define security for an embedded and standalone environment when using authentication and jazn-data.xml file + how do I deploy this to a standalone oc4j and how to get this same application to work in the embedded oc4J.

    hi "romanna"
    Part of the answer to "Can somebody point me out how I need to define security ..." can probably be found in the "Oracle ADF Developer's Guide" that has "18 Adding Security to an Application".
    success
    Jan

  • Photos App and images that have no place data

    I am considering moving to the Photos App from iPhoto, but I have a large number of old photos and documents that have no location data. Most were scanned several years ago, such as genealogy records, birth certificates, etc. How does Photos handle these images? Do they appear in Collections and Moments like other images even though they have no location data? Must I assign each some arbitrary location if I want the images to stay together? Should I also alter the "time the image was taken" as well? If not, I assume they will be sorted according to the time the image was added to iPhoto.

    Photos without location data will appear in the moments based on the date. If the photos have no capture date, the file creation date will be used.
    Must I assign each some arbitrary location if I want the images to stay together?
    How are your photos stored now? If the are in iPhoto you could assign places to them and adjust the dates of your scans to group them by date in a meaningful way before migrating the library to Photos.
    But photos will also allow you to create albums  or smart albums based on keywords and other tags.  iPhoto events will be mapped to albums.

Maybe you are looking for