Reading data flow files

Hello,
I like to know if its posible to read dfd files, Dataflow files. I include an example of the file format.
Thanks in advance
B Bakels
Labview CLD , Engineer/Manager
Promedes and DSM
using LV 7.1, 8.0, 8.2, 8.5 and 2009 SP1
http://www.promedes.nl
Attachments:
04062008TW3001mm20mms.zip ‏353 KB

Hi bartb,
well, we know that you attached an example.
But we don't know the file format!
We can only guess:
-First 0x1270 bytes contain header data (identifier, version numbers, date/time strings, information on used hardware, channels, etc.). You have to know the format to get it into human readable form.
-Remaining bytes (starting at offset 0x1270) seem to contain 16bit data (according to used PCM-DAS16 hardware?). The data seem saved in big endian order...
The attached vi will load the data part of the example file - you have to interpret the data as I neither know the format correctly nor do I have knowledge what those data represent...
Message Edited by GerdW on 07-10-2008 02:09 PM
Best regards,
GerdW
CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
Kudos are welcome
Attachments:
ReadDFD_LV801.vi ‏47 KB

Similar Messages

  • Reading data from file in EJB

    I would like to read data from files in EJBS (Stateless Session Bean I think). I've heard that the EJB sepcifications don't allow to use the java.io package to access the filesystem. How can I resolve this problem? Possible solutions I've thought of are:
    * Use a webserver to store the data
    * put the files in a jar and access them using the getResource() in the classloader class
    Could you comment on this plesae
    greetings

    The specification states:
    "An enterprise bean must not use the java.io package to attempt to access files and directories in the file system.
    The file system APIs are not well-suited for business components to access data. Business components should use a resource manager API, such as JDBC, to store data."
    From this I understand I cannot acces files directly, but it does not specifiy I can't use the java.io package at all. It specifically says not to use the java.io package to acces files and directories, they don't say anything about using the classes for other use.
    however, a litle lower the specification states:
    "The enterprise bean must not attempt to create a class loader; obtain the current class loader; set the context class loader; set security manager; create a new security manager; stop the JVM; or change the input, output, and error streams.
    These functions are reserved for the EJB Container. Allowing the enterprise bean to use these functions could compromise security and decrease the Container�s ability to properly manage the runtime environment."
    I'm not sure how to interpret this, but I believe this rule makes my solution invalid.
    Please comment on this,
    thanks

  • How to read *.dat type file

    Hallo!
    Can anyone help me with reading *.dat file using labview functions. Previously I've read them via matlab script, but after compiling to executable I could not read *.dat files. It might be because I'm working on LV 6i and in order to be able to read via matlab 6.5 I've paste matlabscript.dll from LV 7 to the main 6i place. However now I want to load this dats by using LV function and I can not achieve it.
    I've already tried to use some examples from the help base and it doesn't work. There are slightly more then 2 milion samples in one coloumn in tis file.
    Can anyone show me how to do that? I will appraciate
    thanks
    michal m
    Attachments:
    260701-1405V1.zip ‏1 KB

    LabVIEw is always big endian (on any platform), but most other programs are little endian on intel machines (windows). See also the link Donald posted.
    Your particular code still has a few bugs. Since you are reading U16, the number of data points is half the number of bytes in the file. You are trying to read twice as many and always get Error 4 (EOF encountered). To avoid this error, you need to divide the bytes by two for reading. Also watch out for correct representations, your record(s) indicator is I32 instead of U16, causing unecessary extra memory usage. ... and don't forget to close your file when done reading.
    Message Edited by altenbach on 05-30-2005 09:19 AM
    Message Edited by Support on 05-31-2005 08:59 AM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    ReadU16Dat.png ‏6 KB

  • Read data in file line by line

    Hello,
    I got a question to ask you all about reading file.
    How do I read a file data line by line and than print out to the screen?
    For example, my file file.txt content is:
    my line 1
    my line 2
    the program will read the 2 line and print out to the screen.
    thanks very much

    Thanks for your code but the code does not work. I can
    compile the following code, but the content of the
    file is not printed out yet.
    Why is it so?Well, in addition to simply compiling the code, you have to execute it.
    Also javajunior left open certain parts of the code, which you were supposed to finish. Like the exception handling and setting the filename variable. These are things that really only you can do.

  • Read Data XML Files or Xml Schema

    Hi
    I need load info from XML files o schemas, using JavaScript
    How is the conecction a xmlshema o xml file with JavaScript?
    Thanks

    This product does look to be ok, however it is missing a couple features that I would like the product to have.
    I would like this product to be able to insert/update data across multiple tables. Being able to do such things as generate foreign key relationships.
    It doesn't seem as though the XML being generated can be customized very easily. Some XML formats I deal with are very complex (such as X12/EDI).
    I would really like a graphical mapper.
    Here is a note from the doc:
    Storing XML Data Across Tables
    Currently the XML SQL Utility (XSU) can only store data in a single table. It maps a canonical representation of an XML document into any table or view. But there is a way to store XML with XSU across tables. You can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (using object views if needed) and then do the insertions into the view. If the view is inherently non-updatable (because of complex joins), then you can use INSTEAD OF triggers over the views to do the inserts.
    This product looks really good, I was hoping that Oracle had something comparable:
    http://www.hitsw.com/products_services/xml_platform/allora.html

  • How to read data from a file in OSB

    hi guys,
    Recently, I've got a problem with reading file from specific location. I've actually followed this post OSB 11g - Read or Poll File in OSB - Oracle Fusion Middleware Blog, and then
    I know how to read a file. However, it does not as expected. Because, I've found no way to read data from the file. Therefore, no chance to manipulate the data like assigning to a variable, or extracting ....
    Hence, is there any way to read data from file by using proxy service in OSB ??? No Java code ???
    by the way, supposed that there is no way to read data from a file in OSB. So, What purposes will the way in the post above be used for?
    Many thanks in advance

    http://jakarta.apache.org/poi/hssf/index.html
    HSSF stands for Horrible Spreadsheet Format, but it still works!

  • How to read and writre file into remote machine in the network

    HI Experts,
       i want to write the data and read data into file in remote machine(not in application server and presentation server).is it possible in abap.
    thanks in advance
    With Regads
    Naidu

    Hi naidu,
    1. We can use this type of path
    computername
    folder
    file.ext
    2. We can use this in GUI_UPLOAD
       and it will run on presentation server,
       connect to
    computer
       and read the file contents.
    regards,
    amit m.

  • Error when i try to read a R3 dat file in a data flow

    Dear all,
    The scenario is that i read data using R3 Data flow, from SAP R/3 and put that into a R3 flat file. then i try to read that file along wth some other table from R3 in a R3 data flow. But while executing it gives an error saying cant open file
    3208     2864     R3C-150607     8/23/2009 11:16:59 AM     |Dataflow DF_DeltaNewInfoRecord_SAP
    3208     2864     R3C-150607     8/23/2009 11:16:59 AM     Execute ABAP program <D:/ABAP_PUR/NewInforecord.aba> error <    Open File Error --  D:\ABAP_PUR/InfoRecord.dat>.
    1936     2556     R3C-150607     8/23/2009 11:16:59 AM     |Dataflow DF_DeltaNewInfoRecord_SAP
    1936     2556     R3C-150607     8/23/2009 11:16:59 AM     Execute ABAP program <D:/ABAP_PUR/NewInforecord.aba> error <    Open File Error --  D:\ABAP_PUR/InfoRecord.dat>.
    Can you please help me...thank you very much in advance.
    Regards
    Smijoe
    Equate Petrochemicals

    Place ABAP_PUR/InfoRecord.dat in SAP working directory. This should help.

  • Reading A xml file and sending that XML Data as input  to a Service

    Hi All,
    I have a requirement to read(I am using File adapter to read) a xml file and map the data in that xml to a service(schema) input variable.
    Example of  xml file that I have to read and the content of that xml file like below:
      <StudentList>
        <student>
           <Name> ravi</Name>
           <branch>EEE</branch>
          <fathername> raghu</fathername>
        </student>
      <student>
           <Name> raju</Name>
           <branch>ECE</branch>
          <fathername> ravi</fathername>
        </student>
    <StudentList>
    I have to pass the data(ravi,EEE,raghu etc) to a service input varible. That invoked Service input variable(schema) contains the schema similar to above schema.
    My flow is like below:
      ReadFile file adapter -------------------> BPEL process -----> Target Service.I am using transform activity in BPEL process to map the data from xml file to Service.
    I am using above xml file as sample in Native Data format(to create XSD schema file).
    After I built the process,I checked file adapter polls the data and receive the file(I am getting View xml document in EM console flow).
    But transform activity does not have anything and it is not mapping the data.I am getting blank data in the transform activity with only element names like below
    ---------------------------------------------------------------------------EM console Audit trail (I am giving this because u can clearly understand what is happening-----------------------------------------------------
       -ReceiveFile
            -some datedetails      received file
              View XML document  (This xml contains data and structure like above  xml )
        - transformData:
            <payload>
              <InvokeService_inputvariable>
                  <part name="body">
                     <StudentList>
                         <student>
                           <name/>
                            <branch/>
                            <fathername/>
                         </student>
                   </StudentList>
              </part>
             </InvokeService_inputvariable>
    'Why I am getting like this".Is there any problem with native data format configuration.?
    Please help me out regarding this issue as I am running out my time.

    Hi syam,
    Thank you very much for your replies so far so that I have some progrees in my task.
    As you told I could have put default directory in composite.xml,but what happenes is the everyday new final subdirectory gets created  in the 'soafolder' folder.What I mean is in  the c:/soafolder/1234_xmlfiles folder, the '1234_xmlfiles' is not manually created one.It is created automatically by executing some jar.
    Basically we can't know the sub folder name until it is created by jar with its own logic. whereas main folder is same(soafolder) ever.
    I will give you example with our folder name so that it would be more convenient for us to understand.
    1) yesterday's  the folder structure :  'c:/soafolder/130731_LS' .The  '130731_LS' folder is created automatically by executing some jar file(it has its own logic to control and create the subdirectories which is not in our control).
    2) Today's folder structure :  'c:/soafolder/130804_LS. The folder is created automatically(everytime the number part(130731,130804).I think that number is indicating 2013 july 31 st like that.I have to enquire about this)is changing) at a particular time and xml files will be loaded in the folder.
    Our challenge : It is not that we can put the default or further path in composite.xml and poll the file adapter.Not everytime we have to change the path in composite.xml.The process should know the folder path (I don't know whether it is possible or not.) and  everyday and file adapter poll the files in that created subfolders.
    I hope you can understand my requirement .Please help me out in this regard.

  • Error in Reading data from a xml file in ESB

    Hi,
    i created a inbound file adapter service which reads data from a xml file and passes it to the routing service and from there updates to the database.....
    (everything created in jdeveloper)
    But i am getting error....it is not getting updated to the database...when i check the database(select * from table) its showing one row selected but i couldnt find the data....
    Transformation mapping also i did...
    i think may be some error in reading the data from the xml file but not so sure.....
    please reply to this mail as soon as possible its very urgent

    Michael R wrote:
    The target table will be created when you execute the interface, if you set the option on the flow tab as instructed in step #6 of the "Setting up ODI Constraint on CLIENT Datastore" Section.
    Option     Value
    CREATE_TARG_TABLE      trueHi Michel,
    This was not my required answer.I am sorry that I was unable to clarify my question.Actually
    +This project executed successfully with some warning.Target Table is automatically created in database and also populated with data.But when I right-click Target Datastore(in >Mapping Tab of the Interface), and then select Data to View Data that needs to be inserted in the target table.I get some error like this:-...+This above line is the result of my project my problem is
    when I right-click Target Datastore(in Mapping Tab of the Interface), and then select Data to View Data that already inserted in the target table.Is not shown by the view data operation.
    I meant to say I am facing this error
    At the10(1010 written) step of
    Creating a New ODI Interface to Perform XML File to RDBMS Table Transformation
    wehre it says
    Open the Interface tab. Select Mapping tab, right-click Target Datastore - CLIENT, and then select Data. View Data inserted in the target table. Close Data Editor. Close the tabs...
    In my case when I use my sqldeveloper I can see data successfully inserted in my target table and also in error table (data that can't satisfy the constraint) .But I was unable to check this by following the above mentioned 10 th step and got this error.
    Thanks

  • BO Data Services - Reading from excel file and writing to SQL Server Table

    Hi,
    I would like to read data from an excel file and write it to a SQL Server Data base table without making any transformations using Data Services. I have created an excel file format as source and created target table in SQL Server. The data flow will just have source and target. I am not sure how to map the columns between source and target. Appreciate your quick help in providing a detailed steps of mapping.
    Regards,
    Ramesh

    Ramesh,
    were you able to get this to work? if not, let me know and I can help you out with it.
    Lynne

  • Reading data files saved using FTP Append

    Hi All,
    The code posted will write me a data file using FTP Append, which represents the data coming from a cRIO chassis.
    However I do not seem to be able to write the code to pull the data back out of the saved files.
    I have success in retrieving an isolated value (ie setting the count to 1 on the read binary file.vi), but as soon as I try to retrieve anything any bigger I have some serious issues.  I have checked through the various different tutorials and I don't seem to be doing anything particularly wrong.
    Any help would be good ta.
    Attachments:
    FTP streaming.vi ‏29 KB

    OK thanks Dom.
    I have a few things which you may consider trying in order to troubleshoot the issue.
    In your FTP Strem File Read code, try setting the count input of the read binary file vi to -1 (read entire file) and see what the result is. It may be worth creating a new string indicator rather than an array for this.
    Also I noticed you are performing a get file size function in parallel with the read binary vi. It is good practice to keep the flow of file reference and error cluster through these vi's sequentially in order to keep data flow and be sure of the execution order. In this case issues may occur if both vi's try to perform their functions concurrently.
    Any chance you could send the file created by the FTP Stream vi so I could take a look?
    Thanks.
    Paul
    http://www.paulharris.engineering

  • Need help ASAP with Data Flow Task Flat File Connection

    Hey there,
    I have a Data Flow Task within a ForEach loop container.  The source of the flow is ADO.NET connection and the destination is a Flat File Connection.  I loop through a collection of strings in the ForEach loop.  Based on the string content,
    I write some data to the same destination file in each iteration overwriting the previous version. 
    I am running into following Errors:
    [Flat File Destination [38]] Warning: The process cannot access the file because it is being used by another process.
    [Flat File Destination [38]] Error: Cannot open the datafile "Example.csv".
    [SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
    I know what's happening but I don't know how to fix it.  The first time through the ForEach loop, the destination file is updated.  The second time is when this error pops up.  I think it's because the first iteration is not closing the destination
    file. How do I force a close of the file within Data Flow task or through a subsequent Script Task.
    This works within a SQL 2008 package on one server but not within SQL 2012 package on a different server.
    Any help is greatly appreciated.
    Thanks! 

    Thanks for the response Narsimha.  What do you mean by FELC? 
    First time poster - what is the best way to show the package here?

  • Creating abap data flow, open file error

    hello experts,
    i am trying to pull all the field of MARA table in BODS.
    so i m using abap data flow.but after executing the job i got error "cant open the .dat file"
    i am new to abap data flow so i think may be i did some mistake in configuration of datastore.
    can any one guide me how to create a datastore for abap data flow???

    In your SAP Applications datastore, are you using "Shared Directory" or "FTP" as the "Data transfer method"?  Given the error, probably the former.  In that case, the account used by the Data Services job server must have access to wherever SAP is putting the .DAT files.  When you run an ABAP dataflow, SAP runs the ABAP extraction code (of course) and then exports or saves the results to a .DAT file, which I believe is just a tab-delimited flat text file, in the folder "Working directory on SAP server." This is specified from the perspective of the SAP server, e.g., "E:\BODS\TX," where the E:\BODS\TX folder is local to the SAP application server. I believe this folder is specified as a directive to the ABAP code, telling SAP where to stick it (the .DAT files). The DS job server then picks it up from there, and you tell it how to get there via "Application path to the shared directory," which, in the above case, might be
    SAPDEV1\BODS\TX" if you shared-out the E:\BODS folder as "BODS" and the SAP server was SAPDEV1.  Anyway: the DS job server needs to be able to read files at
    SAPDEV1\BODS\TX, and may not have any rights to do so, especially if it's just logging-in as Local System.  That's likely your problem. In a Windows networking environment, I always have the DS job server log-in using an AD account, which then needs to be granted privileges to the, in our example's case,
    SAPDEV1\BODS\TX folder.  Also comes in handy for getting to data sources, sometimes.
    Best wishes,
    Jeff Prenevost
    Data Services Practice Manager
    itelligence

  • View Object to read data from a java file

    Hi,
    I am using JDeveloper 11.1.1.4 and ADF-BC in my application.
    For one of my view objects , I want the data to be read from a java file which exposes some method to return a collection.
    I cannot use a static view object in this case.
    Please suggest the best way to implement this requirement.Basically build a view object that should read data from a java file.
    Thanks,
    Praveen

    Depending on your use case you can either use a programmatic VO or directly expose the JV class as a data control.
    http://docs.oracle.com/cd/E18941_01/tutorials/jdtut_11r2_36/jdtut_11r2_36.html

Maybe you are looking for

  • Is it possible to upgrade only 1 memory slot in a Macbook Pro?

    I have a 2010 Macbook Pro 7.1 running Maverick and I want to upgrade my memory from the standard 4bg. Is it possible to add a single 8gb card in one slot and keep a 2gb in the other? I have read that this model can run up to 16gb but I'm not sure I n

  • NLS in a Web archtecture

    I'm building a Web application that will have to support the Western European character set to accomodate French, Italian, Spanish, and German. How/where do I set the client NLS vars to display the proper locale settings? Clients from all countries w

  • Query GLOBAL filter

    Hello, In Prakash's blog for Query Creation Checklist, he has mentioned a lot about GLOBAL filter on the following issues: Exclusions Constant Selection Time MultiProviders Calculated and Restricted Key Figures Could you please clarify, what does he

  • Do Apple Authorized Service Providers "recycle" hard drives as new?

    I'd like to upgrade the hard drive in my iMac (500 gigs ain't cutting it!) to a 2 TB drive. While I've done this before in Macbook Pros, Towers and other computers, the iMac hard drive upgrade is a little sketchy to do on your own (you have to remove

  • How can I hide certain bookmarks so others cant see them on my unit

    ''locking - duplicate - https://support.mozilla.com/en-US/questions/850438'' I would like to hide some bookmarks where only I can see them, how can I do this?