Data plug-in for binary data with byte streams of variable length

Hi there,
I would like to write a data plug-in to read binary data from file and I'm using DIAdem 10.1.
Each data set in my file consists of binary data with a fixed structure (readable by using direct access channels) and of a byte stream of variable length. The variable length of each byte stream is coded in the fixed data part.
Can anyone tell me how my data plug-in must look like to read such kind of data files?
Many thanks in advance!
Kind regards,
Stefan

Hi Brad,
thank you for the very quick response!
I forgot to mention, that the data in the byte stream can actually be ignored, it is no data to be evaluated in DIAdem (it is picture data and the picture size varies from data set to data set).
So basically, of each data set I would like to read the fixed-structure data (which is the first part of the data set) and discard the variable byte stream (last part of the data set).
Here is a logical (example) layout of my binary data file:
| fixedSize-Value1 | fixedSize-Value2 | fixedSize-Value3 (=length of byte stream) | XXXXXXXXXXXXX (byte stream)
| fixedSize-Value1 | fixedSize-Value2 | fixedSize-Value3 (=length of byte stream) | XXXXXX (byte stream)
| fixedSize-Value1 | fixedSize-Value2 | fixedSize-Value3 (=length of byte stream) | XXXXXXXXXXXXXXXXXXXX (byte stream)
What I would like to show in DIAdem is only fixedSize-Value1 and fixedSize-Value2.
´
If I understood right, would it be possible to set the BlockLength of each data set by assigning Block.BlockLength = fixedSize-Value3 and to use Direct Access Channels for reading fixedSize-Value1 and fixedSize-Value2 ?
Thank you!
Kind regards,
Stefan

Similar Messages

  • Download file problem for binary data?

    Dear All,
    I have wrote a jsp file to do download page. I have used a piece of code from the JDC to this. This code will prompt the download dialog box each time user clicks the download button. The code itself will set the content type for different application. The code is like below:
    try
    java.io.File fileobj = new java.io.File(strFolder + strFile);
    response.setContentType(application.getMimeType(fileobj.getName()));
    response.setHeader("Content-Disposition","attachment; filename=\""
    + strFile + "\"");
    java.io.FileInputStream in = new java.io.FileInputStream(fileobj);
    int ch;
    while ((ch = in.read()) != -1) {
    out.write(ch);
    out.flush();
    in.close();
    } catch(Exception e)
    The code can download and handle text file correctly when it is openned in the text editor or inside the IE. But when a PDF file or Image is downloaded and openned in the PDF viewer or image viewer, it is corrupted and cannot be viewed. What is the problem? Any ideas?
    So, I wonder this code can handle binary data or not. It is seen like there is no different code to handle text and binary data in Java/Jsp.
    Thank you very much!
    Best Regards,
    Rockyu Lee
              

    Add following lines to .tld file (custom tag definition)
    <tag>
    <name>downloadbinary</name>
    <tagclass>org.rampally.DownloadBinaryTag</tagclass>
    <bodycontent>JSP</bodycontent>
    </tag>
    Add following line to JSP files.
    In JSP, keep one line of source. Make sure that there are no space and additional line feeds at the any where
    in the JSP files except JSP tags.
    <%@ taglib uri="/WEB-INF/taglibs/mb.tld" prefix="mytags" %>
    <mytags:downloadbinary />
    I am hoping that you have all required parameters such as fileName to download, etc.
    in your session or request object.
    Tag class ....
    public class DownloadBinaryTag extends TagSupport {
         public int doEndTag() throws JspException {
              // TODO: get binary data from filename or
              // binary data buffer from datase.
              // I am making it simple .. assume that it is a request parameter for
              // you test easily.
              String fileName = request.getParameter( "filename" );
              java.io.File file = new java.io.File( fileName);
              java.io.DataInputStream dis;
              try {
                   dis = new java.io.DataInputStream(new FileInputStream(fileName));
              } catch (FileNotFoundException e) {
                   // do error handling ...
                   return EVAL_PAGE;
              BinaryUtil.sendBinaryFile( dis, (HttpServletResponse) pageContext.getResponse(), contentType );
              return EVAL_PAGE;
    public class BinaryUtil
         static public void sendBinaryFile( DataInputStream dis,
                                  HttpServletResponse response,
                                  String contentType ) {
              try {
                   response.setContentType(contentType);
                   String fileName="test.pdf";
                   response.setHeader("Content-disposition", "inline; filename=" + newFileName );
                   ServletOutputStream sout = response.getOutputStream();
                   int len;
                   byte[] data = new byte[128 * 1024];
                   while ((len = dis.read(data, 0, 128 * 1024)) >= 0)
                        sout.write(data, 0, len);
                   sout.flush();
                   sout.close();
              } catch (Exception e) {
                   System.out.println(e.getMessage());
         static public void sendBinaryFile( byte[] data,
                                  HttpServletResponse response,
                                  String contentType ) {
              try {
                   response.setContentType(contentType);
                   String fileName="test.pdf";
                   response.setHeader("Content-disposition", "inline; filename=" + newFileName );
                   ServletOutputStream sout = response.getOutputStream();
                   sout.write(data);
                   sout.flush();
                   sout.close();
              } catch (Exception e) {
                   System.out.println(e.getMessage());
    You may have to change 'inline' to 'attachment' if you do not want IE to inline the document.
    That's all!!.. Hope this helps...!

  • Is there an object like StringBuffer, but for binary data?

    I like the performance of StringBuffer, it's very fast when I use the indexOf() and lastIndexOf() methods.
    Is there an equivalent buffer object for binary data so that I can quickly search for byte sequences fast instead of looping through it?
    Thanks.

    I like the performance of StringBuffer, it's very fast when I use the indexOf() and lastIndexOf() methods.You mean fast as in O(n)?
    Is there an equivalent buffer object for binary data so that I can quickly search for byte sequences fast instead of looping through it?A ByteBuffer might be useful, though you will have to loop - (thats what StringBuffer does)

  • C++ data plug-in for DIAdem

    Hello,
    I need to developp a C++ data plug-in for DIADem, I read it's possible, but I don't find any information about how to do this.
    I downloaded the labview_dataplugin_sdk_2011.exe because I'm using DIAdem 2011.
    Does someone now how to do this in C++ and where I can fine C++ example ?
    Futhermore, the measurement file is associated to 2 other files, one containing the channels and the frequencies, the other containing the scaling information for analog input. The scaling is polynomial and more than degree 1.
    Is it possible to create a data plug-in that reads information in 3 files ?
    Is it possible to set polynomial information instead of factor and gain for scaling settings ?
    Thanks.
    CFOE
    Solved!
    Go to Solution.

    Hello Andreas,
    I know that there is a Labview sdk to develop such plug-in but I didn't find something about the C++ one although NI said me that it could be done.
    I searched on the website and the forum but I didn't find the API and documentation.
    If you know where I can download it, it would be great give me the information.
    Thanks.
    CFOE

  • Domain type for binary data

    Hello,
    I want to create domain and data element to save binary data like pictures.
    Do you use RAW or RAWSTRING for this?
    thx
    chris

    Refer:
    http://help.sap.com/saphelp_nw04/helpdata/en/cf/21f2e5446011d189700000e8322d00/frameset.htm

  • BPC10 - Data manager package for dimension  data export and import

    Dear BPC Expers,
    Need your help.
    I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
    I created a test data manager package from Organize > add package > with  process chain /CPMB/EXPORT_MD_TO_FILE  and Add
    In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
    I have not done any chnages in the script inside the task .
    But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
    I have not changed anything there
    in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
    Not sure how to proceed further.
    I shall be greatfull if someone guide me from your experiance  how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and  output file in the advance tab. how and what  transformation file to be created and link to the data manager package for export / import .
    What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
    Thanks in advance for your guidance.
    Thanks and Regards,
    Ramanuj
    =====================================================================================================
    Detals of the task
    Task : APPL_MD-SOURCE
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : EXPORT_MD_CONVERT
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : FILE_TARGET
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    ================================================================================

    1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
    2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
    Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
    Cheers,
    Julius

  • Error 100 File contains erroneous data. Normally for user data files.

    Hello.  We have a LabVIEW program that reads in test steps and data in order to execute a test sequence.  Recently we had to apply Retina and Gold Disk security patches in accordance with DOD security policies.  Now we are getting the following error:
    Error 100
    LabVIEW: File contains erroneous data.  Normally for user data files.
    We have not changed the code or files that the program is reading in.  My guess would be it is some sort of permission issue.  However, we have given the user modify permission to the entire C drive, and still get this error.  Does anyone have any ideas on what could be causing it?  Thanks!

    Do you have any backup copies of the files, by any chance? Is it possible the files were modified somehow (perhaps something extra was added when the new security measures were implemented)?
    How is the file being accessed? Is it occurring on the local machine, or are the files accessed from a remote location?
    Caleb Harris
    National Instruments | Mechanical Engineer | http://www.ni.com/support

  • Power View in SharePoint Server - The data extension type for a data source is not valid

    Hi All,
    All of a sudden I am getting following error when trying to create Power View report using shared report data source (no error testing the connection):
    "The current action cannot be completed. The data extension type for a data source
    'http://dev/Shared Ducuments/Sales.rsds' is not valid for this operation"
    I already have a data source (I had created it after creating my site collection a week ago) and when I use this source to create Power View report then there is no error but I am getting above error when I create another similar data source and use it to create
    a Power View report.
    Please help me to resolve the error.
    Thanks

    I am going nuts! I had selected 'Analysis Services' instead of 'Microsoft BI Semantic Model for Power View'

  • Plugin for binary data: channels with coefficient : y = a3*x^3 + a2*x^2 + a1*x +a0

    Hello,
    I am bulding a plugin for extracting data from a binary file.
    My problem is that the scaling of the values is not defiened by a linear equation (y = a1*x + a0 where conversion is easy with Factor and Offset properties of Channel), but by a cubic equation (y = a3*x^3 + a2*x^2 + a1*x +a0) with 4 coefficients. For the moment, I performed the operation in a loop over all the values of the channel... this takes a lot of time to convert all the values!!! Is there a more efficient way for multiplying array in VBA or channels in DIADem (for Plugins!!!)...?

    Hi Ollac,
    There is no way to efficiently apply polynomial scaling in a DataPlugin.  You could create each of the polynomial component terms with an eMultiplyProcessor type ProcessedChannel in the DataPlugin, but we can't add them together with another ProcessedChannel because you can't add a ProcessedChannel to another ProcessedChannel.  If the manual cell-by-cell scaling that you've already tried is too woefully slow, then I'd suggest exposing the raw data values to channels in the Data Portal and adding the "a0", "a1", "a2", "a3" polynomial coefficients as channel properties.  Once the raw data is in DIAdem, you can use the ChnCalculate() or the older and faster FormulaCalc() command to apply the polynomial scaling in-place based on the coefficients in the channel properties.  You might want the Data Plugin to add the "_Raw" suffix to the channel names and have the scaling VBScript remove the "_Raw" suffix or replace it with a "_Scaled" suffix so you don't accidentally apply the polynomial scaling twice.
    Ask if you have questions about this,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • Best Datatype for Binary Data

    Hi,
    We storing a Binary Data say "€ù?" in oracle as String Datatype.
    I have formed the query in my application to insert a record which has above string as one of the String field value.
    The insertion was success but when the above string is not stored as it is but it is stored as "¿ù?" which is not a correct data.
    Is this problem is because I'm using string for storing binary data?
    Also please let me know what could be the best data type to store the binary data (like above data) in oracle.

    Justin,
    With the help of your query I identified how the bytes are stored in the memory.
    I'm having VARCHAR2 datatype of size 14 in oracle when ever i store my binary data it is storing as
    127,0,0,0,0,0,0,0....for 14 bytes -> So the binary equivalent is 011111111,00000000,00000000,...... And i'm ok with it.
    But the problem is whenever a byte is holding the binary equivalent of 128 it is storing in the oracle as 191
    eq:
    Bits: 10000000 => equivalent to 128 in decimal and it is what i want to get when i query it using select, but what i'm getting in sql query is "191".
    Really I'm not able to understand the relationship beween 128 and 191 binary values.
    I got the above detail using the DUMP sql function.

  • Use httpchannel/httpservice combination for binary data

    I have a blazeds proxy configuration set up with my flex application to communicate with a RESTful webservice.
    This webservice is practically an interface to a content management system which has all types of content text and binary(audio,video,images etc) protected by HTTP basic authentication. I had no reason to use Blazeds except for its proxy capabilities to enable HTTP Authentication headers to be sent from the application to the webserver which dont work when sent directly to the server as mentioned.See related post below -
    http://tech.groups.yahoo.com/group/flexcoders/message/136576
    I send the requests through HTTPService from flex through a configured destination and it works well for all data formats supported by HTTPservice. Unfortunately HTTPService does not support binary resultformat for images,audio and video. I am aware that i should use a streaming server for audio and video data but for images I would like to go through the blazeds proxy to the webserver. Loader is the only class which is recommended to be used with images but i am not sure of how to use it with blazeds.
    Is it possible in anyway to achieve what I am trying to?
    Thanks,
    Peeyush

    Try this...
    var b:ByteArray = _dataLoaderHTTPService.lastResult as
    ByteArray;
    b.position = 0;
    trace(b.readObject());
    Bob I.

  • Ideal column type for Binary Data ??

    Hi there,
    I must store binary data of fixed length (e.g. IP address of 4/16 bytes) into a table and manipulate that table over C++ / OCI.
    The database server is running on either SUN/SPARC or LINUX/i86. The clients are running on a broad range of systems (NT/i86, LINUX/x86, SUN/SPARC, ..).
    I am worried about the OCI layer modifying char/varchar values while transmission ...
    Whats the best approach ??
    Use CHAR/VARCHAR or use the variable length RAW even for fixed length binary data ??
    Any help would be greatly appreciated,
    Tobias
    null

    I guess RAW is the best approach

  • Data Plug In for FAMOS (*.raw)

    I am lookimg for a data plug in to load FAMOS data (*.raw). Where can I get this data plug in?

    Hi Mavis,
    Thanks for your reply.
    Date and time of famos data were created by IMC device. 
    The following attachment is an example to get waveform's t0, but I did not get what i want .
    Attachments:
    Load_famos_raw_data.vi ‏63 KB
    plugin.zip ‏100 KB
    raw data.zip ‏14 KB

  • DETAIL_DATASTORE for binary data

    can i put binary data like PDF,word doc in DETAIL_DATASTORE?

    You should as this question in the Oracle Text forum, where you will get a more expert, quicker answer.

  • Conversion from scaled ton unscaled data using Graph Acquired Binary Data

    Hello !
    I want to acquire temperature with a pyrometer using a PCI 6220 (analog input (0-5V)). I'd like to use the VI
    Cont Acq&Graph Voltage-To File(Binary) to write the data into a file and the VI Graph Acquired Binary Data to read it and analyze it. But in this VI, I didn't understand well the functionnement of "Convert unscaled to scaled data", I know it takes informations in the header to scale the data but how ?
    My card will give me back a voltage, but how can I transform it into temperature ? Can I configure this somewhere, and then the "Convert unscaled to scaled data" will do it, or should I do this myself with a formula ?
    Thanks.

    Nanie, I've used these example extensively and I think I can help. Incidently, there is actually a bug in the examples, but I will start a new thread to discuss this (I haven't written the post yet, but it will be under "Bug in Graph Acquired Binary Data.vi:create header.vi Example" when I do get around to posting it). Anyway, to address your questions about the scaling. I've included an image of the block diagram of Convert Unscaled to Scaled.vi for reference.
    To start, the PCI-6220 has a 16bit resolution. That means that the range (±10V for example) is broken down into 2^16 (65536) steps, or steps of ~0.3mV (20V/65536) in this example. When the data is acquired, it is read as the number of steps (an integer) and that is how you are saving it. In general it takes less space to store integers than real numbers. In this case you are storing the results in I16's (2 bytes/value) instead of SGL's or DBL's (4 or 8 bytes/value respectively).
    To convert the integer to a scaled value (either volts, or some other engineering unit) you need to scale it. In the situation where you have a linear transfer function (scaled = offset + multiplier * unscaled) which is a 1st order polynomial it's pretty straight forward. The Convert Unscaled to Scaled.vi handles the more general case of scaling by an nth order polynomial (a0*x^0+a1*x^1+a2*x^2+...+an*x^n). A linear transfer function has two coefficients: a0 is the offset, and a1 is the multiplier, the rest of the a's are zero.
    When you use the Cont Acq&Graph Voltage-To File(Binary).vi to save your data, a header is created which contains the scaling coefficients stored in an array. When you read the file with Graph Acquired Binary Data.vi those scaling coefficients are read in and converted to a two dimensional array called Header Information that looks like this:
    ch0 sample rate, ch0 a0, ch0 a1, ch0 a2,..., ch0 an
    ch1 sample rate, ch1 a0, ch1 a1, ch1 a2,..., ch1 an
    ch2 sample rate, ch2 a0, ch2 a1, ch2 a2,..., ch2 an
    The array then gets transposed before continuing.
    This transposed array, and the unscaled data are passed into Convert Unscaled to Scaled.vi. I am probably just now getting to your question, but hopefully the background makes the rest of this simple. The Header Information array gets split up with the sample rates (the first row in the transposed array), the offsets (the second row), and all the rest of the gains entering the for loops separately. The sample rate sets the dt for the channel, the offset is used to intialize the scaled data array, and the gains are used to multiply the unscaled data. With a linear transfer function, there will only by one gain for each channel. The clever part of this design is that nothing has to be changed to handle non-linear polynomial transfer functions.
    I normally just convert everything to volts and then manually scale from there if I want to convert to engineering units. I suspect that if you use the express vi's (or configure the task using Create DAQmx Task in the Data Neighborhood of MAX) to configure a channel for temperature measurement, the required scaling coefficients will be incorporated into the Header Information array automatically when the data is saved and you won't have to do anything manually other than selecting the appropriate task when configuring your acquisition.
    Hope this answers your questions.
    ChrisMessage Edited by C. Minnella on 04-15-2005 02:42 PM
    Attachments:
    Convert Unscaled to Scaled.jpg ‏81 KB

Maybe you are looking for