Access Channels in Data Portal through Data Plugin

Hi,
I am supposed to write two plugins, one plugin creates the channel reads in the data values and from the other plugin I need to read in the channel custom properties.
Is there a possibility to access the channels in the Data portal through the plugin though I know through plugin we will not be able to access any DIAdem native commands.
I have an option of reading in these properties to a text file and then pull them into DIAdem but it is not recommendable.All that is required is read the file and create the cutom property for the existing channels.
Can anyone give me a best suggestion for the above at the earliest?
Thanks,
Priya

Hello Priya,
A dataplugin is not limited to read a single file only. There are quite a few cases where let's say there is a file "abc.bin" and another file "abc.hdr" which belong together. One contains the channel data, the other contains the descriptive information. We have developed plugins which allow the user to select one or the other and the DataPlugin then reads the information from both. As long as you can find the second file, e.g. through the name or some piece of information which is in the file, its not a problem. And yes, a plugin can read from a binary file and an ASCII file at the same time. We even developed DataPlugins for files which are both, ASCII and Binary, in a single file.
If you like, you can send me some examples together with a description of the file format(s) and I can try something for you.
Andreas

Similar Messages

  • Unable to refresh SQL Server data source through Data Management Gateway

    I just installed the version 1.1.5226.8 of Data Management Gateway and tried to refresh a simple query on a table connected to SQL Server, with no transformations in Power Query.
    This is the error I obtain:
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: transfer service job status is invalid.
    I am wondering whether my Power BI is still not updated to handle such a connection type, or there could be something else not working?
    I correctly created the data source in admin panel following instructions in Release Notes, and
    test Power Query connection is ok.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

    I made other tests and I found important information (maybe there is a bug, but read the following).
    The functions DateTime.LocalNow and DateTime.FixedLocalNow
    work correctly, generating these statements to SQL Server:
        convert(datetime2, '2014-05-03 06:37:52.1135108') as [LocalNow],
        convert(datetime2, '2014-05-03 06:37:52.0525061') as [FixedLocalNow],
    The functions DateTimeZone.FixedLocalNow, DateTimeZone.FixedUtcNow,
    DateTimeZone.LocalNow, and DateTimeZone.UtcNow
    stop the scheduled refresh with the error I mentioned
    in my previous messages, generating these statements to SQL Server:
        '2014-05-03 06:37:52.0525061+02:00' as [TZFixedLocalNow],
        '2014-05-03 04:37:52.0525061+00:00' as [TZFixedUtcNow],
        '2014-05-03 06:37:52.1135108+02:00' as [TZLocalNow],
        '2014-05-03 04:37:52.1135108+00:00' as [TZUtcNow]
    I solved the issue by placing the DateTimeZone calls after a Table.Buffer call, so query folding does not translate in SQL these functions. However, it seems like something to fix.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

  • Sort the filenames to be loaded in data portal by date

    Hi,
    Filenames listed in Diadem File browser are sorted alphabetically.
    I would like to script how to load files in search results sorted by date of creation not by name.

    Hi Andre Rako,
    You mentioned two different ways of loading data into DIAdem (File Browser, Search Results).  Which do you mean?  There's no way to affect the ordering of the files in the File Browser, because that simply shows the same order as the files are in on disk-- the same order you see when you look with Windows Explorer.
    The Search Results is a different story, however.  After running any query, you can order to resulting Search Results list by clicking on the "up/down" icon at the top right of any column.  In your case you'll want to display the datetime as a property column in the Search Results and then click on the "up/down" (enumeration) icon at the top of that column.
    If you have DIAdem 2010 or later, you can also include the ordering of the Search Results as part of the query.  There is now the "up/down" icon to the left of the condition in the Condition Table (just to the right of the C1, C2, text).  You can click on this icon to set the ordering to be ascending, descending, or turned off.
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • Add Channel data in blocks through Data Plugin.

    Hi,
    I have to create a plugin for a binary data file. I have no probelm creating the plugin, as I know all the required formats but now the issue is I am supposed to load channel values of a specific region of interset. For example, If data file has values of data for the time 0 to 13sec, then I am ssupposed to read in channel with data only between 5 to 9 secs. Can anyone please let me know how can I go about for selective data loading through plugin?
    Thanks,
    Priya

    Hi Priya,
    This is not a built-in feature in DIAdem, although R&D is looking into the feasibility of adding this as a feature at some point.  In the meantime I developed my own back-door way of getting the job done.  You can pass the reducing information in a text file of the same name (but different extension) as the binary file, then the DataPlugin can read the data reduction information and declare the binary block to start at the correct byte position, skip the correct number of values, and end at the correct last value for the desired interval.  Below is a simple example of a DataPlugin outfitted with the "Red" routines.
    Let me know if you have further questions,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments
    Attachments:
    GigaLV Red.zip ‏40 KB

  • How to handle unsaved data in portal through webdynpro ABAP ?

    Hi Experts ,
         I  need to handle unsaved data in SAP Enterprise Portal through webdynpro for ABAP . I got a sdn link which explained some code for this..
    http://help.sap.com/saphelp_nw70/helpdata/EN/45/b76f4169e25858e10000000a1550b0/frameset.htm
    I tired to implement this in EXIT method of the view. but im not able to get this . Can any one help me in this ? .
    Regards ,
    Kalpana .

    Hi ,
    Yes i have checked the code . But the example itself not working when i linked the same with portal . If EXIT is not the right method , Can you help in placing the code in right method ? The code i pasted in EXIT method is 
    data L_COMPONENTCONTROLLER type ref to IG_COMPONENTCONTROLLER .
    data L_API_COMPONENTCONTROLLER type ref to IF_WD_COMPONENT.
    data L_PORTAL_MANAGER type ref to IF_WD_PORTAL_INTEGRATION.
    L_COMPONENTCONTROLLER =   WD_THIS->GET_COMPONENTCONTROLLER_CTR( ).
    L_API_COMPONENTCONTROLLER = L_COMPONENTCONTROLLER->WD_GET_API( ).
    L_PORTAL_MANAGER = L_API_COMPONENTCONTROLLER->GET_PORTAL_MANAGER( ).
    call method L_PORTAL_MANAGER->SET_APPLICATION_DIRTY_FLAG
      exporting
        DIRTY_FLAG = ABAP_TRUE .

  • How do I check to see if a channel group exists in the data portal?

    I have a list of channel groups in the data portal but sometimes some channel groups  are not created all the time.  How do I check using a script if a channel group exists or was created.

    Hello jreynolds!
    With the command 'GroupIndexGet'. It returns 0 if the group does not exist.
    Matthias
    Matthias Alleweldt
    Project Engineer / Projektingenieur
    Twigeater?  

  • Creating a time channel in the data portal and filling it with data - Is there a more efficient way than this?

    I currently have a requirement to create a time channel in the data portal and subsequently fill it with data. I've shown below how I am currently doing it:
    Time_Ch = ChnAlloc("Time channel", 271214           , 1      ,           , "Time"         ,1                  ,1)              'Allocate time channel
    For intLoop = 1 to 271214
      ChD(intLoop,Time_Ch(0)) = CurrDateTimeReal          'Create time value
    Next
    I understand that the function to create and allocate memory for the time channel is extremely quick. However the time to store data in the channel afterwards is going to be highly dependent on the length I have assigned to the Time_Ch. In my application the length of Time_Ch is variable but could easily be in the order of 271214 or higher. Under such circumstances the time taken to fill Time_Ch is quite considerable. I am wondering whether this is the most appropriate way of doing things or whether there is a more efficient way of creating a time channel and filling it.
    Thanks very much for any help.
    Regards
    Matthew

    Hi Matthew,
    You are correct that there is a more efficient way to do this.  I'm a little confused about your "CurrDateTimeReal" assignment-- is this a constant?  Most people want a Time channel that counts up linearly in seconds or fractions of a second over the duration of the measurement.  But that looks like you would assign the same time value to all the rows of the new Time channel.
    If you want to create a "normal" Time channel that increases at a constant rate, you can use the ChnGenTime() function:
    ReturnValue = ChnGenTime(TimeChannel, GenTimeUnit, GenTimeXBeg, GenTimeXEnd, GenTimeStep, GenTimeMode, GenTimeNo)
    If you really do want a Time channel filled with all the same values, you can use the ChnLinGen() function and simply set the GenXBegin and GenXEnd parameters to be the same value:
    ReturnValue = ChnLinGen(TimeChannel, GenXBegin, GenXEnd, XNo, [GenXUnitPreset])
     In both cases you can use the Time channel you've already created (which as you say executes quickly) and point the output of these functions to that Time channel by using the Group/Channel syntax of the Time channel you created for the first TimeChannel parameter in either of the above functions.
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • Question related to accessing members through dates

    we can access the members in the report through by dates in the sample.basic application through attributes concept. If there is requirement to access the members if it is specified with start date to end date (16/09/09 to 17/09/09)...How can we proceed..?

    I'm afraid you're out of luck. Attribute dims don't accept security.
    Could you handle this through the front end and custom code?
    I wonder if FR would let you limit the scope -- you'd have to try.
    Also, if 11x, you might try a Smart Slice in SmartView. Again, you'd have to try.
    Regards,
    Cameron Lackpour

  • RVL200 IPSEC: Channel all or some data traffic through tunnel, possible?

    Is it at all possible to channel all/some data traffic through an established ipsec tunneled connection using the RVL200?
    I have successfully established an ipsec connection through RVL200 and RV042 routers and are able to connect to servers/computers behind it.
    Now I want to channel all or some traffic through the ipsec-tunnel for computers that reside on 192.168.1.0 subnet of RVL200 network.
    Main office - RV042 router - 10.200.62.1
    Remote office - RVL200 router - 192.168.1.1
    I am trying to use the Advanced Routing option to add static routes but I am not 100% sure if I am configuring the routes correctly.
    To give an example of routing DNS requests for HOTMAIL.COM [65.55.72.183]:
    Destination IP - 65.55.0.0
    SM - 255.255.0.0
    GW - 10.200.62.1
    Hop - 1
    Interface - LAN
    For some reason this does not appear to work. I have also tried using the interface setting of WAN and tested - this also does not work.
    Can this be done? If anyone has tried doing this I would be very interested in finding out how to configure this.
    Cheers.
    MP

    For some reason the DNS IP settings does not seem to work.
    I started looking at the option of using the Quick VPN client which appears to have a setting for enabling Remote DNS.
    I have setup a test user on both the RV042 and RVL200 to test if I can overcome the Split DNS limitation. But for some reason I can't connect to either of the two routers. I have installed the client on a 64bit Windows 7 client machine which has the Windows Firewall service enabled.
    I keep getting the below error, there is no conflict with the IP address scheme and the password is correct.
    Could it be this new client does not support the older Linksys badged RV0xx routers? Because Split DNS is only supported on v3 hardware. The firmware on my RVL200 is v1.1.12 .1.
    What should I check to enable connectivity using this client? Or is because it does not support 64bit WIndows 7? I have even exported the certificates for both Admin and User into the C:\Program Files (x86)\Cisco Small Business\QuickVPN Client folder.

  • List of unique channels in data portal

    Hello,
    I often have multiple groups in my data portal and many of the channels (though not all) in each group share the same name as channels in the other groups.  What would be the most quick/efficient way that I could retrieve a list of the unique channel names in the data portal? 
    For example, my data set might look as follows:
    Group 1: contains channels A, B, C and D
    Group 2: contains channels A, B, C, D and E
    Group 3: contains channels B, C, D and E
    My desired result:  Unique channel names in portal = A, B, C, D and E
    **note, my groups often have 500+ channels in them, and this is why I am looking for something very quick.
    Thanks!
    Julia

    Hello Julia,
    I dont have an idea if this is the fastest method but its working and not too difficult to implement.
    I am using the VBS library object to create a list of the channels. The
    add method is giving an error if a channel name is allready existing.
    With the on error resume next statement I am suppressing these
    errors, though. In this way I am creating a list with unique channel
    names. Actually its a table with the channel names in the 1st column
    (called key) and the channel number of the 1st channel with this name in the second column (called item).
    on error resume next
      Set chn = CreateObject("Scripting.Dictionary")
      for x = 1 to globusedchn
        chn.add cn(x), x
      next
      list = "Channels:"&vbcrlf
      for each k in chn.keys
      list = list & k & vbcrlf
      next
      msgbox list
    on error goto 0
    Let me know if this is running sufficiently quickly for all of your channels.
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • How to display .lvm filename in data portal?

    When I drag a .lvm file to the data portal, it lists it as 'LabVIEW_Data' rather than the filename that I dragged there. When I drag another .lvm file it lists it as 'LabVIEW_Data2'. How can I get it to display the filenames that were dragged there instead of a generic 'LabVIEW_Data'?

    Hi Dewey,
    Unzip the below attached two VBScript files somewhere on your hard drive where you're willing to leave them.  Then run the "Custom LVM Load.VBS" in DIAdem SCRIPT.  Now drag a new LVM file into the Data Portal.  You should see the Group take on the name of the LVM file.  The custom load event only renames the Group if the channels in that last Group were loaded with the "LVM" DataPlugin.
    You can configure your DIAdem to always start with the "Custom LVM Load.VBS", declared as part of DIAdem's launch cycle, by editing the "Start Script" field in the "General Settings" dialog and then saving the settings prior to shutting down DIAdem.
    Ask if you have further questions,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments
    Attachments:
    Custom LVM Load.zip ‏1 KB

  • How to delete only a portion of the data portal?

    This would be similar to DataDelAll() command, but allow only a limited set of data to be removed.
    For some background:
    - DIAdem 9.1 sp2
    - LabVIEW interface calling various DIAdem/VBS scripts
    Using DIAdem to format reports based on various scripts and TDM data
    results, I would like to be able to load and unload various data files
    (TDM) while maintaining the specific report layout (also TDM) data in
    the portal.  Two (ugly) options seem to be:
    a)  load entire data result set into the portal (including layout instructions) and process, or
    b)  keep track of layout state variables and clear portal/reload layout/load next data
    This would all be easy if there is a command to enable the right-click delete... behavior of the data portal UI.
    Thanks!
    James

    Hello James,
    there are diufferent commands you can use to delete only a portion of the data in the portal. You can delete single channels, a selection of channels, or entire groups with these three commands:
    Call ChnDel(ChnArg) - deletes one channel
    Call ChnDelete(ClpSource) - deletes a selection of channels
    Call ChnSDel(ChnArg1, ChnNo) - delets a number of channels
    Call GroupDel(TargetGroupIndex) - deletes a group including its channels
    To create a selection of channels as argument for the second command, have a look at the functions
    ChnSelAdd, ChnSelGet, ChnSelCount
    Regards
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • From Mysql to Data Portal

    Dear All,
    I am new in DIAdem 12.0, I want to create a VBS script to create queries from a mysql server.
    Data is structured in different tables  and in its table data is structured in the way you see below ,
    I want to find the sensors with the same sensor_id and make channels with the respective date and value.
    For example I used the following commands:
    CallSQL_Select("sensor_id,measdate,measvalue","seal_pp_a_final","sensor_id=2 ORDER BY sensor_id","")
    Call SQL_BindChannel("date1","measdate","n")
    Call SQL_BindChannel("PP-A1","measvalue","n")
    It worked but the date values are not in this format 2013-06-19 00:00:00 do you know how I can solve this problem ?
    Since there are many sensors can I use a loop for example if I want to read from sensor_id (1-100)
    write something like
    For i=1 to 100
    CallSQL_Select("sensor_id,measdate,measvalue","seal_pp_a_final","sensor_id=i ORDER BY sensor_id","")
    Call SQL_BindChannel("date(i)","measdate","n")
    Call SQL_BindChannel("PP-A(i)","measvalue","n")
    i=i+1
    End
    Looking forward to your Comments .
    Thank you in advance,
    Ioannis
    Sensor_id
    Datetime
    Value
    1
    2013-06-19 00:00:00
    1.2
    1
    2013-06-19 01:00:00
    1.3
    2
    2013-06-19 00:00:00
    5.6
    2
    2013-06-19 01:00:00
    5.7
    3
    2013-06-19 00:00:00
    310.2

    Hi Engineer,
    I would strongly advise you to switch to the ADO method of querying the data base columns.  ADO will return the query results as variants, which allows you to handle all the different column data types with no effort.  There are additional advantages to ADO as well.  I adapted a standard example script I use with your column and table names and query and order conditions:
    ' needs to be set to a valid ADO connection string for data base
    ConnectionStr = "Dsn=ABCDEFG;Uid=;Pwd=;"
    ' construct the SQL query to execute
    Table = "seal_pp_a_final"
    ColStr = "sensor_id,measdate,measvalue"
    CondStr = " WHERE sensor_id=2"
    SortStr = " ORDER BY sensor_id"
    QueryStr = "SELECT " & ColStr & " FROM " & Table & CondStr & SortStr
    MsgBox QueryStr
    ' Connect to the data base
    Set ADO = CreateObject("ADODB.Connection")
    ADO.Open ConnectionStr
    ' Execute the query and import the resulting data records into a VBScript variable
    Set RecordSet = ADO.Execute(QueryStr)
    RowVals = RecordSet.GetRows()
    ChanNames = Split(ColStr, ",")
    ' Send the resulting data records from the query to new channels in the Data Portal
    Call DataDelAll
    Call GroupCreate(Table)
    Call GroupDefaultSet(GroupCount)
    Channels = ArrayToChannels(RowVals, ChanNames, 1)
    ' Disconnect from the database and output the query used and any errors
    ADO.Close
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • Getting file names of different groups in a data portal

    Hi,
    I am finding difficult to retreive the file names of the different groups loaded in the data portal. For example, say I have loaded two different files of same ext .tdms in to the data portal which automatically assigns as two groups. My objective is to retreive the file name of the two groups loaded using VBS in diadem 11.1.
    Can anyone help me in this regard.
    Regards,
    X. Ignatius
    Solved!
    Go to Solution.

    Thanks Andreas,
    I have got a plugin which loads the multiple lvm files with their sourcefile name. Earlier it would be like Labview Data, Labview data1, Labview data2.... when multiple files are loaded. Now with this plugin attached in the startup script, all the files are loaded with their original file names.
    Attached the plugins, place the Custom LVM Load Event. Vbs in the start up script, the other attachment is the function callled by the main script.
    Regards,
    X. Ignatius
    Attachments:
    Custom LVM Load.VBS ‏1 KB
    Custom LVM Load Event.VBS ‏2 KB

  • Connecting through Data Source using JNDI

    I would like to connect my application to sql server database through data
    source using JNDI. But when i try to bind the data source object with the
    logical name, i am getting following exception. How can i ger rid of this
    error ? How can i provide the initial context ? I thought Java would create the default initial context by itself. But it doesn't seem to be true. Any type of help would be appreciated.
    -Prashant
    Exception :
    Naming Exception :Need to specify class name in environment or system
    property, or as an applet parameter, or in an application resource file:
    java.naming.factory.initial
    javax.naming.NoInitialContextException: Need to specify class name in
    environment or system property, or as an applet parameter, or in an
    application resource file: java.naming.factory.initial
    at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:651)
    at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:246)
    at
    javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:283)
    at javax.naming.InitialContext.bind(InitialContext.java:358)
    at RegDataSource.regDataSource(RegDataSource.java:30)
    at RegDataSource.main(RegDataSource.java:52)
    Source code :
    public class RegDataSource
    public RegDataSource()
    private void regDataSource()
    try
    com.microsoft.jdbcx.sqlserver.SQLServerDataSource sds =
    new
    com.microsoft.jdbcx.sqlserver.SQLServerDataSource();
    sds.setServerName("servername13");
    sds.setDatabaseName("test");
    Context ctx = new InitialContext();
    ctx.bind("jdbc/EmployeeDB", sds);
    catch(NamingException e)
    System.out.println("Naming Exception :" + e.getMessage()
    //+ "\n" + e.getExplanation()
    //+ "\n" + e.getResolvedObj()
    //+ "\n" + e.getResolvedName()
    e.printStackTrace();
    catch(Exception e)
    System.out.println("Exception :" + e.getMessage());
    public static void main(String[] args)
    RegDataSource regDataSource1 = new RegDataSource();
    regDataSource1.regDataSource();

    Thanks you very very much for your prompt reply and helping me out. I have following questions.
    1) Now i am able to bind data source object to the logical name. But the problem is that whenever i try to look up the data source object by providing logical name (i.e. DataSource ds = (DataSource)ctx.lookup("jdbc/EmployeeDB") ), it returns always null. I don't know why it doesn't return the correct data source object ?
    Following is the code used to bind datasource with the logical name
    Hashtable env = new Hashtable();
    env.put(Context.INITIAL_CONTEXT_FACTORY,
    "com.sun.jndi.fscontext.RefFSContextFactory");
    env.put(Context.PROVIDER_URL, "file:/TEMP/jndi");
    Context ctx = new InitialContext(env);
    //Properties p = new Properties();
    //p.put(Context.INITIAL_CONTEXT_FACTORY,
    // "com.sun.jndi.fscontext.RefFSContextFactory");
    //Context ctx = new InitialContext(p);
    ctx.bind("jdbc/EmployeeDB", sds);
    Following is the code used to look up for the bound object
    Hashtable env = new Hashtable();
    env.put(Context.INITIAL_CONTEXT_FACTORY,
    "com.sun.jndi.fscontext.RefFSContextFactory");
    env.put(Context.PROVIDER_URL, "file:/TEMP/jndi");
    Context ctx = new InitialContext(env);
    DataSource ds = (DataSource)ctx.lookup("jdbc/EmployeeDB");
    2) I am writing client server application in which my client is going to access the SQL Server 2000 to read/write database related data. The reason behind using the JNDI is that i don't want my client application to kwon which driver (sql) and database i am using. It is going to provide the great flexibility whenever i can make my application to use other database like Oracel, sybase, etc. without changing any code most probably. In this situation, which JNDI service provider to use ? I am not sure about "File System" service provider be the ideal choice for this type of situation. so please let me know which JNDI service provider is the ideal for this situation.
    Any type of help would be appreacited.

Maybe you are looking for