Modes and Methods in Tag Query and SQL Query

Hi,
Can someone explain me about the modes available in <b>TAG Query and SQL Query.</b>
TAG Query has modes such as <b>Current, CurrentWrite, GroupList, History, HistoryEvent, ModeList, Statistics and TagList</b>
SQL Query i still have doubt on <b>FixedQuery with output , Modelist and TableList</b>
I also need to know why methods are used?
Thanks in advance
Regards
Muzammil

I'll try to  explain to the best of my knowledge :
<u><b>TagQuery</b></u>
<b>Current</b> : Gives you the current value of the Tag you are reading.
<b>CurrentWrite</b> : Let you write a Value as the Current Value of the Tag.
<b>GroupList</b> : Generally Tags are grouped under different groups. Returns you the name of the Groups.
<b>From the xMII Help Document :</b>
<b>History</b> : History Mode returns interpolated data.  Interpolation can be accomplished by specifying either the # of rows desired or the retrieval resolution.  If the mode is "History" and a value is provided for the Resolution parameter (which is in seconds), the connector will retrieve evenly-spaced values starting at the beginning of the time interval, up to the maximum # of rows specified in the RowCount parameter.  If no value is provided for the Resolution parameter, the connector will return an evenly-spaced number of values based on the value of the RowCount parameter.
For example, if the time interval is 1 hour, Resolution is 15, and RowCount is 240, the connector will return evenly spaced values each 15 seconds, up to 240 values (which would span the entire hour).
If the time interval is 1 hour, Resolution is not provided or is set to zero, and RowCount is 120, the connector would return 120 evenly spaced values, at an effective interval of 30 seconds.
<b>HistoryEvent Mode</b> : The connector can provide historical values "as they were stored" the database.  This mode provides no interpolation of values.
<b>Statistics Mode</b> : When retrieving data for statistical calculations, the connector utilizes the same techniques as in the "HistoryEvent"  mode.  It is important to note that the first two returning columns in the HistoryEvent query must be the timestamp and the value, in that order.  The SAP xMII Statistical processor expects that order, or errors will occur.  This ensures precision of statistical measurements, particularly time-weighted average, by using the exact storage time and values from the historical database.  The SAP xMII system provides the statistical calculations.
<b>Modelist</b> : Basically returns the modes of the Query Available. The Data returned is same as the data in the Modes list in the Quert Template Editor.
<b>Taglist</b> : Returns all the Tags in the Datasource.
<u><b>SQL Query</b></u>
<b>Modelist</b> : Same as above.
<b>TableList</b> : List of all the tables in the database to which the connector connects.
Again from SAP xMII Help Documentation :
<b>FixedQueryWithOutput</b> : This mode is used to execute an Oracle stored procedure or function that returns a REF CURSOR as output.  The position of the REF CURSOR is marked by a "?" in the query.  For example:
<b>Create a table.</b>
<i>create table usage (id int, name varchar(50));
insert into usage (id, name) values (1, 'test1');
insert into usage (id, name) values (2, 'test2');
insert into usage (id, name) values (3, 'test3');
insert into usage (id, name) values (4, 'test4');
insert into usage (id, name) values (5, 'test5');
insert into usage (id, name) values (6, 'test6');
insert into usage (id, name) values (7, 'test7');
insert into usage (id, name) values (8, 'test8');</i>
<b>Define the stored procedure.</b>
<i>DROP PACKAGE foopkg;
CREATE PACKAGE foopkg IS
  TYPE cursortype is ref cursor;
  PROCEDURE test (mycursor in out cursortype);
END foopkg;
CREATE PACKAGE BODY foopkg IS
  PROCEDURE test (mycursor in out cursortype) AS
  BEGIN
     open mycursor for select * from usage;
  END;
END foopkg;
</i>
Define a query template for calling the stored procedure.  Enter the following in the FixedQuery tab:
<b>call foopkg.test(?)</b>
This template returns all rows from the Usage table.

Similar Messages

  • MII 11.5 and PCo 2.1 and Tag Query Error

    Hi,
    I have KEPWare OPC server configured with a single tag, and with the OPC Client open.  I have PCo 2.1 configured with a single agent, and 'legacy' ticked with port 9001.  I have a UDS connector pointing to port 9001, and a tag query in 'currentwrite' mode, and can successfully write a value from the query editor (within MII 11.5) to KEPWare, when tested directly within the MII query editor.
    However, when I call the same tag query from within an MII transaction, and set 'TagName.1' and 'TagValue.1' to the values I want, I get the following error:
    "An item with the same key has already been added. at SAP.Manufacturing.Connectivity.Protocol.Custom.PCoQueryRequestHandler.ProcessRequest(Object handler)"
    I see this error in the PCo log.
    Interesting thing is, that within the transaction editor, when I select the underlying tag query, and the editor asks to generate the XML, and I say Yes, the underlying tag query runs fine (I can see the value getting updated within KEPWare).
    I've also tried with/without subscription items in PCo, plus different settings for 'Cache Mode' in PCo, plus using fully qualified tag name (ie: channel.device.tag as KEPWare sees it), etc, but in all cases I get the above error.
    Any ideas?

    Hi Diana,
    I upgraded to PCo V2.1.4.2, still same issue.  And yes, the data server (UDC) is checked as "Writable"!!
    As mentioned, the tag query runs fine in both "Current" mode and "CurrentWrite" mode.  Only when I call the same query from a transaction do I see this error.  Excerpt from Runner Log below;
    2011-02-23 15:20:32,390 [ServletExec: request: time=1298434832343, uri=/Lighthammer/Runner]
    ERROR  Runner - [9FA68944-67CE-824D-3FA6-D96023119ECE][ERROR]:
    IlluminatorQuery: An item with the same key has already been added. TMP0794C967-0F69-8F04-568C-CD037815D1D0
    2011-02-23 15:20:32,453 [ServletExec: request: time=1298434832343, uri=/Lighthammer/Runner]
    ERROR  Runner - [9FA68944-67CE-824D-3FA6-D96023119ECE][ERROR]:
    ACTION FAILED: End Action IllumTagQuery_0 : () TMP0794C967-0F69-8F04-568C-CD037815D1D0
    If I leave the transaction Tag Query action (that calls the underlying tag query) without any assignment to TagValue.1, then there is no error, and the error as set in the aunderlying tag query is used.  But as soon as I assign values to TagValue.1 inside the transaction, I get the error.  I've tried passing a value via a local paramater etc, same issue.
    Regards
    Kevin.

  • Business Logic - Success of Tag and SQL Queries

    Greetings All,
    I have a Logic flow from an old NQL process. 
    The process Gets some SQL and Tag Data then checks to see if data was returned and notifies via email if data wasn't returned.  So this would translate into the Tag and SQL Queries followed by a Logic Conditional block.
    HERE's the questions:
    In BLS, if the SQL and/or Tag Query fail, the entire process quits (at least when Executed locally) and never gets to the Logic Conditional block to verify the tagQuery.success or SQLQuery.success.
    Is this what it does when run as a scheduled task also???
    Is there any onError then do... type of parameter???
    Any Suggestions would be appreciated...
    Thanks
    Dennis West

    Dennis, it shouldn't "quit".  The Success property should be set to false, and the error caught (typically).  Sounds like a bug.

  • How long should I let Safe Mode startup run? I keep getting the "disk0s2 i/o error" message and I'm just not sure when to give up and try a different method. I already ran disk utility in Recovery Mode and it said there were no repairs needed.

    How long should I let Safe Mode startup run? I keep getting the "disk0s2 i/o error" message and I'm just not sure when to give up and try a different method. I already ran disk utility in Recovery Mode and it said there were no repairs needed but it still kept getting stuck on the apple loading screen.

    You have limited opportunity to attempt to create a backup of your created files.
    That is what the SafeBoot or safemode appears to allow you -- at the moment.
    Since the hard disk drive exhibits signs of failure or other major issues, plan
    on a replacement in the near future. You may be able to get the computer to
    start up in a regular full OS X (not safe mode) but consider its hours are limited.
    An externally enclosed hard disk drive (with own power supply, not relying on
    Mac ports to run it) is a good basic means of which to use a disk utility to make
    a copy or a Clone of the current OS X. This may help retrieve an archive that
    could be used along with a Time Machine backup, to restore your Mac once you
    get a new hard drive installed inside.
    Good luck & happy computing!

  • Region Monitoring iOS 7 : didEnterRegion method is not calling when app is killed by user or by OS in iOS 7 only. It is working fine when it is in background. and the same code is working fine with iOS 6 for both app in suspended mode and background mode.

    Region Monitoring iOS 7 : didEnterRegion method is not calling when app is killed by user or by OS in iOS 7 only. It is working fine when it is in background. and the same code is working fine with iOS 6 for both app in suspended mode and background mode. What changes I have to made to work great in iOS 7 also.

    I rewrote code for debugging purpose and tried to catch error using GetLastError();  method,
    but it only printed 0. Below is code snippet; I think Create() throw an exception
    and code goes to catch block. 
    LONG ConnectTS(CString strIP, UINT n_Port)
    try{
              ErrorLog(0,0,"ConnectTS is calling Create [is going to call]","");
              if(!Create())
    // Exception Line
    n_Err = GetLastError();
    return NET_INIT;
    catch(...)
                       DWORD errorCode = GetLastError();
                       CString errorMessage
                       errorMessage.Format("%lu",errorCode);
                       ErrorLog (0, 0, "Image
    System", (LPTSTR)(LPCTSTR)errorMessage);
                       return  IS_ERR_WINDOWS;
    Output: -
    ConnectTS is calling Create [is going to call]
    Image System
    0

  • How I break then insert mod and change to query mod through coding

    Hi master
    Sir how I break then insert mod and change to query mod through coding
    Please give me idea
    Aamir

    what do you mean with query-mode? QUERY-mode or ENTER-QUERY-mode ?
    QUERY-mode is after you fetch successfully data from the database
    ENTER-QUERY-mode is the mode, when you insert your selection-criteria into the record

  • TRIM not working despite AHCI mode and fsutil behavior query disabledeletenotify returning 0

    Hi,
    I recently changed my mobo, from a Gigabyte to an ASRock X79 Extreme9.
    After doing so, being sure the new one is set up to be in AHCI mode, and updating the drivers of an existing Win 7 x64 installation using the provided DVD by ASRock, I used trimcheck-0.7.exe to check if trim was working, and it's not. Trim was working with
    the Gigabyte mobo.
    I made sure also that HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Msahci and HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\IastorV start value was 0, to no avail.
    The Device Manager shows that my 2 SSDs use an SCSI driver, which might be the source of the problem. I tried to update the driver automatically there, but Windows kept the ones existing, saying it could not find better.
    Then, I tried to remove the Intel RST component, and reboot, but it did not help.
    Any help would be greatly appreciated.

    Hi,
    Regarding to the TRIM capability and AHCI mode, I would like to share the following link with you, please see the reply posted by koitsu,
    https://communities.intel.com/message/76399
    This is hardware configuration question, not a Windows question, you will be well answered by your manufacture of the motherboard and ssd.
    NOTE
    This
    response contains a reference to a third party World Wide Web site. Microsoft is providing this information as a convenience to you.
    Microsoft does not control these
    sites and has not tested any software or information found on these sites.
    Yolanda Zhu
    TechNet Community Support

  • PL/SQL 101 : Cursors and SQL Projection

    PL/SQL 101 : Cursors and SQL Projection
    This is not a question, it's a forum article, in reponse to the number of questions we get regarding a "dynamic number of columns" or "rows to columns"
    There are two integral parts to an SQL Select statement that relate to what data is selected. One is Projection and the other is Selection:-
    Selection is the one that we always recognise and use as it forms the WHERE clause of the select statement, and hence selects which rows of data are queried.
    The other, SQL Projection is the one that is less understood, and the one that this article will help to explain.
    In short, SQL Projection is the collective name for the columns that are Selected and returned from a query.
    So what? Big deal eh? Why do we need to know this?
    The reason for knowing this is that many people are not aware of when SQL projection comes into play when you issue a select statement. So let's take a basic query...
    First create some test data...
    create table proj_test as
      select 1 as id, 1 as rn, 'Fred' as nm from dual union all
      select 1,2,'Bloggs' from dual union all
      select 2,1,'Scott' from dual union all
      select 2,2,'Smith' from dual union all
      select 3,1,'Jim' from dual union all
      select 3,2,'Jones' from dual
    ... and now query that data...
    SQL> select * from proj_test;
             ID         RN NM
             1          1 Fred
             1          2 Bloggs
             2          1 Scott
             2          2 Smith
             3          1 Jim
             3          2 Jones
    6 rows selected.
    OK, so what is that query actually doing?
    To know that we need to consider that all queries are cursors and all cursors are processed in a set manner, roughly speaking...
    1. The cursor is opened
    2. The query is parsed
    3. The query is described to know the projection (what columns are going to be returned, names, datatypes etc.)
    4. Bind variables are bound in
    5. The query is executed to apply the selection and identify the data to be retrieved
    6. A row of data is fetched
    7. The data values from the columns within that row are extracted into the known projection
    8. Step 6 and 7 are repeated until there is no more data or another condition ceases the fetching
    9. The cursor is closed
    The purpose of the projection being determined is so that the internal processing of the cursor can allocate memory etc. ready to fetch the data into. We won't get to see that memory allocation happening easily, but we can see the same query being executed in these steps if we do it programatically using the dbms_sql package...
    CREATE OR REPLACE PROCEDURE process_cursor (p_query in varchar2) IS
      v_sql       varchar2(32767) := p_query;
      v_cursor    number;            -- A cursor is a handle (numeric identifier) to the query
      col_cnt     integer;
      v_n_val     number;            -- numeric type to fetch data into
      v_v_val     varchar2(20);      -- varchar type to fetch data into
      v_d_val     date;              -- date type to fetch data into
      rec_tab     dbms_sql.desc_tab; -- table structure to hold sql projection info
      dummy       number;
      v_ret       number;            -- number of rows returned
      v_finaltxt  varchar2(100);
      col_num     number;
    BEGIN
      -- 1. Open the cursor
      dbms_output.put_line('1 - Opening Cursor');
      v_cursor := dbms_sql.open_cursor;
      -- 2. Parse the cursor
      dbms_output.put_line('2 - Parsing the query');
      dbms_sql.parse(v_cursor, v_sql, dbms_sql.NATIVE);
      -- 3. Describe the query
      -- Note: The query has been described internally when it was parsed, but we can look at
      --       that description...
      -- Fetch the description into a structure we can read, returning the count of columns that has been projected
      dbms_output.put_line('3 - Describing the query');
      dbms_sql.describe_columns(v_cursor, col_cnt, rec_tab);
      -- Use that description to define local datatypes into which we want to fetch our values
      -- Note: This only defines the types, it doesn't fetch any data and whilst we can also
      --       determine the size of the columns we'll just use some fixed sizes for this example
      dbms_output.put_line(chr(10)||'3a - SQL Projection:-');
      for j in 1..col_cnt
      loop
        v_finaltxt := 'Column Name: '||rpad(upper(rec_tab(j).col_name),30,' ');
        case rec_tab(j).col_type
          -- if the type of column is varchar2, bind that to our varchar2 variable
          when 1 then
            dbms_sql.define_column(v_cursor,j,v_v_val,20);
            v_finaltxt := v_finaltxt||' Datatype: Varchar2';
          -- if the type of the column is number, bind that to our number variable
          when 2 then
            dbms_sql.define_column(v_cursor,j,v_n_val);
            v_finaltxt := v_finaltxt||' Datatype: Number';
          -- if the type of the column is date, bind that to our date variable
          when 12 then
            dbms_sql.define_column(v_cursor,j,v_d_val);
            v_finaltxt := v_finaltxt||' Datatype: Date';
          -- ...Other types can be added as necessary...
        else
          -- All other types we'll assume are varchar2 compatible (implicitly converted)
          dbms_sql.DEFINE_COLUMN(v_cursor,j,v_v_val,2000);
          v_finaltxt := v_finaltxt||' Datatype: Varchar2 (implicit)';
        end case;
        dbms_output.put_line(v_finaltxt);
      end loop;
      -- 4. Bind variables
      dbms_output.put_line(chr(10)||'4 - Binding in values');
      null; -- we have no values to bind in for our test
      -- 5. Execute the query to make it identify the data on the database (Selection)
      -- Note: This doesn't fetch any data, it just identifies what data is required.
      dbms_output.put_line('5 - Executing the query');
      dummy := dbms_sql.execute(v_cursor);
      -- 6.,7.,8. Fetch the rows of data...
      dbms_output.put_line(chr(10)||'6,7 and 8 Fetching Data:-');
      loop
        -- 6. Fetch next row of data
        v_ret := dbms_sql.fetch_rows(v_cursor);
        -- If the fetch returned no row then exit the loop
        exit when v_ret = 0;
        -- 7. Extract the values from the row
        v_finaltxt := null;
        -- loop through each of the Projected columns
        for j in 1..col_cnt
        loop
          case rec_tab(j).col_type
            -- if it's a varchar2 column
            when 1 then
              -- read the value into our varchar2 variable
              dbms_sql.column_value(v_cursor,j,v_v_val);
              v_finaltxt := ltrim(v_finaltxt||','||rpad(v_v_val,20,' '),',');
            -- if it's a number column
            when 2 then
              -- read the value into our number variable
              dbms_sql.column_value(v_cursor,j,v_n_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_n_val,'fm999999'),',');
            -- if it's a date column
            when 12 then
              -- read the value into our date variable
              dbms_sql.column_value(v_cursor,j,v_d_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
          else
            -- read the value into our varchar2 variable (assumes it can be implicitly converted)
            dbms_sql.column_value(v_cursor,j,v_v_val);
            v_finaltxt := ltrim(v_finaltxt||',"'||rpad(v_v_val,20,' ')||'"',',');
          end case;
        end loop;
        dbms_output.put_line(v_finaltxt);
        -- 8. Loop to fetch next row
      end loop;
      -- 9. Close the cursor
      dbms_output.put_line(chr(10)||'9 - Closing the cursor');
      dbms_sql.close_cursor(v_cursor);
    END;
    SQL> exec process_cursor('select * from proj_test');
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: RN                             Datatype: Number
    Column Name: NM                             Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,1     ,Fred
    1     ,2     ,Bloggs
    2     ,1     ,Scott
    2     ,2     ,Smith
    3     ,1     ,Jim
    3     ,2     ,Jones
    1     ,3     ,Freddy
    1     ,4     ,Fud
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    So, what's really the point in knowing when SQL Projection occurs in a query?
    Well, we get many questions asking "How do I convert rows to columns?" (otherwise known as a pivot) or questions like "How can I get the data back from a dynamic query with different columns?"
    Let's look at a regular pivot. We would normally do something like...
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    (or, in 11g, use the new PIVOT statement)
    but many of these questioners don't understand it when they say their issue is that, they have an unknown number of rows and don't know how many columns it will have, and they are told that you can't do that in a single SQL statement. e.g.
    SQL> insert into proj_test (id, rn, nm) values (1,3,'Freddy');
    1 row created.
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    ... it's not giving us this 3rd entry as a new column and we can only get that by writing the expected columns into the query, but then what if more columns are added after that etc.
    If we look back at the steps of a cursor we see again that the description and projection of what columns are returned by a query happens before any data is fetched back.
    Because of this, it's not possible to have the query return back a number of columns that are based on the data itself, as no data has been fetched at the point the projection is required.
    So, what is the answer to getting an unknown number of columns in the output?
    1) The most obvious answer is, don't use SQL to try and pivot your data. Pivoting of data is more of a reporting requirement and most reporting tools include the ability to pivot data either as part of the initial report generation or on-the-fly at the users request. The main point about using the reporting tools is that they query the data first and then the pivoting is simply a case of manipulating the display of those results, which can be dynamically determined by the reporting tool based on what data there is.
    2) The other answer is to write dynamic SQL. Because you're not going to know the number of columns, this isn't just a simple case of building up a SQL query as a string and passing it to the EXECUTE IMMEDIATE command within PL/SQL, because you won't have a suitable structure to read the results back into as those structures must have a known number of variables for each of the columns at design time, before the data is know. As such, inside PL/SQL code, you would have to use the DBMS_SQL package, just like in the code above that showed the workings of a cursor, as the columns there are referenced by position rather than name, and you have to deal with each column seperately. What you do with each column is up to you... store them in an array/collection, process them as you get them, or whatever. They key thing though with doing this is that, just like the reporting tools, you would need to process the data first to determine what your SQL projection is, before you execute the query to fetch the data in the format you want e.g.
    create or replace procedure dyn_pivot is
      v_sql varchar2(32767);
      -- cursor to find out the maximum number of projected columns required
      -- by looking at the data
      cursor cur_proj_test is
        select distinct rn
        from   proj_test
        order by rn;
    begin
      v_sql := 'select id';
      for i in cur_proj_test
      loop
        -- dynamically add to the projection for the query
        v_sql := v_sql||',max(decode(rn,'||i.rn||',nm)) as nm_'||i.rn;
      end loop;
      v_sql := v_sql||' from proj_test group by id order by id';
      dbms_output.put_line('Dynamic SQL Statement:-'||chr(10)||v_sql||chr(10)||chr(10));
      -- call our DBMS_SQL procedure to process the query with it's dynamic projection
      process_cursor(v_sql);
    end;
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy
    2     ,Scott               ,Smith               ,
    3     ,Jim                 ,Jones               ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    ... and if more data is added ...
    SQL> insert into proj_test (id, rn, nm) values (1,4,'Fud');
    1 row created.
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3,max(decode(rn,4,nm)) as nm_4 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    Column Name: NM_4                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy              ,Fud
    2     ,Scott               ,Smith               ,                    ,
    3     ,Jim                 ,Jones               ,                    ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    Of course there are other methods, using dynamically generated scripts etc. (see Re: 4. How do I convert rows to columns?), but the above simply demonstrates that:-
    a) having a dynamic projection requires two passes of the data; one to dynamically generate the query and another to actually query the data,
    b) it is not a good idea in most cases as it requires code to handle the results dynamically rather than being able to simply query directly into a known structure or variables, and
    c) a simple SQL statement cannot have a dynamic projection.
    Most importantly, dynamic queries prevent validation of your queries at the time your code is compiled, so the compiler can't check that the column names are correct or the tables names, or that the actual syntax of the generated query is correct. This only happens at run-time, and depending upon the complexity of your dynamic query, some problems may only be experienced under certain conditions. In effect you are writing queries that are harder to validate and could potentially have bugs in them that would are not apparent until they get to a run time environment. Dynamic queries can also introduce the possibility of SQL injection (a potential security risk), especially if a user is supplying a string value into the query from an interface.
    To summarise:-
    The projection of an SQL statement must be known by the SQL engine before any data is fetched, so don't expect SQL to magically create columns on-the-fly based on the data it's retrieving back; and, if you find yourself thinking of using dynamic SQL to get around it, just take a step back and see if what you are trying to achieve may be better done elsewhere, such as in a reporting tool or the user interface.
    Other articles in the PL/SQL 101 series:-
    PL/SQL 101 : Understanding Ref Cursors
    PL/SQL 101 : Exception Handling

    excellent article. However there is one thing which is slightly erroneous. You don't need a type to be declared in the database to fetch the data, but you do need to declare a type;
    here is one of my unit test scripts that does just that.
    DECLARE
    PN_CARDAPPL_ID NUMBER;
    v_Return Cci_Standard.ref_cursor;
    type getcardapplattrval_recordtype
    Is record
    (cardappl_id ci_cardapplattrvalue.cardappl_ID%TYPE,
    tag ci_cardapplattrvalue.tag%TYPE,
    value ci_cardapplattrvalue.value%TYPE
    getcardapplattrvalue_record getcardapplattrval_recordtype;
    BEGIN
    PN_CARDAPPL_ID := 1; --value must be supplied
    v_Return := CCI_GETCUSTCARD.GETCARDAPPLATTRVALUE(
    PN_CARDAPPL_ID => PN_CARDAPPL_ID
    loop
    fetch v_return
    into getcardapplattrvalue_record;
    dbms_output.put_line('Cardappl_id=>'||getcardapplattrvalue_record.cardappl_id);
    dbms_output.put_line('Tag =>'||getcardapplattrvalue_record.tag);
    dbms_output.put_line('Value =>'||getcardapplattrvalue_record.value);
    exit when v_Return%NOTFOUND;
    end loop;
    END;

  • Link Tag Query to SQL Query

    I have tag values that are steps or alarm codes in a process and I want to create a cross reference SQL table for these values.  Using BLS, how do you link these tables to have an Xacute query that will return the cross referenced values from the tag?
    I appreciate any help that can be provided.
    Thanks
    Larry

    Rick,
    One other note I missed in my last e-mail, the Tag Query does not return the TagName as a value in a column but as the column header for the value of the tag.  How is it possible to get this as the value of its own column to be used as a link in the Joiner action.
    Current query returns
    DateTime                              XV43003     -
    (header row)
    03/22/2007 03:30:00                     0
    03/22/2007 03:31:00                     2
    03/22/2007 03:32:00                     8
    Preferred method
    DateTime                              TagName         Value     -
    (header row)
    03/22/2007 03:30:00              XV43003              0
    03/22/2007 03:31:00              XV43003              2
    03/22/2007 03:32:00              XV43003              8
    This would give me both the tagname and value to join to the sql table.
    Thanks for your help!
    Larry

  • Using the Tag Query's Statistics Mode

    I've tried to come up with the average of tag values over a span of a month two different ways:
    1.  using the Statistics mode
    2.  using the HistoryEvent mode and averaging the resultant data with an average function.
    For most tags the two results are quite similar.  I have one tag however that contains a lot of zero values and for that tag, the Statistics mode average is about 10% higher than the HistoryEvent mode average. It's like the Statistics mode ignores the zero values when doing it's calculations.
    Is this correct?  My tag query is returning data from six tags - does this cause the Statistics mode any heartache?
    David Macindoe

    David,
    Martin K - if you are listening please feel free to correct or confirm the following:
    The PI UDS uses native Statistical method calls to the PI API/SDK, so the results are coming directly from PI.  In this case, xMII is not doing a HistoryEvent call and then doing the average calcs on the raw event records, like it would be if you did it in BLS.
    Regards,
    Jeremy

  • How to pass 100+ tags in a single sql/tag query

    <b>In my current application I have to pass 180+ tags in a single query to retrieve data from iHistorian.
    I want to know how to pass more thatn 100 tags in a single SQL or TAG Query using OLEDB or UDC connectors.
    If anybody has done it in the past ,please share with me and also let me know how to do it</b>

    Currently you can only query a maximum of 128 tags, this is a hard limit and may be changed in future releases of xMII, but < 11.5 (im fairly sure 12 also) all have the limit of 128.
    As for iHistorian with the OLEDB UDS, you can write a query that would return over 128 queries becasue this is an input limitation on the UDSs. (PLEASE DON"T ASK US HOW TO WRITE iHistorian QUERIES... we aren't experts in iHistorian).
    Please note that though you can query over 128 tags, the performance may not be what you expect... This may take a very long time to return.
    Martin

  • Tag Query error when assigning mode in Link Editor

    Hi,
    I am receiving the following error when I run a transaction that contains a Tag query.  In my transaction I have set the tag query mode to be defined by a local property via the link editor.  Then when I execute the transaction I get the following error:
    "[ERROR] [TAG_QUERY_WRITE_TAG]com.sap.xmii.Illuminator.logging.LHException: Mode parameter was not found or is malformed"
    I am on version 12.1.8 Build(43).
    I import the project into MII 12.1.4 Build(53) and the transaction works as expected.
    In the new verson of MII is there a bug utilizing the link editor to set mode?  Or is there a specific way it wants the mode linked. (e.g. specific case)
    Additional details:
    I am trying to use the CurrentWrite mode.  I have tried setting Current mode via the link editor and that isn't working either. 
    Any suggestions?
    Thanks,
    Justin

    Hi Mike,
    Please correct me if I understand the logic incorrectly.  But here is what I am thinking....
    I have configured my transaction to have this flow:
    String_List_To_XML_Sequence --> Repeater_TagValues  -->  TAG_QUERY_WRITE_TAG
    The String_List_To_XML_Sequence would contain String_List_To_XML_Parser_TagValues & String_List_To_XML_Parser_TagNames.
    With this flow the logic would say that I will be running a seperate tag query for each tag value.  So if I have 100 tag values that I want to send, I will run 100 tag queries.  This doesn't seem the most efficient.  Also if this is how you were thinking I would configure the transaction then I could use the following links:
    Target Xpath: TAG_QUERY_WRITE_TAG.TagValue.1
    Expression: String_List_To_XML_Parser_TagValues.Output{/Rowsets/Rowset/Row[#Repeater_TagInput.CurrentItem#]/Item}
    Target Xpath: TAG_QUERY_WRITE_TAG.TagName.1
    Expression: String_List_To_XML_Parser_TagNames.Output{/Rowsets/Rowset/Row[#Repeater_TagInput.CurrentItem#]/Item}
    With this configuration updating 10 tags takes 2 seconds:
    [INFO] Statistics [Load = 35 ms msec, Parse = 35 ms, Execution = 2015 ms, Total = 2067 ms]
    With my old configuration updating 10 tags took 700 ms. (original transaction on 12.1.4 Build (53)
    [INFO] Statistics [Load = 11.137 msec, Parse = 224.113 msec, Execution = 451.78 msec, Total = 736.62 msec]
    Please let me know if I interpreted your thoughts incorrectly.
    If there is a way to utilize only 1 Tag Query, then I am not sure how the transaction flow should be configured.
    Thanks for your time,
    Justin
    Edited by: Justin M Brown on Jul 14, 2011 7:29 PM
    Edited by: Justin M Brown on Jul 14, 2011 7:33 PM

  • JPA and SQL Server 2005's  Identity Don't Play Nice

    I am hoping someone with a little more senior knowledge of JPASQL Server 2005 can point me in the right direction involving SQL Server 2005.
    What I am discovering is JPA and SQL Server 2005 do not playing nice together when you use the IDENTITY capability for
    a primary key. Let me explain. "...when using the IDENTITY as the generator type, the value for the identity field MAY NOT be
    available BEFORE the entity data is saved in the database because typically it is generated when a record is committed."
    My problem is this.
    I have two separate entities using another entity, for example a car has a wheel and a truck has a wheel. I can go through the steps
    of persisting the car and the wheel with no problem. My problem begins when I try to
    find the wheel (by wheel description) for the truck. The query finds the wheel, but does not return the primary key for wheel.
    Therefore, using the PK inside a query generates this error:
    java.lang.IllegalArgumentException: An instance of a null PK has been incorrectly provided for this find operation. What has been my appraoch to get the entity to commit to the database is to use the following sequence for the Entity Manager.
    entityManager.persist(wheel);
    entityManager.flush(); I have assumed to this point this action would persist my data because I have been able to retrieve the PK immidately after these steps.
    However, multiple actions taking place within the same transaction, appears to render the persist and flush steps useless. My initial thought
    is to wrap the wheel creation within a new transaction so when the method/transaction completes, the data would be in the database. This
    has not worked.
    I hope I have explained this clear for others to follow. I apologize in advance if I have not. Any suggestions are greatly appreciated.

    I am hoping someone with a little more senior knowledge of JPASQL Server 2005 can point me in the right direction involving SQL Server 2005.
    What I am discovering is JPA and SQL Server 2005 do not playing nice together when you use the IDENTITY capability for
    a primary key. Let me explain. "...when using the IDENTITY as the generator type, the value for the identity field MAY NOT be
    available BEFORE the entity data is saved in the database because typically it is generated when a record is committed."
    My problem is this.
    I have two separate entities using another entity, for example a car has a wheel and a truck has a wheel. I can go through the steps
    of persisting the car and the wheel with no problem. My problem begins when I try to
    find the wheel (by wheel description) for the truck. The query finds the wheel, but does not return the primary key for wheel.
    Therefore, using the PK inside a query generates this error:
    java.lang.IllegalArgumentException: An instance of a null PK has been incorrectly provided for this find operation. What has been my appraoch to get the entity to commit to the database is to use the following sequence for the Entity Manager.
    entityManager.persist(wheel);
    entityManager.flush(); I have assumed to this point this action would persist my data because I have been able to retrieve the PK immidately after these steps.
    However, multiple actions taking place within the same transaction, appears to render the persist and flush steps useless. My initial thought
    is to wrap the wheel creation within a new transaction so when the method/transaction completes, the data would be in the database. This
    has not worked.
    I hope I have explained this clear for others to follow. I apologize in advance if I have not. Any suggestions are greatly appreciated.

  • Why is my transaction file growing when db is in simple mode and there are no waiting transactions

    I noticed that my database log file has reached 160GBs. My database is in simple recovery mode and I used the following query to see if there are
    any waiting transactions.
    SELECT d.name,d.log_reuse_wait,d.log_reuse_wait_desc
    FROM sys.databases d
    It returned log_reuse_wait 0 and description 'NOTHING'. Why is my transaction log growing then? Please advice.
    mayooran99

    DBCC opentran will give spid, you can pass that spid to dbcc inputbuffer(spid) to see what transaction is running. Also you can do sp_who2 spid will give other details.
    You can make use of these queries to get MORE details about OPEN and ACTIVE transactions.
    SELECT
    trans.session_id as [Session ID],
    trans.transaction_id as [Transaction ID],
    tas.name as [Transaction Name],
    tds.database_id as [Database ID]
    FROM sys.dm_tran_active_transactions tas
    INNER JOIN sys.dm_tran_database_transactions tds
    ON (tas.transaction_id = tds.transaction_id )
    INNER JOIN sys.dm_tran_session_transactions trans
    ON (trans.transaction_id=tas.transaction_id)
    WHERE trans.is_user_transaction = 1 -- user
    AND tas.transaction_state = 2 -- active
    AND tds.database_transaction_begin_time IS NOT NULL
    SELECT
        [s_tst].[session_id],
        [s_es].[login_name] AS [Login Name],
        DB_NAME (s_tdt.database_id) AS [Database],
        [s_tdt].[database_transaction_begin_time] AS [Begin Time],
        [s_tdt].[database_transaction_log_bytes_used] AS [Log Bytes],
        [s_tdt].[database_transaction_log_bytes_reserved] AS [Log Rsvd],
        [s_est].text AS [Last T-SQL Text],
        [s_eqp].[query_plan] AS [Last Plan]
    FROM
        sys.dm_tran_database_transactions [s_tdt]
    JOIN
        sys.dm_tran_session_transactions [s_tst]
    ON
        [s_tst].[transaction_id] = [s_tdt].[transaction_id]
    JOIN
        sys.[dm_exec_sessions] [s_es]
    ON
        [s_es].[session_id] = [s_tst].[session_id]
    JOIN
        sys.dm_exec_connections [s_ec]
    ON
        [s_ec].[session_id] = [s_tst].[session_id]
    LEFT OUTER JOIN
        sys.dm_exec_requests [s_er]
    ON
        [s_er].[session_id] = [s_tst].[session_id]
    CROSS APPLY
        sys.dm_exec_sql_text ([s_ec].[most_recent_sql_handle]) AS [s_est]
    OUTER APPLY
        sys.dm_exec_query_plan ([s_er].[plan_handle]) AS [s_eqp]
    ORDER BY
        [Begin Time] ASC;
    GO
    http://www.sqlskills.com/blogs/paul/script-open-transactions-with-text-and-plans/

  • Wanted to connect Voyager (BO XI R3 ) and sql 2000 (MSAS).

    We have never used Voyager before, wanted to connect Voyager (BO XI R3 ) and sql 2000 (MSAS).
    Details of my Voyager connection below:
    Name: test
    Provider: Microsoft OLE DB Provider for OLAP Services 8.0
    Server Information:
    Server Type: Server
    Server: <server ip address>
    Data Source:
    User: <domain>\<NT account>
    Password: <NT password>
    Error: "No cubes were found. Please check that you have permission to view cubes on the database server."
    I tried accessing the cubes using the same data connection via excel pivottable and was able to access the cubes successfully.
    Badly need help!

    Hi Everyone,
    As I have been working with MSAS 2000 and I reconmmend to install SP4 on the MSAS 2000 since BO XI R3 according to Support plataform it says that it is supported from SP4.
    As well verified that you have access using the security role using your Windows AD authentication since Microsoft Service Analysis does not have its own authentication mode to connect basicatly rely on Windows AD.
    If you want more information check this link.
    http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/anservog.mspx#EXFAE
    Also check the service account on the same link above.
    Receive Client Security Credentials Via Middle-Tier Application
    The MSSQLServerOLAPService service account is irrelevant in the typical client-server environment. In this environment, the user application connects directly to the Analysis Services computer to execute a query or create an object, and passes the user's credentials directly to Analysis Services for evaluation. Access is granted or denied by Analysis Services based on cell-level and dimension-level security.
    However, if the client application attempts to connect to Analysis Services through a middle-tier server, the authentication process is not quite so simple. Normally, security credentials cannot be passed over multiple computers. However, if the middle-tier application server and the Analysis Services computer support Kerberos authentication and delegation, the client's security credentials can be passed by the middle-tier application to Analysis Services.
    For Kerberos authentication, delegation, impersonation, and mutual authentication to work, the MSSQLServerOLAPService service must run under one of the following types of accounts:
    u2022 Local system account (which has no network access rights).
    u2022 Domain administrator account.
    u2022 Domain user without administrative privileges in the Microsoft Active Directory domain, provided that a domain administrator registers the Service Principal Name (SPN) for the account separately using the setspn utility in the Windows 2000 Resource Kit.
    Note   There are a number of steps you must follow to permit Kerberos authentication, delegation, impersonation, and mutual authentication to work. For information about these steps, see "Security Account Delegation" in SQL Server Books Online. Also go to Knowledge Base (support.microsoft.com) and see the article "Use Kerberos Authentication for Microsoft SQL Server 2000 Analysis Services."
    Miguel Jimenez

Maybe you are looking for

  • PostMessage not working in IE

    I am trying to post messages from my pdf document to the web browser.  I have the following code snippet below.  I am receiving the alert from the pdf file in all browsers.  I am not getting the postMessage back in IE but I am in Chrome. function onM

  • Production order confirmed quantity

    hello! can i change the settings, that production order atp-check confirms quantities from e.g. purchase requisitions, not only stock in warehouse? best regards, matthias

  • Regenerative planning run in APO

    Hi experts, I need your input on the following scenario. We have APO PPDS implemented for our Manufacturing industry client. We have designed to run only Net Change Planning every day and there is no design to run Regenerative Planning. What is the i

  • I cant maintain illustrator resolution when I import my .ai, .eps, or .png into fireworks

    When I try to import my image i made in illustrator into fireworks the resolution is really bad.  i've tried to copy and paste it as well as open the (.ai, .eps, and .png) file in fireworks. The resolution on the illustrator file is 72 ppi as well as

  • Newbie help needed on apparent lockout

    Hi all, I am pretty much Mac illiterate but can stumble around. I was helping a friend who is even worse off install Retrospect (6.1) on her iMac (OS X 10.4.11). Install went fine, a backup ran fine and then today her machine appears to have locked u