SQL: GROUP BY and UNION addition issue..

I'm looking to to get the TOTALS of the grouping. What I'm getting now is either a 1 or a 2 whether or not each PRIMARY_ASSIGMENT_GROUP appears in each query. when I run the
SELECT a.PRIMARY_ASSIGNMENT_GROUP,GROUP(a.PRIMARY_ASSIGNMENT_GROUP) from
SELECT
COUNT(PRIMARY_ASSIGNMENT_GROUP),PRIMARY_ASSIGNMENT_GROUP
from
SMINCREQ
LEFT JOIN SMOPERATOR on SMINCREQ.OPENED_BY=SMOPERATOR.NAME
WHERE
open_time BETWEEN to_date(:P246_SDATEB,'DD-MON-YYYY') AND to_date(:P246_EDATEB,'DD-MON-YYYY') and
((to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '00' and '08') or (to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '19' and '24')) GROUP BY PRIMARY_ASSIGNMENT_GROUP
UNION ALL
SELECT
COUNT(PRIMARY_ASSIGNMENT_GROUP),PRIMARY_ASSIGNMENT_GROUP
from SMINTERACTIONS
WHERE
open_time BETWEEN to_date(:P246_SDATEB,'DD-MON-YYYY') AND to_date(:P246_EDATEB,'DD-MON-YYYY') and
((to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '00' and '08') or (to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '19' and '24')) GROUP BY PRIMARY_ASSIGNMENT_GROUP
) a
GROUP BY (a.PRIMARY_ASSIGNMENT_GROUP)
ORDER by COUNT(a.PRIMARY_ASSIGNMENT_GROUP)

bostonmacosx wrote:
I'm looking to to get the TOTALS of the grouping. What I'm getting now is either a 1 or a 2 whether or not each PRIMARY_ASSIGMENT_GROUP appears in each query. when I run the
SELECT a.PRIMARY_ASSIGNMENT_GROUP,GROUP(a.PRIMARY_ASSIGNMENT_GROUP) from
SELECT
COUNT(PRIMARY_ASSIGNMENT_GROUP),PRIMARY_ASSIGNMENT_GROUP
from
SMINCREQ
LEFT JOIN SMOPERATOR on SMINCREQ.OPENED_BY=SMOPERATOR.NAME
WHERE
open_time BETWEEN to_date(:P246_SDATEB,'DD-MON-YYYY') AND to_date(:P246_EDATEB,'DD-MON-YYYY') and
((to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '00' and '08') or (to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '19' and '24')) GROUP BY PRIMARY_ASSIGNMENT_GROUP
UNION ALL
SELECT
COUNT(PRIMARY_ASSIGNMENT_GROUP),PRIMARY_ASSIGNMENT_GROUP
from SMINTERACTIONS
WHERE
open_time BETWEEN to_date(:P246_SDATEB,'DD-MON-YYYY') AND to_date(:P246_EDATEB,'DD-MON-YYYY') and
((to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '00' and '08') or (to_char(CAST((FROM_TZ(CAST(open_time AS TIMESTAMP),'+00:00') AT TIME ZONE 'US/Eastern') AS DATE),'HH24') between '19' and '24')) GROUP BY PRIMARY_ASSIGNMENT_GROUP
) a
GROUP BY (a.PRIMARY_ASSIGNMENT_GROUP)
ORDER by COUNT(a.PRIMARY_ASSIGNMENT_GROUP)Instead of using UNION ALL you should be able to get by with GROUPING SETS or ROLLUP.
If you'd care to put together a small test case with data (representing what you have) and some sample output (what you need) along with your Oracle version
select * from v$versionI'm sure someone will help you out.
Cheers,

Similar Messages

  • SQL Server Import and Export Wizard Issue

    i am trying to export SQL data into excel to send out weekly reports.  I have created a view and a SQL account has access to this view; however, I am unable to successfully export the data.  In preview I see all of the data yet it fails on the
    Pre-execute with the below.  It creates the excel file with just the header.  I am using SQL 2014 and loaded the 64 bit AccessDatabaseengine.  I am selecting excel 2007.  Any ideas welcome.
    Messages
    Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E37.
     (SQL Server Import and Export Wizard)
    Error 0xc02020e8: Data Flow Task 1: Opening a rowset for "WeeklyList" failed. Check that the object exists in the database.
     (SQL Server Import and Export Wizard)
    Error 0xc004701a: Data Flow Task 1: Destination - WeeklyList failed the pre-execute phase and returned error code 0xC02020E8.
     (SQL Server Import and Export Wizard)

    Hi astro,
    Please ensure that you haven’t renamed or moved the destination excel file during the exporting process. Also make sure that “Create destination table” option is checked as below screenshot and the SQL statement is correct.
    For more details about using SQL Server Import/Export Wizard to export data from a SQL Server database to an Excel spreadsheet, please review the below blog.
    http://www.mssqltips.com/sqlservertutorial/202/simple-way-to-export-data-from-sql-server/
    Thanks,
    Lydia Zhang
    If you have any feedback on our support, please click
    here.
    Lydia Zhang
    TechNet Community Support

  • Need Help With SQL GROUP BY and DISTINCT

    I am working on a project and need to display the total of each order based on the order id. For instance I want to display the order id, customer id, order date, and then the extension price (ol_quantity * inv_price).
    I would then like a total displayed for order # 1 and then move on to order #2.
    Here is my SQL code :
    SELECT DISTINCT orders.o_id, customer.c_id, inv_price * ol_quantity
    FROM orders, customer, inventory, order_line
    GROUP BY orders.o_id, customer.c_id, inv_price, ol_quantity
    ORDER BY orders.o_id;
    When my code is run it displays the order id, customer id and inv_price * quantity (extension price) but no order total for the order number and a new group is not started when a new order number is started....they are all clumped together.
    Any help is greatly appreciated!!

    Hi,
    user12036843 wrote:
    I am working on a project and need to display the total of each order based on the order id. For instance I want to display the order id, customer id, order date, and then the extension price (ol_quantity * inv_price).
    I would then like a total displayed for order # 1 and then move on to order #2.
    Here is my SQL code :
    SELECT DISTINCT orders.o_id, customer.c_id, inv_price * ol_quantity
    FROM orders, customer, inventory, order_line
    GROUP BY orders.o_id, customer.c_id, inv_price, ol_quantity
    ORDER BY orders.o_id;
    When my code is run it displays the order id, customer id and inv_price * quantity (extension price) but no order total for the order number and a new group is not started when a new order number is started....they are all clumped together.
    Any help is greatly appreciated!!Sorry, it's unclear what you want.
    Whenever you post a question, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only), and the results you want from that data.
    Explain, using specific examples, how you get those results from that data.
    Always say what version of Oracle you're using.
    Do you want the output to contain one row for each row in the table, plus an extra row for each distinct order, showing something about the order as a whole (e.g., total inv_price or average extension_price)? If so, you need GROUP BY ROLLUP or GROUP BY GROUPING SETS .
    If you want one row of output for each row of the table, but you want to include something that reflects the group as a whole (again, e.g, total inv_prive or average extension_pcie), then you can us analytic functions. (Most of the aggregate functions, such as SUM and AVG have analytic counterparts that can get the same results without collapsing the result set down to one row per group.)
    Here's an example of how to use GROUP BY GROUPING SETS.
    Way we're interested in employees' salary and commission from the scott.emp table:
    SELECT       deptno
    ,       ename
    ,       sal
    ,       comm
    FROM       scott.emp
    ORDER BY  deptno
    ,            ename
    ;Output:
    `   DEPTNO ENAME             SAL       COMM
            10 CLARK            2450
            10 KING             5000
            10 MILLER           1300
            20 ADAMS            1100
            20 FORD             3000
            20 JONES            2975
            20 SCOTT            3000
            20 SMITH             800
            30 ALLEN            1600        300
            30 BLAKE            2850
            30 JAMES             950
            30 MARTIN           1250       1400
            30 TURNER           1500          0
            30 WARD             1250        500Now say we want to add the total income (sal + comm, or just sal if there is no comm) to each row, and also to add a row for each department showing the total sal, comm and income in that department, like this:
    `   DEPTNO ENAME             SAL       COMM     INCOME
            10 CLARK            2450                  2450
            10 KING             5000                  5000
            10 MILLER           1300                  1300
            10                  8750                  8750
            20 ADAMS            1100                  1100
            20 FORD             3000                  3000
            20 JONES            2975                  2975
            20 SCOTT            3000                  3000
            20 SMITH             800                   800
            20                 10875                 10875
            30 ALLEN            1600        300       1900
            30 BLAKE            2850                  2850
            30 JAMES             950                   950
            30 MARTIN           1250       1400       2650
            30 TURNER           1500          0       1500
            30 WARD             1250        500       1750
            30                  9400       2200      11600(This relies on the fact that ename is unique.) Getting those results is pretty easy, using GROUPING SETS:
    SELECT       deptno
    ,       ename
    ,       SUM (sal)          AS sal
    ,       SUM (comm)          AS comm
    ,       SUM ( sal
               + NVL (comm, 0)
               )               AS income
    FROM       scott.emp
    GROUP BY  GROUPING SETS ( (deptno)
                             , (deptno, ename)
    ORDER BY  deptno
    ,            ename
    ;Notice that we're displaying SUM (sal) on each row. Most of the rows in the output are "groups" consisting of only one row from the table, so the SUM (sa) for that goup will be the sal for the one row in the group.
    Edited by: Frank Kulash on Nov 23, 2011 2:03 PM
    Added GROUPING SET example

  • UNION ALL and UNION performance issue

    Hi All,
    I am trying to figure out the data for which only receive transaction has been done and further processing is pending. These transactions include all PO, RMA , ISO etc...
    I have to use UNION ALL in this case as for RMA and ISO, details which i want are not able to gather in a single query.
    But query is taking a lot of time ...may be around 30..mins in UNION ALL while 6 to 7 mins in UNION.
    To get all records I must have to use UNION ALL...
    So kindly suggest the solution for this problem
    Thanks
    Sachin
    Query is given below...
    SELECT /* + FIRST_ROWS */ DECODE(rsl.SOURCE_DOCUMENT_CODE,'REQ',(SELECT org1.ORGANIZATION_NAME
                                                           FROM     org_organization_definitions org1
                                                           WHERE org1.ORGANIZATION_ID =
                                                           rsl.FROM_ORGANIZATION_ID)) Vendor_Name
    ,rsh.RECEIPT_NUM Receipt_Number
         ,TO_CHAR(rt3.TRANSACTION_DATE,'Mon-DD-YYYY HH:MM:SS') Receipt_Date_and_Time
         ,msi.SEGMENT1 Part_Number
         ,msi.DESCRIPTION Part_Name
         ,rt3.QUANTITY Quantity
         ,rt3.UNIT_OF_MEASURE UOM
         ,NULL ASL_Status
         --for ISO no asl flag ASL Flag
         ,TO_CHAR(TRUNC((((86400*(SYSDATE-rt3.TRANSACTION_DATE))/60)/60)/24))|| ' Days ' || TO_CHAR(TRUNC(((86400*(SYSDATE-rt3.TRANSACTION_DATE))/60)/60)-24*(TRUNC((((86400*(SYSDATE-rt3.TRANSACTION_DATE))/60)/60)/24)))|| ' Hours' Days_and_hours_passed
         ,DECODE(
                        NVL(msi.max_minmax_quantity,0) ,
                        0 , 0 ,
                        (NVL(msi.max_minmax_quantity,0) -
                        NVL(inmohqd.onhand,0))
                             * 100
                             / NVL(msi.max_minmax_quantity,0)
                        ) gap_percent
    FROM rcv_transactions rt3
         ,rcv_shipment_headers rsh
         ,rcv_shipment_lines rsl
         ,mtl_system_items msi
         ,org_organization_definitions org
         --,MTL_ONHAND_QUANTITIES_DETAIL moqhd
         ,(SELECT NVL(SUM(primary_transaction_quantity),0) onhand,INVENTORY_ITEM_ID item_id,ORGANIZATION_ID organization_id
         FROM      mtl_onhand_quantities_detail
         WHERE SUBINVENTORY_CODE NOT IN ('Wip_SF','Wip_Int','Reject','Scrap','FG Trading','FG')
         GROUP BY INVENTORY_ITEM_ID, ORGANIZATION_ID) inmohqd
    WHERE inmohqd.item_id(+) = msi.INVENTORY_ITEM_ID
         AND inmohqd.organization_id(+) = msi.ORGANIZATION_ID
         --AND inmoqhd.SUBINVENTORY_CODE NOT IN  ('Wip_SF','Wip_Int','Reject','Scrap','FG Trading','FG')
         AND msi.INVENTORY_ITEM_ID = rsl.ITEM_ID
         AND rsh.SHIPMENT_HEADER_ID = rsl.SHIPMENT_HEADER_ID
         AND org.ORGANIZATION_ID = rt3.ORGANIZATION_ID
         AND msi.ORGANIZATION_ID = rt3.ORGANIZATION_ID
         AND rsh.SHIPMENT_HEADER_ID = rt3.SHIPMENT_HEADER_ID
         AND rsl.SHIPMENT_HEADER_ID = rt3.SHIPMENT_HEADER_ID
         AND rsl.SHIPMENT_LINE_ID = rt3.SHIPMENT_LINE_ID
         AND rt3.PO_HEADER_ID IS NULL
         AND TRUNC(rt3.TRANSACTION_DATE) <= TRUNC(p_tilldate)
         AND rsl.TO_ORGANIZATION_ID = p_organization_id
         AND rsh.ORGANIZATION_ID = p_organization_id
         AND CONCAT(TRIM(rt3.SHIPMENT_HEADER_ID),TRIM(rt3.SHIPMENT_LINE_ID)) IN
         SELECT CONCAT(TRIM(rt1.SHIPMENT_HEADER_ID),TRIM(rt1.SHIPMENT_LINE_ID))
         FROM     rcv_transactions rt1
         WHERE NOT EXISTS(
         SELECT 1
              FROM     rcv_transactions rt2
              WHERE     rt2.TRANSACTION_TYPE <> 'RECEIVE'
                        AND rt1.SHIPMENT_HEADER_ID = rt2.SHIPMENT_HEADER_ID
                        AND rt1.SHIPMENT_LINE_ID = rt2.SHIPMENT_LINE_ID
                        AND rt2.ORGANIZATION_ID = p_organization_id
    UNION
    SELECT /* + FIRST_ROWS */ pv.VENDOR_NAME Vendor_Name
         ,rsh.RECEIPT_NUM Receipt_Number
         ,TO_CHAR(rt.TRANSACTION_DATE,'Mon-DD-YYYY HH:MM:SS') Receipt_Date_and_Time
         ,msi.SEGMENT1 Part_Number
         ,msi.DESCRIPTION Part_Name
         ,rt.QUANTITY Quantity
         ,rt.UNIT_OF_MEASURE UOM
         --start 001
         ,NVL((SELECT DISTINCT DECODE (ASL_STATUS_ID,1,'New',2,'Approved','To be checked')
                   FROM po_approved_supplier_list pasl
                   WHERE pasl.item_id=rsl.ITEM_ID
                             AND pasl.VENDOR_ID(+) = pv.VENDOR_ID
                             AND pasl.VENDOR_SITE_ID(+) = pvs.VENDOR_SITE_ID),'No_data') ASL_Status
              --end 001
              ,TO_CHAR(TRUNC((((86400*(SYSDATE-rt.TRANSACTION_DATE))/60)/60)/24))|| ' Days ' || TO_CHAR(TRUNC(((86400*(SYSDATE-rt.TRANSACTION_DATE))/60)/60)-24*(TRUNC((((86400*(SYSDATE-rt.TRANSACTION_DATE))/60)/60)/24)))|| ' Hours' Days_and_hours_passed          ,DECODE(
                   NVL(msi.max_minmax_quantity,0) ,
              0 , 0 ,
              (NVL(msi.max_minmax_quantity,0) -
              NVL(inmohqd.onhand,0))
                   * 100
                   / NVL(msi.max_minmax_quantity,0)
              ) gap_percent
    FROM rcv_transactions rt
         ,po_vendors pv
         ,po_vendor_sites_all pvs
         ,rcv_shipment_headers rsh
         ,rcv_shipment_lines rsl
         ,mtl_system_items msi
         ,org_organization_definitions org
         --,mtl_onhand_quantities_detail moqhd
         ,(SELECT NVL(SUM(primary_transaction_quantity),0) onhand,INVENTORY_ITEM_ID item_id,ORGANIZATION_ID organization_id
         FROM      mtl_onhand_quantities_detail
         WHERE SUBINVENTORY_CODE NOT IN ('Wip_SF','Wip_Int','Reject','Scrap','FG Trading','FG')
         GROUP BY INVENTORY_ITEM_ID, ORGANIZATION_ID) inmohqd
    WHERE inmohqd.item_id(+) = msi.INVENTORY_ITEM_ID
         AND inmohqd.ORGANIZATION_ID(+) = msi.ORGANIZATION_ID
         --AND inmoqhd.SUBINVENTORY_CODE NOT IN  ('Wip_SF','Wip_Int','Reject','Scrap','FG Trading','FG')
         AND msi.INVENTORY_ITEM_ID = rsl.ITEM_ID
         AND rsh.SHIPMENT_HEADER_ID = rsl.SHIPMENT_HEADER_ID
         AND pv.VENDOR_ID = pvs.VENDOR_ID
         AND org.ORGANIZATION_ID = rt.ORGANIZATION_ID
         AND msi.ORGANIZATION_ID = rt.ORGANIZATION_ID
         AND pvs.VENDOR_SITE_ID = rt.VENDOR_SITE_ID
         AND pv.VENDOR_ID = rt.VENDOR_ID
         AND rsh.SHIPMENT_HEADER_ID = rt.SHIPMENT_HEADER_ID
         AND rsl.SHIPMENT_HEADER_ID = rt.SHIPMENT_HEADER_ID
         AND rsl.SHIPMENT_LINE_ID = rt.SHIPMENT_LINE_ID
         AND TRUNC(rt.TRANSACTION_DATE) <= TRUNC(p_tilldate)
         AND rsl.TO_ORGANIZATION_ID = p_organization_id
         AND CONCAT(TRIM(rt.SHIPMENT_HEADER_ID),TRIM(rt.SHIPMENT_LINE_ID)) IN
              SELECT CONCAT(TRIM(rt1.SHIPMENT_HEADER_ID),TRIM(rt1.SHIPMENT_LINE_ID))
              FROM RCV_TRANSACTIONS rt1
              WHERE rt1.TRANSACTION_TYPE = 'RECEIVE'
                   AND rt1.DESTINATION_TYPE_CODE = 'RECEIVING'
                   AND rt1.PO_HEADER_ID IS NOT NULL
                   AND NOT EXISTS(
                   SELECT 1
                        FROM     RCV_TRANSACTIONS rt2
                        WHERE     rt2.SHIPMENT_HEADER_ID = rt1.SHIPMENT_HEADER_ID
                                  AND rt2.SHIPMENT_LINE_ID = rt1.SHIPMENT_LINE_ID
                                  AND rt2.TRANSACTION_TYPE <> 'RECEIVE'
         )

    In this case, for selected columns, all data is same for one of the RMA with more than one line. So UNION will skip one of the records. However, shipment line id are different for both records, so by selecting it in select list is solving the problem and so no need to use UNION ALL. But, anyhow UNION ALL is better than UNION in performance as it does not require to sort. Then why I am facing this problem...
    Kindly suggest
    Regards,
    Sachin

  • Cisco Jabber windows call option and user addition issue

    Hi,
    After uploading the jabber-config.xml (EDI-BDI) on the CUCM, the call option for new user contacts started appearing but the already existing contacts in Jabber client have still no call option. Also when we add new contacts to the Jabber client, it just disappears as such. Any one faced similar issue before. Below are the details and attached jabber-config.xml and LDAP profile snap.
    We are using employeeNumber as attribute in LDAP configuration.
    CUCM - 8.5.1.15900-4
    UCCX - 8.6.3.10000-20
    Jabber for windows - 9.7.0 (Tried with earlier version of jabber as well)

    Hey
    Just right click on the contacts which do not have the call option and select view profile, then see if those contacts that are added are they from the Outlook or the AD, if its AD then they will have a contact number and as such they will have the Call Option enabled.
    If they are from outlook then delete the users and readd them by checking the view profile option to verify they are being pulled from AD.
    Also if i assumed the issue differently then what it is please explain the whole scenario.
    Note-: In case of J4W it will not connect to the directory using the LDAP profile info, it automatically verified and cnnects to the domain using your login creds.

  • SQL Server 2000 and BPEL configuration issues

    I am attempting to get SQL Server 2000 to work with BPEL PM Server, and have followed a similar set of instructions as provided in a previously posted document regarding the switch from oracle lite to oracle production. I am following the OC4J route. I've seen a previous posting on this, however, I am elaborating a little more on the configuration details here and the difficulties that I am encountering.
    I'm am using the following software:
    1) SQL Server 2000 (w/ SP3)
    2) SQL Server 2000 JDBC Driver (SP3 latest version)
    3) BPEL PM (GA release)
    Here's what I've done:
    1) setup the database in SQL Server 2000 (named: ORABPEL). then ran the ddl scripts that came with the BPEL installation for sql server. there were two scripts, one for domain and the other for server. the commandlines to run these scripts:
    sql -Uuser -Ppassword -ddatabase
    -i c:\orabpel\system\database\scripts\domain_sqlserver.ddl
    -o c:\orabpel\system\database\scripts\domain_sqlserver.out
    2) installed stored procedures for JTA. this is documented in the JDBC driver help file.
    3) modified the library paths in application.xml as followed:
    <!-- SQL2K JDBC LIBS -->
    <library path="C:\Program files\Microsoft SQL Server 2000 Driver for JDBC\lib\msbase.jar"/>
    <library path="C:\Program files\Microsoft SQL Server 2000 Driver for JDBC\lib\msutil.jar"/>
    <library path="C:\Program files\Microsoft SQL Server 2000 Driver for JDBC\lib\mssqlserver.jar"/>
    4) modified the datasources in the data-sources.xml:
    - first comment out the oracle lite data-source
    - add datasources for mssql 2000:
    <data-source class="com.evermind.sql.DriverManagerDataSource"
         name="BPELServerDataSource"
         location="loc/BPELServerDataSource"
         xa-location="BPELServerDataSource"
         ejb-location="jdbc/BPELServerDataSource"
         connection-driver="com.microsoft.jdbc.sqlserver.SQLServerDriver"
         url="jdbc:microsoft:sqlserver://127.0.0.1:1433;SelectMethod=cursor;User=<username>;Password=<password>;DatabaseName=ORABPEL">
    </data-source>
    <data-source class="com.evermind.sql.DriverManagerDataSource"
         name="BPELSamplesDataSource"
         location="jdbc/BPELSamplesDataSource"
         xa-location="BPELSamplesDataSource"
         ejb-location="jdbc/BPELSamplesDataSource"
         connection-driver="com.microsoft.jdbc.sqlserver.SQLServerDriver"
         url="jdbc:microsoft:sqlserver://127.0.0.1:1433;SelectMethod=cursor;User=<username>;Password=<password>;DatabaseName=ORABPEL">
    </data-source>
    <data-source class="com.evermind.sql.DriverManagerDataSource"
    name="AdminConsoleDateSource"
    location="jdbc/AdminConsoleDateSource"
    xa-location="AdminConsoleDateSource"
    ejb-location="jdbc/AdminConsoleDateSource"
         connection-driver="com.microsoft.jdbc.sqlserver.SQLServerDriver"
         url="jdbc:microsoft:sqlserver://127.0.0.1:1433;SelectMethod=cursor;User=<username>;Password=<password>;DatabaseName=ORABPEL">
    </data-source>
    after starting the BPEL PM Server, I got the following set of error messages:
    Loading processes for BPEL domain "default" ...
    <2005-06-02 09:36:44,482> <ERROR> <default.collaxa.cube.sensor> <PCException::<i
    nit>> Sensors not supported.
    <2005-06-02 09:36:44,482> <ERROR> <default.collaxa.cube.sensor> <PCException::<i
    nit>> Sensors are not supported on this database platform.
    <2005-06-02 09:36:44,482> <ERROR> <default.collaxa.cube.sensor> <PCException::<i
    nit>> If sensor functionality is required, please switch to a supported platform
    After this I went and changed the class tags to: com.microsoft.jdbcx.sqlserver.SQLServerDataSource
    restarted the server and got the following:
    <2005-06-02 09:22:52,531> <INFO> <collaxa> <ConnectionFactoryImpl::init> Initial
    ized connection factory jdbc/BPELServerDataSource
    05/06/02 09:23:06 ORABPEL-04077
    Cannot fetch a datasource connection.
    The process domain was unable to establish a connection with the datasource with
    the connection URL "loc/BPELServerDataSource". The exception reported is: Cann
    ot fetch a datasource connection.
    The process domain was unable to establish a connection with the datasource with
    the connection URL "loc/BPELServerDataSource". The exception reported is: [Mic
    rosoft][SQLServer 2000 Driver for JDBC]Unable to connect. DataSource property s
    erverName must be specified.
    Please check that the machine hosting the datasource is physically connected to
    the network. Otherwise, check that the datasource connection parameters (user/p
    assword) is currently valid.
    Please check that the machine hosting the datasource is physically connected to
    the network. Otherwise, check that the datasource connection parameters (user/p
    assword) is currently valid.

    Hi,
    I just saw your post message about configuring SQL server 2000 with Oracle BPLE. Have you configured it successfully or still encountered any problem.
    I am new to Oracle BPEL. Want to know if Oracle BPEL can use MSFT SQL server 2000 as the repository entirely, therefore, we don't need Oracle (or Oracle light database).
    Will really appreciate if you can share information and experience to configure SQL 2000 with Oracle BPEL.
    Thank you so much in advance.
    Leey

  • SQL group by and having

    I have a table having columns business_name ,sales,state,zip.
    I would like to know count of businesses having sales 0-50 and
    count of businesses having sales50-100, and 100-150 and so on.
    can i write single query to gent count of business for all these ranges,instead of
    writing seperate query for each range. If possible please tell me how?

    SELECT sum(case
                  when sales between 0 and 50 then 1
                  else 0
               end
              ) sales_0_50
          ,sum(case
                  when sales between 51 and 100 then 1
                  else 0
               end
              ) sales_51_100
          ,sum(case
                  when sales between 101 and 150 then 1
                  else 0
               end
              ) sales_101_150
          ,sum(case
                  when sales > 150 then 1
                  else 0
               end
              ) sales_over_150
    FROM   biz_table;

  • Mailbox not receiving, but Blackberry Can Send and Receive-Additional Issue

    Not receiving mail to my Mac Inbox or the linked Gmail account within Mail ap. Sent mail sends, but stays in the Outbox. Spinning Dial next to "Sent" will not stop.
    Furthermore after several reboots, connection checks and software updates it still persists.
    My BB is sending and receiving messages fine and is also connected to Gmail.
    Please help

    ''aem28 [[#answer-706948|said]]''
    <blockquote>
    ''aem28 [[#answer-706943|said]]''
    <blockquote>
    Thanks for the idea.
    I cleared the message from the server (It was long gone.) and I Blocked the message in Yahoo, I also Blocked it in TB. However, I still get the same message from three weeks ago and nothing more, so I am sure it has something to do with a Profile. If I can't smoke it our there, I will clear out the profiles and start from scratch.
    I'd appreciate any further thoughts as I don't want to lose my Contact List as some of my Messages.
    </blockquote>
    </blockquote>

  • Issue with Alias and Union of CTEs

    I have a query that works in SQL Server but I'm having issues with it in PL/SQL.  I keep running into the error saying "invalid identifier" when I try to reference PlantNumber or Plant_No or any other piece of the CTE or subquery.  Is this not possible with Oracle?
    Here's a shortened version of my query:
    WITH RemoveData AS
       SELECT a.PLANT_NO,a.ALLOC_WHDV_VOL,a.KW_CTR_REDELIVERED_HV, a.MTR_NO, a.MTR_SFX, a.TRNX_ID, a.REC_STATUS_CD,
    MAX(a.ACCT_DT) ACCT_DT
       FROM GasStmt a
       WHERE a.REC_STATUS_CD = 'RR'
       GROUP BY a.PLANT_NO,a.ALLOC_WHDV_VOL,a.KW_CTR_REDELIVERED_HV, a.MTR_NO, a.MTR_SFX, a.TRNX_ID, a.REC_STATUS_CD
       HAVING COUNT(a.REC_STATUS_CD) > 2
      RemoveData2 AS
       SELECT plant_no "PlantNumber"
       ,SUM(-a.ALLOC_WHDV_VOL) "PlantStandardGrossWellheadMcf"
       ,SUM(KW_CTR_REDELIVERED_HV) "KeepWholeResidueMMBtu"
       FROM RemoveData a
       GROUP BY plant_no
      OriginalData AS
       SELECT a.PLANT_NO "PlantNumber"
       ,SUM(a.ALLOC_WHDV_VOL) "PlantStandardGrossWellheadMcf"
       ,SUM(CASE WHEN a.REC_STATUS_CD = 'RR' THEN -a.KW_CTR_REDELIVERED_HV ELSE a.KW_CTR_REDELIVERED_HV END) "KeepWholeResidueMMBtu"
       FROM GasStmt a
       LEFT OUTER JOIN (SELECT MTR_NO, MTR_SFX, TRNX_ID, REC_STATUS_CD, MAX(ACCT_DT) ACCT_DT
       FROM GasStmt
       WHERE REC_STATUS_CD = 'RR'
       GROUP BY MTR_NO, MTR_SFX, TRNX_ID, REC_STATUS_CD
       HAVING COUNT(TRNX_ID) > 1) b
       ON a.MTR_NO = b.MTR_NO
       AND a.TRNX_ID = b.TRNX_ID
       AND a.Rec_Status_Cd = b.REC_STATUS_CD
       AND a.Acct_Dt = b.ACCT_DT
       WHERE a.ACCT_DT > '1/1/2010'
       AND b.MTR_NO IS NULL
       GROUP BY a.PLANT_NO
    UnionCTE AS (  
    SELECT *
    FROM RemoveData2
    UNION
    SELECT *
    FROM OriginalData
    SELECT PlantNumber, SUM(PlantStandardGrossWellheadMcf) AS PlantStandardGrossWellheadMcf,SUM(KeepWholeResidueMMBtu) AS KeepWholeResidueMMBtu
    FROM UnionCTE
    GROUP BY PlantNumber
    It's the bottom select from UnionCTE that's causing the issue.  Any tips would be appreciated!

    I can't check it at the moment, but here's some code I forgot to post.  Thanks for your response, I'll let you know if it works for me.
    CREATE TABLE STG.GasStmt
    (PLANT_NO varchar(100),
    ALLOC_WHDV_VOL numeric(29, 5),
    KW_CTR_REDELIVERED_HV numeric(29, 5),
    MTR_NO varchar(100),
    MTR_SFX varchar(100),
    TRNX_ID bigint,
    REC_STATUS_CD varchar(100),
    ACCT_DT DateTime)
    insert into STG.GasStmt
    select '043','0','50','36563','','83062200','OR','12/1/2011' union all
    select '002','0','100','36563','','83062222','OR','12/1/2011' union all
    select '002','0','-.99','36563','','-83062299','RR','12/1/2011' union all
    select '002','0','-.99','36563','','-83062299','RR','2/1/2013' union all
    select '002','0','-.99','36563','','-83062299','RR','4/1/2013' union all
    select '002','0','-.99','36563','','83062299','OR','2/1/2011' union all
    select '002','0','-.99','36563','','-86768195','RR','12/1/2011' union all
    select '002','0','-.99','36563','','-86768195','RR','2/1/2013' union all
    select '002','0','-.99','36563','','-86768195','RR','4/1/2013' union all
    select '002','0','-.99','36563','','86768195','OR','3/1/2011' union all
    select '002','0','-.99','36563','','-90467786','RR','1/1/2012' union all
    select '002','0','-.99','36563','','-90467786','RR','2/1/2013' union all
    select '002','0','-.99','36563','','-90467786','RR','4/1/2013' union all
    select '002','0','-.99','36563','','90467786','OR','4/1/2011' union all
    select '002','0','-.99','36563','','-77671301','RR','2/1/2013' union all
    select '002','0','-.99','36563','','-77671301','RR','4/1/2013' union all
    select '002','0','-.99','36563','','77671301','OR','1/1/2011' union all
    select '002','0','-.99','36563','','-68420423','RR','2/1/2013' union all
    select '002','0','-.99','36563','','68420423','OR','4/1/2013' union all
    select '002','0','-.99','36563','','-188808446','RR','3/1/2013' union all
    select '002','0','-.99','36563','','188808446','OR','1/1/2013' union all
    select '002','1205.15','0','36563','A','138365544','OR','2/1/2012'

  • SQL query using Group by and Aggregate function

    Hi All,
    I need your help in writing an SQL query to achieve the following.
    Scenario:
    I have table with 3 Columns. There are 3 possible values for col3 - Success, Failure & Error.
    Now I need a query which can give me the summary counts for distinct values of col3 for each GROUP BY of col1 and col2 values. When there are no values for col3 then it should return ZERO count.
    Example Data:
    Col1 Col2 Col3
    abc 01 success
    abc 02 success
    abc 01 success
    abc 01 Failure
    abc 01 Error
    abc 02 Failure
    abc 03 Error
    xyz 07 Failure
    Required Output:
    c1 c2 s_cnt F_cnt E_cnt (Heading)
    abc 01 2 1 1
    abc 02 1 1 0
    abc 03 0 0 1
    xyz 07 0 1 0
    s_cnt = Success count; F_cnt = Failure count; E_cnt = Error count
    Please note that the output should have 5 columns with col1, col2, group by (col1,col2)count(success), group by (col1,col2)count(failure), group by (col1,col2)count(error)
    and where ever there are NO ROWS then it should return ZERO.
    Thanks in advance.
    Regards,
    Shiva

    Hi,
    user13015050 wrote:
    Thanks TTT. Unfortunately I cannot use this solution because I have huge data for this.T's solution is basically the same as mine. The first 23 lines just simulates your table. Since you actually have a table, you would start with T's line 24:
    SELECT col1 c1, col2 c2, SUM(decode(col3, 'success', 1, 0)) s_cnt, ...
    user13015050 wrote:Thanks a lot Frank. It helped me out. I just did some changes to this as below and have no issues.
    SELECT     col1
    ,     col2
    ,     COUNT ( CASE
              WHEN col3 = 'SUCCESS'
              THEN 1
              END
         )          AS s_cnt
    ,     COUNT ( CASE
              WHEN col3 = 'FAILED'
              THEN 1
              END
         )          AS f_cnt
    ,     COUNT ( CASE
              WHEN col3 = 'ERROR'
              THEN 1
              END
         )          AS e_cnt
    FROM     t1
    WHERE c2 in ('PURCHASE','REFUND')
    and c4 between to_date('20091031000000','YYYYMMDDHH24MISS') AND to_date('20100131235959','YYYYMMDDHH24MISS')
    GROUP BY c1, c2
    ORDER BY c1, c2;
    Please let me know if you see any issues in this query.It's very hard to read.
    This site normally compresses spaces. Whenever you post formatted text (such as queries or results) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.
    Also, post exactly what you're using.  The code above is SELECTing col1 and col2, but there's no mention of either in the GROUP BY clause, so I don't believe it's really what you're using.
    Other than that, I don't see anything wrong or suspicious in the query.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Memory leak issue with link server between SQL Server 2012 and Oracle

    Hi,
    We are trying to use the linked server feature with SQL Server 2012 to connect SQL server and Oracle database. We are concerned about the existing memory leak issue.  For more context please refer to the link.
    http://blogs.msdn.com/b/psssql/archive/2009/09/22/if-you-use-linked-server-queries-you-need-to-read-this.aspx
    The above link talks about the issues with SQL Server versions 2005 and 2008, not sure if this is still the case in 2012.  I could not find any article that talks about if this issue was fixed by Microsoft in later version.
    We know that SQL Server process crashes because of the third-party linked server provider which is loaded inside SQL Server process. If the third-party linked server provider is enabled together with the
    Allow inprocess option, the SQL Server process crashes when this third-party linked server experiences internal problems.
    We wanted to know if this fixed in SQL Server 2012 ?

    So your question is more of a information type or are you really facing OOM issue.
    There can be two things for OOM
    1. There is bug in SQL Server which is causing the issue which might be fixed in 2012
    2. The Linked server provider used to connect to Oracle is not upto date and some patch is missing or more recent version is to be used.  Did you made sure that you are using latest version.
    What is Oracle version you are trying to connect(9i,10g, R2...)
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Wiki Article
    MVP

  • PL/SQL 101 : Cursors and SQL Projection

    PL/SQL 101 : Cursors and SQL Projection
    This is not a question, it's a forum article, in reponse to the number of questions we get regarding a "dynamic number of columns" or "rows to columns"
    There are two integral parts to an SQL Select statement that relate to what data is selected. One is Projection and the other is Selection:-
    Selection is the one that we always recognise and use as it forms the WHERE clause of the select statement, and hence selects which rows of data are queried.
    The other, SQL Projection is the one that is less understood, and the one that this article will help to explain.
    In short, SQL Projection is the collective name for the columns that are Selected and returned from a query.
    So what? Big deal eh? Why do we need to know this?
    The reason for knowing this is that many people are not aware of when SQL projection comes into play when you issue a select statement. So let's take a basic query...
    First create some test data...
    create table proj_test as
      select 1 as id, 1 as rn, 'Fred' as nm from dual union all
      select 1,2,'Bloggs' from dual union all
      select 2,1,'Scott' from dual union all
      select 2,2,'Smith' from dual union all
      select 3,1,'Jim' from dual union all
      select 3,2,'Jones' from dual
    ... and now query that data...
    SQL> select * from proj_test;
             ID         RN NM
             1          1 Fred
             1          2 Bloggs
             2          1 Scott
             2          2 Smith
             3          1 Jim
             3          2 Jones
    6 rows selected.
    OK, so what is that query actually doing?
    To know that we need to consider that all queries are cursors and all cursors are processed in a set manner, roughly speaking...
    1. The cursor is opened
    2. The query is parsed
    3. The query is described to know the projection (what columns are going to be returned, names, datatypes etc.)
    4. Bind variables are bound in
    5. The query is executed to apply the selection and identify the data to be retrieved
    6. A row of data is fetched
    7. The data values from the columns within that row are extracted into the known projection
    8. Step 6 and 7 are repeated until there is no more data or another condition ceases the fetching
    9. The cursor is closed
    The purpose of the projection being determined is so that the internal processing of the cursor can allocate memory etc. ready to fetch the data into. We won't get to see that memory allocation happening easily, but we can see the same query being executed in these steps if we do it programatically using the dbms_sql package...
    CREATE OR REPLACE PROCEDURE process_cursor (p_query in varchar2) IS
      v_sql       varchar2(32767) := p_query;
      v_cursor    number;            -- A cursor is a handle (numeric identifier) to the query
      col_cnt     integer;
      v_n_val     number;            -- numeric type to fetch data into
      v_v_val     varchar2(20);      -- varchar type to fetch data into
      v_d_val     date;              -- date type to fetch data into
      rec_tab     dbms_sql.desc_tab; -- table structure to hold sql projection info
      dummy       number;
      v_ret       number;            -- number of rows returned
      v_finaltxt  varchar2(100);
      col_num     number;
    BEGIN
      -- 1. Open the cursor
      dbms_output.put_line('1 - Opening Cursor');
      v_cursor := dbms_sql.open_cursor;
      -- 2. Parse the cursor
      dbms_output.put_line('2 - Parsing the query');
      dbms_sql.parse(v_cursor, v_sql, dbms_sql.NATIVE);
      -- 3. Describe the query
      -- Note: The query has been described internally when it was parsed, but we can look at
      --       that description...
      -- Fetch the description into a structure we can read, returning the count of columns that has been projected
      dbms_output.put_line('3 - Describing the query');
      dbms_sql.describe_columns(v_cursor, col_cnt, rec_tab);
      -- Use that description to define local datatypes into which we want to fetch our values
      -- Note: This only defines the types, it doesn't fetch any data and whilst we can also
      --       determine the size of the columns we'll just use some fixed sizes for this example
      dbms_output.put_line(chr(10)||'3a - SQL Projection:-');
      for j in 1..col_cnt
      loop
        v_finaltxt := 'Column Name: '||rpad(upper(rec_tab(j).col_name),30,' ');
        case rec_tab(j).col_type
          -- if the type of column is varchar2, bind that to our varchar2 variable
          when 1 then
            dbms_sql.define_column(v_cursor,j,v_v_val,20);
            v_finaltxt := v_finaltxt||' Datatype: Varchar2';
          -- if the type of the column is number, bind that to our number variable
          when 2 then
            dbms_sql.define_column(v_cursor,j,v_n_val);
            v_finaltxt := v_finaltxt||' Datatype: Number';
          -- if the type of the column is date, bind that to our date variable
          when 12 then
            dbms_sql.define_column(v_cursor,j,v_d_val);
            v_finaltxt := v_finaltxt||' Datatype: Date';
          -- ...Other types can be added as necessary...
        else
          -- All other types we'll assume are varchar2 compatible (implicitly converted)
          dbms_sql.DEFINE_COLUMN(v_cursor,j,v_v_val,2000);
          v_finaltxt := v_finaltxt||' Datatype: Varchar2 (implicit)';
        end case;
        dbms_output.put_line(v_finaltxt);
      end loop;
      -- 4. Bind variables
      dbms_output.put_line(chr(10)||'4 - Binding in values');
      null; -- we have no values to bind in for our test
      -- 5. Execute the query to make it identify the data on the database (Selection)
      -- Note: This doesn't fetch any data, it just identifies what data is required.
      dbms_output.put_line('5 - Executing the query');
      dummy := dbms_sql.execute(v_cursor);
      -- 6.,7.,8. Fetch the rows of data...
      dbms_output.put_line(chr(10)||'6,7 and 8 Fetching Data:-');
      loop
        -- 6. Fetch next row of data
        v_ret := dbms_sql.fetch_rows(v_cursor);
        -- If the fetch returned no row then exit the loop
        exit when v_ret = 0;
        -- 7. Extract the values from the row
        v_finaltxt := null;
        -- loop through each of the Projected columns
        for j in 1..col_cnt
        loop
          case rec_tab(j).col_type
            -- if it's a varchar2 column
            when 1 then
              -- read the value into our varchar2 variable
              dbms_sql.column_value(v_cursor,j,v_v_val);
              v_finaltxt := ltrim(v_finaltxt||','||rpad(v_v_val,20,' '),',');
            -- if it's a number column
            when 2 then
              -- read the value into our number variable
              dbms_sql.column_value(v_cursor,j,v_n_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_n_val,'fm999999'),',');
            -- if it's a date column
            when 12 then
              -- read the value into our date variable
              dbms_sql.column_value(v_cursor,j,v_d_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
          else
            -- read the value into our varchar2 variable (assumes it can be implicitly converted)
            dbms_sql.column_value(v_cursor,j,v_v_val);
            v_finaltxt := ltrim(v_finaltxt||',"'||rpad(v_v_val,20,' ')||'"',',');
          end case;
        end loop;
        dbms_output.put_line(v_finaltxt);
        -- 8. Loop to fetch next row
      end loop;
      -- 9. Close the cursor
      dbms_output.put_line(chr(10)||'9 - Closing the cursor');
      dbms_sql.close_cursor(v_cursor);
    END;
    SQL> exec process_cursor('select * from proj_test');
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: RN                             Datatype: Number
    Column Name: NM                             Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,1     ,Fred
    1     ,2     ,Bloggs
    2     ,1     ,Scott
    2     ,2     ,Smith
    3     ,1     ,Jim
    3     ,2     ,Jones
    1     ,3     ,Freddy
    1     ,4     ,Fud
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    So, what's really the point in knowing when SQL Projection occurs in a query?
    Well, we get many questions asking "How do I convert rows to columns?" (otherwise known as a pivot) or questions like "How can I get the data back from a dynamic query with different columns?"
    Let's look at a regular pivot. We would normally do something like...
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    (or, in 11g, use the new PIVOT statement)
    but many of these questioners don't understand it when they say their issue is that, they have an unknown number of rows and don't know how many columns it will have, and they are told that you can't do that in a single SQL statement. e.g.
    SQL> insert into proj_test (id, rn, nm) values (1,3,'Freddy');
    1 row created.
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    ... it's not giving us this 3rd entry as a new column and we can only get that by writing the expected columns into the query, but then what if more columns are added after that etc.
    If we look back at the steps of a cursor we see again that the description and projection of what columns are returned by a query happens before any data is fetched back.
    Because of this, it's not possible to have the query return back a number of columns that are based on the data itself, as no data has been fetched at the point the projection is required.
    So, what is the answer to getting an unknown number of columns in the output?
    1) The most obvious answer is, don't use SQL to try and pivot your data. Pivoting of data is more of a reporting requirement and most reporting tools include the ability to pivot data either as part of the initial report generation or on-the-fly at the users request. The main point about using the reporting tools is that they query the data first and then the pivoting is simply a case of manipulating the display of those results, which can be dynamically determined by the reporting tool based on what data there is.
    2) The other answer is to write dynamic SQL. Because you're not going to know the number of columns, this isn't just a simple case of building up a SQL query as a string and passing it to the EXECUTE IMMEDIATE command within PL/SQL, because you won't have a suitable structure to read the results back into as those structures must have a known number of variables for each of the columns at design time, before the data is know. As such, inside PL/SQL code, you would have to use the DBMS_SQL package, just like in the code above that showed the workings of a cursor, as the columns there are referenced by position rather than name, and you have to deal with each column seperately. What you do with each column is up to you... store them in an array/collection, process them as you get them, or whatever. They key thing though with doing this is that, just like the reporting tools, you would need to process the data first to determine what your SQL projection is, before you execute the query to fetch the data in the format you want e.g.
    create or replace procedure dyn_pivot is
      v_sql varchar2(32767);
      -- cursor to find out the maximum number of projected columns required
      -- by looking at the data
      cursor cur_proj_test is
        select distinct rn
        from   proj_test
        order by rn;
    begin
      v_sql := 'select id';
      for i in cur_proj_test
      loop
        -- dynamically add to the projection for the query
        v_sql := v_sql||',max(decode(rn,'||i.rn||',nm)) as nm_'||i.rn;
      end loop;
      v_sql := v_sql||' from proj_test group by id order by id';
      dbms_output.put_line('Dynamic SQL Statement:-'||chr(10)||v_sql||chr(10)||chr(10));
      -- call our DBMS_SQL procedure to process the query with it's dynamic projection
      process_cursor(v_sql);
    end;
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy
    2     ,Scott               ,Smith               ,
    3     ,Jim                 ,Jones               ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    ... and if more data is added ...
    SQL> insert into proj_test (id, rn, nm) values (1,4,'Fud');
    1 row created.
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3,max(decode(rn,4,nm)) as nm_4 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    Column Name: NM_4                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy              ,Fud
    2     ,Scott               ,Smith               ,                    ,
    3     ,Jim                 ,Jones               ,                    ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    Of course there are other methods, using dynamically generated scripts etc. (see Re: 4. How do I convert rows to columns?), but the above simply demonstrates that:-
    a) having a dynamic projection requires two passes of the data; one to dynamically generate the query and another to actually query the data,
    b) it is not a good idea in most cases as it requires code to handle the results dynamically rather than being able to simply query directly into a known structure or variables, and
    c) a simple SQL statement cannot have a dynamic projection.
    Most importantly, dynamic queries prevent validation of your queries at the time your code is compiled, so the compiler can't check that the column names are correct or the tables names, or that the actual syntax of the generated query is correct. This only happens at run-time, and depending upon the complexity of your dynamic query, some problems may only be experienced under certain conditions. In effect you are writing queries that are harder to validate and could potentially have bugs in them that would are not apparent until they get to a run time environment. Dynamic queries can also introduce the possibility of SQL injection (a potential security risk), especially if a user is supplying a string value into the query from an interface.
    To summarise:-
    The projection of an SQL statement must be known by the SQL engine before any data is fetched, so don't expect SQL to magically create columns on-the-fly based on the data it's retrieving back; and, if you find yourself thinking of using dynamic SQL to get around it, just take a step back and see if what you are trying to achieve may be better done elsewhere, such as in a reporting tool or the user interface.
    Other articles in the PL/SQL 101 series:-
    PL/SQL 101 : Understanding Ref Cursors
    PL/SQL 101 : Exception Handling

    excellent article. However there is one thing which is slightly erroneous. You don't need a type to be declared in the database to fetch the data, but you do need to declare a type;
    here is one of my unit test scripts that does just that.
    DECLARE
    PN_CARDAPPL_ID NUMBER;
    v_Return Cci_Standard.ref_cursor;
    type getcardapplattrval_recordtype
    Is record
    (cardappl_id ci_cardapplattrvalue.cardappl_ID%TYPE,
    tag ci_cardapplattrvalue.tag%TYPE,
    value ci_cardapplattrvalue.value%TYPE
    getcardapplattrvalue_record getcardapplattrval_recordtype;
    BEGIN
    PN_CARDAPPL_ID := 1; --value must be supplied
    v_Return := CCI_GETCUSTCARD.GETCARDAPPLATTRVALUE(
    PN_CARDAPPL_ID => PN_CARDAPPL_ID
    loop
    fetch v_return
    into getcardapplattrvalue_record;
    dbms_output.put_line('Cardappl_id=>'||getcardapplattrvalue_record.cardappl_id);
    dbms_output.put_line('Tag =>'||getcardapplattrvalue_record.tag);
    dbms_output.put_line('Value =>'||getcardapplattrvalue_record.value);
    exit when v_Return%NOTFOUND;
    end loop;
    END;

  • Changing sql server service and sql server agent service startup account in SQL Server hosting SharePoint DB

    Hi 
    i have a sharepoint deployment with one SQL Server (running on VM) hosting the config DB and another SQL Server (Physical Host because VM was running out of space) to host the huge Content DBs. I need to schedule automatic backups of the Content DBs to a
    network share. For that i need to run the SQL Server Service with an account having permissions to the share as suggested in https://support.microsoft.com/kb/207187?wa=wsignin1.0
    I tried changing the logon as a service account to a domain
    account which has permissions to the Network Share and is also in local Administrators group of SQL Server and has "public and sysadmin" roles in SQL Server but that caused an issue. the SharePoint Web Application started showing a White Screen so
    I had to revert back to the default accounts i.e. NT Service\SQLSERVERAGENT and NT Service\MSSQLSERVER. I viewed the event logs . These are the types of error i got after changing the logon as a service account to a domain account
    1) Information Rights Management (IRM): Retried too many times to initialize IRM client. Cannot retry more. Retried times is:0x5.
    System
    Provider
    [ Name]
    Microsoft-SharePoint Products-SharePoint Foundation
    [ Guid]
    {6FB7E0CD-52E7-47DD-997A-241563931FC2}
    EventID
    5148
    Version
    15
    Level
    2
    Task
    9
    Opcode
    0
    Keywords
    0x4000000000000000
    TimeCreated
    [ SystemTime]
    2015-02-02T04:46:04.750899500Z
    EventRecordID
    176477
    Correlation
    [ ActivityID]
    {8FACE59C-1E17-50D0-7135-25FDB824CDBE}
    Execution
    [ ProcessID]
    6912
    [ ThreadID]
    8872
    Channel
    Application
    Computer
    Security
    [ UserID]
    S-1-5-21-876248814-3204482948-604612597-111753
    EventData
    hex0
    0x5
    2)
    Unknown SQL Exception 0 occurred. Additional error information from SQL Server is included below.
    The target principal name is incorrect.  Cannot generate SSPI context.
    System
    Provider
    [ Name]
    Microsoft-SharePoint Products-SharePoint Foundation
    [ Guid]
    {6FB7E0CD-52E7-47DD-997A-241563931FC2}
    EventID
    5586
    Version
    15
    Level
    2
    Task
    3
    Opcode
    0
    Keywords
    0x4000000000000000
    TimeCreated
    [ SystemTime]
    2015-02-02T07:01:35.843757700Z
    EventRecordID
    176490
    Correlation
    [ ActivityID]
    {50B4E59C-5E3A-50D0-7135-22AD91909F02}
    Execution
    [ ProcessID]
    6912
    [ ThreadID]
    5452
    Channel
    Application
    Computer
    Security
    [ UserID]
    S-1-5-17
    EventData
    int0
    0
    string1
    The target principal name is incorrect. Cannot generate SSPI context.

    Hi Aparna,
    According to your description, you get the above two errors when scheduling backups of Content DB. Right?
    Based on those two error messages, they are related to the service principal name(SPN) for SQL Server service. Please verify the if the SPN is registered successfully. You can view it in ADSI Edit or use command line. Please see:
    http://blogs.msdn.com/b/psssql/archive/2010/03/09/what-spn-do-i-use-and-how-does-it-get-there.aspx
    When installing SQL Server, those two services below should be registered:
            MSSQLSvc/servername:1433      
            MSSQLSvc/servername
    Please check if those SPNs or duplicated SPNs exist. You can use command to reset SPN or remove duplicated SPN and add new one. See:
    Setspn.
    We have also met this issue when this SPN is registered under Administrator. Please try to register it under Computer. You can add it in ADSI Edit.
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • Group Policy Guru? Group Policy and Windows 7 erratic and inconsistant.

    (*If you don't feel like reading everything, skip to the bottom two paragraphs for my questions)
    I've had a premier call open with MS since August. This week I had a Microsoft Technician in-house.  Though we eliminated some possibilities, we're not really closer to a cause or solution.
    Every time we work with an expert, I get a different explanation to describe the situation we are viewing.
    Quick summery of the issue:  We've been using Group Policy to manage most Windows XP and 7 settings for years, but starting the middle of last year, we began having clients with machines where some or all group policies would fail to apply. 
    These could be long assigned policies, new polices, or changes to policies.  It would never affect everyone or even a majority at once, and the resolution is never the same.  Sometimes a GPUDPATE /FORCE sometimes fixed automajically the next day,
    sometimes (but very rarely) longer.
    Troubleshooting History:
    What we found in early troubleshooting, that these machines, had errors in Event Viewer for Netlogon, Time-Sync, and Group Policy.  The other issue we noticed, was that our GPRESULT /H reports were missing security groups and the denied section was
    nothing but SSID's.  The first issue pointed me to:
    Event ID 5719 and event ID 1129 may be logged when a non-Microsoft DHCP Relay Agent is used
    I installed these Hot Fixes.  No change to any of the errors in event viewer, or to our Group Policy problems.
    Initial work with Premier Support found that Netlogon, Time-Sync, and Group Policy, were failing before loading of the network stack.  The suggestion was to apply the group policy setting "Always wait for the network at computer startup and
    logon".  At the time, this seemed not to work.  The policy was set on a test bed of laptops and desktops, and no changes in behavior were seen after 3 days.
    Windows 7 Clients intermittently fail to apply group policy at startup
    For some time after this, we were collecting GPSVC and NetTrace logs for Premeir Support, trying to document and troubleshoot the problem.  Eventually we got fed up and asked our TAM to call in a pro to get this resolved.  We were sent an engineer
    for 3 days.  For three days we banged away on this issue.  We verified AD and replication health, we tried numerous fixes and workarounds.  I learned 3 different desriptions of how Group Policy works, and in the end we thought we had a workaround
    using the "Always wait for the network at computer startup and logon" because of a single success late in the day.  On day 3 we tried replicating this fix, and quickly realized that the same issue we were having preventing other GPOs to apply,
    were also preventing our "fix" GPO from applying.  So we went the route of using a registry entry.  I also had a problem that even though it was making the process more consistant, it was still taking 3 reboots for a Computer Policy, assigned
    to a computer object via Security Group, to fully take affect on a computer.
    I used the registry methods in the above article.  It didn't work, no sign it was having the same affect the GPO had had.
    Our support engineer claimed this was the proper method, but that path wasn't even close in a Windows 7 SP1 registry, and after creating all the keys that were not present, it still didn't work.
    Always wait for the network at computer startup and logon - AzureWeb
    We ran out of time, our engineer returned home.
    I can understand how these errors indicate a problem applying Group Policy at boot.  But to me it doesn't explain why it doesn't correct post boot, and after a GPUDPATE /FORCE and a reboot.
    It also doesn't explain why we were working fine for years, then all of a sudden DHCP is being outrun by background services.  (By the way logging showed DHCP wasn't significantly delayed, out boot process was actually excellent, health wise.) 
    Why all of a sudden is this not behaving optimly?  No changes to network design or function.  No changes to the domain since 2008 R2 was installed in 2011.
    Today I'm reading through all these KB's and articles again, and took some time to read:
    [Forum FAQ] Common steps to start troubleshooting Group Policy
    application and it's links below.
    We ran though all of that before and during the 3-day onsite.  It's not getting us any closer to the cause or a solution.
    I found and begin some deep reading in this link today.  It has some additional information I will try to use next week:
    Group Policy Basics - Part 3: How Clients Process GPOs
    The one unanswered question I have is this.  How is group policy supposed to apply to a computer, when that policy is applied to a AD Security Group, in which the computer object is a member?
    Before we began having this problem, we would assign a computer GPO, then ask the user to reboot.  If it were a user GPO, we'd ask the user to log off, or reboot.  Either way, if we allowed a few minutes for AD and FRS replication, the user would
    log back in with that new policy in affect.  A new imaged machine would boot with all the GPO's linked to that domain and assigned to "Authenticated Users", already in affect.  Admin groups would be present in administrators, proxy settings
    would be set in Internet Explorer, etc.
    Now I'm aked to beleive this was never the case from Premeier Support and Microsoft Engineers.  That those policies require the equilent of a "GPUPDATE /FORCE" that was executed by the Local_System account.  That 3 reboots may
    be nessessary for a group policy to be applied.  One for the AD Security Group to be applied.  One for the Computer Policy to be applied.  And a final one for the policy in the GPO to be applied to Windows.
    Can someone confirm or correct this information please?  It's imperitive to my troubleshootng.
    There's no place like 127.0.0.1

    That key is empty on all of my machines I have checked today.  Working and problematic alike.
    GPRESULT logs, when ran as me, historically would show the group polices applied, denied, and the AD group membership all by name.  About 6 months ago I noticed this changed.
    Now they show the applied GPO's by name, a few of the denied GPO's by name, most by SID, and only 2 to 3 AD groups, though PowerShell shows all the AD groups assigned.  This happens after several AD security and distribution groups are added to the
    machine (Radia software distribution uses Dist groups to assign software).
    A check showed no groups with long legacy Kerberos keys.
    When we make a change to AD Security Group membership, to assign or deny a Group Policy, is usually when we encounter this problem.  It will usually fix itself in 24 hours of the machine being left up and running.  But no amount of GPUPDATE /FORCE
    and rebooting will cause the changes to take affect.
    During this time, the Group Policies will show assigned to the computer in the GPRESULT log.
    Yesterday I began looking into Spanning Tree configuration on our network being a possible cause for the boot up issues.  I'm waiting on responses from our Network group to confirm our configuration.
    There's no place like 127.0.0.1

  • The Ultimate Guide to Resolving Profile and Device Manager Issues

    The following article also applies to issues after re-setting the severs' hostname. It also applies to situations where re-setting the Code Signing Certifictateas described by Apple has not resolved the issue.
    Hello,
    I have been plagued with Profile Manager and Device Manager issues since day one.
    I would like to share my experience and to suggest a way how to resolve issues such as device cannot be enrolled or Code Signing Certificate not accepted.
    I shall try to be as brief as possible, just giving an overview of the steps that resolved my issues. The individual steps have been described elsewhere in this forum. For users who have purchased commercial SSL certs the following may not apply.
    In my view many of these issues are caused by missing or faulty certificates. So let us first touch on the very complex matter of certificates.
    Certificates come in many flavours such as CA (Certificate Authority), Code Signing Certificate, S/MIME and Server Identification.
    (Mountain?) Lion Server creates a so-called Intermediate CA certificate (IntermediateCA_hostname_1") and Server Identification Certificate ("hostname") when it installs first. This is critical for the  operation of many server functionalities, including Open Direcory. These certs together with the private/public keys can be found in your Keychain. Profile  and Device Manager may need a Code Signing Certificate.
    The most straightforward way to resolve the Profile Manaher issues is in my view to reset the server created certicates.
    The bad news is that this procedure involves quite a few steps and at least 2 hours of your precious time because it means creating a fresh Direcory Master.
    I hope that I have not forgotten to mention an important step. Readers' comments and addenda are welcome.
    I shall outline a sensible strategy:
    1. Clone your dysfunctional server to an external harddrive (SuperDuper does a reliable job)
    2. Start the server fom the clone and shut down ALL services.
    3. It may be sensible to set up a root user access.
    4. Back-up all user data such as addess book, calendar and other data that you *may* need to set up your server.
    5. Open Workgroup Manager and export all user and workgroup accounts to the drive that you using to re-build your server (it may cause problems if you back-up to an external drive).
    6. Just in case you may also want to back-up the Profile Manager database and erase user profiles:
    In Terminal (this applies to Lion Server - paths may be diferent in Mountain Lion !)
    Backup: sudo pg_dump -U _postgres -c device_management > $HOME/device_management.sql
    Erase database:
    sudo /usr/share/devicemgr/backend/wipeDB.sh
    7. Note your Directory (diradmin) password for later if you want to re-use it.
    8. Open Open Server Admin and demote OD Master to Standalone Directory.
    9. In Terminal delete the old Certificate Authority
    sudo rm -R /var/root/Library/Application\ Support/Certificate\ Authority/
    This step is crucial because else re-building you OD Master will fail.
    9. Go back to Server Admin and promote the Standalone Directory to OD Master. You may want to use the same hostname.
    10. When the OD Master is ready click on Overview and check that the LDAP and Keberos Realm reflect your server's hostname.
    11. Go back to Workgroup Manager and re-import users and groups.
    NOTE: passwords are not being exported. I do not know how to salvage user passwords. (Maybe passwords can be recovered by re-mporting an OD archive - comments welcome! ).
    12. Go to Server App and reset passwords and (not to forget) user homefolder locations, in particular if you want to login from a network account!
    If the home directory has not been defined you cannot login from a network account.
    13. You may now want to restore Profile Manager user profiles in Terminal. Issue the following commands:
    sudo serveradmin stop devicemgr
    sudo serveradmin start postgres
    sudo psql -U _postgres -d device_management -f $HOME/device_management.sql
    sudo serveradmin start devicemgr
    14. You can now switch back on your services, including Profile Manager.
    In Profile Manager you may have to configure Device Management. This creates a correct Code Signng Certicate.
    15. Check the certificate settings in Server App -> Hadware -> Settings-> SSL Certificates.
    16. Check that Apple Push Notifications are set.(you easily check if they are working later)
    17. You may want to re-boot OS Server from the clone now.
    18. After re-boot open Server App and check that your server is running well.
    19. Delete all profiles in System Preferences -> Profiles.
    19. Login to Profile Manager. You should have all users and profiles back. In my experience devices have to be re-enrolled before profiles can be pushed and/or devices be enrolled. You may just as well delete the displayed devices now.
    20. Grab one of your (portable) Macs that you want to enrol and go to (yourhostname)/mydevices and install the server's trust profile. The profile's name  should read "Trust Profile for...) and underneath in green font "Verified".
    21. Re-enrol that device. At this stage keep your finger's crossed and take a deep breath.
    22. If the device has been successfully enrolled you may at last want to test if pushing profiles really works. Login to Profile Manager as admin, select the newly enrolled device. Check that Automatic Push is enabled (-> Profile -> General). Create a harmless management profile such as defining the dock's position on the target machine. (Do not forget to click SAVE at the end - this is easily missed here). If all is well Profile Manager will display an active task (sending) and the dock's position on the target will have changed in a few seconds if you are on a LAN (Note: If sending seems to take forever: check on the server machine and/or on your router that the proper ports are open and that incoming data is not intercepted by Little Snitch or similar software).
    Note: if you intend to enrol an Apple iPhone you may first need to install the proper Apple Configuration software.
    Now enjoy Profile and Device Manager !
    Regards,
    Twistan

    HI
    1. In Action profiles, logon to system and recheck correcion are available in action definition as well in condition configuration and the schedule condition is also maintained. but the display is not coming(i.e in the worklist this action is not getting displayed).
    You can check the schedule condition for the action and match the status values...or try recreating the action with schedule condition again....for customer specific ....copy the standard aciton with ur zname and make a schedule condition and check the same.
    2, In suppport team of incident when i give individual processor it throwing a warning that u r not the processor. but when i give org unit it is working perfectly. Could anyone guide on this.
    You need to have the empolyee role for BP ..goto BP and got here dropdown for ur bp and choose role Employee and then enter ur userid
    also make sure that u have the message processing role
    Hope it clarifies ur doubt and resolve ur prob
    Regards
    Prakhar

Maybe you are looking for