AT NEW/AT END performance versus using temp variables.

What would be more efficient?  Imagine you are looping through a table that contains sales information for many sales documents/sales groups/sales offices.  I need an AT  statement for each.  The amount of data that this report is HUGE so any performance would be noticed. 
So, which would perform better?  Using some if statements along with temp variables or the AT statements?
Regards,
Davis

You can use Control break statements.
make sure you sorted the Table with the fields you are going to perform control breaks.
and also those fields should be in the beginning or the fields changes should not controlled by any other fields.
you need to use one extra work area to avoid '*' may be some cases you want some data from the work area.
loop at itab.
wa = itab.
at new field.
"here some fields which are next to control break fields will appear stars
"so use work area here
endat.
endloop.

Similar Messages

  • Performance when using bind variables

    I'm trying to show myself that bind variables improve performance (I believe it, I just want to see it).
    I've created a simple table of 100,000 records each row a single column of type integer. I populate it with a number between 1 and 100,000
    Now, with a JAVA program I delete 2,000 of the records by performing a loop and using the loop counter in my where predicate.
    My first JAVA program runs without using bind variables as follows:
    loop
    stmt.executeUpdate("delete from nobind_test where id = " + i);
    end loop
    My second JAVA program uses bind variables as follows:
    pstmt = conn.prepareStatement("delete from bind_test where id = ?");
    loop
    pstmt.setString(1, String.valueof(i));
    rs = pstmt.executeQuery();
    end loop;
    Monitoring of v$SQL shows that program one doesn't use bind variables, and program two does use bind variables.
    The trouble is that the program that does not use bind variables runs faster than the bind variable program.
    Can anyone tell me why this would be? Is my test too simple?
    Thanks.

    [email protected] wrote:
    I'm trying to show myself that bind variables improve performance (I believe it, I just want to see it).
    I've created a simple table of 100,000 records each row a single column of type integer. I populate it with a number between 1 and 100,000
    Now, with a JAVA program I delete 2,000 of the records by performing a loop and using the loop counter in my where predicate.
    Monitoring of v$SQL shows that program one doesn't use bind variables, and program two does use bind variables.
    The trouble is that the program that does not use bind variables runs faster than the bind variable program.
    Can anyone tell me why this would be? Is my test too simple?
    The point is that you have to find out where your test is spending most of the time.
    If you've just populated a table with 100,000 records and then start to delete randomly 2,000 of them, the database has to perform a full table scan for each of the records to be deleted.
    So probably most of the time is spent scanning the table over and over again, although most of blocks might already be in your database buffer cache.
    The difference between the hard parse and the soft parse of such a simple statement might be negligible compared to effort it takes to fulfill each delete execution.
    You might want to change the setup of your test: Add a primary key constraint to your test table and delete the rows using this primary key as predicate. Then the time it takes to locate the row to delete should be negligible compared to the hard parse / soft parse difference.
    You probably need to increase your iteration count because deleting 2,000 records this way probably takes too short and introduces measuring issues. Try to delete more rows, then you should be able to spot a significant and constant difference between the two approaches.
    In order to prevent any performance issues from a potentially degenerated index due to numerous DML activities, you could also just change your test case to query for a particular column of the row corresponding to your predicate rather than deleting it.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • About the performance of using shared variables

    Hi
     My system has almost 200 I/O points,I want to the host communicates with PLC using OPC(I had create a OPC I/O Server).
    If I use shared variables bind to OPC data items, there must be almost 200 shared variables.
    And my puzzle is that what's the performance of using these shared variables,will they eat up memory and processor?
    Another question is that if an shared variable will be time-lapse(i.e.,if an OPC data item value change feedback to  program block with 0 delay or not),if so ,then how many millseconds it will delaies? 

    here is a paper about the performance of shared variables.
    Steve Bird
    Culverson Software - Elegant software that is a pleasure to use.
    Culverson.com
    Blog for (mostly LabVIEW) programmers: Tips And Tricks

  • New Mac Pro performance versus a Dual 1.8 ghz G5

    I just wanted to give some feedback of my new Mac Pro experience and hopefully get some feedback about it.
    I recently updated to from a 1.8 dual core G5 to a 3.0 dual quad core Intel machine. In "core" processing (burning from disk and encoding) is sustainably faster (from 30 mins to 10 mins when encoding). The biggest disappointment is just the day to day operation. It isn't faster. I wanted to see Photoshop, iTunes and other programs to just jump up when I launch them (they just take as long to load maybe a little faster) and other just basic mac processes to work really quickly but the speed of the interface is the same as it is on my powermac. It still is paging my hard drive and isn't loading all my data when I do searches (I probably need to leave it running over night to get the hard drive completely paged). I have had this weird fan issue that makes the fans ALOT louder than my powermac (have read many posts here about that issue). Also have been having weird midi issues where my fire wire speakers don't work when I put the computer to sleep and wake it and I have plug my headphones in multiple times before they will be recognized by the computer.
    My intial feeling is that the operating system hasn't been fully built up around the mac pro as it is for the powermac. I feel that the powermac was more "integrated" with the Os.
    I also have heard that the new version of the OS is supposed to make better use the core processers. Therefore making tasks like hard drive searches and program launching faster. My fear is that part of using the core processors is on software like Adobe to write code to use them.
    I have been on the phone with Apple about the speaker issue. I think I'm just going to bring my speakers into a Apple Store and repeat my issue. I also want to see if my headphone port is messed up (because it takes the headphones but seems "loose").
    Thanks for listening to my concerns.

    Migration Assistant, we began to see a year ago, was causing most systems to run like molasses and sluggish. So that is #1. There is no way to really see the potential at all with 1GB of RAM and 2GB gets it at least off the starvation diet.
    8GB (4 x 2? 8 x 1?) if you use large files (750MB and above) AND you then use the "enable VM Buffer plug-in" from Adobe.
    If you were use to using a 750GB drive, that would also help explain why. if you hate launching... don't - just leave them up.... or boot off a RAID.
    but be sure to set PS scratch to another drive than boot drive if possible.
    When you stop seeing swap files and pageouts, then you know that you are a step further. Tiger or Leopard can still use available memory for cache or for virtual scratch space to some extent that also can improve performance... once running.
    You'll want to reformat the 750GB drive to native GUID/GPL partition map from the PPC/APL it has.
    I don't even keep /users on boot drive (its there, just in case a program thinks it needs to be there) but ~/ home is on 2nd drive or RAID, just to insure that the boot drive can do its thing and not get slowed down by also having to serve for other tasks and programs.
    The new Seagate 7200.11 750/1000GB drive has better multi-user functionality, though the Hitachi seems to be faster overall, except... not all Hitachi models work well on Mac Pro (or didn't as I don't keep up). Which is why I have been using strictly WD Raptor and Caviar (RE/RE2 and SE16) for the last year, they seem to be okay (and some el cheapo MaxLine Pro based on price).
    Today, 2 x 1GB costs a mere $139, while a year ago it would have set someone back $700-800.

  • CONVERSION OF 1 TABLE TO ACCOMODATE MORE FIELDS USING TEMP VARIABLE IN STORE PROC

    USE [FacetsXR]
    GO
    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[frdmrpt].[pr_pha_BiometricsSummary]') AND type in (N'P', N'PC'))
    DROP PROCEDURE [frdmrpt].[pr_pha_BiometricsSummary]
    GO
    CREATE PROCEDURE [frdmrpt].[pr_pha_BiometricsSummary]
    AS
    BEGIN
    DECLARE @partcTot int
    select @partcTot = [frdmrpt].[fn_pha_total_participants](default);
    DECLARE @TEMPTAB TABLE
    SortOrder varchar(10),
    BIMEASURE VARCHAR(30) NULL,
    RC VARCHAR(100) NULL,
    N VARCHAR(5) NULL,
    PSP VARCHAR(10) NULL,
    AVL VARCHAR(10) NULL
    -- HEADER ROW
    --INSERT @TEMPTAB VALUES('00','Biometric Measure','Risk Citeria','N','Percent Of Screened Population','Average Value')
    --BMI
    declare @BMI int
    select @BMI = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.BMI is not null and hra.BMI != 0.0
    declare @BMIavg float
    select @BMIavg = cast(isnull(AVG(BMI),0.0) as decimal (5,1))
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.BMI is not null
    INSERT @TEMPTAB VALUES('100','BMI','.',@BMI,'.',@BMIavg)
    INSERT @TEMPTAB
    SELECT SortOrder,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,CAST((CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @BMI)) AS VARCHAR(100))) AS VARCHAR(100))+'%'
    FROM
    SELECT
    CASE
    WHEN (BMI > 0 AND BMI <= 17.4) THEN '101'
    WHEN BMI BETWEEN 17.5 AND 18.4 THEN '102'
    WHEN BMI BETWEEN 18.5 AND 24.9 THEN '103'
    WHEN BMI BETWEEN 25.0 AND 29.9 THEN '104'
    WHEN BMI >=30.0 THEN '105'
    END AS SortOrder,
    CASE
    WHEN (BMI > 0 AND BMI <= 17.4) THEN 'Very Underweight'
    WHEN BMI BETWEEN 17.5 AND 18.4 THEN 'Underweight'
    WHEN BMI BETWEEN 18.5 AND 24.9 THEN 'Normal'
    WHEN BMI BETWEEN 25.0 AND 29.9 THEN 'Overweight'
    WHEN BMI >=30.0 THEN 'Obese'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumBI
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumBI.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='BMI'
    WHERE bioSumBI.RiskCriteria is not null
    GROUP BY bioSumBI.SortOrder ,bioSumBI.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='101') = 0
    INSERT @TEMPTAB VALUES('101','Very Underweight','17.4 and below','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='102') = 0
    INSERT @TEMPTAB VALUES('102','Underweight','17.5 to 18.4','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='103') = 0
    INSERT @TEMPTAB VALUES('103','Normal','18.5 to 24.9','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='104') = 0
    INSERT @TEMPTAB VALUES('104','Overweight','25 to 29.9','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='105') = 0
    INSERT @TEMPTAB VALUES('105','Obese','30.0 and above','0','0%','.')
    --BP
    declare @BP INT
    select @BP = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.Systolic IS NOT NULL AND hra.Diastolic IS NOT NULL
    declare @BPAVG VARCHAR(15)
    select @BPAVG = isnull(CONVERT(VARCHAR(15),AVG(Systolic)) + '/' + CONVERT(VARCHAR(5),AVG(Diastolic)),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.Systolic IS NOT NULL AND hra.Diastolic IS NOT NULL
    INSERT @TEMPTAB VALUES('200','Blood Pressure (mmHg)','.',@BP,'.',@BPAVG)
    INSERT @TEMPTAB
    SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,
    CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @BP)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN (Systolic <=119 and Diastolic <=79) THEN 201
    WHEN not (Systolic >= 140 OR Diastolic >=90) and not (Systolic <=119 and Diastolic <=79) THEN 202 -- basically not high and not low
    WHEN (Systolic >= 140 OR Diastolic >=90) THEN 203
    --WHEN hra.Systolic <=119 OR hra.Diastolic <=79 THEN '201'
    --WHEN (hra.Systolic between 120 AND 139) OR (hra.Diastolic between 80 AND 89) THEN '202'
    --WHEN (hra.Systolic between 140 AND 159) OR (hra.Diastolic between 90 AND 99) THEN '203'
    --WHEN hra.Systolic >= 160 OR hra.Diastolic >= 100 THEN '204'
    END AS SortOrder,
    CASE
    WHEN (Systolic <=119 and Diastolic <=79) THEN 'Low Risk'
    WHEN not (Systolic >= 140 OR Diastolic >=90) and not (Systolic <=119 and Diastolic <=79) THEN 'Moderate Risk'
    WHEN (Systolic >= 140 OR Diastolic >=90) THEN 'High Risk'
    --WHEN hra.Systolic <=119 OR hra.Diastolic <=79 THEN 'Low Risk'
    --WHEN (hra.Systolic between 120 AND 139) OR (hra.Diastolic between 80 AND 89) THEN 'Prehypertension'
    --WHEN (hra.Systolic between 140 AND 159) OR (hra.Diastolic between 90 AND 99) THEN 'Stage I hypertension'
    --WHEN hra.Systolic >= 160 OR hra.Diastolic >= 100 THEN 'Stage II hypertension'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumBP
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumBP.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='Blood Pressure (mmHg)'
    WHERE bioSumBP.RiskCriteria is not null
    GROUP BY bioSumBP.SortOrder ,bioSumBP.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    --ORDER BY BS.Risk_Higher_Limit desc
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='201') = 0
    INSERT @TEMPTAB VALUES('201','Low Risk','Systolic: 119 and below Diastolic: 79 and below','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='202') = 0
    INSERT @TEMPTAB VALUES('202','Moderate Risk','Systolic: 120 to 139 Diastolic: 80 to 89','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='203') = 0
    INSERT @TEMPTAB VALUES('203','High Risk','Systolic: 140 and above Diastolic: 90 and above','0','0%','.')
    --Cholestrol
    declare @TC int
    select @TC = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.TotalCholesterol IS NOT NULL
    declare @TCAVG FLOAT
    select @TCAVG = isnull(ROUND(AVG(TotalCholesterol),0),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.TotalCholesterol IS NOT NULL
    INSERT @TEMPTAB VALUES('300','Total Cholesterol (mg/dL)','.',@TC,'.',@TCAVG)
    INSERT @TEMPTAB
    SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,
    CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @TC)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN hra.TotalCholesterol <= 199 THEN '301'
    WHEN hra.TotalCholesterol between 200 and 239 THEN '302'
    WHEN hra.TotalCholesterol >=240 THEN '303'
    END AS SortOrder,
    CASE
    WHEN hra.TotalCholesterol <= 199 THEN 'Low Risk'
    WHEN hra.TotalCholesterol between 200 and 239 THEN 'Moderate Risk'
    WHEN hra.TotalCholesterol >=240 THEN 'High Risk'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumHDL
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumHDL.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='Total Cholesterol (mg/dL)'
    WHERE bioSumHDL.RiskCriteria is not null
    GROUP BY bioSumHDL.SortOrder ,bioSumHDL.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='301') = 0
    INSERT @TEMPTAB VALUES('301','Low Risk','199 and below','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='302') = 0
    INSERT @TEMPTAB VALUES('302','Moderate Risk','200 to 239','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='303') = 0
    INSERT @TEMPTAB VALUES('303','High Risk','240 and above','0','0%','.')
    --HDL
    declare @HDL int
    select @HDL = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE HDL IS NOT NULL
    declare @HDLAVG FLOAT
    select @HDLAVG = isnull(ROUND(AVG(HDL),0),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE HDL IS NOT NULL
    INSERT @TEMPTAB VALUES('400','HDL (mg/dL)','.',@HDL,'.',@HDLAVG)
    INSERT @TEMPTAB
    SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,
    CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @HDL)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN hra.HDL >= 60 THEN '401'--'Desirable'
    WHEN hra.HDL between 40 and 59 THEN '402' --'Borderline Risk'
    WHEN hra.HDL <=39 THEN '403' --'Undesirable Risk'
    END AS SortOrder,
    CASE
    WHEN hra.HDL >= 60 THEN 'Low Risk'--'Desirable'
    WHEN hra.HDL between 40 and 59 THEN 'Moderate Risk' --'Borderline Risk'
    WHEN hra.HDL <=39 THEN 'High Risk' --'Undesirable Risk'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumHDL
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumHDL.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='HDL (mg/dL)'
    WHERE bioSumHDL.RiskCriteria is not null
    GROUP BY bioSumHDL.SortOrder ,bioSumHDL.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit desc
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='401') = 0
    INSERT @TEMPTAB VALUES('401','Low Risk','60 and above','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='402') = 0
    INSERT @TEMPTAB VALUES('402','Moderate Risk','40 to 59','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='403') = 0
    INSERT @TEMPTAB VALUES('403','High Risk','39 and below','0','0%','.')
    --LDL
    declare @LDL DECIMAL
    select @LDL = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.LDL IS NOT NULL
    declare @LDLAVG FLOAT
    select @LDLAVG = isnull(ROUND(AVG(LDL),0),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.LDL IS NOT NULL
    INSERT @TEMPTAB VALUES('500','LDL (mg/dL)','.',@LDL,'.',@LDLAVG)
    INSERT @TEMPTAB
    SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,
    CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @LDL)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN hra.LDL <= 129 THEN '501'
    WHEN hra.LDL between 130 and 159 THEN '502' --'Borderline Risk'
    WHEN hra.LDL >=160 THEN '503' --'Undesirable Risk'
    END AS SortOrder,
    CASE
    WHEN hra.LDL <= 129 THEN 'Low Risk'--'Desirable'
    WHEN hra.LDL between 130 and 159 THEN 'Moderate Risk' --'Borderline Risk'
    WHEN hra.LDL >=160 THEN 'High Risk' --'Undesirable Risk'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumHDL
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumHDL.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='LDL (mg/dL)'
    WHERE bioSumHDL.RiskCriteria is not null
    GROUP BY bioSumHDL.SortOrder ,bioSumHDL.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='501') = 0
    INSERT @TEMPTAB VALUES('501','Low Risk','129 and below','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='502') = 0
    INSERT @TEMPTAB VALUES('502','Moderate Risk','130 to 159','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='503') = 0
    INSERT @TEMPTAB VALUES('503','High Risk','160 and above','0','0%','.')
    --Blood Glucose (mg/dL)
    declare @BG int
    select @BG = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.BloodGlucose IS NOT NULL
    declare @BGAVG FLOAT
    select @BGAVG = ROUND(AVG(BloodGlucose),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.BloodGlucose IS NOT NULL
    INSERT @TEMPTAB VALUES('600','Blood Glucose (mg/dL)','.',@BG,'.','NA' )--@BGAVG)
    INSERT @TEMPTAB
    PRESENT CODE FOR ATTACHED TABLE SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria, Total,
    CAST(([frdmrpt].[fn_pha_percent](Total, @BG)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN (r.RiskLevel='LOW') THEN '601'
    WHEN (r.RiskLevel='MODERATE') THEN '602'
    WHEN (r.RiskLevel='HIGH') THEN '603'
    END AS SortOrder,
    CASE
    WHEN (r.RiskLevel='LOW') THEN 'Low Risk'
    WHEN (r.RiskLevel='MODERATE') THEN 'Moderate Risk'
    WHEN (r.RiskLevel='HIGH') THEN 'High Risk'
    END AS RiskCriteria,
    CASE
    WHEN (r.RiskLevel='LOW') THEN cast(COUNT(*)AS VARCHAR(100))
    WHEN (r.RiskLevel='MODERATE') THEN cast(COUNT(*)AS VARCHAR(100))
    WHEN (r.RiskLevel='HIGH') THEN cast(COUNT(*)AS VARCHAR(100))
    END AS Total
    From [frdmrpt].[wt_rpt_pha_pw_UserHealthRisks] r
    join [frdmrpt].[wt_rpt_pha_pw_Member] m on m.UserID = r.UserID
    join [frdmrpt].[wt_rpt_pha_pw_HRADetail2] d on d.UserID = m.UserID
    where r.RiskStringID = 'BLOODGLUCOSE'
    group by GROUPING sets ( r.RiskLevel, () )
    ) bioSumHDL
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumHDL.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='Blood Glucose (mg/dL)'
    WHERE bioSumHDL.RiskCriteria is not null
    GROUP BY bioSumHDL.SortOrder ,bioSumHDL.Total,bioSumHDL.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='601') = 0
    INSERT @TEMPTAB VALUES('601','Low Risk','Fasting: 70 to 99 Random: 80 to 139','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='602') = 0
    INSERT @TEMPTAB VALUES('602','Moderate Risk','Fasting: 100 to 125 Random: 140 to 199','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='603') = 0
    INSERT @TEMPTAB VALUES('603','High Risk','Fasting: 126 and above Random: 200 and above','0','0%','.')
    --Triglycerides (mg/dL)
    declare @TRG int
    select @TRG = count(*)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.Triglycerides IS NOT NULL
    declare @TRGAVG FLOAT
    select @TRGAVG = isnull(ROUND(AVG(Triglycerides),0),0)
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    WHERE hra.Triglycerides IS NOT NULL
    INSERT @TEMPTAB VALUES('700','Triglycerides (mg/dL)','.',@TRG,'.',@TRGAVG)
    INSERT @TEMPTAB
    SELECT SortOrder ,RiskCriteria, BS.Risk_Criteria,COUNT(*) Total,
    CAST(([frdmrpt].[fn_pha_percent](COUNT(*), @TRG)) AS VARCHAR(100))+'%','.'
    FROM
    SELECT
    CASE
    WHEN hra.Triglycerides <=149 THEN '701'
    WHEN hra.Triglycerides between 150 and 199 THEN '702'
    WHEN hra.Triglycerides >= 200 THEN '703'
    END AS SortOrder,
    CASE
    WHEN hra.Triglycerides <=149 THEN 'Low Risk'
    WHEN hra.Triglycerides between 150 and 199 THEN 'Moderate Risk'
    WHEN hra.Triglycerides >= 200 THEN 'High Risk'
    END AS RiskCriteria
    FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ON hra.UserID = mems.UserID
    ) bioSumHDL
    INNER JOIN [frdmrpt].[pw_Biometric_Summary] BS
    ON bioSumHDL.RiskCriteria = BS.Bimetric_Measure
    AND BS.Bimetric_Category ='Triglycerides (mg/dL)'
    WHERE bioSumHDL.RiskCriteria is not null
    GROUP BY bioSumHDL.SortOrder ,bioSumHDL.RiskCriteria,BS.Risk_Criteria,BS.Risk_Higher_Limit
    ORDER BY BS.Risk_Higher_Limit
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='701') = 0
    INSERT @TEMPTAB VALUES('701','Low Risk','149 and below','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='702') = 0
    INSERT @TEMPTAB VALUES('702','Moderate Risk','150 to 199','0','0%','.')
    IF (Select COUNT(*) from @TEMPTAB where SortOrder ='703') = 0
    INSERT @TEMPTAB VALUES('703','High Risk','200 and above','0','0%','.')
    SELECT BIMEASURE,RC ,N ,PSP ,AVL FROM @TEMPTAB order by SortOrder
    END
    GO
    GRANT EXECUTE ON [frdmrpt].[pr_pha_BiometricsSummary] to FRDM_END_USER_ROLE
    GO

    work done so far(I will aprreciate your input and suggestions to complete this.Thanks
    CREATE PROCEDURE [frdmrpt].[pr_pha_T1T2_BiometricsSummaryTest]
    AS
    BEGIN
    DECLARE @partcTot int
    select @partcTot = [frdmrpt].[fn_pha_total_participants](default);
    DECLARE @TEMPTAB TABLE
    SortOrder varchar(10),
    BIMEASURE VARCHAR(30) NULL,
    RC VARCHAR(100) NULL,
    N_T1 VARCHAR(5) NULL,
    PSP_T1 VARCHAR(10) NULL,
    AVL_T1 VARCHAR(10) NULL,
    N_T2 VARCHAR(5) NULL,
    PSP_T2 VARCHAR(10) NULL,
    AVL_T2 VARCHAR(10) NULL
    -- HEADER ROW
    --INSERT @TEMPTAB VALUES('00','Biometric Measure','Risk Citeria','N','Percent Of Screened Population','Average Value')
    --BMI
    --declare @BMI int
    --select @BMI = count(*)
    ---FROM [frdmrpt].[wt_rpt_pha_pw_HRADetail2] hra
    ---INNER JOIN [frdmrpt].[wt_rpt_pha_pw_Member] mems
    ---ON hra.UserID = mems.UserID
    --WHERE hra.BMI is not null and hra.BMI != 0.0
    declare @BMI_T1 int
    select @BMI_T1 = count(*)
    FROM [frdmrpt].wt_pha_T1T2_HRADetail2 hra
    INNER JOIN [frdmrpt].wt_pha_T1T2_Member mems
    ON hra.UserID = mems.UserID and hra.T1T2=mems.T1T2
    WHERE hra.BMI is not null and hra.BMI <> 0.0 and mems.T1T2 = 'T1'
    --SELECT @BMI_T1
    declare @BMI_T2 int
    select @BMI_T2 = count(*)
    FROM [frdmrpt].wt_pha_T1T2_HRADetail2 hra
    INNER JOIN [frdmrpt].wt_pha_T1T2_Member mems
    ON hra.UserID = mems.UserID and hra.T1T2=mems.T1T2
    WHERE hra.BMI is not null and hra.BMI <> 0.0 and mems.T1T2 = 'T2'
    ---SELECT @BMI_T2
    declare @BMIavg_T1 float
    select @BMIavg_T1 = cast(isnull(AVG(BMI),0.0) as decimal (5,1))
    FROM [frdmrpt].wt_pha_T1T2_HRADetail2 hra
    INNER JOIN [frdmrpt].wt_pha_T1T2_Member mems
    ON hra.UserID = mems.UserID
    WHERE hra.BMI is not null --and hra.BMI <> 0.0
    and mems.T1T2 = 'T1'
    --SELECT @BMIavg_T1

  • Poor performance when using bind variable in report

    I have a report that takes 1 second to run if i 'hardcode' a particular value into the where clause of the report. However, if i now replace the hardcoded value with a bind variable and set the default value for the bind variable to be the (previous) hard coded value the report now takes 50 seconds to run instead of 1 second!!
    Has anyone else seen this behaviour - any suggestions to workaround this will be gratefully received

    More info
    SELECT patch_no, count(*) frequency
    FROM users_requests
    WHERE patchset IN (SELECT arps2.patchset_name
    FROM aru_bugfix_relationships abr, aru_bugfixes ab, aru_status_codes ac,
    aru_patchsets arps, aru_patchsets arps2
    WHERE arps.patchset_name = '11i.FIN_PF.E'
    AND abr.bugfix_id = ab.bugfix_id
    AND arps.bugfix_id = ab.bugfix_id
    AND abr.relation_type = ac.status_id
    AND arps2.bugfix_id = abr.related_bugfix_id
    AND abr.relation_type IN (601, 602))
    AND included ='Y'
    GROUP BY patch_no
    order by frequency desc, patch_no
    Runs < 1 sec from SQL navigator and from portal (if i hardcode the value for fampack.
    Takes ~50 secs if i replace with :fampack and set default value to 11i.FIN_PF.D

  • Temp Variable in BPC NW

    Hello
    I was trying to find the post in SDN on use of temp variable in BPC NW 7.0 however i could not find any thing. In all posts no one is using temp variables.
    Can we declare temp variable in script logic like we used to do in BPC MS with   # sign ?
    or we have to always define memeber in dimension before we can use that in logic ?
    All help will be appreciated
    Thanks in Advance
    Regards
    RS
    Edited by: rinku singh on Apr 7, 2010 8:23 PM

    Hello,
    In BPC NW 7.0 (SP02), I just want to know if we can declare and use local variable in a script logic.
    For exemple, the user enter a year in a data manager package (with a prompt). For exemple year = 2011.
    Then I want to execute calculations for this Year and the previous year (so 2011 et 2010).
    I have already tried to use variable with # or % (like in *select instruction) but the result was not OK.
    Thanks for your Help.
    Best Regards,
    A. Portal.

  • I just bought a new ipad 2, but cannot use it as it only shows the itunes logo and a cable. This means it's telling me to connect to itunes. I've done this and performed sync, still no way out. I have been told by friends to "unlock" it. Please how?

    I just bought a new ipad 2, but cannot use it as it only shows the itunes logo and a cable. This means it's telling me to connect to itunes. I've done this and performed sync, still no way out. I have been told by friends to "unlock" it. Please how?

    Have you already activated the iPad? That is what the screen is telling you to do - connect to the computer with the supplied cable and run iTunes while connected - and iTunes should guide you through the set up process. If you have already set up the iPad and you are still getting thos screen message - restore your iPad. You must be running iTunes 10.5 on your computer as well.
    Restore iPad
    If you can't restore this way, you will need recovery mode.
    Unable to Restore

  • Does rebuild of indexes uses temp tablespace or system tablespace?

    Does rebuild of indexes uses temp tablespace or system tablespace?
    If so why?

    If you combine the answers from Aman and Burleson, they cover most of the picture.
    When rebuilding an index, you may end up sorting a large amount of information. The sort may spill into the temporary tablespace - if you haven't configured your database and users properly, it is possible that the SYSTEM tablespace may be used for the temporary tablespace.
    As the new copy of the index is built, it has to be built in the right place (tablespace), and the space used to build it will be marked as a temporary segment as the build takes place. When the build is complete, this temporary segment will take on the name of the origrinal index, and the original index will be re-badged as a temporary segment and dropped. (Again, you might see temporary segments in the SYSTEM tablespace if the index was originally in, or was rebuilt into, the SYSTEM tablespace).
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk

  • Connecting my gen 4-5 nano to new HP laptop windows 8 using USB 3.0 port computer restarts

    Connecting my gen 4-5 nano to my new HP laptop Windows 8 using USB 3 port itunes opens then an error message "your PC ran into a problem and needs to restart ........." WDF Violation. If I don't disconnect the nano the computer will keep restarting. After restarting a window opens saying it wants to send the following files to Microsoft "C:\Windows\Minidump\010413-20875-01.dmp, C:\Users\Curt\AppData\Local\Temp\WER-69015-0.sysdata.xml, C:\Windows\Memory.DMP". It does this using any of the 3 USB 3 ports. Using the USB 2 port it works just fine. Thanks for any help and as info.  

    I found the solution!!!  Even ended up educating the tech -- who didn't have a clue. 
    I have a brand new  7th gen iPod Nano, a brand new Windows 8 laptop, and iTunes 11. Everything was working all fine and dandy until I unchecked "Enable Disc Usage in iTunes on my iPod.  Then all **** broke loose.  I got the blue screen of death with the WDF_violation and I couldn't even plug my Nano in without it crashing my computer, let alone access iTunes to re-enable Disc Usage. Because the crashes happened immediately after I did that I just KNEW it had something to do with that.  Trouble is, I couldn't fix it.... or so I thought. 
    I was on the phone with the tech, who was telling me it was a driver issue ( impossible because both the laptop and iPod are brand new) and I was looking around in iTunes while she did. 
    HERE IS YOUR SOLUTION:  Edit --> Preferences --> Devices.  Check Prevent iPods, iPhones and iPads from syncing automatically.  Then plug in your iPod, re-enable the Disc Usage, and voile!!!!!!  Problem solved.

  • Load/Performance Testing using ECATT

    Please provide the process to perform Load/Performance Testing using ECATT ASAP. 
    What are the T-Codes are required to fulfill Load/Performance Testing using ECATT.
    Thanks in ADVANCE.

    Hello Colleague,
    Here are the steps that you need to do, for performance testing using ST30.
    Use transaction ST30 to invoke Global Performance Analysis ( Widely used for performance tests of certain transactions ).
    On the eCATT test tab, Key in the folloing data
    Log ID ( needs to be created ONLY for the first run ),
    Performance test ( logically the entries for Perrformance test field are of the format:
    Logid_name/PERF_transaction_name/systemname/client/date ),
    Name of the testconfiguration ( You need to create a test configuration for the eCATT to be used in ST30, use the same name for the created Test
    configuration as that of the test script ),
    No of times the test configuration needs to be run as the preprocessor to create the required backend data, No of times the test configutation needs to
    run as processor ( both these fields are filled with 5 and 5 or 5 and 10 respectively for performance measurements, but in your case you can give a 1 and 1
    or 0 and 1 in these fields for your requirements )
    With all the check boxes in the Programming guidelines and Distributed Statistics Data unchecked ( unless req ). In the data comparison ( use No - option
    for With values ).
    Click on the eCATT test only button to start the performance run using ST30.
    Now the procedure stated above makes the eCATT test configuration execute as many times as the sum of pre and pro given by the user AT ONE STRETCH ONLY. But if there is a requirement of having the eCATT execute after an interval, we follow a different approach.
    We have a VB script that will create a ECA session, call se37, select the required test package and then execute all the required test cases ( eCATTs ) in the
    test package and also ensure the KILL ECA session at the end of the execution.
    We then create a batch file to execute the VB script and call the batch file for our executions
    In you case, please schedule the execution of the batch file for every 30 mins ( or any such time duration ) using the simple scheduler functionality provided by
    Windows.
    The only problem with this is that whenever we have some system messages / Software updates / any new screens the scheduling is bound to fail as the called VB script does not handle the new situation. Please also ensure that the user whose Password has been given in the scheduler has to be the user who has logged into the system during the execution period.
    So, to summarize : ST30 will only allow you to run the eCATT as many times as required, but only at ONE STRETCH, you need to use the second mechanism to make the eCATT run effectively after a predetermined time without any user interaction.
    FYI : A new feature to handle the scheduling of executions is being developed, will post the details when it is available and the usage steps soon. We also have a new command called PERF ENDPERF in eCATT also ( a new development ), kindly go through the documentations for the new developments in eCATTs for the same
    Thanks and best regards,
    Sachin

  • Original Mac Pro vs. New High End iMac

    I need some advice on whether to upgrade my system, based on the new iMacs just released. Here is what I currently own:
    Mac Pro with two Intel 2.66 dual-core processors
    (Early 2007 edition)
    16x SuperDrive
    6 GB of RAM
    1 TB HD
    ATI Radeon X1900 XT (512 MB)
    23" Apple Cinema Display
    I'm thinking of selling all of that and buying the new high-end 27" iMac...
    2.66 GHz Intel Core i5
    4 GB of memory (I'd upgrade it to 8 GB)
    2 TB Hard Drive
    ATI Radeon HD 4850 graphics card
    27" display (included)
    I've done the research and I would roughly break even. Cost is not a concern. The following factors are under consideration...
    What is the performance difference between the 4 cores I have in the 2.66 Xeon vs the 4 cores in the Core i5?
    My X1900 graphics card is not supported under Snow Leopard's OpenCL. The new HD 4850 would be.
    I'd like to have the larger display (27" vs. 23") and the built-in video camera for iChat video conferencing.
    I am not a professional user. I consider myself a high-end consumer user, primarily doing HD video editing with iMovie, some video encoding with Handbrake, and "prosumer" work with RAW files in Aperture. The rest is basic use- such as email, internet, etc.
    I would love to have some objective advice on this decision!
    Thanks....

    If performance of the current system is acceptable, I would keep it and use it as long as possible. You have the capability to do things like replace the display (with a new one of your choosing), upgrade the video card, and add additional internal SATA drives. Having additional internal storage at SATA speed, and using separate drives to store OS/apps and user data, will increase your system's productivity.
    When performance of your Mac Pro is no longer acceptable, get a new Mac Pro.

  • We purchased a new iPad2 and registered it using a 'new' iCloud email/ID. We are unable to send email from the iPad and iPhone. The error is: Cannot send mail. The user name or password for iCloud is incorrect.

    We purchased a new iPad2 and registered it using a 'new' iCloud email/ID. We are unable to send email from the iPad and iPhone. The error is:>> Cannot send mail. The user name or password for iCloud is incorrect.

    About ~20 hours later, this ended up solving itself. We can send email using the '.icloud' email from both the iPad and iPhone.  Advise would be 'wait' before you start seeking alteranatives like yahoo, hotmail, etc.  This definitely is a convenient way to keep all your 'cloud' information in a centralized place, including the common email...

  • Cannot add a new reminider.  I am using and Iphone5, version 6.0.2.  When I press the reminder app on the phone there is no  "+" sign to input a reminder.  When I use Siri it says "sorry I wasnt able to create reminder".

    Cannot add a new reminider.  I am using and Iphone5, version 6.0.2.  When I press the reminder app on the phone there is no  "+" sign to input a reminder.  When I use Siri it says "sorry I wasnt able to create reminder".

    Many thanks.
    With those symptoms, I'd try the following document:
    Apple software on Windows: May see performance issues and blank iTunes Store
    (If there's a SpeedBit LSP showing up in Autoruns, it's usually best to just uninstall your SpeedBit Video Accelerator.)

  • New front end editor problem

    I'm quite new on this forum, so don't be so angry if I'm asking about something that someone else was asking for, by i didn't found answer on my question.
    So...
    In my company, administrator made some updates in 4.7, after that updates new front end editor is avalible (this with code colorin, code hinting, code folding etc).
    But I have some problem with keyboard mapping scheme. When i try to save my own setting or change keymapping to another one, SAP stop responding (it's not a crash with short dump but simple aplication freezing). I tried to find some solution to this behaviour but I didn't found it yet.
    My question is:
    1) Did someone else have the same behaviour as I?
    2) Where to find SAP notes about front end editor?
    Thx in advance for any help

    Keymapping settings in file:
    C:\Documents and Settings\<username>\ab4_data\keymap.xml
    You should also check the following dir:
    c:\Program Files\SAP\FrontEnd\SapGui\ab4_data\
    During patch installation this dir is overwritten, and during the first use of the new editor, the files from here copied to the <username>\ab4_data dir, where you can change them.
    Might be permission problem, I can imagine, that if you have no authorization to change these files, the GUI starts hanging or chrashing.
    Peter

Maybe you are looking for