Suppressing two sets of data

I am trying to suppress two different things.  I use the section expert and originally used
if {@90th Percentile returned time} = "0" then True
It worked perfectly, but now I want to add
if next({incident.incidentnumber}) = {incident.incidentnumber} then true
What I put was
if {@90th Percentile returned time} = "0" then True
and
if next({incident.incidentnumber}) = {incident.incidentnumber} then true
What I am trying to do is hide or remove some of my erroneous duplicates and I need to suppress anything with a 0.
I appreciate any help!

>
Tim Knudsen wrote:
> if {@90th Percentile returned time} = "0" then True
> and
> if next({incident.incidentnumber}) = {incident.incidentnumber} then true
>
> What I am trying to do is hide or remove some of my erroneous duplicates and I need to suppress anything with a 0.
>
Tim, from your description I'm not quite sure if you are trying to hide records if both are true or to hide them if either is true.
If you want to hide records where both are true (90th=0 AND incident # = incident # for each record) try:
if {@90th Percentile returned time} = "0" AND next({incident.incidentnumber}) = {incident.incidentnumber} then true
If you want to hide records where either is true (90th=0 OR incident # = incident #) try:
if {@90th Percentile returned time} = "0"  OR next({incident.incidentnumber}) = {incident.incidentnumber} then true

Similar Messages

  • How can I plot two sets of data points in 3d graph?

    I am trying to plot two sets of data points in 3d graph. I also need to make a set of data with one color and the other set with a different color. For example, I need to plot 10 (x,y,z) data points in 3d. Maybe 6 points need to be blue, and the other 4 red. Could you tell me how?

    What is your LabVIEW version? (things are a bit different in 8.6).
    Basically, you create multiple instances of the VI that gets placed automatically when you drop the 3D graph and then wire a different number to the plot ID at the bottom.
    You define all the plots (color, style) via the properties dialog.
    Have a look at my example posted here. One plot is "blue dots" and one is a "shaded surface". See how far you get.
    Message Edited by altenbach on 04-23-2009 01:11 PM
    LabVIEW Champion . Do more with less code and in less time .

  • Numbers Charts:  Stacked bar chart with two sets of data

    Hi....newbie with first post
    I'm trying to create a stacked bar chart with two lots of comparative data.
    So, there is Basic + Overtime for April 2010 as one stack (ie the basic is one colour, stacked on top is the overtime in a different colour)
                      Basic + Overtime for April 2011 next to it.
    I can get April figures next to one another as four separate columns (2 x basic, 2 x overtime), or I can get all four figures in a single stack, but I can't to two columns (one for each year) of basic and overtime.
    Any ideas?

    UK,
    What you are trying to do will require that you interleave the two sets of two-series data into one longer two-set series, and then plot just the one pair of series. Clear as mud?
    What I'm saying is that you have to fake out the program into thinking that you have just two sets, not four.
    You will have one series that has, for instance Apr 2010 Basic, Apr 2011 Basic, May, 2010 Basic, May 2011 Basic, etc.. This other series in that chart will have Apr 2010 Overtime, Apr 2011 Overtime, May 2010 Overtime, May 2011 Overtime, etc.
    The bad thing here (I'll anticipate your first complaint) is that 2010 and 2011 data will have the same color for Basic and another color will be assigned for both 2010 Overtime and 2011 Overtime.
    To get separate color sets for 2010 and 2011 data pairs, you will have to create two charts and overlay them. Both charts would be built in the alternating year format described above, but in one chart all the 2010 data would be blank and in the other chart all the 2011 data would be blank, leaving spaces in the chart for the data from the other chart to nest in.
    Jerry

  • Can I import two sets of data to Essbase at the same time?

    For example, use two MaxL Scripts to import data to one Essbase Application Datatbase in the Same time, is there any impacted or will fail one of them or nothing will happen but just slower?

    Hi,
    It is possible you run 2 dataloads at the same time, there will be probably be an impact in performance than just loading one file at a time, not sure of the impact, you will have to test.
    If you are talking about a BSO cube you could also have a look at the essbase configuration setting DLTHREADSWRITE to see if you can improve the dataload time by increasing the threads.
    If it is an ASO cube then you should be able to do multiple dataloads into different buffers, though you have to commit them to the database at the sametime.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Two sets of Data within one Chart

    Hi All,
    I am creating a report that is strictly all charts.  It's coming along well, but I've hit a roadblock.
    I have a database of 1200 clients, each divided into 40 sales branches.  Each report has graphs that shows the sales performance of each sales branch, and in a few charts, I like to compare certain branch performance data to the overall performance for all 40 branches.  I'm currently using parameters and "F5" so each report is only of the selected sales branch.
    For example:  Bar chart #1 has a breakdown of of our client profile based on gender, for Branch #25.  So I have a chart with two bars, Males=75%, Females=25%.  Great.  Now I want to insert values that applies to all 40 branches to the same chart, to see how Branch #25 compares to the overall client gender profile for all 40 branches.  Ideally, I like to see two bars for males, one for Branch #25, and one for Overall, and likewise, two bars for females.
    How do I do that?
    Thank you very much!!
    Frank

    Thanks Brian!  You are correct, that is a disappointment... 
    Does anyone know of a workaround?  For example, by creating a subreport of the overall, and importing certain numbers into the main report, and using that to chart the values?? 
    In the example I've set up, let's say I created another report that summarizes the client profile based on gender for all 40 branches so I have the percentages, then import those percentages into the main report.  I know you can import summations from another subreport(stored variables), and using those stored variables for calculations in the main report.  Can we do the same for charting?
    Anyone?
    Thank You!!
    Frank

  • Date bug in import process: 2 sets of dates from same-day shots

    I took some photos the other day using my Canon 400D. Some were taken around 1:00 pm, and another series of them were taken around 4:00 pm. I then went to import them into Lightroom from a CompactFlash card, using a cardreader, by using the organize by date option with the format 2005\2005-12-17. When Lightroom's import process gathered the file data for the images, I discovered that there were two sets of dates, viz. 2007\2007-02-24 and 2007\2007-02-25. Considering that I was doing the import on 2007-02-24, the second date was quite in error.
    I checked the EXIF data and the files seemed OK. The timestamps on the files on the card were also as expected.
    I then manually copied the image files to a temporary folder on my PC. When I pointed Lightroom's import function to that folder, lo and behold, the files were picked up and the correct single date for the target folder was set up for me.
    Has anyone else come across this problem? Any suggestions how to work around it from WITHIN Lightroom if it occurs again?
    Witold

    Are you by chance using Vista? That would be a "Known issue", the workaround being copying the images to the hard drive and importing from there instead of importing directly from the card/camera.
    Alexander.
    Canon EOS 400D (aka. XTi) • 20" iMac Intel • 12" PowerBook G4 • OS X 10.4 • LR 1 • PSE 4

  • HT1660 How can I maintain two separate itunes accounts on one single device without erasing either set of data?

    How can I maintain two separate itunes accounts on one single device (iPhone 4) without erasing either set of data when syncing with itunes?

    Hi,
    Depends on what you trying to setup but for purchases, you can roll down at the very end of the iTunes/AppStore pages to logout of your current account and then login into a different one.  Then your purchase will be billded to this new account.  Everything will sync as you used to but in order to use the purchased content in iTunes or other devices, you will need to authorize them (you can have up to 5 devices authorized for an account).

  • Numbers: two sets of numeric data in one cell. Delete, clear all deletes one set only. Cell is not "locked"  Ideas?

    Numbers: two sets of numeric data in one cell. Delete, clear all deletes one set only. Cell is not "locked"  Ideas?

    You will really need to explain in a little more detail what you mean by "two sets of numeric data in one cell".
    Also, is this Numbers for OS X? You are in the iOS forum.

  • Problem while having a large set of data to work on!

    Hi,
    I am facing great problem with processing large set of data. I have a requirement in which i'm supposed to generate a report.
    I have a table and a MView, which i have joined to reduce the number of records to process. The MView holds 200,00,000 records while the table 18,00,000. Based on join conditions and where clause i'm able to break down the useful data to approx 4,50,000 and i'm getting 8 of my report columns from this join. I'm dumping these records into the table from where i'll be generating the report by spooling.
    Below is the block which takes 12mins to insert into the report table MY_ACCOUNT_PHOTON_DUMP:
    begin
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    insert into MY_ACCOUNT_PHOTON_DUMP --- Report table
    (SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO, CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO)
    select crm.SUBSCR_NO, crm.ACCOUNT_NO, crm.AREA_CODE, crm.DEL_NO, crm.CIRCLE_ID,
    aa.CREATED_DATE, aa.EMAIL_ID, aa.ALTERNATE_CONTACT
    from MV_CRM_SUBS_DTLS crm, --- MView
    (select /*+ ALL_ROWS */ A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
    from MCCI_PROFILE_DTLS a, MCCI_PROFILE_SUBSCR_DTLS b
    where A.PROFILE_ID = B.PROFILE_ID
    and B.ACE_STATUS = 'N'
    ) aa --- Join of two tables giviing me 18,00,000 recs
    where crm.SUBSCR_NO = aa.SUBSCR_NO
    and crm.SRVC_TYPE_ID = '125'
    and crm.END_DT IS NULL;
    INTERNET_METER_TABLE_PROC_1('MCCIPRD','MY_ACCOUNT_PHOTON_DUMP'); --- calling procedure to analyze the report table
    COMMIT;
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    end; --- 12 min 04 secFor the rest of the 13 columns required i am running a block which has a FOR UPDATE cursor on the report table:
    declare
    cursor cur is
    select SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO,
    CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO
    from MCCIPRD.MY_ACCOUNT_PHOTON_DUMP --where ACCOUNT_NO = 901237064
    for update of
    MRKT_SEGMNT, AON, ONLINE_PAY, PAID_AMNT, E_BILL, ECS, BILLED_AMNT,
    SRVC_TAX, BILL_PLAN, USAGE_IN_MB, USAGE_IN_MIN, NO_OF_LOGIN, PHOTON_TYPE;
    v_aon VARCHAR2(10) := NULL;
    v_online_pay VARCHAR2(10) := NULL;
    v_ebill VARCHAR2(10) := NULL;
    v_mkt_sgmnt VARCHAR2(50) := NULL;
    v_phtn_type VARCHAR2(50) := NULL;
    v_login NUMBER(10) := 0;
    v_paid_amnt VARCHAR2(50) := NULL;
    v_ecs VARCHAR2(10) := NULL;
    v_bill_plan VARCHAR2(100):= NULL;
    v_billed_amnt VARCHAR2(10) := NULL;
    v_srvc_tx_amnt VARCHAR2(10) := NULL;
    v_usg_mb NUMBER(10) := NULL;
    v_usg_min NUMBER(10) := NULL;
    begin
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    for rec in cur loop
    begin
    select apps.TTL_GET_DEL_AON@MCCI_TO_PRD591(rec.ACCOUNT_NO, rec.DEL_NO, rec.CIRCLE)
    into v_aon from dual;
    exception
    when others then
    v_aon := 'NA';
    end;
    SELECT DECODE(COUNT(*),0,'NO','YES') into v_online_pay
    FROM TTL_DESCRIPTIONS@MCCI_TO_PRD591
    WHERE DESCRIPTION_CODE IN(SELECT DESCRIPTION_CODE FROM TTL_BMF_TRANS_DESCR@MCCI_TO_PRD591
    WHERE BMF_TRANS_TYPE
    IN (SELECT BMF_TRANS_TYPE FROM
    TTL_BMF@MCCI_TO_PRD591 WHERE ACCOUNT_NO = rec.ACCOUNT_NO
    AND POST_DATE BETWEEN
    TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY') AND SYSDATE
    AND DESCRIPTION_TEXT IN (select DESCRIPTION from fnd_lookup_values@MCCI_TO_PRD591 where
    LOOKUP_TYPE='TTL_ONLINE_PAYMENT');
    SELECT decode(count( *),0,'NO','YES') into v_ebill
    FROM TTL_CUST_ADD_DTLS@MCCI_TO_PRD591
    WHERE CUST_ACCT_NBR = rec.ACCOUNT_NO
    AND UPPER(CUSTOMER_PREF_MODE) ='EMAIL';
    begin
    select ACC_SUB_CAT_DESC into v_mkt_sgmnt
    from ttl_cust_dtls@MCCI_TO_PRD591 a, TTL_ACCOUNT_CATEGORIES@MCCI_TO_PRD591 b
    where a.CUST_ACCT_NBR = rec.ACCOUNT_NO
    and a.market_code = b.ACC_SUB_CAT;
    exception
    when others then
    v_mkt_sgmnt := 'NA';
    end;
    begin
    select nvl(sum(TRANS_AMOUNT),0) into v_paid_amnt
    from ttl_bmf@MCCI_TO_PRD591
    where account_no = rec.ACCOUNT_NO
    AND POST_DATE
    BETWEEN TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY')
    AND SYSDATE;
    exception
    when others then
    v_paid_amnt := 'NA';
    end;
    SELECT decode(count(1),0,'NO','YES') into v_ecs
    from ts.Billdesk_Registration_MV@MCCI_TO_PRD591 where ACCOUNT_NO = rec.ACCOUNT_NO
    and UPPER(REGISTRATION_TYPE ) = 'ECS';
    SELECT decode(COUNT(*),0,'PHOTON WHIZ','PHOTON PLUS') into v_phtn_type
    FROM ts.ttl_cust_ord_prdt_dtls@MCCI_TO_PRD591 A, ttl_product_mstr@MCCI_TO_PRD591 b
    WHERE A.SUBSCRIBER_NBR = rec.SUBSCR_NO
    and (A.prdt_disconnection_date IS NULL OR A.prdt_disconnection_date > SYSDATE )
    AND A.prdt_disc_flag = 'N'
    AND A.prdt_nbr = b.product_number
    AND A.prdt_type_id = b.prouduct_type_id
    AND b.first_level LIKE 'Feature%'
    AND UPPER (b.product_desc) LIKE '%HSIA%';
    SELECT count(1) into v_login
    FROM MCCIPRD.MYACCOUNT_SESSION_INFO a
    WHERE (A.DEL_NO = rec.DEL_NO or A.DEL_NO = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
    AND to_char(A.LOGIN_TIME,'Mon-YYYY') = to_char(sysdate-5,'Mon-YYYY');
    begin
    select PACKAGE_NAME, BILLED_AMOUNT, SERVICE_TAX_AMOUNT, USAGE_IN_MB, USAGE_IN_MIN
    into v_bill_plan, v_billed_amnt, v_srvc_tx_amnt, v_usg_mb, v_usg_min from
    (select rank() over(order by STATEMENT_DATE desc) rk,
    PACKAGE_NAME, USAGE_IN_MB, USAGE_IN_MIN
    nvl(BILLED_AMOUNT,'0') BILLED_AMOUNT, NVL(SRVC_TAX_AMNT,'0') SERVICE_TAX_AMOUNT
    from MCCIPRD.MCCI_IM_BILLED_DATA
    where (DEL_NUM = rec.DEL_NO or DEL_NUM = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
    and STATEMENT_DATE like '%'||to_char(SYSDATE,'Mon-YY')||'%')
    where rk = 1;
    exception
    when others then
    v_bill_plan := 'NA';
    v_billed_amnt := '0';
    v_srvc_tx_amnt := '0';
    v_usg_mb := 0;
    v_usg_min := 0;
    end;
    -- UPDATE THE DUMP TABLE --
    update MCCIPRD.MY_ACCOUNT_PHOTON_DUMP
    set MRKT_SEGMNT = v_mkt_sgmnt, AON = v_aon, ONLINE_PAY = v_online_pay, PAID_AMNT = v_paid_amnt,
    E_BILL = v_ebill, ECS = v_ecs, BILLED_AMNT = v_billed_amnt, SRVC_TAX = v_srvc_tx_amnt,
    BILL_PLAN = v_bill_plan, USAGE_IN_MB = v_usg_mb, USAGE_IN_MIN = v_usg_min, NO_OF_LOGIN = v_login,
    PHOTON_TYPE = v_phtn_type
    where current of cur;
    end loop;
    COMMIT;
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    exception when others then
    dbms_output.put_line(SQLCODE||'::'||SQLERRM);
    end;The report takes >6hrs. I know that most of the SELECT queries have ACCOUNT_NO as WHERE clause and can be joined, but when i joining few of these blocks with the initial INSERT query it was no better.
    The individual queries within the cursor loop dont take more then 0.3 sec to execute.
    I'm using the FOR UPDATE as i know that the report table is being used solely for this purpose.
    Can somebody plz help me with this, i'm in desperate need of good advice here.
    Thanks!!
    Edited by: user11089213 on Aug 30, 2011 12:01 AM

    Hi,
    Below is the explain plan for the original query:
    select /*+ ALL_ROWS */  crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
    from MV_CRM_SUBS_DTLS crm,
            (select /*+ ALL_ROWS */  A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
            from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
            where A.PROFILE_ID = B.PROFILE_ID
            and   B.ACE_STATUS = 'N'
            ) aa
    where crm.SUBSCR_NO    = aa.SUBSCR_NO
    and   crm.SRVC_TYPE_ID = '125'
    and   crm.END_DT IS NULL
    | Id  | Operation              | Name                     | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT       |                          |  1481K|   100M|       |   245K  (5)| 00:49:09 |
    |*  1 |  HASH JOIN             |                          |  1481K|   100M|    46M|   245K  (5)| 00:49:09 |
    |*  2 |   HASH JOIN            |                          |  1480K|    29M|    38M| 13884   (9)| 00:02:47 |
    |*  3 |    TABLE ACCESS FULL   | MCCI_PROFILE_SUBSCR_DTLS |  1480K|    21M|       |  3383  (13)| 00:00:41 |
    |   4 |    INDEX FAST FULL SCAN| SYS_C002680              |  2513K|    14M|       |  6024   (5)| 00:01:13 |
    |*  5 |   MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG   |  1740K|    82M|       |   224K  (5)| 00:44:49 |
    Predicate Information (identified by operation id):
       1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
       2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
       3 - filter("B"."ACE_STATUS"='N')
       5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Whereas for the modified MView query, the plane remains the same:
    select /*+ ALL_ROWS */ crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
    from    (select * from MV_CRM_SUBS_DTLS
             where SRVC_TYPE_ID = '125'
             and   END_DT IS NULL) crm,
            (select /*+ ALL_ROWS */  A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
            from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
            where A.PROFILE_ID = B.PROFILE_ID
            and   B.ACE_STATUS = 'N'
            ) aa
    where crm.SUBSCR_NO  = aa.SUBSCR_NO
    | Id  | Operation              | Name                     | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT       |                          |  1481K|   100M|       |   245K  (5)| 00:49:09 |
    |*  1 |  HASH JOIN             |                          |  1481K|   100M|    46M|   245K  (5)| 00:49:09 |
    |*  2 |   HASH JOIN            |                          |  1480K|    29M|    38M| 13884   (9)| 00:02:47 |
    |*  3 |    TABLE ACCESS FULL   | MCCI_PROFILE_SUBSCR_DTLS |  1480K|    21M|       |  3383  (13)| 00:00:41 |
    |   4 |    INDEX FAST FULL SCAN| SYS_C002680              |  2513K|    14M|       |  6024   (5)| 00:01:13 |
    |*  5 |   MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG   |  1740K|    82M|       |   224K  (5)| 00:44:49 |
    Predicate Information (identified by operation id):
       1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
       2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
       3 - filter("B"."ACE_STATUS"='N')
       5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Also took your advice and tried to merge all the queries into single INSERT SQL, will be posting the results shortly.
    Edited by: BluShadow on 30-Aug-2011 10:21
    added {noformat}{noformat} tags.  Please read {message:id=9360002} to learn to do this yourself                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • How do I fix the format of plot lines for an arbitrary number of sets of data plotted on the same x-y graph?

    I have written a VI to control an LCR meter and perform sweep functions. I have two x-y graphs on my front panel (one for each of 2 measured parameters). The VI is designed with a button to take a measurement, plot the data, and then wait for another event (like write data, clear plot, or measure again). The user can take simultaneous measurements and these are plotted on top of the data already there in the same x-y graphs (using an array of clusters with shift registers). There is no limit on how many sets of data can be on the x-y graph. 
    Now, my issue is in formatting the plot lines (color, thickness, marker type). I can fix the first plot line by adjusting the parameters of the x-y graph. However, I also want to control the format of all consecutive plot lines. How can I do this for an arbitrary number of lines? The main thing I want to do is to have all of them show the square marker type (the default is no marker type, just a line). 
    Thanks!

    Hi maximum23,
    You can use properties of plot to format the plot lines. You just have to select Active plot using property and then select color and line width etc..
    You can define color and width etc parameter by asking user or you can just make them default.
    Gaurav k
    CLD Certified !!!!!
    Do not forget to Mark solution and to give Kudo if problem is solved.

  • Pie chart with two measures and date dimension navigation not working

    Hi Experts,
    Pie chart with two measures and date dimension navigation not working. Any help is appreciated.
    Thanks
    V

    Hi Deepak,
    I had time dimension in the RPD.
    I have stacked bar chart with same time dim like year & month in the report. when I go to legand and set navigation it is working fine. But not with pie chart.
    I am not not sure what is the problem. When I click on Pie chart it is not navigating to the target report. Can it be any other issues..???

  • Problem with JFMERGE using the last set of data on all form sets

    b Here is the scenario...
    I have a file of data in character delimited format. Each record needs to produce the same set of forms. I used the Visual Transformation Editor to define the field nominated data file, including the ^form statements. The transformation agent processes the file of multiple records just fine. I get a file from JFTRANS that contains the set of data & form statements repeated for each record in the file.
    Great so far.
    b Here is the problem...
    Unfortunately, the print agent is repeating the same set of data on every copy of the set of forms that is printed. It uses the data from the last set of forms to be printed.
    b Additional information...
    For ease of building the field nominated file by our custom client program we have standardized on using GLOBAL fields. Therefore, our DAT files contain all the ^global declarations with all the necessary ^form statements at the end. I used the transformation editor to build the same type of file. Therefore the file going to the print agent looks like:
    ^global name1
    data1
    ^global name2
    data2
    ^form form1
    ^form form2
    ^global name1
    newdata1
    ^global name2
    newdata2
    ^form form1
    ^form form2
    The printed result puts "newdata1" & "newdata2" on both sets of form2. Strangely, the correct data values are showing up on both sets of form1.
    The only difference between the two forms is that form1 is not defined with the fields as global where form2 does define them as global. I removed the global flag from form2 but that didn't change anything.
    I think I know what is occurring. JFMERGE is putting the data elements into the global dictionary and when the same name is in the file the latest data value ends up in the dictionary (thus all the forms print using the last data put into the dictionary). I certainly didn't expect this. I thought it would load the dictionary and print the first set of forms, then reload the dictionary and print the next set of forms. At least that is the way I see it occuring based on the order of things in the DAT file. Based on what is occuring it appears that JFMERGE reads the whole DAT file, building the dictionaries and then processes all the ^form statements.
    b My request for help...
    Does anyone know what I can do (such as insert some command) to get the correct data on the forms?
    It looks like using a fixed record format might do the trick but that isn't an option at this time. As things are right now it looks like I'm going to have to take the data file apart, and make lots of files with one line of data per file.
    I have a time deadline to get this printed so if there isn't a way to make JFMERGE operate as I expected it looks like I'll be doing this by hand. Fortunately, there are only 250-300 data lines. Future jobs could have thousands of data lines so if there isn't a way to do it with a multi-line data file I'll have someone create a program for splitting the original file up (and adding the ^job line).

    We don't use subforms so some of what you suggest isn't effective for us. Our forms are a duplication of the paper forms and are not dynamic. We design each form separately and our custom filler software (that builds the DAT file) uses a database with the required forms for each print job defined within it.<br /><br />Since we don't have subforms, our simple method of printing is to declare as GLOBAL all the unique fields used on all the forms to be printed followed by the ^form commands. It was a simple & straightforward method for us to do better than 500 print jobs over that past couple of years. In addition, our custom filler software can easily build the DAT files.<br /><br />Because I need to manually declare all the ^form statements my experience with using ^field is that the form must be declared followed by the appropriate field commands (repeat for all the forms).<br /><br />i <many hours later I continue typing this post><br /><br />Taking the original Transformation Definition File and customizing it for how things need to be when using ^field commands was a time consuming experience (and a learning one). It did the job as I needed but it is an experience I don't want to repeat. There has got to be a better way of building the TDF properly. I'm definately leaning towards a program to split the file into multiple files.<br /><br />Here is what I went through...<br /><br />I did all the editting using Notepad. I changed all ^global to ^field. Then I had to remove the #comment lines. Next was to find any references to fields that had to be global to the top and chang ^field to ^global. Then it was moving the fields around so they were underneath the appropriate ^form statement. Then, since one of the forms was multi-page I had to add the ^page command and sort the fields so they were under the correct page. That was followed by duplicating fields that are used on multiple forms or multiple pages. Then I discovered that the print agent won't do ^page unless "inline processing" is turned off (I did not want to specify the page name as it could change over time) - so I added "^inline off" in front of every ^page. Finally it was removing unecessary field references so the log file would be clean.<br /><br />For those that are interested, the following is an example of my final file. <br /><br />^global nameA<br />data<br />^global nameB<br />data<br />^form nameX<br />^field name1<br />data<br />^field name2<br />data<br />^form nameY<br />^field name1<br />data<br />^field name3<br />data<br />^inline off<br />^page<br />^field name4<br />data<br />^field name1<br />^inline off<br />^page<br />^field name5<br />data<br />^field name6<br />data

  • Cut off some data from a set of data

    I have a set of data our of from power spectral density measurement.
    I want to curve fitting with the data.
    But I want to cut off below a certain frequency.  For instance, the PSD
    data
    contains {(1,10),(2,3),(3,4),(4,3),...} and I want to fit from 3 Hz
    {(3,4),(4,3)...}.
    I tried high pass filter but it does not change.  I tried several ways such
    as
    get time stamp and find the index for the element and remove the array
    using
    'Array'tools.  It cuts off the data sets that I wanted to remove but it
    does
    not remove the frequency values.  So it works like {(1,4),(2,3)....}
    instead of
    {(3,4),(2,3)..}.  Please let me know how I can make this work.
    Thanks.
    Attachments:
    new-PSD-factor-v3.vi ‏522 KB

    Hey Arrow,
    I think I understand your question better now. You'll want to use the "delete from array.vi". It can be found by going to all functions>>Array. You'll need to input the array to be edited, which index to begin deleting from and how many elements to delete. From your application it sounds like you'll want to start from element zero and delete the number of elements that corresponds to 0 through 4 Hz.
    In addition, you'll need to create an array that represents the frequencies you want to plot (4 1000 Hz or your upper limit). There'll need to be one frequency on your x axis for each element in your PSD array. Then bundle the two arrays together and plot them on an XY plot. You can use the "bundle" vi to accomplish this. You can find examples using clusters from help>>find examples. Then navigate to Fundamentals>>Arrays and Clusters.
    Hope this helps out.
    Chris Cilino
    National Instruments
    LabVIEW Product Marketing Manager
    Certified LabVIEW Architect

  • PreparedStatement set date sometimes sets the date one day behind

    I have a PreparedStatement that sometimes sets the date a day behind. I am saving to a MSSQL DB with a field datetime. I have two identical PreparedStatments, one for insert and one for update. When either is executed, it will sometimes set the date back one day. It's not everytime. Every other or every third one, but it's not consistent. Any help would be appreciated.
    ps.setDate(1, Util.parseSqlDate(getParam("CHARGED")));
    public class Util {
         public static java.sql.Date parseSqlDate(String datestr) {
              DateFormat sdf = null;
              if (datestr == null)
                   return null;
              if (datestr.length() > 8) {
                   sdf = new SimpleDateFormat("MM/dd/yyyy");
              else {
                   sdf = new SimpleDateFormat("MM/dd/yy");
              java.util.Date d = null;
              try {
               d = sdf.parse(datestr);
              } catch (ParseException e) {
                   return null;
              if (d != null) {
                   Calendar cal = Calendar.getInstance();
                   cal.setTime(d);
                   return new java.sql.Date(cal.getTimeInMillis());
              return null;
            protected String getParam(String name) {
              return (getParamArray(name)== null) ? null : getParamArray(name)[0];
         protected String[] getParamArray(String name) {
              return (String[])params.get(name);
         }

    traigo wrote:
    The database is a datetime field. Then you should be using the appropriate java jdbc time/date methods to access it rather than strings.
    We are only storing the date portion. Saving with today's date should produce '2009-12-28 00:00:00.000'.
    I just want to set the date to an absolute date (no time value) provided without timezones.Impossible. Since the database datatype is datetime that means that a timezone is always involved.
    And java always uses timezones. Ignoring the problem doesn't make it go away.

  • Mdx error : Query (11, 9) Two sets specified in the function have different dimensionality.

    Hi all
    i am getting the fallowing error for the below mdx , what may be the issue?
    Query (11, 9) Two sets specified in the function have different dimensionality.
    WITH SET LastSample AS TAIL(NONEMPTY([Date].[Hierarchy].members *[Time].[Hierarchy].members,[Measures].[Overall LU SR]))
    MEMBER [KPI Name] AS "Overall LU SR"
    MEMBER [KPI Value] As KPIValue("Overall LU SR")
    MEMBER [KPI Status] AS KPIStatus("Overall LU SR")
    MEMBER ObjectName as "Overall LU SR"
    MEMBER [Critical Threshold] as  [KPI Thresholds].[Threshold1].&[Overall LU SR].member_value
    MEMBER [Major Threshold] AS  [KPI Thresholds].[Threshold2].&[Overall LU SR].member_value
    MEMBER [Minor Threshold] AS 0
    MEMBER latestDate as [Date].[Hierarchy].membervalue, format_string = 'mm/dd/YYYY'
    MEMBER latestTime as [Time].[Hierarchy].member_caption
    SELECT  {latestDate,latestTime,ObjectName, [KPI Name], [KPI Value],[KPI Status],[Measures].[Critical Threshold],[Measures].[Major Threshold],[Measures].[Minor Threshold]} ON COLUMNS
    ,nonempty(LastSample) ON ROWS
    FROM [DRA]
    Surendra Thota

    hi all
    i got the solution . i have add measures dimension to the name  ie ,[Measures].[KPI Name],
    WITH SET LastSample AS TAIL(NONEMPTY([Date].[Hierarchy].members *[Time].[Hierarchy].members,[Measures].[Overall LU SR]))
    MEMBER [KPI Name] AS "Overall LU"
    MEMBER [KPI Value] As KPIValue("Overall LU SR")
    MEMBER [KPI Status] AS KPIStatus("Overall LU SR")
    MEMBER ObjectName as "Overall LU SR"
    MEMBER [Critical Threshold] as  [KPI Thresholds].[Threshold1].&[Overall LU SR].member_value
    MEMBER [Major Threshold] AS  [KPI Thresholds].[Threshold2].&[Overall LU SR].member_value
    MEMBER [Minor Threshold] AS 0
    MEMBER latestDate as [Date].[Hierarchy].membervalue, format_string = 'mm/dd/YYYY'
    MEMBER latestTime as [Time].[Hierarchy].member_caption
    SELECT  {latestDate,latestTime,ObjectName,[Measures].[KPI Name], [KPI Value],[KPI Status],[Measures].[Critical Threshold],[Measures].[Major Threshold],[Measures].[Minor Threshold]} ON COLUMNS
    ,nonempty(LastSample) ON ROWS
    FROM [DRA]
    Surendra Thota

Maybe you are looking for

  • HT203302 How to get Personal Hotspot working consistently?

    Hi All, I have an iPhone 5s 16gb and an iPad Air 32gb (non cellular). I'm having issues with the personal hotspot working/not working intermittently.  It never connects first time, even though wireless,Bluetooth and hotspot are all turned on, I have

  • Is there a way to adjust the default prefs inside (Safari)package contents?

    Hello All, Following another posting re default prefs resetting for Safari each time I reboot out of Windows back into OS X, I got thinking.. Is there a way I can adjust the default prefs inside Safari (the package in App folder) so every time it res

  • TS1717 dep

    itunes will not open due to dep security

  • Installation Questions Documaker 12.3 Enterprise Edition

    We are looking to install Documaker 12.3 Enterprise Edition and have the following installation questions: When installing Documaker can we have our own VM in our own OU for the WAS application servers? This will make maintenance easier on our side. 

  • Execute Immediate Issue

    Hi All, I am trying to create a table using CTAS method. I am using a single sql query as well as a procedure. The single SQL Query is taking 5-7 mins but the Procedure is getting hanged.Yesterday it ran for 3 hours and i had to kill the job. Is ther