Perl API: growing memory problem in loops over large sets of data

Hi,
When going through all XmlResults like this:
while ($results->next($val)) {
print $val->asString, "\n";
The process size keeps growing. It does not when I comment $val->asString method out, but then I have no way of getting the results.
This becomes a significant problem when the number of results is huge. I am doing this on a database of over a million short XML documents (400-800 bytes each).
The more complete code is here:
eval {
$env = new DbEnv();
$env->open($dbDir, Db::DB_JOINENV | Db::DB_INIT_LOCK
| Db::DB_INIT_MPOOL | Db::DB_CREATE, 0);
my $mgr = new XmlManager($env, DbXml::DBXML_ADOPT_DBENV);
my $db = $mgr->openContainer(undef, $dbName, Db::DB_RDONLY);
my $context = $mgr->createQueryContext(XmlQueryContext::LiveValues,
XmlQueryContext::Lazy);
my $lookup = $mgr->createIndexLookup($db, "", $nodeName,
"node-$nodeType-equality-$syntax",
new XmlValue($types{$nodeType}, $value), XmlIndexLookup::GTE);
my $results = $lookup->execute(undef, $context);
my $val = new XmlValue();
while ($results->next($val)) {
print $val->asString, "\n";
if (my $e = catch std::exception) {
die $e->what() . "\n";
The process size just grows until the system limit is reached, then the process quits saying 'Out of memory'.
I suspect the problem is with the std::string result returned by C++ XmlValue::asString() const.
The (left-hand-side) result string is likely allocated by new std::string and receives the value by calling the string copy operator. Then the Perl scalar result is prepared, but when it gets returned to my code, the C++ string is not deleted.
Moving the Sleepycat::XmlValue Perl object inside the loop does not help either:
while ($results->hasNext()) {
my $val = new XmlValue();
$results->next($val);
print $val->asString, "\n";
In fact, the process seems to grow faster, possibly because the old $val instances do not get destroyed by Perl at the end of the loop. Where is Perl's garbage collection?
I am using DB XML version: 2.2.13; BDB version: 4.4.20.2; OS: FreeBSD 6-STABLE. However the problem seems to be common for any OS or BDB XML version as it involves Perl-to-C++ interface.
Has anyone experienced similar problems?
Thanks,
Konstantin.
Konstantin @ Chuguev.com

Good catch - you found a memory leak. Luckily the fix is very straightforward. Edit the file
dbxml/src/perl/common.h
and find this line
#define newSVfromString(str) newSVpvn(str.c_str(), str.length())
Change it to this
#define newSVfromString(str) sv_2mortal(newSVpvn(str.c_str(), str.length()))
and recompile the module.
Paul

Similar Messages

  • Problem while having a large set of data to work on!

    Hi,
    I am facing great problem with processing large set of data. I have a requirement in which i'm supposed to generate a report.
    I have a table and a MView, which i have joined to reduce the number of records to process. The MView holds 200,00,000 records while the table 18,00,000. Based on join conditions and where clause i'm able to break down the useful data to approx 4,50,000 and i'm getting 8 of my report columns from this join. I'm dumping these records into the table from where i'll be generating the report by spooling.
    Below is the block which takes 12mins to insert into the report table MY_ACCOUNT_PHOTON_DUMP:
    begin
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    insert into MY_ACCOUNT_PHOTON_DUMP --- Report table
    (SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO, CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO)
    select crm.SUBSCR_NO, crm.ACCOUNT_NO, crm.AREA_CODE, crm.DEL_NO, crm.CIRCLE_ID,
    aa.CREATED_DATE, aa.EMAIL_ID, aa.ALTERNATE_CONTACT
    from MV_CRM_SUBS_DTLS crm, --- MView
    (select /*+ ALL_ROWS */ A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
    from MCCI_PROFILE_DTLS a, MCCI_PROFILE_SUBSCR_DTLS b
    where A.PROFILE_ID = B.PROFILE_ID
    and B.ACE_STATUS = 'N'
    ) aa --- Join of two tables giviing me 18,00,000 recs
    where crm.SUBSCR_NO = aa.SUBSCR_NO
    and crm.SRVC_TYPE_ID = '125'
    and crm.END_DT IS NULL;
    INTERNET_METER_TABLE_PROC_1('MCCIPRD','MY_ACCOUNT_PHOTON_DUMP'); --- calling procedure to analyze the report table
    COMMIT;
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    end; --- 12 min 04 secFor the rest of the 13 columns required i am running a block which has a FOR UPDATE cursor on the report table:
    declare
    cursor cur is
    select SUBSCR_NO, ACCOUNT_NO, AREA_CODE, DEL_NO,
    CIRCLE, REGISTRATION_DT, EMAIL_ID, ALT_CNTCT_NO
    from MCCIPRD.MY_ACCOUNT_PHOTON_DUMP --where ACCOUNT_NO = 901237064
    for update of
    MRKT_SEGMNT, AON, ONLINE_PAY, PAID_AMNT, E_BILL, ECS, BILLED_AMNT,
    SRVC_TAX, BILL_PLAN, USAGE_IN_MB, USAGE_IN_MIN, NO_OF_LOGIN, PHOTON_TYPE;
    v_aon VARCHAR2(10) := NULL;
    v_online_pay VARCHAR2(10) := NULL;
    v_ebill VARCHAR2(10) := NULL;
    v_mkt_sgmnt VARCHAR2(50) := NULL;
    v_phtn_type VARCHAR2(50) := NULL;
    v_login NUMBER(10) := 0;
    v_paid_amnt VARCHAR2(50) := NULL;
    v_ecs VARCHAR2(10) := NULL;
    v_bill_plan VARCHAR2(100):= NULL;
    v_billed_amnt VARCHAR2(10) := NULL;
    v_srvc_tx_amnt VARCHAR2(10) := NULL;
    v_usg_mb NUMBER(10) := NULL;
    v_usg_min NUMBER(10) := NULL;
    begin
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    for rec in cur loop
    begin
    select apps.TTL_GET_DEL_AON@MCCI_TO_PRD591(rec.ACCOUNT_NO, rec.DEL_NO, rec.CIRCLE)
    into v_aon from dual;
    exception
    when others then
    v_aon := 'NA';
    end;
    SELECT DECODE(COUNT(*),0,'NO','YES') into v_online_pay
    FROM TTL_DESCRIPTIONS@MCCI_TO_PRD591
    WHERE DESCRIPTION_CODE IN(SELECT DESCRIPTION_CODE FROM TTL_BMF_TRANS_DESCR@MCCI_TO_PRD591
    WHERE BMF_TRANS_TYPE
    IN (SELECT BMF_TRANS_TYPE FROM
    TTL_BMF@MCCI_TO_PRD591 WHERE ACCOUNT_NO = rec.ACCOUNT_NO
    AND POST_DATE BETWEEN
    TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY') AND SYSDATE
    AND DESCRIPTION_TEXT IN (select DESCRIPTION from fnd_lookup_values@MCCI_TO_PRD591 where
    LOOKUP_TYPE='TTL_ONLINE_PAYMENT');
    SELECT decode(count( *),0,'NO','YES') into v_ebill
    FROM TTL_CUST_ADD_DTLS@MCCI_TO_PRD591
    WHERE CUST_ACCT_NBR = rec.ACCOUNT_NO
    AND UPPER(CUSTOMER_PREF_MODE) ='EMAIL';
    begin
    select ACC_SUB_CAT_DESC into v_mkt_sgmnt
    from ttl_cust_dtls@MCCI_TO_PRD591 a, TTL_ACCOUNT_CATEGORIES@MCCI_TO_PRD591 b
    where a.CUST_ACCT_NBR = rec.ACCOUNT_NO
    and a.market_code = b.ACC_SUB_CAT;
    exception
    when others then
    v_mkt_sgmnt := 'NA';
    end;
    begin
    select nvl(sum(TRANS_AMOUNT),0) into v_paid_amnt
    from ttl_bmf@MCCI_TO_PRD591
    where account_no = rec.ACCOUNT_NO
    AND POST_DATE
    BETWEEN TO_DATE('01-'||TO_CHAR(SYSDATE,'MM-YYYY'),'DD-MM-YYYY')
    AND SYSDATE;
    exception
    when others then
    v_paid_amnt := 'NA';
    end;
    SELECT decode(count(1),0,'NO','YES') into v_ecs
    from ts.Billdesk_Registration_MV@MCCI_TO_PRD591 where ACCOUNT_NO = rec.ACCOUNT_NO
    and UPPER(REGISTRATION_TYPE ) = 'ECS';
    SELECT decode(COUNT(*),0,'PHOTON WHIZ','PHOTON PLUS') into v_phtn_type
    FROM ts.ttl_cust_ord_prdt_dtls@MCCI_TO_PRD591 A, ttl_product_mstr@MCCI_TO_PRD591 b
    WHERE A.SUBSCRIBER_NBR = rec.SUBSCR_NO
    and (A.prdt_disconnection_date IS NULL OR A.prdt_disconnection_date > SYSDATE )
    AND A.prdt_disc_flag = 'N'
    AND A.prdt_nbr = b.product_number
    AND A.prdt_type_id = b.prouduct_type_id
    AND b.first_level LIKE 'Feature%'
    AND UPPER (b.product_desc) LIKE '%HSIA%';
    SELECT count(1) into v_login
    FROM MCCIPRD.MYACCOUNT_SESSION_INFO a
    WHERE (A.DEL_NO = rec.DEL_NO or A.DEL_NO = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
    AND to_char(A.LOGIN_TIME,'Mon-YYYY') = to_char(sysdate-5,'Mon-YYYY');
    begin
    select PACKAGE_NAME, BILLED_AMOUNT, SERVICE_TAX_AMOUNT, USAGE_IN_MB, USAGE_IN_MIN
    into v_bill_plan, v_billed_amnt, v_srvc_tx_amnt, v_usg_mb, v_usg_min from
    (select rank() over(order by STATEMENT_DATE desc) rk,
    PACKAGE_NAME, USAGE_IN_MB, USAGE_IN_MIN
    nvl(BILLED_AMOUNT,'0') BILLED_AMOUNT, NVL(SRVC_TAX_AMNT,'0') SERVICE_TAX_AMOUNT
    from MCCIPRD.MCCI_IM_BILLED_DATA
    where (DEL_NUM = rec.DEL_NO or DEL_NUM = ltrim(rec.AREA_CODE,'0')||rec.DEL_NO)
    and STATEMENT_DATE like '%'||to_char(SYSDATE,'Mon-YY')||'%')
    where rk = 1;
    exception
    when others then
    v_bill_plan := 'NA';
    v_billed_amnt := '0';
    v_srvc_tx_amnt := '0';
    v_usg_mb := 0;
    v_usg_min := 0;
    end;
    -- UPDATE THE DUMP TABLE --
    update MCCIPRD.MY_ACCOUNT_PHOTON_DUMP
    set MRKT_SEGMNT = v_mkt_sgmnt, AON = v_aon, ONLINE_PAY = v_online_pay, PAID_AMNT = v_paid_amnt,
    E_BILL = v_ebill, ECS = v_ecs, BILLED_AMNT = v_billed_amnt, SRVC_TAX = v_srvc_tx_amnt,
    BILL_PLAN = v_bill_plan, USAGE_IN_MB = v_usg_mb, USAGE_IN_MIN = v_usg_min, NO_OF_LOGIN = v_login,
    PHOTON_TYPE = v_phtn_type
    where current of cur;
    end loop;
    COMMIT;
    dbms_output.put_line(to_char(sysdate,'hh24:mi:ss'));
    exception when others then
    dbms_output.put_line(SQLCODE||'::'||SQLERRM);
    end;The report takes >6hrs. I know that most of the SELECT queries have ACCOUNT_NO as WHERE clause and can be joined, but when i joining few of these blocks with the initial INSERT query it was no better.
    The individual queries within the cursor loop dont take more then 0.3 sec to execute.
    I'm using the FOR UPDATE as i know that the report table is being used solely for this purpose.
    Can somebody plz help me with this, i'm in desperate need of good advice here.
    Thanks!!
    Edited by: user11089213 on Aug 30, 2011 12:01 AM

    Hi,
    Below is the explain plan for the original query:
    select /*+ ALL_ROWS */  crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
    from MV_CRM_SUBS_DTLS crm,
            (select /*+ ALL_ROWS */  A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
            from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
            where A.PROFILE_ID = B.PROFILE_ID
            and   B.ACE_STATUS = 'N'
            ) aa
    where crm.SUBSCR_NO    = aa.SUBSCR_NO
    and   crm.SRVC_TYPE_ID = '125'
    and   crm.END_DT IS NULL
    | Id  | Operation              | Name                     | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT       |                          |  1481K|   100M|       |   245K  (5)| 00:49:09 |
    |*  1 |  HASH JOIN             |                          |  1481K|   100M|    46M|   245K  (5)| 00:49:09 |
    |*  2 |   HASH JOIN            |                          |  1480K|    29M|    38M| 13884   (9)| 00:02:47 |
    |*  3 |    TABLE ACCESS FULL   | MCCI_PROFILE_SUBSCR_DTLS |  1480K|    21M|       |  3383  (13)| 00:00:41 |
    |   4 |    INDEX FAST FULL SCAN| SYS_C002680              |  2513K|    14M|       |  6024   (5)| 00:01:13 |
    |*  5 |   MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG   |  1740K|    82M|       |   224K  (5)| 00:44:49 |
    Predicate Information (identified by operation id):
       1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
       2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
       3 - filter("B"."ACE_STATUS"='N')
       5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Whereas for the modified MView query, the plane remains the same:
    select /*+ ALL_ROWS */ crm.SUBSCR_NO, crm.ACCOUNT_NO, ltrim(crm.AREA_CODE,'0'), crm.DEL_NO, >crm.CIRCLE_ID
    from    (select * from MV_CRM_SUBS_DTLS
             where SRVC_TYPE_ID = '125'
             and   END_DT IS NULL) crm,
            (select /*+ ALL_ROWS */  A.ALTERNATE_CONTACT, A.CREATED_DATE, A.EMAIL_ID, B.SUBSCR_NO
            from MCCIPRD.MCCI_PROFILE_DTLS a, MCCIPRD.MCCI_PROFILE_SUBSCR_DTLS b
            where A.PROFILE_ID = B.PROFILE_ID
            and   B.ACE_STATUS = 'N'
            ) aa
    where crm.SUBSCR_NO  = aa.SUBSCR_NO
    | Id  | Operation              | Name                     | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT       |                          |  1481K|   100M|       |   245K  (5)| 00:49:09 |
    |*  1 |  HASH JOIN             |                          |  1481K|   100M|    46M|   245K  (5)| 00:49:09 |
    |*  2 |   HASH JOIN            |                          |  1480K|    29M|    38M| 13884   (9)| 00:02:47 |
    |*  3 |    TABLE ACCESS FULL   | MCCI_PROFILE_SUBSCR_DTLS |  1480K|    21M|       |  3383  (13)| 00:00:41 |
    |   4 |    INDEX FAST FULL SCAN| SYS_C002680              |  2513K|    14M|       |  6024   (5)| 00:01:13 |
    |*  5 |   MAT_VIEW ACCESS FULL | MV_CRM_SUBS_DTLS_08AUG   |  1740K|    82M|       |   224K  (5)| 00:44:49 |
    Predicate Information (identified by operation id):
       1 - access("CRM"."SUBSCR_NO"="B"."SUBSCR_NO")
       2 - access("A"."PROFILE_ID"="B"."PROFILE_ID")
       3 - filter("B"."ACE_STATUS"='N')
       5 - filter("CRM"."END_DT" IS NULL AND "CRM"."SRVC_TYPE_ID"='125')Also took your advice and tried to merge all the queries into single INSERT SQL, will be posting the results shortly.
    Edited by: BluShadow on 30-Aug-2011 10:21
    added {noformat}{noformat} tags.  Please read {message:id=9360002} to learn to do this yourself                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Looping over arraycollection with sap data

    Hello,
    I asked a lot of persons but i still have a problem.
    I am trying to make a organizational chart in Adobe Flex with data out of the SAP system.
    When I try to access my data that I have given trough the Flash Islands container it doesn’t work.
    I do:
    [Bindable]
    Public var datasource:Arraycollection;
    Then I want to loop over this datasource with a for loop but it says that the object is null.
    for (i = o; i < datasource.length; i++)
    When I use datasource as dataprovider for a datagrid in Flex, that works perfect.
    Please answer fast because I need it for school.
    greetings

    Hello,
    I asked a lot of persons but i still have a problem.
    I am trying to make a organizational chart in Adobe Flex with data out of the SAP system.
    When I try to access my data that I have given trough the Flash Islands container it doesn’t work.
    I do:
    [Bindable]
    Public var datasource:Arraycollection;
    Then I want to loop over this datasource with a for loop but it says that the object is null.
    for (i = o; i < datasource.length; i++)
    When I use datasource as dataprovider for a datagrid in Flex, that works perfect.
    Please answer fast because I need it for school.
    greetings

  • Problem to update very large volume of data for 2LIS_04* extr.

    Hi
    I have problem with jobs for 2LIS_04* extractors using Queued Delta.
    There are interface between R3 system and other production system and 3 or 4 times in the month very large volumen of data has been send to R3.
    Then job runs very long and not pull data to RSA7.
    How to resolve this problem.
    Our R3 system is PI_BASIS 2005_1_620.
    Thanks
    Adam

    U can check these SAP Notes..........it will help u........
    How can downtime be reduced for setup table update
    SAP Note Number: 753654
    Performance improvement for filling the setup tables
    SAP Note Number: 436393
    LBWE: Performance for setup of extract structures
    SAP Note Number: 437672

  • Problem while looping through record set and tem table for matching data

    hi I am using one record set and ane temp table and looping through both to find the match between dates.
    If date matches then it shud do some processing otherwise it will return default values(null values).
    FOR i IN student_rec .FIRST..student_rec .LAST          /*student_rec.school_date has 01-MAR-2012,02-MAR-2012,03-MAR-2012,04-MAR-2012,05-MAR-2012*/
    LOOP
    l_return_out.school_date := student_rec(i).school_date;
    l_return_out.marks_obtained := student_rec(i).marks_obtained;
    FOR i IN selected_dates .FIRST..selected_dates .LAST          /*selected_dates has 02-MAR-2012,03-MAR-2012,05-MAR-2012*/
    LOOP
    DBMS_OUTPUT.PUT_LINE(selected_dates(i));
    IF selected_dates(i)=student_rec(i).sett_date
    THEN
    EXIT;
    end if;
         ---------call procedure P1
    -----------get output as ret_val1               /*getting ret_val1 as 10 for 02-MAR-2012,03-MAR-2012,05-MAR-2012 */
         ----------call procedure P2
    ---------get ouput as ret_val2               /*getting ret_val1 as 20 for 02-MAR-2012,03-MAR-2012,05-MAR-2012 */
    if ret_val1>0 0r ret_val2>0
    then
    l_return_out.has_csts := yes;
    l_return_out.cst_present := 10;
    l_return_out.cst_absent := 20;
    else
    l_return_out.has_csts :=No;
    l_return_out.cst_present:= 0;
    l_return_out.cst_absent := 0;
    end if;
    end loop;
    l_return_out.has_cst := student_rec(i).has_csts;
    l_return_out.cst_missing := student_rec(i).cst_present;
    l_return_out.cst_existing := student_rec(i).cst_absent;
    PIPE ROW(l_return_out);
    END LOOP;
    RETURN ;
    I am expecting this as result
    school_date     marks_obtained     has_csts     cst_present cst_absent
    01-MAR-2012     20          
    02-MAR-2012     30          yes 10          20
    03-MAR-2012     40           yes 10          20
    04-MAR-2012     70          
    05-MAR-2012     60          yes 10          20
    but this as result
    school_date     marks_obtained     has_csts     cst_present cst_absent
    01-MAR-2012     20          
    02-MAR-2012     30          
    03-MAR-2012     40           
    04-MAR-2012     70          
    05-MAR-2012     60          
    Can anybody please highlight the mistake i am doing while processing the logic??
    Edited by: 942390 on Jul 13, 2012 8:44 PM
    Edited by: 942390 on Jul 13, 2012 8:45 PM

    I am getting a set values from a record set....student_rec
    and on pipelining this record set from 1st till last
    i am getting this
    school_date     marks_obtained     has_csts     cst_present cst_absent
    01-MAR-2012     20          
    02-MAR-2012     30          
    03-MAR-2012     40           
    04-MAR-2012     70          
    05-MAR-2012     60     
    so initially has_csts, cst_present and cst_absent is null for all dates.
    now have a temp table of selected_dates(which contains these dates 02-MAR-2012,03-MAR-2012,05-MAR-2012)
    now I am want to populate has_csts, cst_present and cst_absent with data only for those dates which is present in selected_dates temp table(02-MAR-2012,03-MAR-2012,05-MAR-2012) and that too has_csts, cst_present and cst_absent will be populated with some condition (based on the values from procedure got from P1 and P2).
    so want result set to look like
    school_date     marks_obtained     has_csts     cst_present cst_absent
    01-MAR-2012     20          
    02-MAR-2012     30          yes 10          20
    03-MAR-2012     40           yes 10          20
    04-MAR-2012     70          
    05-MAR-2012     60          yes 10          20
    so what could be the possible solution to obtained this....

  • Looping over multiple rows of data to output one row

    My query is giving me the appropriate data, but I need to
    only have one row per 'MANAGER' in my output - not one for each
    'JOB'. Now the output looks like this:
    manager 1 ... period 1 $ .... period 2 $ .... period 3 $
    manager 1... period 1 $ .... period 2 $ .... period 3 $
    manager 1... period 1 $ .... period 2 $ .... period 3 $
    manager 1... period 1 $ .... period 2 $ .... period 3 $
    manager 2... period 1 $ .... period 2 $ .... period 3 $
    manager 2... period 1 $ .... period 2 $ .... period 3 $
    manager 2... period 1 $ .... period 2 $ .... period 3 $
    manager 2... period 1 $ .... period 2 $ .... period 3 $
    It needs to be:
    Manager 1 ... total period 1 $ .... total period 2 $ ....
    Manager 2 ... total period 1 $ .... total period 2 $ ....
    Any help would be wonderful. I am really new at something
    this complex. Thanks!!!

    To fix your immediate issue, you need to order and group by
    manager, not MPR_ID (whatever that is).
    What is dm_id? Use descriptive names!
    Anyway, change the ORDER BY clause to:
    ORDER BY
    RR.LastName+', '+RR.FirstName, <!--- Or maybe dm_id??
    --->
    AA.MPR_ID
    Then change the first <cfoutput> to:
    <cfoutput query="production" group="Rep_name">

  • Looping through a set of data

    seq     |     id|     Name|     TotalPc
    1     |     22|     ABA |     20
    1     |     23|     ABB |     20
    2     |     24|     ABC |     20
    3     |     25|     ABD |     20
    3     |     26|     ABE |     20
    3     |     27|     ABF |     20
    4     |     28|     ABG |     20
    5     |     29|     ABH |     20
    6     |     30|     ABI |     20
    6     |     31|     ABJ |     20
    6     |     32|     ABK |     20
    6     |     33|     ABL |     20The above result is returned in a resultset. What i would like to achieve is store these rows in an object that is structured as shown below
    public class Name{
         String seq
         String name;
         Arraylist namePcs = new ArrayList()
    public class NamePcs{
         String id;
         String totalPcs;
    }Now as you can see in the data, some of the rows belong to the same element. For example, all those with seq=6 should be stored in the same object because they below to the same group i.e. 6.
    I tried several combinations of approaches but i always seem to either miss the last one, duplicate them or exhaust the resultset.
    Here are some examples of what i tried to do.
    Example 1 - This example copies every name into every element.
    while(rs.next()){
         Name name = new Name();
         name.seq = rs.getString("seq");
         name.name = rs.getStrign("name");
         NamePcs namePcs= new NamePcs()
         namePcs.id = rs.getString("id");
         namePcs.totalPcs = rs.getString("totalPcs")
         name.namePcs.add(namePcs);
    seq_no = rs.getString("seq");
    }Example 2 - In this example i try to save the child elements only if the seq number is the same but i still get duplicates.
    String seq_no = "";
    while(rs.next){
         if(!seq_no.equals(rs.getString("seq_no"){
                   Name name = new Name();
                   name.seq = rs.getString("seq");
                   name.name = rs.getStrign("name");
         if(seq_no.equals(rs.getString("seq_no");{
              NamePcs namePcs= new NamePcs()
              namePcs.id = rs.getString("id");
              namePcs.totalPcs = rs.getString("totalPcs")
              name.namePcs.add(namePcs);
    seq_no = rs.getString("seq");
    }I also tried an inner while loop with an inner rs.next() but that exhausts the resultset. Is there a better way to do this? Thansk in advanc.e
    Edited by: ziggy on Sep 23, 2008 9:10 AM

    Hi,
    Sorry for not being clear enough. The above was just a quick example to try and show what i am trying to achieve.
    What i am trying to achieve is to store the data in objects. The data itself has a one to many relationship. This means that many names will be related to the same sequence number. I would like to store the common names (i.e. those with the same seq no) in the same object in an arraylist.
    Maybe this data structure makes more sense
    public class seq{
         String seq     
         Arraylist namePcs = new ArrayList() //will contain all names with the same sequence
    public class NamePcs{
         String id;
         String name;
         String totalPcs;
    }What im hoping to achieve is the following
    - The seq object will contain one seq row (either 1,2,3,4,5,6)
    - Array list in the seq object should contain all the rows for each individual seq.
    - As an example if seq[3].seq = 6, the seq.namePcs() should contain all NamePcs objects tha t have id's (30,31,32,33).
    Here is what i am hoping to achieve as the end result (for the first 6 rows)
    seq[0]
         seq = 1
         namePcs[0]
              id = 22
              name = ABA
              TotalPcs = 20
         namePcs[1]
              id = 23
              name = ABB
              TotalPcs = 20
    seq[1]
         seq = 2
         namePcs[0]
              id = 24
              name = ABC
              TotalPcs = 20
    seq[2]
         seq = 3
         namePcs[0]
              id = 25
              name = ABD
              TotalPcs = 20
         namePcs[1]
              id = 26
              name = ABE
              TotalPcs = 20
         namePcs[1]
              id = 27
              name = ABF
              TotalPcs = 20               I hope that made it a bit clear.
    Thanks
    Edited by: ziggy on Sep 23, 2008 9:41 AM

  • Looping over query by month

    etings,
    I have a query I am pulling that has a date field entitled, "Completed". I am attempting to loop over the query by date to slice and dice the data for a table and chart. Here is what I have done thus far...
    Setup two variables where I am only interested in the month. My plan is to fileter the date by month so I can pull the data out by month.
        <cfset startDate = #DatePart('m','01/01/09')#>
        <cfset endDate = #DatePart('m',Now())#>
    Here is my loop...
        <cfloop from = "#startDate#" to = "#endDate#" index = "i" step = "1">
    Here is one of my QoQs within the loop...
            <cfquery name="NPS0" dbtype="query">
            SELECT *
            FROM rsNPS
            WHERE #DatePart('m',rsNPS.completed)# = #i#
            </cfquery>
    I am having difficulties in getting this to work. Has anyone ever done something like this. I feel like the answer is right in front of me, but I have been staring at this code for a while. If anyone has any thoughts, I would be glad to hear them.
    ~Clay

    fs22 wrote:
             <cfquery name="NPS0" dbtype="query">
            SELECT *
            FROM rsNPS
            WHERE #DatePart('m',rsNPS.completed)# = #i#
            </cfquery>
    QoQ are a separate beast. You cannot use standard CF functions inside them.  AFAIK, QoQ only support a few functions like CAST, UPPER, LOWER, etcetera.  So as Dan suggested, you should peform the date functions in your database query.

  • Out of memory problem using the API

    Hi all,
    I need your assistance, we are working with CDB 10.2 making searches and retrieving the documents with all their attributes.
    In our actual scenario we have a single user (which represents an application) accessing CDB. This user use several persistent sessions simultaneously. I mean, several thousands of final users connect to an application that uses one user of CDB to connect to CDB with several persistent sessions.
    To simulate this scenario we wrote java code that open five threads and make several searches (requesting all the attributes) using the same user on cdb.
    Retrieving a considerable amount of data found on the search (~5000), we found a “Out of memory” problem when we made these tests:
    -     5 threads obtaining 100 documents (and all their attributes) / search
    -     1 thread obtaining 500 documents (and all their attributes) / search
    -     Also we have same problem if we make several searches with less results
    We suppose it’s a configuration or code issue so we ask for your assistance and experience to solve it.
    Thanks for your help,
    Dani
    import java.sql.Connection;
    import oracle.ifs.examples.api.constants.AttributeRequests;
    import oracle.ifs.examples.api.util.CommonUtils;
    import oracle.ifs.fdk.Attributes;
    import oracle.ifs.fdk.ClientUtils;
    import oracle.ifs.fdk.FdkConstants;
    import oracle.ifs.fdk.FdkCredential;
    import oracle.ifs.fdk.ManagersFactory;
    import oracle.ifs.fdk.NamedValue;
    import oracle.ifs.fdk.Options;
    import oracle.ifs.fdk.SearchExpression;
    import oracle.ifs.fdk.SearchManager;
    import oracle.ifs.fdk.SimpleFdkCredential;
    import oracle.jdbc.pool.OracleDataSource;
    public class Prueba {
         public static void main(String args[]) {
              Thread thread = new BasicThread1();
              Thread thread1 = new BasicThread1();
              Thread thread2 = new BasicThread1();
              Thread thread3 = new BasicThread1();
              Thread thread4 = new BasicThread1();
              thread.start();
              thread1.start();
              thread2.start();
              thread3.start();
              thread4.start();
    class BasicThread1 extends Thread {
         public void run() {
              ManagersFactory session = null;
              try {
                   System.out.println(this.getName() + "-->init");
                   session = getSession();
                   SearchManager sManager = session.getSearchManager();
                   SearchExpression srchExpr = new SearchExpression(Attributes.SIZE,
                             new Integer(20000000), FdkConstants.OPERATOR_LESS_THAN);
                   NamedValue[] res = null;
                   for (int i = 0; i < 100000; i++) {
                        res = sManager.search(srchExpr, basicSearchOptions2,
                                  AttributeRequests.DOCUMENT_CATEGORY_ATTRIBUTES);
                   System.out.println(this.getName()+" --> fin sin error: " + res.length);
                   } catch (Throwable t) {
    t.printStackTrace();
    System.out.println("<--"+this.getName());
              } finally {
                   CommonUtils.bestEffortLogout(session);
         static NamedValue[] basicSearchOptions2 = new NamedValue[] {
                   ClientUtils.newNamedValue(
                                       Options.MULTILEVEL_FOLDER_RESTRICTION,
                                       Boolean.TRUE),
                   ClientUtils.newNamedValue(Options.SEARCH_FOR_DOCUMENTS,
                             Boolean.TRUE),
                   ClientUtils.newNamedValue(Options.SEARCH_FOR_FOLDERS,
                             Boolean.FALSE),
                   ClientUtils.newNamedValue(Options.RETURN_COUNT,
                             new Integer(500)) //<<Maximo nº de elementos
         private static ManagersFactory getSession() throws Exception {
         OracleDataSource ods = new OracleDataSource();
         ods.setURL("URL");
         Connection conn = ods.getConnection();
         FdkCredential credential = new SimpleFdkCredential("USER","PSW");
         ManagersFactory session = ManagersFactory.login(credential,
         "SERVER");
    return session;
    }

    re-Post

  • Problem with Message-Mapping: Loop over Elements possible?

    Hi all,
    I want do create a Message-Mapping for an IDoc-to-File Scenario. In the Source Structure there are some Elements which can appear more than once (1..unbounded). I need a mechanism which loops over these elements and search for specified values. From the Element which contains an element with this specified value the mapping should write a value in the target structure.
    Here a simple example (source structure) for better understanding:
    <root>
       <invoice>
          <number> 10 </number>
          <sum> 200.00 </sum>
       </invoice>
       <invoice>
          <number> 20 </number>
          <sum> 150.00 </sum>
       </invoice>
       <invoice>
          <number> 30 </number>
          <sum> 120.00 </sum>
       </invoice>
    </root>
    Now the duty of the Mapping should be to search in the elements <invoice> for the number 30. And then the sum of the invoice with the number 30 should be written in a field of the target structure.
    I tried it out with a constant togehter with an equalsS-function and an ifWithoutElse-function, but it is working only then, if the invoice with the number 30 has the first position in the root context.
    Can anybody help me? Thanks
    With kind regards
    Christopher

    Hi,
    Write a UDF to sum the required values and map to target node.
    See while writing the UDF select the type as queue.
    number -- removecontext-UDF targetnode
    sum----removecontext--/
    number abd sum or the two inputs
    in UDF
    int nsum = 0;
    for(int i;i < number.length;i++){
      if number(i).equals("30") then
         nsum = nsum + valueOf(sum(i));
    result.addValue(nsum); // convert the nsum into string
    Regsrds
    Chilla

  • Memory problems with PreparedStatements

    Driver: 9.0.1 JDBC Thin
    I am having memory problems using "PreparedStatement" via jdbc.
    After profiling our application, we found that a large number oracle.jdbc.ttc7.TTCItem objects were being created, but not released, even though we were "closing" the ResultSets of a prepared statements.
    Tracing through the application, it appears that most of these TTCItem objects are created when the statement is executed (not when prepared), therefore I would have assumed that they would be released when the ResultSet is close, but this does not seem to be the case.
    We tend to have a large number of PreparedStatement objects in use (over 100, most with closed ResultSets) and find that our application is using huge amounts of memory when compared to using the same code, but closing the PreparedStatement at the same time as closing the ResultSet.
    Has anyone else found similar problems? If so, does anyone have a work-around or know if this is something that Oracle is looking at fixing?
    Thanks
    Bruce Crosgrove

    From your mail, it is not very clear:
    a) whether your session is an HTTPSession or an application defined
    session.
    b) What is meant by saying: JSP/Servlet is growing.
    However, some pointers:
    a) Are there any timeouts associated with session.
    b) Try to profile your code to see what is causing the memory leak.
    c) Are there references to stale data in your application code.
    Marilla Bax wrote:
    hi,
    we have some memory - problems with the WebLogic Application Server
    4.5.1 on Sun Solaris
    In our Customer Projects we are working with EJB's. for each customer
    transaction we create a session to the weblogic application server.
    now there are some urgent problems with the java process on the server.
    for each session there were allocated 200 - 500 kb memory, within a day
    the JSP process on our server is growing for each session and don't
    reallocate the reserved memory for the old session. as a work around we
    now restart the server every night.
    How can we solve this problem ?? Is it a problem with the operating
    system or the application server or the EJB's ?? Do you have problems
    like this before ?
    greetings from germany,

  • Memory Problems with Adobe PDF iFilter for 64-bit

    In preparation to rebuild my Windows Search Index, I installed the Adobe PDF iFilter for 64-bit on my system (Vista Business 64).  When I finally rebuilt the index, I wasn't too surprised by what I saw happen, namely, the SearchFilter.exe process would kick in whenever I wasn't using the system and just eat RAM.  One time I turned it on and it had allocated over 4,000 MB (and my system only has 4,030 MB available) so of course it was forcing all the other processes to hard fault (ie. everything was moving like molasses--for example, it took 20 minutes to put the thing to sleep).  But I just let it do it's work, figuring that perhaps this was to be expected relative to the small library of PDF's that I've accumulated on my computer, ranging from LaTeX generated text files, to containers for hi-res scans.  So, after a day and a half of basically not using my laptop, everything finally calmed down and I enjoyed the benefits of searching the content of my library from the Windows Start menu--for a short while.
    However, to my dismay I've encountered the problem that this freezing of my computer would now occur after everytime I download a new PDF (in this particular case they were Google Books scans) and then left the computer to idle.  Again, the SearchFilter.exe would allocate all of my RAM for itself and just push everything else onto the Virtual RAM, which means the SLOWEST possibly fetching you can get.  I had to uninstall as this was making my computer unusable for 15-30 minutes after each idle. Everything is back in working order without the iFilter, but I would like to know if anyone has reported such problems on x64 systems.  Obviously, I will also report the problem to Microsoft, since the search engine should certainly have the precaution to handle such memory problems.   However, it is a problem that is created by the Adobe PDF iFilter interacting with the Windows Search engine.

    Hello,
    We believe we have figured this out.  It looks like it has to do with the length of the default folder location for the Adobe iFilter.
    I was able to reproduce the issue and the following resolved it for me.  See if this resolves it for you all as well.
    Here is how to get Adobe Version 11 PDF filter to work.
     1 . If you haven’t already, run the following in SQL Server:
    Sp_fulltext_service ‘Load_os_resources’, 1
    Go
    --you might also need to run: 
    sp_fulltext_service ‘Verify_signature’,0  --This is used to validate trusted iFilters. 0 disables it. So use with caution.
    --go
    2. Stop SQL Server.  (Make sure FDHost.exe stops)
    3.  
    Uninstall the Adobe ifilter (because it defaulted to having spaces or the folder name is too long).
    4.  
    Reinstall the Adobe iFilter and when it prompts for where to install it, change it to: C:\Program Files\Adobe\PDFiFilter
    5.  Once the installation finishes, go the computer’s Environment variables. Add the following to the PATH.
    C:\Program Files\Adobe\PDFiFilter\BIN
    NOTE: it must include the BIN folder
    NOTE: If you had the OLD location that included spaces, remove it from the path environment variable.
    6. Start SQL Server
    7.  IF you had an existing Full-text index on PDFs, drop the full-text index and recreate it.
    8. You should now get results when you run sys.dm_fts_index_keywords('db','tblname')  --Note: Change db to be the actual database name and tblname to be the actual table name.
     Give this a try and see if this fixes yours. 
    Sincerely,
    Rob Beene, MSFT

  • Nokia C5-03 Low memory problems ( Other Files )

    Please help. I have a nokia C5-03 the handset is showing phone memory is full. I have checked and uninstalled all unecessay applications. But problem is still there. when i check the phone memory it indicates that i have other files of 42mb installed and using phone memory. however i am unable to check what other files are installed or classfied as other files. Can you please assist me so i can know what these other files consist of or where they are located.

    Hi Thetao,
    I typed a more extensive reaction before, but it got lost when I pressed "post". Therefore I just respond to the main points that you mentioned (and some I found out myself).
    Strange: I can't find the 40 MB Maximum User Storage on the Nokia website anymore (nor the 75 MB). But it sounds very familiar to me. It looks to me as if they removed this from the phone specs, also of other Smartphones by the way.
    Yesterday, I deleted some small apps that I don't use (anymore) such as InternetRadio and I also removed Nokia email. Although the apps were below 2 MB together, this freed up over 7 MB of Phone memory (24 MB free now)! I think there were still some old emails stored in C: which I couldn't delete any other way. This helped me a great deal already but I tried your suggestions as well.
    1. No map data or CITIES folder on C: 2. Switched messages memory to phone (and phone to offline mode) and I did indeed find a forgotten email account with 30 email messages. Not much but I had 24,7 MB free after that. Of course, I put messages memory back to the memorycard. 3. Used the free edition of Y-Browser to manually delete the cache folder. Not much data in that, but 25,1 MB free after that. Nice tool, with which you can reach folders that normally stay hidden! Used YBrowser to search all C: for files over 300 kB. Only 2 files: boot_space.txt in C:\ (500 kB, contains only the X repeatedly as far as I see, but is probably essential for the operating system) and C:\resource\python25\pyton25.zip (1 MB). It looks like an installation package, but I'm not sure if I can delete it. By the way: YBrowser hasn't made a shortcut in one of the menus. Only way that I found to start it was to look for it using the Phone's search function. Is there a way to make this shortcut myself?
    4. Yes I did. No Images folder on C: anymore, nor other big files (see point 3)
    5. I use Bluetooth for file transfer sometimes, mainly for installation files (such as YBrowser.sis, but I did this one via USB-cable). However, no big files are left on C: so I don't think I have this problem.
    6. I tried to delete Nokia Chat yesterday as well (with the other apps), but it won't be uninstalled the normal way as it says "Uninstall cancelled" (not sure about the exact translation since my phone 'speaks' Dutch) Do you know if there's another way to get rid of this 3 MB app that I don't use at all?
    I think I may have found an explanation and a solution for the memory problem while navigating. You mentioned the "memory in use" in the map settings. Above that option there's a slide bar for the % of memory that the navigation can use. Standard is 70%. I always thought this was about storage memory on (in my case) the memorycard. Another topic mentioned that this the working memory (so the RAM) that the navigation may use. Setting it to 70% means there's only 30% left for other apps that run in the background. The other topic states that this is nog enough so the slider should be set to for instance 30% for navigation leaving 70% free for "the phone". From behind my computer, navigation seems much more stable. I'll try this setting in my car soon and let you know how it works.
    Thanks a lot for thinking along with me so far! There's already 25,1 MB of space, which is great since it was only 7 MB last Sunday. And navigation looks more stable. I'd appreciate if you have some more answers to my latest questions, but if not I think my phone will work a lot better already!
    Regards, Paul

  • Memory problem with jdk/jre 1.1.8

    My name is BERGMANN Yannick.
    I'm working for IRM in Li?ge and we developped an application (user interface for an industrial measurement system) in Java (JDK/JRE version : 1.1.8).
    We have a big memory problem with this application :
    - This user interface is running on a WINDOWS NT PC with 128MB.
    - This is the command to lanch our application :
    C:\Program Files\JavaSoft\JRE\1.1\bin\jrew.exe" -ms32m -mx32m -cp "\Program Files\HMI\HMI.zip;\Velocis\Add_On\Jdbc\raima.jar;\Program Files\Swing-1.1.1\swingall.jar" be.irm.hmi.kernel.HMI -t15 -d"Velocis rdstcp" -newdb -mf1m -mr20
    - When our application is running, everything seems to be OK in memory for it. The garbage collector seems to work properly and our application has always at least 5MB free memory (We use the java instruction "Runtime.getRuntime().freeMemory()" to know this).
    - But when we look in the "Windows NT task manager" for the "jrew" application, the memory increases ALWAYS.
    - After 5 days our application is completely frozen and blocked ...???
    - here is a memory map of our Windows NT PC :
         "jre.exe"     "commit total"     "commit limit"     "commit peak"     "physical total"     "physical available"     "physical file cache"     
    Monday     92264     109256     194944     109424     130484     19492     6216     
    Thuesday     106196     123072     194944     123348     130484     6072     5840     
    Wednesday     110836     132288     194944     132416     130484     4408     5140     
    Thursday     108200     144980     194944     145140     130484     4888     5148     
    Friday     109440     158319     194944     161334     130484     4911     4992     
    Monday     111600     209060     228548     209148     130484     5184     3484     
    Have you any idea of what is happening with "jrew" in memory ?
    We have had this problem for six month and we are totaly out of idea.
    If you can give us any idea, we'll appreciate a lot.
    Thanks in advance,
    BERGMANN Yannick
    IRM SA - Software Engineer
    Tel. (32)4/239.90.10
    Tel. (32)4/239.90.74 (direct)
    Fax (32)4/263.40.97
    E-mail [email protected]

    We had a memory problem with a swing applet in our company. The major reason for this was that we added new components in a JTree and removed them later again, and the components we removed were never garbage collected. This was because with these components we added different listeners, and we didn't remove the listeners after we didn't need the components anymore. After we corrected this, the components where garbage collected.
    Perhaps it's a similar problem you have, or I have no idea. Check that you remove actionlisteners, mouselisteners etc from components you want to be garbage collected.
    You could also test your application with OptimizeIt to see what objects you create and how many you get of them over time: http://www.vmgear.com

  • How can i loop over treeview selected nodes and then to delete each node if it's a directory or a file ?

    I have this code:
    if (s == "file")
    file = false;
    for (int i = 0; i < treeViewMS1.SelectedNodes.Count; i++)
    DeleteFile(treeViewMS1.SelectedNode.FullPath, file);
    I know it's a file and it is working if it's a single file.
    On the treeView i click on a file right click in the menu i select delete and it's doing the line:
    DeleteFile(treeViewMS1.SelectedNode.FullPath, file);
    And it's working no problems.
    But i want to do that if i selected some files togeather multiple selection then to delete each file.
    So i added the FOR loop but then how do i delete from the SelectedNodes each node ?
    The treeView SelectedNodes dosen't have FullPath like SelectedNode also doing SelectedNodes[i] dosen't have the FullPath property.
    Same as for if i want to delete a single directory or multiple selected directories:
    This is the continue of the code if it"s not a "file" (else) it's null i know it's a directory and doing:
    else
    file = true;
    RemoveDirectoriesRecursive(treeViewMS1.SelectedNode, treeViewMS1.SelectedNode.FullPath);
    Also here i'm using SelectedNode but if i marked multiple directories then i how do i loop over the SelectedNodes and send each SelectedNode to the RemoveDirectoriesRecrusive method ?
    My problem is how to loop over SelectedNode(multiple selection of files/directories) and send each selected file/directory to it's method like i'm doing now ?

    foreach (TreeNode n in treeViewMS1.SelectedNodes)
    // Remove everything associated with TreeNode n here
    I don't think it's any harder than that, is it?
    If you can multi-select both an item and one of its descendents in the tree, then you'll have the situation that you may have deleted the parent folder and all of its children by the time you get around to deleting the descendent.  But that's not such
    a big deal.  A file can get deleted externally to your program too - so you'll just have to deal with it having been deleted already (or externally) when you get around to deleting it yourself.

Maybe you are looking for

  • Bi Content Add-on

    Hi All, While adding the bi content, ocs_import queue job is hang status, when we check the work process status is on hold with reason RFC, it is not updating anything(sequential (or)direct read)).then i have canceled that job and reset with the func

  • Question about using constant variables in Forms

    As I am still very new to Forms, please forgive my ignorance if the answer to my question is very simple. I am trying to figure out how to use constant variables within my Forms application. For example, if I want to setup return code constants for t

  • Tell us here what changes you want to see in the SpeedGrade forum.

    Welcome to the new SpeedGrade user-to-user forum. We're just getting started here, so there are very likely things that we'll need to add or change to make this forum as useful as it can be for you. Let us know what changes you'd like to see. Content

  • Doubt in pgi date and quantity in SAP-SD

    Hi All, Please let me know the table and field name for PGI date and quantity in sap-sd

  • HT4759 cell phone question

    can i use a samsug cell phone with icloud?