Unable to pull duplicate record in BW

I am working on E-Recritment module.
I have to get candidate education details in BW but those are not available in standard extrator (0CANDAIDATE_ATTR & 0CAND_TD_ATTR). For that I have enhanced DS by fields INSTITUTE, START_DATE & END_DATE. My extractor is working fine & I am able to see all the records in RSA3.
I am getting these records upto PSA but when I want to update further to Info Object Candidate is giving error as Duplicate Records.
Problem is occuring in extracting the details such as Institute (from table HRP 5104) as each candidate is having more then one institute & BW is rejecting duplicate records.
Eg.    0OBJID           INSTITUTE     START_DATE              END_DATE
         50038860         ABC              10.06.1965                    20.05.1973
         50038860         XYZ                20.05.1976                  15.05.1978
         50038860        PQR                30.05.1978                   12.05.1980
Alternatively I have done compounding of InfoObject but stll is not giving correct result.
Can anybody five idea to solve this.
Thanks in advance.

Try creating Time Dependant Hierarchies, i don't think Compounding or only time dependant will help you.
Load it DSO is not a bad option.
Nagesh Ganisetti.

Similar Messages

  • Unable to delete double records from internal table

    Hi all,
    The internal table is like this
    begin of ta_itab1 occurs 0,
          mark type c,
          cnt_hedg type c,
          kunnr like vbak-kunnr,
          vbeln like vbak-vbeln,
          posnr like vbap-posnr,
          matnr like vbap-matnr,
          kwmeng like vbap-kwmeng,
          h_kwmeng like vbap-kwmeng,
          spart like vbap-spart,
          werks like vbap-werks,
          component like bom_item_api01-component,
          comp_qty like bom_item_api01-comp_qty,
          comp_qty1 like bom_item_api01-comp_qty,
          base_quan like stko_api02-base_quan,
          comp_unit like bom_item_api01-comp_unit,
          base_unit like bom_item_api01-comp_unit,
          bukrs_vf like vbak-bukrs_vf,
          end of ta_itab1.
    and used the sytax:
    sort ta_itab6 by kunnr vbeln.
    DELETE ADJACENT DUPLICATES FROM ta_itab6 comparing COMP_QTY COMP_QTY1.
    but Im unable to delete duplicate record .
    Thank You.
    anu

    Hi ,
    You need to use the fields in sort statement on whichyiu wnat to perform Delete Adjacent duplicates..
    sort ta_itab6 by kunnr vbeln COMP_QTY COMP_QTY1.
    DELETE ADJACENT DUPLICATES FROM ta_itab6 comparing COMP_QTY COMP_QTY1.

  • J1INQEFILE - efile generation - Exported file shows Duplicate records.

    Dear Team,
    When I execute J1INQEFILE, I am facing problem with the e-file generation i.e. exported Excel file. When I execute and export the file in excel to the desktop, I can see duplicate records.
    For eg. On execution of J1INQEFILE, I can see 4 records on the SAP screen, whereas the exported file to the desktop shows me 2 more identical records i.e 6 records. As a result, in the SAP system i can see Base Amount as 40000 ie. 10000 for each. on the contrary the excel sheet shows me 60000 i.e. 6 records of 10000 each (bcse of 2 more duplicated records) and also shows the TDS amount wrong. How are the records getting duplicated? Is there any SAP note to fix this? We are debugging on this but no clue....
    Please assist on this issue....
    Thanks in Advance !!!!

    Dear Sagar,
    I  am an abaper,
    Even I came across the same kind of situation for one of our  client ,When we  execute J1INQEFILE, after exporting the same to  Excel file we use to get duplicate records.
    For this I have Debug the program and checked at  point of efile generation, there duplicate records were getting appended for the internal table that is downloaded for Excel, so I have pulled the Document number in to Internal table and  used  Delete Adjacent duplicates by comparing all fields and hence able to resolve the issue.
    Hope the same logic helps or guide you to proceed with the help of an abaper.
    <<Text removed>>
    Regards,
    Kalyan
    Edited by: Matt on Sep 8, 2011 9:14 PM

  • Duplicate Records in Details for ECC data source. Help.

    Hello. First post on SDN. I have been searching prior posts, but have come up empty. I am in the middle of creating a report linking directly into 4 tables in ECC 6.0. I am having trouble in getting either the table links set up correctly, or filtering out duplicate record sets that are being reporting in the details section of my report. It appears that I have 119 records being displayed, when the parameters values should only yeild 7. The details section is repeating the 7 records 17 times (there are 17 matching records for the parameter choices in one of the other tables which I think is the cause).
    I think this is due to the other table links for my parameter values. But, I need to keep the links the way they are for other aspects of the report (header information). The tables in question are using an Inner Join, Enforced Both, =. I tried the other link options, with no luck.
    I am unable to use the "Select Disctinct Records" option in the Database menu since this is not supported when connecting to ECC.
    Any ideas would be greatly appreciated.
    Thanks,
    Barret
    PS. I come from more of a Functional background, so development is sort of new to me. Take it easy on the newbie.

    If you can't establish links to bring back unique data then use a group to diplay data.
    Group report by a filed which is the lowest commom denominator.
    Move all fields into group footer and suppress Group header and details
    You will not be able to use normal summaries as they will count/sum all the duplicated data, use Running Totals instead and select evaluate on change of the introduced group
    Ian

  • Selecting/Pulling Duplicates in JOIN? Why happening?

    Hi Experts,
    One simple doubt is that,
    Am pulling the records from 2 tables i.e. MBEW and MAKT for the selection criteria, by doing JOIN on MATNR..........But, the duplicates r coming! I mean, if the MATNR_1 is existing 1000 plant, its repeating 3-4 times!!
    Any clue?
    thanq.
    Message was edited by:
            Srikhar

    hi Srikhar,
    you did not copy you SELECT statement...
    One possible cause is that in the WHERE you are restricting on matkt-spras. In MBEW there are alos more records for one MATNR, so please copy your whole SELECT here to be able to help you out.
    ec

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • How to handel duplicate record by bcp command

    Hi All,
    I`m using BCP to import ASCII data text into a table that already has many records. BCP failed because of `Duplicate primary key`. 
    Now, is there any way using BCP to know precisely which record whose primary key caused that `violation of inserting duplicate key`.
    I already used the option -O to output error to a `error.log`, but it doesn`t help much, because that error log contains the same error message mentioned above without telling me exactly which record so that I can pull that `duplicate record`
    out of my import data file.
    TIA and you have a great day.

    The only way of figuring out what PKs conflicted I know of is to load the data to a different table and then run an INNER JOIN select between the two.
    BCP.exe is not part of SSIS technically, don't know why you are posting in this section of the forum.
    Arthur My Blog

  • Query creating duplicate records

    Hi
    I need to create a report that gives the po number, po line item number, material number and the delivery post code.
    I have created a query with the following tables:
    EKKO, EKPO, EBAN, VBEP, VBAK, VBPA, ADRC.
    I have deleted the wrong link between EKKO and EKPO and retained all other links as suggested.
    Could anybody let me know why there are duplicate records in the report pls?
    Thanks
    Desp09

    Desp09 wrote:
    Hi
    >
    > I have connected
    >
    > EKKO-EBELN to EKPO
    >
    >  EKPO -EBELN and EKPO-EBELP  to the same of EBAN
    >
    > also EKPO-BANFN and EKPO-BNFPO to the same of EBAN
    >
    > EBAN-BANFN and EBAN-BNFPO to VBEP
    >
    > VBEP-VBELN to VBAK-VBELn
    >
    > VBAK-VBELN to VBPA-VBELN
    >
    > VBPA-ADRNR to ADRC-ADDRNUMBER.
    >
    > This is the first time I have created a query so if you can kindly explain the logic of the error it would help in my future reports.
    >
    > Thanks
    > Priya
    Hi Priya,
    You can simplify the process if you make use of the LDB, which will fetch the same report with std LDB available in the system.
    You can define multiple infosets and  pull the fields for easier approach.
    Regards
    Shiva

  • How to remove the duplicate record in DART Extract

    Hi Guys,
    We are getting duplicate record when we do validate the DART extract file through DATA VIEWS for FI General Ledger Account Balances. If any one have experance on this, pls help us.
    Following are the steps we done to Validate the DART EXTRACT File for FI General Ledger Account Balances.
    1. We have run the DART extract program to extract the data from table to directory file by period vice in T.code FTW1A.
    2. When we do validate the data from DART extract file through DATA VIWE for FI General Ledger Account Balances in T.code FTWH, getting duplicate record.
    We unable to find out from where the duplicate records are coming out. will be great if any one can help us immediately.
    Thanks & Records,
    Boobalan,v

    If the dup records are actually in the DART View versus the DART Extract, you could try OSS Note 1139619 DART: Eliminate duplicate records from DART view.
    Additional Note - 1332571 FTWH/FTWY - Performance for "Eliminate duplicate records
    Colleen
    Edited by: Colleen Geraghty on May 28, 2009 6:07 PM

  • How to avoid retrieve duplicate records from SalesLogix

    I wanted to know if you could assist me.  I am now responsible for reporting our inside sales activities which includes (each month), outbound calls made, opportunities created, opportunities won $, etc.  We use SalesLogix as our tool.  I have been working with Business Objects exporting this information from SalesLogix and have pretty much created the report I need.  The only problem I have is it will pull in duplicate records with the same opportunity ID number because my query is based on u201Ccampaign codesu201D attached to SLX opportunities.  When an opportunity is created in SLX, it automatically assigns an opportunity ID (ex: OQF8AA008YQB) which is distinctive.  However, when we attach more than one u201Ccampaign codeu201D to this opportunity it pulls in opportunity ID that many more times.
    Is there a way to filter or only retrieve one ID record number regardless of how many campaign codes are attached?  All the information attached to the opportunity are the same with the exception that the "campaign code" is different which makes it two records since I pull by "campaign code"
    My greatest appreciation!

    Hi,
    If you are having CAMPAIGN CODE in your query and if you are displaying it in your report, then it would definitely display multiple rows for OPPORTUNITY ID for each CAMPAIGN CODE it has. 
    If you would like to have just one row for OPPORTUNITY ID, then you will need to remove CAMPAIGN CODE from your report.

  • Duplicate records at a particular time

    Hi, i am getting duplicate records at a particular point of time could you please help me with this regard.

    The drive in which I installed informatica ran out of disc space. So I found this in the error log SF_34125 Error in writing storage file [C:\Informatica\9.0.1\server\infa_shared\Storage\pmservice_Domain_ssintr01_INT_SSINTR01_1314615470_0.dat].  System returns error code [errno = 28], error message [No space left on device]. Then I tried to shut down the integration service and then freeup some space on the disc. I got the following message in the log file LM_36047 Waiting for all running workflows to complete.SF_34014 Service [INT_SSINTR01] on node [node01_ssintr01] shut down. Then when I tried to start the integration service again, I got the following error Could not execute action... The Service INT_SSINTR01 could not be enabled due to the following error: [DOM_10079] Unable to start service [INT_SSINTR01] on any node specified for the service After this I am not able to find any entry in the log file for the integration service. So I went to the domain log to find out more details. I found these in the domain log DOM_10126 Request to disable [SERVICE] [INT_SSINTR01] in [COMPLETE] mode.DOM_10130 Stop service process for [SERVICE] [INT_SSINTR01] on node [node01_ssintr01].LIC_10040 Service [INT_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [INT_SSINTR01] with mode [COMPLETE] on node [node01_ssintr01].DOM_10127 Request to disable service [INT_SSINTR01] completed.DOM_10126 Request to disable [SERVICE] [Repo_SSINTR01] in [ABORT] mode.DOM_10130 Stop service process for [SERVICE] [Repo_SSINTR01] on node [node01_ssintr01].LIC_10042 Repository instance [Repo_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [Repo_SSINTR01] with mode [ABORT] on node [node01_ssintr01].DOM_10127 Request to disable service [Repo_SSINTR01] completed.DOM_10115 Request to enable [service] [Repo_SSINTR01].DOM_10117 Starting service process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10018 Request to start process for service [Repo_SSINTR01] was successful.SPC_10051 Service [Repo_SSINTR01] started on port [6,019] successfully.DOM_10118 Service process started for service [Repo_SSINTR01] on node [node01_ssintr01].DOM_10121 Selecting a primary service process for service [Repo_SSINTR01].DOM_10120 Service process on node [node01_ssintr01] has been set as the primary node of service [Repo_SSINTR01].DOM_10122 Request to enable service [Repo_SSINTR01] completed.LIC_10041 Repository instance [Repo_SSINTR01] has started on node [node01_ssintr01].DOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service.DOM_10126 Request to disable [SERVICE] [INT_SSINTR01] in [COMPLETE] mode.DOM_10130 Stop service process for [SERVICE] [INT_SSINTR01] on node [node01_ssintr01].LIC_10040 Service [INT_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [INT_SSINTR01] with mode [COMPLETE] on node [node01_ssintr01].DOM_10127 Request to disable service [INT_SSINTR01] completed.DOM_10126 Request to disable [SERVICE] [Repo_SSINTR01] in [ABORT] mode.DOM_10130 Stop service process for [SERVICE] [Repo_SSINTR01] on node [node01_ssintr01].LIC_10042 Repository instance [Repo_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [Repo_SSINTR01] with mode [ABORT] on node [node01_ssintr01].DOM_10127 Request to disable service [Repo_SSINTR01] completed.DOM_10115 Request to enable [service] [Repo_SSINTR01].DOM_10117 Starting service process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10018 Request to start process for service [Repo_SSINTR01] was successful.SPC_10051 Service [Repo_SSINTR01] started on port [6,019] successfully.DOM_10118 Service process started for service [Repo_SSINTR01] on node [node01_ssintr01].DOM_10121 Selecting a primary service process for service [Repo_SSINTR01].DOM_10120 Service process on node [node01_ssintr01] has been set as the primary node of service [Repo_SSINTR01].DOM_10122 Request to enable service [Repo_SSINTR01] completed.LIC_10041 Repository instance [Repo_SSINTR01] has started on node [node01_ssintr01].DOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service.Then I tried shutting down the domain and restarting the informatica service again. I got the following error when the Integration service was initializedDOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10009 Service process [INT_SSINTR01] output [Informatica(r) Integration Service, version [9.0.1], build [184.0604], Windows 32-bit].SPC_10009 Service process [INT_SSINTR01] output [Service [INT_SSINTR01] on node [node01_ssintr01] starting up.].SPC_10009 Service process [INT_SSINTR01] output [Logging to the Windows Application Event Log with source as [PmServer].].SPC_10009 Service process [INT_SSINTR01] output [Please check the log to make sure the service initialized successfully.].SPC_10008 Service Process [INT_SSINTR01] output error [ERROR: Unexpected condition at file:[..\utils\pmmetrics.cpp] line:[2118]. Application terminating. Contact Informatica Technical Support for assistance.].SPC_10012 Process for service [INT_SSINTR01] terminated unexpectedly.DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service. I tried creating a new integration service and associating it with the same repository. Even then I got the same error. So I tried creating a new repository and a new integration service. Even then I got the same error. What might be the workaround to start the integration service?

  • How to suppress duplicate records in rtf templates

    Hi All,
    I am facing issue with payment reason comments in check template.
    we are displaying payment reason comments. Now the issue is while making batch payment we are getting multiple payment reason comments from multiple invoices with the same name and it doesn't looks good. You can see payment reason comments under tail number text field in the template.
    If you provide any xml syntax to suppress duplicate records for showing distinct payment reason comments.
    Attached screen shot, template and xml file for your reference.
    Thanks,
    Sagar.

    I have CRXI, so the instructions are for this release
    you can create a formula, I called it cust_Matches
    if = previous () then 'true' else 'false'
    IN your GH2 section, right click the field, select format field, select the common tab (far left at the top)
    Select the x/2 to the right of Supress  in the formula field type in
    {@Cust_Matches} = 'true'
    Now every time the {@Cust_Matches} is true, the CustID should be supressed,
    do the same with the other fields you wish to hide.  Ie Address, City, etc.

  • Duplicate records to DSO

    Hello Friends,
    we have an issue with the Duplicate records in to DSO let me enplain the senarion
    The Heder and Details data is loaded to saperate DSO's
    and these 2 DSO's data shuld get merged in the third one,
    the Key fields in
    DSO 1 : DWRECID, 0AC_DOC_NO
    DSO 2 : DWRECID , DWPOSNR
    DSO 3 will fetch data from these the above 2
    Key Fields are : ]
    DWTSLO,
    DWRECID,
    DWEDAT ,
    AC_DOC_NO
    DWPOSNR,
    0CUSTOMER
    Now the data shuld be merge in to a single record in the 3 rd dso
    DSO 1  do not have the DWPOSNR object in its data fields also.
    we even have start routine  data from DSO 1 to populate  some values in the result fields from dso2 ,
    Please provide if you have any inputs to merge the data record wise.
    and also give me all the posibilites or options we have to over write " apart from mappings " the data ,

    Hi,
    You should go for creating an Infoset instead of creating third DSO.
    In that DSO provide the Keys of DSOs and the Common records with those keys will be merged  in that Infoset.
    Hope It Helps.
    Regards
    Praeon

  • How to create duplicate records in end routines

    Hi
    Key fields in DSO are:
    Plant
    Storage Location
    MRP Area
    Material
    Changed Date
    Data Fields:
    Safety Stocky
    Service Level
    MRP Type
    Counter_1 (In flow Key figure)
    Counter_2 (Out flow Key Figure)
    n_ctr  (Non Cumulative Key Figure)
    For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
    routine?
    please let me know some bais cidea of code.

    Hi Uday,
    I have same situation like Suneel and have written your logic in End routine DSO as follows:
    DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
          l_w_duplicate_record TYPE TYS_TG_1.
    LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
        MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
        <result_fields>-/BIC/ZPP_ICNT = 1.
        <result_fields>-/BIC/ZPP_OCNT = 0.
        l_w_duplicate_record-CH_ON = sy-datum.
        l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
        l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
        APPEND l_w_duplicate_record TO  l_t_duplicate_records.
    ENDLOOP.
    APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
    I am getting below error:
    Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 )     RSODSO_UPDATE     19     
    i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
    sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
    that CH_ON value for duplicate record.
    Please help me to resolve this issue.
    Thanks,
    Ganga

  • USE of PREVIOUS command to eliminate duplicate records in counter formula

    i'm trying to create a counter formula to count the number of documents paid over 30 days.  to do this i have to subtract the InvDate from the PayDate.   and then create a counter based on this value.  if {days to pay} is greater than 30 then 1 else 0.
    then sum the {days to pay} field to each group.   groups are company, month, and supplier.
    becuase invoices can have multiple payments and payments can have multiple invoices. there is no way around having duplicate records for the field. 
    so my counter is distorted by by the duplicate records and my percentage of payments over 30 days formula will not be accurate do to these duplicates.
    I've tried Distinct Count based on this formula  if {days to pay} is greater than 30 then . and it works except that is counts 0.00 has a distinct records so my total is off 1 for summaries with a record that (days to pay} is less than or equal to 30.
    if i subract 1 from the formula then it will be inaccurate for summaries with no records over 30 days.
    so i'm come to this.
    if Previous() do not equal
    then
      if {day to days} greater than 30
      then 1
      else 0.00
    else 0.00
    but it doesn't work.  i've sorted the detail section by
    does anyone have any knowledge or success using the PREVIOUS command in a report?
    Edited by: Fred Ebbett on Feb 11, 2010 5:41 PM

    So, you have to include all data and not just use the selection criteria 'PayDate-InvDate>30'?
    You will need to create a running total on the RPDOC ID, one for each section you need to show a count for, evaluating for your >30 day formula. 
    I don't understand why you're telling the formula to return 0.00 in your if statement.
    In order to get percentages you'll need to use the distinct count (possibly running totals again but this time no formula). Then in each section you'd need a formula that divides the two running totals.
    I may not have my head around the concept since you stated "invoices can have multiple payments and payments can have multiple invoices".  So, invoice A can have payments 1, 2 and 3.  And Payment 4 can be associated with invoice B and C?  Ugh.  Still though, you're evaluating every row of data.  If you're focus is the invoices that took longer than 30 days to be paid...I'd group on the invoice number, put the "if 'PayDate-InvDate>30' then 1 else 0" formula in the detail, do a sum on it in the group footer and base my running total on the sum being >0 to do a distinct count of invoices.
    Hope this points you in the right direction.
    Eric

Maybe you are looking for

  • Problems connecting my new macbook to our wireless network

    Hello! I recently received my new macbook and have tried connecting it to our WEP encrypted network that has been set up on a PC and runs from a Netgear DG834 router. I have tried setting up a new network on my macbook, inputting the WEP (hex)key. I

  • How do I make a simple full screen menu?

    Hi, I'm new to iDVD and I really need some help here. I hate the menus this comes with and am having a hard time creating what I need. I have a movie I want to use as my menu background. Other than that I just want 2 buttons with text. None of the me

  • Email notifications not working correctly in Mavericks

    Is anyone else suffering from problems with their email notifications in Mavericks? I have got a mid 2011 21.5 " iMac which previously had Mountain Lion and before I updated I would get every single email notification without fail. Since I I updated

  • Problem with SwingUtilities.UpdateComponentTreeUI(component c)

    I have a Action class and I wrote swingutilities.updatecomponenttreeui in its actonperformed method. I passed a panel class as its argument but it dose not do anything! any clue?

  • RULE HINT 11.2

    Version Details Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production I hav