Avoiding duplicates in JTree
I am trying to avoid duplicates in a JTree and don't think I should have to loop through all the objects and compare them for every new entry. I would like find a way to compare the label of an existing object with the one I'm about to add to the tree.
I'm trying to use getIndexOfChild method instead of looping. The method as it is now is doesn't work. I've tried type casting and toString, but haven't found the correct combo.
Thanks....
public void refreshHostTree(JTree mytree, ArrayList myhostnodes) {
// duplicates have already been accounted for
// when the host list was built.
DefaultTreeModel mymodel = (DefaultTreeModel)mytree.getModel();
Object myroot = mymodel.getRoot();
System.out.println("root is "+myroot);
for (int i=0; i < myhostnodes.size(); i++) {
// update with new node
DefaultMutableTreeNode curnode = new DefaultMutableTreeNode(myhostnodes.get(i));
if((mymodel.getIndexOfChild(myroot,curnode) < 0 )) {
System.out.println("object to add is "+curnode);
mymodel.insertNodeInto(curnode,(DefaultMutableTreeNode)myroot,0);
}
I looked through many posts before posting this question. Could you send me a reference or what u searched on?
As for your questions......
1. Can unique nodes be duplicated in other nodes of the tree?
- yes and no. No for the tree I'm working on now, yes for the tree on the next tabbedpane I'm populating.
1.a If yes, can a node be a child of itself?
- no. I don't want it to be at least...
2. How many nodes total in the tree?
- for my project, maximum of 500. Probably will put them in a paged scroll pane.
3. How many average children does a node have?
- depends on how young the sampled group started having children????
3.a What is the maximum number of children a node can have?
- I'm sure my 500 is below the limit, but only testing will tell.
4. Once the tree is built is it static?
- no. It doesn't have to be. There are good examples of this.
5. How slow is too slow?
- real time is a relative term! I'm trying to be more efficient with doing it this way instead of looping through all the nodes of the tree. I'm also hard headed and think that there should be an easy way to do this.
Thanks
Similar Messages
-
Avoid duplicate batch (batch managment)
dear all,
We are facing problem related to batch managment .we are using manual batch entry .we don't want to make duplicate entry of same batch (which have already assign to material) against any material.what is solution to avoid duplicate batch entry.
can u tell me the settings and any user exit in which we can avoid the duplicate batch.
regardsHi hema,
in our scenario user manually enter batch of raw material in MIGO when we do good recipte in MIGO againts purchase order we enter batch manually.but we need that a batch which has already assign to raw material can not be assign again .if user enter the previous batch then system gives a error that batch has already exist.
may be u know the prob -
How to avoid duplicate posting of noted items for advance payment requests?
How to avoid duplicate posting of noted items for advace payments request?
Puttasiddappa,
In the PS module, we allow the deletion of a component pruchase requisition allthough a purchase order exists. The system will send message CN707 "<i>A purchase order already exists for purchase requisition &</i>" as an Iinformation message by design to allow flexible project management.
If you, however, desire the message CN707 to be of type E you have to
modify the standard coding. Doing so, using SE91, you can invoke the
where-used-list of message 707 in message class CN, and to change the
i707(cn)
to
e707(cn)
where desired.
Also, user exit CNEX0039 provides the possibility to reject the
deletion of a component according to customers needs e. g. you may
check here whether a purchase order exists and reject the deletion.
Hope this helps!
Best regards
Martina Modolell -
How to avoid duplicate BOM Item Numbers?
Hello,
is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
Regards,
Helmut GanteHello,
is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
Regards,
Helmut Gante -
#MULTIVALUE even affter checking avoid duplicate row agg.
Hi experts
I am getting multivalue error in few rows even after checking the option of avoid duplicate row agg.
any ideas
regardsHi,
#Multivalue :- this error will occur in 3ways
1) #multivalue in aggregation -
the o/p context not include i/p context its situation this error occurs.
2) #multivalue in breaks header or footer
3) #multivalue in section level.
Please provide us with the description of the issue u r facing.
Regards,
Chitha. -
how can I avoid duplicates on contacts and how do I get contacts created on iPhone/ipad synchronized on my mac? so far it doesn't work correctly, just sometimes. same for icalendar
On your Mac, for duplicates, switching Contacts off then back on in System Preferences > iCloud may prevent duplicates.
On the iPhone / iPad tap Settings > iCloud. Make sure Contacts and Calendars are swtiched on.
Try restarting your Mac and your iOS devices when items won't sync as they should.
To restart an iOS device: Hold the On/Off Sleep/Wake button down until the red slider appears. Slide your finger across the slider to turn off iPhone. To turn iPhone back on, press and hold the On/Off Sleep/Wake button until the Apple logo appears. -
How to avoid duplicate record in a file to file
Hi Guys,
Could you please provide a soultion
in order to avoid duplicate entries in a flat file based on key field.
i request in terms of standard functions
either at message mappingf level or by configuring the file adapter.
warm regards
mahesh.hi mahesh,
write module processor for checking the duplicate record in file adapter
or
With a JAVA/ABAP mapping u can eliminate the duplicate records
and check this links
Re: How to Handle this "Duplicate Records"
Duplicate records
Ignoring Duplicate Records--urgent
Re: Duplicate records frequently occurred
Re: Reg ODS JUNK DATA
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
regards
srinivas -
Lookup transformation to avoid duplicate rows? - SSIS 2005
Hi,
I'm maintaning a SSIS 2005 pkg. I need to read a flat file to write on a SQL Server table, avoiding duplicates.
I can have duplicates rows into the flat file to import and I need to prevent the insert of any rows already existing in the SQL Server table.
So, I think to use a lookup transformation. I've created a flat file source, then I connect it to a lookup transformation and inside it I've specified as the reference table the SQL Server destination table. Then, I've checked the available lookup columns
each adding as a new column: but the lookup task has arised an error and so I've specified as lookup operation the replacement. For each unmatching I need to write on the SQL Server table (the reference table in the lookup). For the lookup output error I've
indicate to ignore failure. Other steps?
However, when I run the pkg then inside the SQL Server destination table I can see only NULL values, but I want to see the rows don't already present in the table.
Any suggests to me, please? ThanksHi,
I'm using SSIS 2005 as reported in the title of the post.
I could have duplicates inside the source file and the existing table could haven't any rows.
Thanks
If you dont have any rows in existing table, then they will go through Error output in lookup task. For duplicates, lookup task will find matches and will go through lookup match output
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
To avoid duplicate entries in multi row
Hi i need to avoid duplicate entries in multi record form.In the master block i have AGENCY CODE.These are the fields, RATING_CODE(composite primary key alng with AGENCY_CODE in the detail block,but set as hidden) and DESCRIPTION
in the detail block.If AGENCY_CODE is CRISIL,then for that i should not enter duplicate RATING CODES.I have written
DECLARE
L_COUNT2 NUMBER;
l_ret varchar2(20);
BEGIN
SELECT COUNT(*) INTO L_COUNT2 FROM CSTM_AGENCY_CODE_DETAIL
WHERE AGENCY_CODE=:BLK_CSTM_AGENCY_CODE_DETAIL.AGENCY_CODE
AND RATING_CODE=:BLK_CSTM_AGENCY_CODE_DETAIL.RATING_CODE;
IF L_COUNT2 > 0 THEN
l_ret := ovpkcs.fn_dispmsg('AGYCOD-03;',';',';');
Raise Form_Trigger_Failure;
END IF;
END;
in WHEN_VALIDATE_ITEM.
Now when i press the TAB to move next time it gives the message,DUPLICATE RATING CODE.The
problem is when i move back to the previous record by clicking mouse and change it to the already existing value,and while i save the validation is not happening and the message is not shown.Kindly tell me where i should code.
Thank youhy,
you can check whan commit( an save button)
for i = 1 to n-1
check condition (item = item +1)
next_record
end
or by stored insert whit exception return to form program when duplicate key is found
... -
SQVI - Quick Viewer - Avoid Duplicates
Hi All,
I am running a Quick Viewer and need to display the Count of records for each month.
Now I have 2 records for each Employee.(Say 1 for dental and 1 for medical)
Say I have 5 employees,So I could display the Count as 10 records.
But in Actual I need to display the count as 5 because there are 5 employees.
Since in my dataset 2 records are fetched for each employee,Record count is many a times double the employee count .
Is there anyway we could avoid duplicates in Quick viewer.
Edited by: Kumar B on Nov 3, 2009 9:48 AMI am using a custom infotype table in HR similar to the IT0167 (health plans).
Problem is few employees have medical and dental,Few have only medical and few Only dental.
Thus making it difficult to get the actual count of employees.
Thank you,
Edited by: Kumar B on Nov 3, 2009 10:19 AM -
Avoid Duplicate Tasks when Expanding Groups for Custom Task Process
Is there a way to:
Avoid Duplicate Tasks when Expanding Groups for Custom Task Process?
I've got a people metadata column that I am planning on putting groups into. I want the groups to expand and send a task for all users in the groups. I also want to avoid creating multiple tasks if a user happens to be in two groups at the same
time.
I'm trying to work out a way to assign users a read task based on job training requirements. Right now assigning groups and using a workflow task to confirm read is what I'm trying to accomplish. I just end up getting two tasks for a user if
their in multiple groups.
David JenkinsHi David,
Please verify the followings:
After Participants, select Parallel(all at a once)
Expand Task Options, select ‘Assign a task to each member within groups’
Open the action properties, make sure ExpandGroup is Yes
Also in SharePoint Designer ,you can edit the property for the Start Approval Porcess to enable ExpandGroup:
Reference:
https://social.msdn.microsoft.com/Forums/office/en-US/d14da1c4-bd5a-459b-8698-3a89bb01e6ad/expand-groupnot-creating-tasks-for-users-issue-in-sharepoint-2013-designer-workflow?forum=sharepointgeneral
https://social.technet.microsoft.com/Forums/office/en-US/ac245d45-ff66-4341-815c-79213efc4394/sharepoint-2010-designer-workflows-and-sharepoint-user-groups?forum=sharepointcustomizationprevious
Best Regards,
Eric
TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
[email protected] -
There are 2 dataGrids: G1 and G2. Rows can be dragged and
dropped from G1 into G2. There can be duplicate rows in G1, but we
need to avoid duplicate rows in G2.
Ideally there should be a STOP sign displayed when there is
an attempt to drop a duplicate row into G2.
How can this be done?
Thanks.in your drag drop handler, get the dragged row(s), use the
getItemIndex() method of the underlying dataProvider array
collection to see if this row is already there, if so, just call
the event.preventDefault() and set the feedback to stop.
http://www.adobe.com/cfusion/webforums/forum/messageview.cfm?forumid=60&catid=585&threadid =1318924&highlight_key=y&keyword1=preventdefault
ATTA -
Hi All,
I need to code for avoid duplicate IDOC when my program convert one idoc to another IDOC. The Code is written below..
LOOP AT t_seldoc.
SELECT SINGLE * FROM edidc
WHERE docnum EQ t_seldoc-idoc.
REFRESH: t_idocst,
t_edidd.
IF edidc-mestyp = c_msg_type.
MOVE: c_new_type TO edidc-mestyp,
c_st69 TO edidc-status,
c_st69 TO t_seldoc-status,
t_seldoc-idoc TO t_idocst-docnum.
ELSE.
MOVE: 'Z_NGI_SBT_TICKET' TO edidc-mestyp,
c_st69 TO edidc-status,
c_377 TO edidc-stdmes, "Add the stdmes for acks
c_st69 TO t_seldoc-status,
t_seldoc-idoc TO t_idocst-docnum.
ENDIF.
APPEND t_idocst TO t_idocst.
PERFORM update_idoc.
READ TABLE t_output WITH KEY idoc = t_seldoc-idoc.
MOVE sy-tabix TO l_tabix.
MOVE c_upd_idoc TO t_output-status.
MODIFY t_output INDEX l_tabix.
MODIFY t_seldoc.
ENDLOOP.
Line: -
This is the perform statement.
CHANGE BY Swati Namdev 28042009
types : begin of ty_vbak,
vbeln type vbak-vbeln,
end of ty_vbak.
Data : LT_dup_check type standard table of Z1NG_SBTTICKETHD,
it_vbak type standard table of ty_vbak.
End Here Swati Namdev 28042009
CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
EXPORTING
document_number = t_seldoc-idoc
ALREADY_OPEN = 'N'
IMPORTING
IDOC_CONTROL =
TABLES
idoc_data = t_edidd
EXCEPTIONS
document_foreign_lock = 1
document_not_exist = 2
document_not_open = 3
status_is_unable_for_changing = 4
OTHERS = 5.
IF sy-subrc NE 0.
MESSAGE ID sy-msgid
TYPE sy-msgty
NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
LOOP AT t_edidd WHERE segnam EQ c_tickey_hdr.
MOVE t_edidd-sdata TO z1tickethd.
IF z1tickethd-tkt_type EQ '0'.
MOVE '3' TO z1tickethd-tkt_type.
ELSEIF
z1tickethd-tkt_type EQ '1'.
MOVE '4' TO z1tickethd-tkt_type.
ENDIF.
MOVE z1tickethd TO t_edidd-sdata.
MODIFY t_edidd.
ENDLOOP.
DATA: z1ng_sbttickethd LIKE z1ng_sbttickethd,
z1ng_sbtticketdt LIKE z1ng_sbtticketdt,
z1ng_ticketdt LIKE z1ng_ticketdt.
LOOP AT t_edidd WHERE segnam EQ 'Z1NG_TICKETDT'.
MOVE t_edidd-sdata TO z1ng_ticketdt.
CLEAR: z1ng_sbtticketdt.
MOVE-CORRESPONDING z1ng_ticketdt TO z1ng_sbtticketdt.
MOVE z1ng_sbtticketdt TO t_edidd-sdata.
t_edidd-segnam = 'Z1NG_SBTTICKETDT'.
MODIFY t_edidd.
CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENT'
EXPORTING
idoc_changed_data_record = t_edidd
EXCEPTIONS
idoc_not_open = 1
data_record_not_exist = 2
OTHERS = 3.
ENDLOOP.
LOOP AT t_edidd WHERE segnam EQ 'Z1NG_TICKETHD'.
MOVE t_edidd-sdata TO z1ng_tickethd.
CLEAR: z1ng_sbttickethd.
MOVE-CORRESPONDING z1ng_tickethd TO z1ng_sbttickethd.
MOVE z1ng_sbttickethd TO t_edidd-sdata.
t_edidd-segnam = 'Z1NG_SBTTICKETHD'.
MODIFY t_edidd.
CHANGE BY Swati Namdev 28042009
MOVE-CORRESPONDING z1ng_sbttickethd TO LT_dup_check.
append z1ng_sbttickethd to LT_dup_check.
End here Swati Namdev 28042009
ENDLOOP.
CHANGE BY Swati Namdev 28042009
refresh it_vbak. clear it_vbak.
if lt_dup_check[] is not initial.
Select vbeln from vbak into table it_vbak for all entries in
lt_dup_check where KUNNR = lt_dup_check-CUST
and ZZTKT_NBR = lt_dup_check-TKT_NBR.
if it_vbak[] is not initial.
Message text-002 type 'E'.
endif.
endif.
End here Swati Namdev 28042009
CALL FUNCTION 'EDI_CHANGE_CONTROL_RECORD'
EXPORTING
idoc_changed_control = edidc
EXCEPTIONS
idoc_not_open = 1
direction_change_not_allowed = 2
OTHERS = 3.
IF sy-subrc NE 0.
MESSAGE ID sy-msgid
TYPE sy-msgty
NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENT'
EXPORTING
idoc_changed_data_record = t_edidd
EXCEPTIONS
idoc_not_open = 1
data_record_not_exist = 2
OTHERS = 3.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid
TYPE sy-msgty
NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
EXPORTING
document_number = t_seldoc-idoc
do_commit = c_yes
do_update = c_yes
WRITE_ALL_STATUS = 'X'
TABLES
STATUS_RECORDS = T_EDI_DS40
EXCEPTIONS
idoc_not_open = 1
db_error = 2
OTHERS = 3.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid
TYPE sy-msgty
NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
CALL FUNCTION 'IDOC_STATUS_WRITE_TO_DATABASE'
EXPORTING
idoc_number = t_seldoc-idoc
TABLES
idoc_status = t_idocst.
COMMIT WORK.
CALL FUNCTION 'EDI_DOCUMENT_DEQUEUE_LATER'
EXPORTING
docnum = t_seldoc-idoc.
COMMIT WORK.
ENDFORM. " UPDATE_IDOC
AT present I am checking if IDOC is Duplicate giving error message but now I have to set status as 51 and for duplicate idoc and run for remaining.
Please provide the solution.
regards
Swati
Edited by: Swati Namdev on May 5, 2009 11:26 AM
Edited by: Swati Namdev on May 5, 2009 11:28 AM
Edited by: Swati Namdev on May 5, 2009 11:28 AM
Edited by: Swati Namdev on May 5, 2009 11:29 AM
Edited by: Swati Namdev on May 5, 2009 11:32 AMHi all
any inputs on this pl...?
Q: If the same idoc is received second time then how to stop the processing
the duplicate idoc ?
(I understood the question this way )
regards -
Hi All,
I am having scenario where sap is getting data from ERP system trough PI ( JMS to ABAP Proxy). Is there any mechanism to avoid duplicates with out sending to sap which are coming form sender. If yes can you please provide me the procedure.
Regards,
RamaWrite an adapter module to stop processing duplicates in PI.
Please refer this blog
/people/sandeep.jaiswal/blog/2008/05/13/adapter-module-to-stop-processing-of-duplicate-file-ftp-location -
Avoiding duplicates in internal table
Hi Experts,
when Iam populating data into internal table how can i avoid duplicates.
For eg I want to capture all the sales order of all the items , if iam having the same sales order for all the items the internal table should be consisting of only one sales order .Hi,
The better approach will be to use the following
SORT T_MARC BY MATNR.
DELETE ADJACENT DUPLICATES FROM T_MATNR COMPARING MATNR.
The DISTINCT addition to the SELECT statement allows you to remove duplicates from a set of results during a SELECT.
SELECT DISTINCT MATNR
FROM MARC
INTO TABLE T_MARC
WHERE DISMM = 'ND'.
<b>Disadvantages:</b>
Requires sorting on database server and adversely affects overall system performance if no index can be used
When using DISTINCT the database is always accessed directly by passing the SAP buffer
Alternative approach:
SELECT MATNR
FROM MARC
INTO TABLE T_MARC
WHERE DISMM = 'ND'.
SORT T_MARC BY MATNR.
DELETE ADJACENT DUPLICATES FROM T_MATNR COMPARING MATNR.
<b>Recommendation:</b> Only use the DISTINCT addition if there are a large number of duplicates and the set of results will be significantly reduced if you remove duplicates.
Use DISTINCT if the selected fields are part of DB index picked by WHERE clause of the SELECT.
Reward useful answers
Maybe you are looking for
-
Please!!! help me with my zen v
i really need some help please!! my sister gave me this 2G creative zen v that?I really like and?I love it...but the problem is that after a certain number of songs (usually less than a 00) an error message?display on the screen, and I don't know wha
-
"RCL" folder in all hard drives
Can someone help creating a batch file? I want to locate "RCL" folder in all hard drives on a computer and copy few important files and folders from this directory.It needs to be DOS batch file.
-
Crystal Reports functionality from Save The Cat
Sorry if I've missed this in the presentation video since I've only watched half of it, so I have no idea if this has been considered for development. I use "Save The Cat" from Blake Snyder to storyboard my scenes with Crystal Reports - minus his str
-
Strange playlists suddenly appearing...
OK this is so weird. Sometimes a completely strange playlist appears in iTunes. This time it is a playlist that is called "All Marcos Music". I didn't import any playlist? I cannot click on the playlist either since it is password protected. Does any
-
CRASH WHEN WAKING FROM HIBERNATION / DEEP SLEEP
Hi All, 2011 Macbook Pro 15", OS 10.6.8, 750GB Western Digital Black 7200rpm, 16GB RAM I have searched and found several different discussions regarding this issue. When the battery gets low the OS forces a system hibernation / deep sleep. Once you c