How to avoid duplicates in export?
Hi
Is there any way to avoid duplicates when exporting from LR4/avoid exporting the same picture(to the same folder) again?
Kindly
Jan
Rob Cole wrote:
I stand corrected - thanks Jim .
I think I had forgotten this because of how I use publish services / collections.
All of my publish services have exactly one (smart) collection, which defines the photos to be published.
I don't create a multitude of publish collections in order to define an associated publish tree - the tree is defined by the source folders. I started this convention before Lr was a glimmer in Adobe's eyes (even before digital photography was invented). If I was inventing now from scratch I might do it differently, but this is one reason I get aggravated when some of the experts in this forum continually "forget" that the need to maintain a prescribed convention is sometimes an absolute requirement (or at least *highly* desirable), and Lr should be able to adapt to the convention - and not the other way around.
If I quit Lr today, I could still maintain published trees without Lr's publishing collections. If your scheme depends on publishing collections which have no visibility outside Lightroom (e.g. a multitude of hard drive publishing collections), then you'd be screwed (so to speak) if you wanted to migrate to another software for maintenance. Not only that, but if you rebuild your catalog, all such collections are lost (unless you know how to use plugins to preserve them). Even if your scheme depends on regular (non-publishing I mean, whether smart of not) collections, and you use jf's Collection Publisher to publish in matching hierarchy, you'd still be screwed when migrating, since those collections do not exist outside Lightroom.
Impact may vary of course, but I like to minimized dependence on a specific software if possible.
Folders exist regardless of which software you use to edit your photos. Put another way: collection hierarchies are proprietary, folder structure isn't.
So, although lots of people prefer jf's Collection Publisher (understandably), it's worth considering jf's Folder Publisher too, or my very own TreeSync Publisher.
Cheers,
Rob
Similar Messages
-
How to avoid duplicates values from alvgird see below code
how to avoid duplicates values from alvgird see below code
in below query docno no is repeated again and again
how i can avoid duplication in this query.
select * into corresponding fields of table itab
from J_1IEXCHDR
inner join J_1IEXCDTL
on J_1IEXCDTLlifnr = J_1IEXCHDRlifnr
where J_1IEXCHDr~status = 'P'.Hi Laxman,
after that select statement
select * into corresponding fields of table itab
from J_1IEXCHDR
inner join J_1IEXCDTL
on J_1IEXCDTLlifnr = J_1IEXCHDRlifnr
where J_1IEXCHDr~status = 'P'.
<b>if sy-subrc = 0.
delete adjucent duplicates from itab comparing <field name of itab internal table>
endif.</b>
this will delete your duplicate entries.once you done with this call the alv FM.
<b> call function 'REUSE_ALV_GRID_DISPLAY'</b>
exporting
I_INTERFACE_CHECK = ' '
I_BYPASSING_BUFFER = ' '
I_BUFFER_ACTIVE = ' '
i_callback_program = v_repid
I_CALLBACK_PF_STATUS_SET = ' '
I_CALLBACK_USER_COMMAND = 'IT_USER_COMMAND'
I_CALLBACK_TOP_OF_PAGE = ' '
I_CALLBACK_HTML_TOP_OF_PAGE = ' '
I_CALLBACK_HTML_END_OF_LIST = ' '
I_STRUCTURE_NAME =
I_BACKGROUND_ID = ' '
i_grid_title = 'Purchase Order Details'
I_GRID_SETTINGS = I_GRID_SETTINGS
is_layout = wa_layout
it_fieldcat = it_fieldcat
IT_EXCLUDING = IT_EXCLUDING
IT_SPECIAL_GROUPS = IT_SPECIAL_GROUPS
it_sort = it_sort
IT_FILTER = IT_FILTER
IS_SEL_HIDE = IS_SEL_HIDE
I_DEFAULT = 'X'
I_SAVE = ' '
IS_VARIANT = IS_VARIANT
it_events = it_event
IT_EVENT_EXIT = IT_EVENT_EXIT
IS_PRINT = IS_PRINT
IS_REPREP_ID = IS_REPREP_ID
I_SCREEN_START_COLUMN = 0
I_SCREEN_START_LINE = 0
I_SCREEN_END_COLUMN = 0
I_SCREEN_END_LINE = 0
I_HTML_HEIGHT_TOP = 0
I_HTML_HEIGHT_END = 0
IT_ALV_GRAPHICS = IT_ALV_GRAPHICS
IT_HYPERLINK = IT_HYPERLINK
IT_ADD_FIELDCAT = IT_ADD_FIELDCAT
IT_EXCEPT_QINFO = IT_EXCEPT_QINFO
IR_SALV_FULLSCREEN_ADAPTER = IR_SALV_FULLSCREEN_ADAPTER
IMPORTING
E_EXIT_CAUSED_BY_CALLER = E_EXIT_CAUSED_BY_CALLER
ES_EXIT_CAUSED_BY_USER = ES_EXIT_CAUSED_BY_USER
tables
<b> t_outtab = ITAB</b>
exceptions
program_error = 1
others = 2
if sy-subrc <> 0.
message id sy-msgid type sy-msgty number sy-msgno
with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
endif.
Thanks
Vikranth Khimavath -
How to avoid duplicate posting of noted items for advance payment requests?
How to avoid duplicate posting of noted items for advace payments request?
Puttasiddappa,
In the PS module, we allow the deletion of a component pruchase requisition allthough a purchase order exists. The system will send message CN707 "<i>A purchase order already exists for purchase requisition &</i>" as an Iinformation message by design to allow flexible project management.
If you, however, desire the message CN707 to be of type E you have to
modify the standard coding. Doing so, using SE91, you can invoke the
where-used-list of message 707 in message class CN, and to change the
i707(cn)
to
e707(cn)
where desired.
Also, user exit CNEX0039 provides the possibility to reject the
deletion of a component according to customers needs e. g. you may
check here whether a purchase order exists and reject the deletion.
Hope this helps!
Best regards
Martina Modolell -
How to avoid Duplicate Records while joining two tables
Hi,
I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
select
e.id,
e.seqNo,
e.name,
d.resDate,
d.details
from employees e,
((select * from dept)union(select * from dept_hist)) d
join on d.id=e.id and e.seqno=d.seqno
but this returing duplicate records.
Could anyone please tell me how to avoid duplicate records of this query.Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
But I am getting duplicate records if even I am distinct. -
How to avoid duplicate data while inserting from sample.dat file to table
Hi Guys,
We have issue with duplicate data in flat file while loading data from sample.dat file to table. How to avoid duplicate data in control file.
Can any one help me on this.
Thanks in advance!
Regards,
LKRNo, a control file will not remove duplicate data.
You would be better to use an external table and then remove duplicate data using SQL as you query the data to insert it to your destination table. -
How to avoid duplicate record in a file to file
Hi Guys,
Could you please provide a soultion
in order to avoid duplicate entries in a flat file based on key field.
i request in terms of standard functions
either at message mappingf level or by configuring the file adapter.
warm regards
mahesh.hi mahesh,
write module processor for checking the duplicate record in file adapter
or
With a JAVA/ABAP mapping u can eliminate the duplicate records
and check this links
Re: How to Handle this "Duplicate Records"
Duplicate records
Ignoring Duplicate Records--urgent
Re: Duplicate records frequently occurred
Re: Reg ODS JUNK DATA
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
regards
srinivas -
How to avoid duplicates for an result set
how to avoid the duplicate rows for the below query
SELECT to_char(grecode (titleid)) gre_code, to_char(toeflcode (titleid)) toefl_code,titleid
FROM (SELECT DISTINCT TO_CHAR
(UPPER (TRIM (get_clob_value (table_name,
KEY
) RESULT,
titleid
FROM mcp_specifications a JOIN mcp_title_specifications b
ON a.specificationid = b.specificationid
JOIN mcp_titles c ON b.titleid = c.titleid
WHERE b.is_parent = 'F'
AND UPPER (TRIM (c.university_state)) =
UPPER (TRIM ('USA'))
AND TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
AND UPPER (TRIM (SPECIFICATION)) IN
(UPPER (TRIM ('program'))))
WHERE UPPER (TRIM (RESULT)) = UPPER (TRIM ('COMPUTER SCIENCE'))
ORDER BY RESULT ASC;the output of the query would be
gre_code toefl_code titleid
402 78 5518
402 78 5519
402 78 5520
402 78 5521the output should be
402 78 any titleidSome simplified code:
SELECT grecode(titleid) gre_code,
toeflcode(titleid) toefl_code,
min(titleid) titleid
FROM (SELECT DISTINCT TO_CHAR(UPPER(TRIM(get_clob_value(table_name,KEY)))) RESULT,
titleid
FROM mcp_specifications a
JOIN mcp_title_specifications b
ON a.specificationid = b.specificationid
JOIN mcp_titles c
ON b.titleid = c.titleid
WHERE b.is_parent = 'F'
AND UPPER(TRIM(c.university_state)) = 'USA'
AND TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
AND UPPER(TRIM(SPECIFICATION)) = 'PROGRAM')
WHERE UPPER(TRIM(RESULT)) = 'COMPUTER SCIENCE'
GROUP BY grecode(titleid),
toeflcode(titleid)Please note that applying functions like UPPER and TRIM on a string literal can and should be avoided.
For example:
UPPER(TRIM('USA')) = 'USA'Why force the database to do both an UPPER and a TRIM on something that can just be represented in uppercase with no surrounding spaces? It's a waste of time. -
How to avoid duplicate BOM Item Numbers?
Hello,
is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
Regards,
Helmut GanteHello,
is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
Regards,
Helmut Gante -
How to avoid duplicates in CROSS JOIN Query
Hi,
I am using CROSS JOIN to get all the subset of a table col values as shown below:
PRODUCT (Col Header)
Bag
Plate
Biscuit
While doing cross join we will get as
Bag Bag
Bag Plate
Bag Biscuit
Plate Bag
Plate Plate
Plate Biscuit ..... like this
By placing where condition prod1 <> prod2 to avoid Bag Bag and Plate Plate values. So the output will be like below
Bag Plate
Bag Biscuit
Plate Bag
Plate Biscuit
Now "Bag Plate" and "Plage Bag" are same combination how to avoid these records. My expected result is
Bag Biscuit
Plate Biscuit
How to derive this ?
SridharHi,
This is the the solution that I found as fit to the OP question, but
Visakh16 already posted the same idea (assuming the names are unique) from the start and I don't think that anyone notice it!
Sridhar.DPM did
you check Visakh16's response
(the second response received)?!?
I will mark his response as an answer. If this is not what you need pls clarify and you can unmark it :-)
[Personal Site] [Blog] [Facebook] -
How to avoid duplicates in Iphoto 5?
I have IPhoto 5.0.4 with Leopard on a G4. When I attach my digital camera, all the photos from my camera upload. I keep some family pics on my camera all the time, so the same ones duplicate themselves every time I upload new photos. How can this be avoided? thanks
There is a way to do this.
If your camera mounts on the Desktop as an external device, then use iPhoto's File > Import to Library command. Navigate to the camera, preview the photos, select just the ones you want, and import directly into iPhoto.
If your camera does not mount on the Desktop, like most Canon models, you can place the camera card into a USB card reader. That will mount on the Desktop, and you can control your imports as described above. It also gives you the advantage of not having to worry about your camera's battery power during the import.
Regards. -
How to avoid duplicate measures in reports due to case functions?
Hi,
If I create a report, using a dimension called insert_source_type where the next measure would be insert_source in the dimensions hirarchie, if I do not put any formula, when I become a report where i can drill down on insert_source_type and i get insert_source values.
If I use a function like (CASE "Ins Source"."Ins Source Type" WHEN 'OWS' THEN 'WEB' ELSE "Ins Source"."Ins Source Type" END) and change the label of insert_source_tpye to Channel Group instead, when
I drill down on Channel Group, it goes to insert_source_tpye and from there i can drill down to insert_source.
There is an insert_source_type too much!
How can be this avoided?
Thanks and Regards
Giulianohi mahesh,
write module processor for checking the duplicate record in file adapter
or
With a JAVA/ABAP mapping u can eliminate the duplicate records
and check this links
Re: How to Handle this "Duplicate Records"
Duplicate records
Ignoring Duplicate Records--urgent
Re: Duplicate records frequently occurred
Re: Reg ODS JUNK DATA
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
regards
srinivas -
HOW TO AVOID DUPLICATES WHILE USING VRM FM
HI EXPERTS,
I AM USING VRM FM TO PASS VALUES INTO LISTBOX IN SCREENS.FIRST TIME IT IS WORKING PROPERLY WHEN I USE IT FOR THE SECOND TIME IT IS SHOWING THE PREVIOUS DATAS ENTRIES ALSO .HOW TO DELETE THE DUPLICATE ENTRIES OF IT .
THIS IS MY CODE.
when 'ENTER'.
select POSNR ARKTX from LIPS into corresponding fields of table i_pstyp WHERE VBELN EQ VBELN.
LOOP AT I_PSTYP .
W_DDLIST-KEY = I_PSTYP-POSNR.
W_DDLIST-TEXT = I_PSTYP-POSNR.
APPEND W_DDLIST TO POSNR1.
CLEAR W_DDLIST.
CALL FUNCTION 'VRM_SET_VALUES'
EXPORTING
ID = 'POSNR'
VALUES = POSNR1.
IN THIS IF I SELECT NEXT VBELN WITHOUT EXITING FROM THE SCREEN THE PREVIOUS VBELN'S LINE ITEMS ALSO ARE DISPLAYING .WHAT SHALL I DO .
MANIwhen 'ENTER'.
select POSNR ARKTX from LIPS
into corresponding fields of table i_pstyp
WHERE VBELN EQ VBELN.
LOOP AT I_PSTYP .
W_DDLIST-KEY = I_PSTYP-POSNR.
W_DDLIST-TEXT = I_PSTYP-POSNR.
APPEND W_DDLIST TO POSNR1.
CLEAR W_DDLIST.
ENDLOOP. " Endloop here. the FM should not be called inside the loop
CALL FUNCTION 'VRM_SET_VALUES'
EXPORTING
ID = 'POSNR' " hope you are using the fieldname correctly
VALUES = POSNR1.
CLEAR : POSNR1. " this will stop repeating the values in your drop down list
Regards
Gopi -
How to avoid duplicates in LOV
Hi, i'm using search query component (11g) and for the search fields i'm adding LOV. In the UI , i can see the LOV with values from the table. But i need to avoid the duplicates in this . Looked at the docs and demo and i couldn't figure out anything. Could some one point me a resource. Thanks.
How do you create the LOV then?
To make a view object with a lov you need two viewObjects - let us say viewObject and viewObjectLOV.
viewObject is updatalbe, viewObjectLOV is not.
Your problem is that viewObjectLOV returns duplicate values.
1. find where is the viewObjectLOV, you can reach it via viewObject's accessor
2. make sure it has a primary key.
3. modify viewObjectLOV's query so it does not return duplicate values, for example using distinct keyword.
0. If you did not understand what I tried to explain, maybe you should read some more documentation first :) -
How to avoid 'duplicate data record' error message when loading master data
Dear Experts
We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
Is there a trick you know to tell the system that the date fields are also part of the key??
Thank you for your help
PeterAlessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
Siggi - I don't have the error message described in the note.
"There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
In PSA the records are marked red with the same message (MSG no 191).
As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
Thanks
Peter -
How to avoid duplicate data loading from SAP-r/3 to BI
Hi !
I have created one process chain that will load data into some ODS from R/3,where(in R/3)the datasources/tables r updated daily.
I want to scheduled the system such that ,if on any day the source data is not updated (if the tables r as it is) then that data shuold not be loaded into ODS.
Can any one suggest me such mechanism,so that I can always have unique data in my data targets.
Pls ! Reply soon.
Thank You !
Pankaj K.Hello Pankaj,
By setting the unique records option, you pretty much are letting the system know to not check the uniqueness of the records using the change log and the ODS active table log.
Also, in order to avoid the problem where you are having dual requests which are getting activated at the same time. Please make sure you select the options "Set Quality Status to 'OK' Automatically" and "Activate Data Automatically" that way you would be having an option to delete a request as required without having to delete the whole data.
This is all to avoid the issue where even the new request has to be deleted to delete the duplicate data.
Untill and unless the timestamp field is available in the table on top of which you have created the datasource it would be difficult to check the delta load.
Check the table used to make sure there is no timestamp field or any other numeric counter field which can be used for creating a delta queue for the datasource you are dealing with.
Let me know if the information is helpful or if you need additional information regarding the same.
Thanks
Dharma.
Maybe you are looking for
-
Differences between BEA's and Oracle's own drivers
I am trying to understand the various ways and technical differences to connect to an Oracle database from WLS 6.1. As far as I can tell, here is the list of drivers: THIN: 1. the bundled, old Oracle thin driver in weblogic.jar 2. the latest Oracle t
-
Error in depreciation postings after upgrade to ECC6.
Hi FI/Asset Gurus We recently upgraded from 4.7 to ECC6. In 4.7 our depreciation postings were working fine but, after upgrade it is throwing an erroru2026u2026.. Create document number range Z2 using internal number assignment Our number range Z2 h
-
Can I use expdp/impdp for APEX schemas to setup a new server?
If setting up a new development server that will replace an old server, can I simply use datapump tp export the APEX_040100 and FLOWS_FILES schemas on the old server, and just import the schemas on the new server? That would seem much simpler and fas
-
Garageband files not opening properly on new macbook
Hi all, I'm moving from an iMac 2007 running 10.7.5 to a new macbook pro I just bought. Most everything has gone smoothly with the exception of Garageband. Last might I moved my old files over to the new computer, saving them in the the "Songs" folde
-
Uso del Portátil en casa con corriente o batería ?
Buenas Tardes, me gustaría saber si es mejor tener enchufado el Portátil a la corriente mientras se trabaja en casa o trabajar con la batería y luego cuando se acabe cargarlo. Y seguir trabajando ? Gracias por la respuesta por adelantado.