IPTC location fields - best practices?

Lightroom offers various location-related data - Scene, Location, City, State/Province, Country, ISO Country Code.
Country and State/Province are usually straightforward. In a major city, City is also obvious, and for Location I typically enter a street name or street/cross-street.
Elsewhere, things aren't so obvious. For example, I shot some pictures recently in a village with about 200 inhabitants: is that a city, or a location? Other pictures were taken between villages in a large river valley. Should I enter the name of the valley ("Wakhan Valley") as 'City'? What about shots taken on a major road? Would the name of the road ("Pamir Highway") count for City or Location?
And what to enter for 'Scene'? And should ISO Country Code use 2-letter, 3-letter or numeric values?
Does anyone have any advice to offer on these points, or pointers to industry guidelines?
Thank you,
Angus

For industry guidelines, go directly to the IPTC organization. At the following link, download the IPTC Core PS Panel file (along the right side of the page), unzip the file, and look at the pdf for a description of how the various fields are generally used.
http://www.iptc.org/IPTC4XMP/
Of course, you can use them any way you wish. What you've described is most commom. Since Countries, States, and Cities are all political subdivisions (arbitrary man-made boundaries), I generally limit their use to just that. The village you described fits this criteria, so in my collection, I would use the City field. I broadly view the City field as being the smallest political subdivision that we use to govern ourselves (just my definition).
The Location field, on the other hand, could be most anything defined politically, geographically, or otherwise. The IPTC standard suggests using only the Location field, leaving City blank, should you be on a mountain top in the middle of nowhere. Personally, I usually fill in the closest City to the location I'm at, but that might not be what the New York Times or other media outlets would do. I recently took some shots on board a flight at over the Gulf of Mexico - had to really scratch my head on how I wanted to fill in the IPTC fields. Made me wonder what NASA does.
Hope this helps, Gary

Similar Messages

  • IPTC Location Fields

    I've tagged the location of many of my photos with the Assign Location... command in Aperture. However, this metadata seems to only be accessible through Aperture. Are there any Applescripts around to get the Aperture Location Data assigned to the IPTC Core Fields (City,  State/Province, Country)?

    Gone .  From the page that shows up when I click "download" from the Plug-ins page:
    All good things must come to an end.
    The time has come to retire the Über Plug-ins for Aperture and iPhoto. As of October 4th, 2011, they aren't available for download.

  • DAM: integrate/harmonize Collections, Keywords and IPTC location

    Lightroom currently presents the user with two independent methods to assign photos to a hierarchical structure: collections and keywords. Unless I missed something, keywords do everything that collections do, and more.
    In addition, there is one structure that is inherently hierarchical, but is not currently implemented as such: the IPTC (metadata) location. For instance, the city of Amsterdam lies in the province of North Holland in The Netherlands, so entering Amsterdam in the 'City' field implies a province and a country.
    Based on these observations, I have a couple of suggestions:
    * Enhance the usability of collections with a few of the keyword features: listing all assigned collections upon selection, and dragging collections onto photos to assign.
    * Allow users to convert collection trees to keyword trees, and possibly back (convenient for import/export, or changing ones mind as to what paradigm to use).
    * Allow users to either export a keyword tree to the IPTC location fields, or to make a dynamic link between the two (i.e. updating the keyword would immediately affect the location data). This feature would also require an 'exclusivity swith', see below.
    * Introduce an 'exclusivity switch' for keywords or collections, implying that you may only assign *one* keyword in its sub-tree to any particular image. This makes sense for location, because an image was made in a particular location, but it can also be used other contexts. For example, to indicate where the latest version of a file has been backed up to, or what the status (imported, selected, processed, finished) of a file is.
    * Introduce dynamic collections *and* dynamic keywords. I have read things about query-based dynamic collections coming to Lightroom, but please also create 'dynamic keywords'. This would allow keywords to remain the catch-all organization structure, instead of dividing functionality across collections and keywords.
    I think these features could make a significant contribution to Lightroom's DAM capabilities. If implemented alongside improved search capabilities, they would allow me (and possibly others) to stop using my stand-alone DAM application.
    Simon

    John, I had forgotten the fact that keywords do not have any output settings associated with them, unlike collections. I support your idea of indicating which collections have these settings attached to them.
    Still, except for this point, keywords can currently be made to do the same things that collections do. You can even keep them private by choosing not to export certain keywords. Note that even though this is the case, I do not ask for the abolishment of collections per se, because it is convenient to maintain the *conceptual* difference (in addition to the output metadata in collections).
    On the locations, I don't think the 'problem' is solved simply by exporting the location data to a keyword tree. It's the other direction that would greatly speed up the workflow. For example, dragging Amsterdam (potentially one of many Amsterdams) onto an image, would automatically assign the higher level fields as well. A metadata preset would indeed do the same, but only after one has created such a preset for every location, which can quickly add up.
    Also, I did not mean to imply that dynamic keywords would be beneficial for keeping track of the location metadata. In general, dynamic categories (collections or keywords) are very useful for saving searches or keeping track of an internal workflow (like: if an image is not in a backup collection, it is not backed up). Specifically, I can imagine someone wanting to couple a search result to an exported keyword. For example, for a personal website or a Flickr account, you may want to tag images with the keyword '5stars' that is automatically generated.
    The way I see it, there is currently no real difference in the logical structures that can be constructed using collections and keywords. The distinctions between the two are in the input or output phase. It would be a shame to see the logical capabilities of collections enhanced with 'smart' collections, whilst leaving keywords behind.
    By the way, I'm all for a scripting interface, but I think that it's best to get the basics implemented in the right way first.
    Simon
    PS - You pointed out the existence of custom fields as a workflow solution. I have no experience with them, and don't have access to Lightroom on this machine, so I'll get back on that later.

  • IPTC location metadata import.

    Since Aperture 3 doesn't write the GPS EXIF info into the master even when selecting write metadata to master (see post http://discussions.apple.com/thread.jspa?threadID=2335593&tstart=0) I am using houdageo instead of A3 places feature to geotag referenced pictures. There's however a bug with Aperture 3. After I modify the GPS data and IPTC location fields in Houdageo I select "update metadata from master" in Aperture 3, the GPS info gets updated but not the IPTC location fields.

    Same problem here:
    - geotagging within Aperture 3 won't write any data into IPTC location tags
    - tagging with Maperture Pro with running reverse geocode there gives me the same result: latitude, longitude and altitude are filled in, but neither city nor country and that stuff
    Seems to be a but I suppose - these things did work with A2 and Maperture Pro.
    Any suggestions or hints -- or do we have to wait for an upgrade?

  • Metadata: IPTC Location bug

    Have clean LR install (no previous beta's) and fresh created database with 1159 images (mixed CR2 and jpg) on PC (AMD 3000+, 1Gig RAM) -and I do know how to maintain WinXP.
    I've selected one image and tagged all four IPTC location fields:
    Country= Slovenia
    State= Doma
    City= Doma
    Location= Doma
    After a while (damn, why this takes so long), fields on the left pane (Metadata Browser) were updated. Then I deleted Location tag and leaved Country, State and City untouched. After a while (again), left pane's content updated and... (I hope you'll understand what I'm trying to say)...
    I still see country (Slovenia) there, but it contains zero entries! So I have in total 1159 Location entries, Slovenia zero entries and 1158 Unknown Country -weird. I can click on Slovenia (which is greyed out), but no image gets shown in thumbnail preview. If I look after this particular image manually, I can still see that image's Country is Slovenia (and I can also find image using Find: IPTC=Slovenia). Also I don't see State and City entries (which are "Doma") anymore in left pane.
    Yes, I've closed and reopened LR, but things remain the same. There's no way I can force LR to refresh it's database.
    Before continuing I need to say, that I AM database approach fan. Here and there I can read LR's database (workflow, speed,..) isn't so bad after all.. and that some are too picky, complaining all the time.... Let me tell, that LR's database approach is one of the worst I've seen so far:
    Reflections (updating on the left pane) to every correction made by user takes way too long. Actually I wouldn't care about that (being done in background), but if you type fast enough (on the right pane) and edit your mistakes here and there, then you simply cannot know how far have you come -it takes too long before left pane is up to date.
    Why can't I simply select couple of images and drag them into desired Location field (simila as with Keywords)? It's a basic task. Why there's no simply way to see untaged (Not Keyworded, Not Localized, Not edited) files? Yes, I can do something of that with Color/Star/Flag labeling, but why increasing workflow?
    I suppose Collections is something like "Albums" in other DAM's. If I click on some thumbnail, how can I know in which collection image is (if in any)?
    Why I cannot select (in Collections, Keywords,...) only images which are in selected parent tree (subtree content excluded)?
    In short: user shouldn't notice there's database working behind -in this case, I'm quite sure, everybody will like "database paradigm".
    Greetings to all,
    Bogdan

    I think I'm getting to understand IPTC Location issue here (if somebody interested)...
    On the left panel (Metadata Browser), tag Location (once filled with some string) has the the lowest (deepest) position in the tree:
    Country->State->City->"Location"
    If Location is blank, entry resides outside this tree (=Unknown Location inside Location). So, Location tag (not Location section in Metadata Browser) seems to be something like root location tag: if it is empty, other location tags (Country, City,...) aren't managed. Now I'm curious... why does "Unknown Location" reside outside "Unknown Country" -I suppose it should be inside "Unknown City". If there is a reason for being outside, then this tag (Location) shouldn't influence other Location (Country, City...) behaviour.

  • Best Practice(s) for Laptop in Field, Server at Home? (Lightroom 3.3)

    Hi all!
    I just downloaded the 30-day evaluation of Lightroom, now trying to get up to speed. My first task is to get a handle on where the files (photos, catalogs, etc.) should go, and how to manage archiving and backups.
    I found a three-year-old thread titled "Best Practice for Laptop in Field, Server at Home" and that describes my situation, but since that thread is three years old, I thought I should ask again for Lightroom 3.3.
    I tend to travel with my laptop, and I'd like to be able to import and adjust photos on the road. But when I get back home, I'd like to be able to move selected photos (or potentially all of them, including whatever adjustments I've made) over to the server on my home network.
    I gather I can't keep a catalog on the server, so I gather I'll need two Lightroom catalogs on the laptop: one for pictures that I import to the laptop, and another for pictures on the home server -- is that right so far?
    If so, what's the best procedure for moving some/all photos from the "on the laptop catalog" to the "on the server catalog" -- obviously, such that I maintain adjustments?
    Thanks kindly!  -Scott

    Hi TurnstyleNYC,
    Yes, I think we have the same set-up.
    I only need 1 LR-catalog, and that is on the laptop.
    It points to the images wherever they are stored: initially on the laptop, later on I move some of them (once I am am fairly done with developing) within LR per drag&drop onto the network storage. Then the catalog on the laptop always knows they are there.
    I can still continue to work on the images on the network storage (slightly slower than on laptop's hard drive) if I still wish to.
    While travelling, I can also work on metadata / keywording, although without access to my home network the images themselves are offline for develop work.
    2 separate catalogs would be very inconvenient, as I would always have to remember if I have some images already moved. No collections would be possible of images including some on the laptop, some on the network.
    Remember: a LR catalog is just a database with entries about images and the pointer to their storage location.
    You can open only 1 DB of this sort at a time.
    There is no technical reason for limiting a LR-catalog - I have read of people with several hundert thousand images within one.
    The only really ever growing part on my laptop with this setup is the previews folder "<catalog name> Previews.lrdata". I render standard previews so that I can do most of the work for offline-images while travelling.
    The catalog itsself "<catalog name>.lrcat" grows much slower. It is now 630 MB for 60'000+ images, whereas previews folder is 64 GB.
    So yes, I dedicate quite a junk of my laptop hard disk to that. I could define "standard"-previews somewhat smaller, fitting to the laptop's screen resolution, but then when working at home with a bigger external monitor LR would load all the time for the delta size, which is why I have defined standard-preview-size for my external monitor. It may turn out to be the weakness of my setup long-term.
    That is all what is needed in terms of Lightroom setup.
    What you need additionally to cover potential failure of drives is no matter of LR, but *usual common backup sense* along the question "what can be recreated after failure, if so by what effort?" Therefore I do not backup the previews, but very thoroughly the images themselves as well as the catalog/catalog backups, and for convenience my LR presets.
    Message was edited by: Cornelia-I: sorry, initially I had written "1:1-previews", but "standard previews" is correct.

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

  • Best Practice Table Creation for Multiple Customers, Weekly/Monthly Sales Data in Multiple Fields

    We have an homegrown Access database originally designed in 2000 that now has an SQL back-end.  The database has not yet been converted to a higher format such as Access 2007 since at least 2 users are still on Access 2003.  It is fine if suggestions
    will only work with Access 2007 or higher.
    I'm trying to determine if our database is the best place to do this or if we should look at another solution.  We have thousands of products each with a single identifier.  There are customers who provide us regular sales reporting for what was
    sold in a given time period -- weekly, monthly, quarterly, yearly time periods being most important.  This reporting may or may not include all of our product identifiers.  The reporting is typically based on calendar-defined timing although we have
    some customers who have their own calendars which may not align to a calendar month or calendar year so recording the time period can be helpful.
    Each customer's sales report can contain anything from 1,000-20,000 rows of products for each report.  Each customer report is different and they typically have between 4-30 columns of data for each product; headers are consistently named.  The
    product identifiers included may vary by customer and even within each report for a customer; the data in the product identifier row changes each week.  Headers include a wide variety of data such as overall on hand, overall on order, unsellable on hand,
    returns, on hand information for each location or customer grouping, sell-through units information for each location or customer grouping for that given time period, sell-through dollars information for each location or customer grouping for that given time
    period,  sell-through units information for each location or customer grouping for a cumulative time period (same thing for dollars), warehouse on hands, warehouse on orders, the customer's unique categorization of our product in their system, the customer's
    current status code for that product, and so on.
    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables for our largest customers so I can create queries and pivot tables to more quickly look at sales-related information by category, by specific product(s), by partner, by specific products or categories across partners, by specific products or
    categories across specific weeks/months/years, etc.  We do have a separate product table so only the product identifier or a junction table may be needed to pull in additional information from the product table with queries.  We do need to maintain
    the sales reporting information indefinitely.
    I welcome any suggestions, best practice or resources (books, web, etc).
    Many thanks!

    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables .....
    I assume you want to migrate to SQL Server.
    Your best course of action is to hire a professional database designer for a short period like a month.
    Once you have the database, you need to hire a professional DBA to move your current data from Access & Excel into the new SQL Server database.
    Finally you have to hire an SSRS professional to design reports for your company.
    It is also beneficial if the above professionals train your staff while building the new RDBMS.
    Certain senior SQL Server professionals may be able to do all 3 functions in one person: db design, database administration/ETL & business intelligence development (reports).
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Best practice to define length for varchar field of table in sql server

    What is best practice to define length for a varchar field in table
    where field suppose Remarks By Person  varchar(max) or varchar(4000)
    Could it affect on optimization in future????
    experts Reply Must ... 
    Dilip Patil..

    Hi Dilip,
    Varchar(n/max) is a variable-length, non-unicode character data. N defines the string length and can be a value from 1 through 8,000. Max indicates that the maximum storage size is 2^31-1 bytes (2 GB). The storage size is the actual length of the data entered
    + 2 bytes. We always use varchar when the sizes of the column data entries vary considerably. While if the filed data size might exceed 8,000 bytes in some way, we should use varchar(max).
    So the conclusion is just like Uri said, use varchar(max) or varchar(4000) is depends on how much characters we are going to store.
    The following document about varchar in SQL Server is for your reference:
    http://technet.microsoft.com/en-us/library/ms176089.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Arranging fields in a table-like form: best-Practice-Solution wanted

    Hello Experts,
    I´m wondering if there exists a 'best practice' considering how to arrange fields in a table-like form.
    I know about cross-tables, but that´s not what we need. Most of the requirements that I have come to known are just that certain fields should be put in a certain order in a table-like outfit.
    We have tried to do this using the drawing functions (e.g. putting a square around the fields and certain border styles), but it often happens that the lines overlap or there are breaks between the lines, so that you have to do a lot of manual configuration with the 'table'.
    Since this is a requirement I´ve come upon with many reports, I can´t believe that this is supposed to be the best solution for this.
    I don´t understand why there isn´t a table-like element in Crystal Reports to use for this. E.g. put a table with x rows and y columns in the header or group head section section and then just put the fields in it.
    Many thanks in advance for your help !

    Hi Frank,
    You can use build in templates available in Template expert.
    Click on Report menu-> Template Expert.
    Select the desired template. ( Table grid template would suite best here) and click OK.
    There is no facility of inserting a table directly as you said. You will have to do it manually by using lines and boxes.
    Hope this is helpful.
    Regards

  • Best practice for encrypting data in CRM 2013 (other than the fields it already encrypts)

    I know CRM 2013 can encrypt some values by default, but if I want to store data in custom fields then encrypt that, what's the best practice?  I'm working on a project to do this through a javascript action that when triggered from a form would reference
    a web service to decrypt values and a plugin to encrypt on Update/Create, but I hoped there might be a simpler or more suggested way to do this.
    Thanks.

    At what level are you encrypting?  CRM 2013 supports encrypted databases if you're worried about the data at rest.
    In transit, you should be using SSL to encrypt the entire process, not just individual data.
    you can use field-level security to not display certain fields to end users of a certain type if you're worried about that.  It's even more secure than anything you could do with JS, as the data is never passed over the wire.
    Is there something those don't solve?
    The postings on this site are solely my own and do not represent or constitute Hitachi Solutions' positions, views, strategies or opinions.

  • What is the best practice for package source locations?

    I have several remote servers (about 16) that are being utilized as file servers that have many binaries on them to be used by users and remote site admins for content. Can I have SCCM just use these pre-existing locations as package sources, or is this
    not considered best practice? 
    Or
    Should I create just one package source within close proximity to the Site Server, or on the Site Server itself?
    Thanks

    The primary site server is responsible for grabbing the source data and turning it into packages for Distribution points.  so while you can use ANY UNC to be a source location for content, you should be aware of where that content exists in regards
    to your primary site server.  If your source content is in Montana but your primary server is in California ... there's going to be a WAN hit ... even if the DP it's destined for is also in Montana.
    Second, I strongly recommend locking down your source UNC path so that only the servers and SCCM admins can access it.  This will prevent side-loading of content  as well as any "accidental changing" of folder structure that could cause
    your applications/packages to go crazy.
    Put the two together and I typically recommend you create a DSL (distributed source library) share and slowly migrate all your content into it as you create your packages/applications.  You can then safely create batch installers, manage content versions,
    and other things without fear of someone running something out of context.

  • Best practice for "Quantity" field in Asset Master

    Hi
    I want to know what is the best practice for "Quantity field" in asset master. It should be made displayed only or required field in Asset Master creation.
    Initially I made this field as required entry. So user entered 1 quantity. At the time of posting F-90, he again entered quantity. So my quantity in asset master got increased. Hence i decided to make that field display only in asset master creation.
    Now i made that field as display only in asset master creation. At the time of posting F-90, that quantity field is not coming only. I check my field status group for posting key as well as GL account. Its optional field. Inspite of that user is able to make entry in F-90. Now quantity field is '0' only in asset master even though there is some value in asset.
    Please help what is the best practice wrt quantity field. Should be open in asset master or it should be display only.

    Hi:
               SAP Standard does not recommend you to update quantity field in asset master data.  Just leave the Qty Field Blank , just mention the Unit of Measure as EA. While you post acquisition through F-90 or MIGO this field will get updated in Asset master data automatically. Hope this will help you.
    Regards

  • Best Practice for report output of CRM Notes field data

    My company has a requirement to produce a report with variable output, based upon a keyword search of our CRM Request Notes data.  Example:  The business wants a report return of all Service Requests where the Notes field contains the word "pay" or "payee" or "payment".  As part of the report output, the business wants to freely select the output fields meant to accompany the notes data.  Can anyone please advise to SAP's Best Practice for meeting a report requirement such as this.  Is a custom ABAP application built?  Does data get moved to BW for Reporting (how are notes handles)?  Is data moved to separate system?

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best practice for linking fields from multiple entity objects

    I am currently transitioning from PHP to ADF. I'm looking for the best practice for linking data from multiple entity objects.
    Example:
    EO 'REQUESTS' has fields: req_id, name, dt, his_stat_id, her_stat_id
    EO 'STATUSES' has fields: stat_id, short_txt_descr
    'REQUESTS' is linked to EO 'STATUSES' on: STATUSES.stat_id = REQUESTS.his_status_id
    'REQUESTS' is also linked to EO 'STATUSES' on: STATUSES.stat_id = REQUESTS.her_status_id
    REQUESTS.his_status_id is independent of REQUESTS.her_status_id
    When I create a VO for REQUESTS, I want to display: REQUESTS.name, REQUESTS.dt, STATUSES.short_txt_descr (for his_stat_id), STATUS.short_txt_descr (for her_stat_id)
    What is the best practice for accomplishing this? It appears I could do it a few different ways:
    1. Create the REQUESTS VO with a LOV for his_stat_id and her_stat_id
    2. Create the REQUESTS VO with the join to STATUSES performed within the query for the VO. This would require joining on the STATUSES EO twice (his_stat_id, her_stat_id)
    3. I just started reading about View Links - would that somehow do what I'm looking for?
    I also need to be able to update his_status_id and her_status_id through the by selecting a STATUSES.short_txt_descr from a dropdown.
    Any suggestions on how to approach such a stupidly simple task?
    Using jDeveloper 11.1.2.2.0 if that makes a difference in the solution.
    Thanks ahead of time,
    CJ

    CJ,
    I vote for solution 1 as it's just your use case. As you said you what to update the his_status_id and her_status_id through the by selecting a STATUSES.short_txt_descr by a drop down. This is exactly the LOV solution.
    ViewLinks are used fro master detail navigation (which you don't do here) and Joining the data make it difficult to update (and you still need a LOV for the drop down box.
    Timo

Maybe you are looking for

  • New Company Code Creation Problem

    Hi Experts.. We are having 10 company codes, i got a requirement to create another company code.  I have used one of the earlier company code to create a new company code. It is giving me the following error while trying to post a document in a testi

  • [cs4] [js] place images in a folder

    Hello, I'm sure this is pretty simple, but I can't manage to wrap my head around it... How do I place all images in a folder in an indesign document? I want to be able to simply run the script, select the folder, and have the all images placed in the

  • DML Operations apex 4.1

    Sorry for the asking, I'm a new comer at Oracle Apex 4.1, ringh now I still finishing final project at my college, I have problem at DML Manipulation because I do not understand the SQL language in this apex can you please help me for solve my proble

  • Possible fonctionalities with SAP MM

    Hello Could you confirm that with SAP MM I can deal with automatic creation of purchase orders from a flat file (with Idocs or BAPI solutions) and that the creation of the purchase order can generate an EDI message to send the order to the supplier ?

  • Package: EXIITS project and Objects packages are difference.

    Hi Guru, I have created a Project for exit and stored the same in one package. My doubt is wil this exit will  be effect for objects in other package. I beleieve this is not possible as packages are differenct. Kindly correct me if my understading is