Best practice to develop & keyword Jpeg files - Use of virtual copies

There is no sidecar file attached to Jpeg file and I understand all modification on the Jpeg slightly deterioriate the file quality.
Therefore, is it advisable to use vitual copies for all modifications?
Would use use them for trial and errors procedure, then apply whatever settings retained to the Jpeg file?
Or would it be better to leave the original Jpeg altogether untouched and stick to the virtual copy once modified?
What about keywords applied at several stages before and after usig the develop module.
Any advice would be much appreciated. Thanks in advance.

Would you advise to store the metadata in  the file itself IN ADDITION to where they are located in the catalog?
How to proceed?
Would it make a difference in the export process?
Storing metadata in the file itself is a good idea if you ever think you might share the photo with others, or with another application (other than Lightroom). And I think almost everyone will be doing one of these things at some point.
You would want to turn on the option that automatically stores the metadata in xmp. Edit > Catalog Settings > Metadata, check the box that says Automatically Write Changes into XMP
Would it make a difference in the export process? Probably unnoticeable fractions of a second.

Similar Messages

  • Best practice when developing APEX apps and using a SVN repository

    Hi experts,
    I wanted to get your opinion on best practice regarding how to use SVN and APEX combined.
    The idea is basically how to structure and how to save APEX apps the best way in a repository.
    I am currently working with a custom SVN structure, not using the default TRUNC/TAGS one : every app has a folder , under every app folder i have PAge number folders, and for each page reports, regions and global objects separated.
    This helps me because its more readable then saving the whole page export, its good for small changes and i have a clear overview of every bit and piece.
    What is everybody else using or is there a best practice to follow here that i dont know?
    Kind regards,
    Alex

    @tomaugerdotcom
    Something like this might help: https://testflightapp.com/
    Concevably, you could roll your own internal service if that particular one doesn't suit you. (I don't have any knowledge about how they are doing it, but it shouldn't be hard to figure out since Apple's constraining rules would only allow a few possibilities.)
    USB app install and debugging isn't supported on iOS. You have to use wireless.
    Another option specifically for multi-touch dev/testing, is to use an Android device.

  • What is the best practice in securing deployed source files

    hi guys,
    Just yesterday, I developed a simple image cropper using ajax
    and flash. After compiling the package, I notice the
    package/installer delivers the same exact source files as in
    developed to the installed folder.
    This doesnt concern me much at first, but coming to think of
    it. This question keeps coming out of my head.
    "What is the best practice in securing deployed source
    files?"
    How do we secure application installed source files from
    being tampered. Especially, when it comes to tampering of the
    source files after it's been installed. E.g. modifying spraydata.js
    files for example can be done easily with an editor.

    Hi,
    You could compute a SHA or MD5 hash of your source files on
    first run and save these hashes to EncryptedLocalStore.
    On startup, recompute and verify. (This, of course, fails to
    address when the main app's swf / swc / html itself is
    decompiled)

  • Best practice to develop the news web part to retrieve news data across the farms

    Hi,
    We have developed the News Web part. The functionality  it pulls the news from other SharePoint Farms and display it in the site. Currently we are using secure store service to  to connect to different FARM to retrieve the news from that FARM.
    The issue with this approach is that every time user hits the page it  will always hit the Secure Store service.There are almost 80K users who will be using this web part. Moreover this web part connects to multiple sites for multiple news from different
    division.What should be the best practice to develop such kind of web part without consuming the server resources and keep hitting the server till we get the new news.News does not change very often.
    Regards
    Rajaniesh

    Hi,
    According to your description, my understanding is that you want to know which is the best way  to handle the large complication of SharePoint cross farm retrieving data.
    If you are developing the custom web part to retrieve data from other farm, I suggest you can firstly create a custom timer job to get data hourly in the backend and restore the data in a list. Then you can create a web part to link to the list to display
    the data.
    For cross farm accessing data, I suggest you can create a custom web service to achieve it.
    Also, you can use ajax to display the web part data asynchronously. It will improve the performance and reduce the server pressure.
    Here are some detailed articles for your reference:
    Create and Deploy Custom Timer Job Definition in SharePoint Programatically
    Creating a Custom ASP.NET Web Service
    Create asynchronous web parts for Sharepoint
    Thanks
    Best Regards
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Jerry Guo
    TechNet Community Support

  • Best practice to monitor 10gR3 OSB performance using JMX API?

    Hi guys,
    I need some advice on the best practice to monitor 10gR3 OSB performance using JMX API.
    Jus to show I have done my home work, I managed to get the JMX sample code from
    http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/jmx_monitoring/example.html#wp1109828
    working.
    The following is the list of options I am think about:
    * Set up: I have a cluster of one 1 admin server with 2 managed servers, which managed server runs an instance of OSB
    * What I try to achieve:
    - use JMX API to collect OSB stats data periodically as in sample code above then save data as a record to a
         database table
    Options/ideas:
    1. Simplest approach: Run the modified version of JMX sample on the Admin Server to save stats data to database
    regularly. I can't see problems with this one ...
    2. Use WLI to schedule the Task of collecting stats data regularly. May be overkill if option 1 above is good for production
    3. Deploy a simple web app on Admin Server, say a simple servlet that displays a simple page to start/stop and configure
    data collection interval for the timer
    What approach would you experts recommend?
    BTW, the caveats os using JMX in http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/jmx_monitoring/concepts.html#wp1095673
    says
         Oracle strongly discourages using this API in a concurrent manner with more than one thread or process. This is because a reset performed in
         one thread or process is not visible to another threads or processes. This caveat also applies to resets performed from the Monitoring Dashboard of
         the Oracle Service Bus Console, as such resets are not visible to this API.
    Under what scenario would I be breaking this rule? I am a little worried about its statement
         discourages using this API in a concurrent manner with more than one thread or process
    Thanks in advance,
    Sam

    Hi Manoj,
    Thanks for getting back. I am afraid configuring aggregation interval from Dashboard doesn't solve problem as I need to collect stats data of endpoint URI or in hourly or daily basis, then output to CSV files so line graphs can be drawn for chosen applications.
    Just for those who may be interested. It's not possible to use SQL to query database tables to extract OSB stats for a specified time period, say 9am - 5pm. I raised a support case already and the response I got back is 'No'.
    That means using JMX API will be the way to go :)
    Has anyone actually done this kind of OSB stats report and care to give some pointers?
    I am thinking of using 7 or 1 days as the aggregation interval set in Dashboard of OSB admin console then collects stats data using JMX(as described in previous link) hourly using WebLogic Server JMX Timer Service as described in
    http://download.oracle.com/docs/cd/E12840_01/wls/docs103/jmxinst/timer.html instead of Java's Timer class.
    Not sure if this is the best practice.
    Thanks,
    Regards,
    Sam

  • Best practice for checking out a file

    Hello,
    What is the SAP best practice to check out a file? (DTR -> EDit or DTR -> Edit Exclusive?)
    What are pros and cons of checking out a file exclusively?
    Thanks
    MLS

    Thanks Pascal.
    Also, I think if a developer checks out exclusively, makes changes and leaves the company without checking in, the only way is to revert those files in which case all his changes will be gone.

  • What's the best practice to manage the page file?

           
    We have one Hyper-v Server running windows 2012 R2 with 128 GB RAM and 2 drives (C and D). It setup Automatically manage page file size for all drives. What's the best practice to manage the page file?
    Bob Lin, MCSE & CNE Networking, Internet, Routing, VPN Networking, Internet, Routing, VPN Troubleshooting on http://www.ChicagoTech.net How to Install and Configure Windows, VMware, Virtualization and Cisco on http://www.HowToNetworking.com

    For Hyper-V systems, my general recommendation is to set the page file to 1-4 GB. This allows for a mini-dump should something happen. 99.99% of the time, Microsoft will be able to figure out the cause of the problem from the mini-dump. It does not make
    sense on a Hyper-V system to set aside enough space to capture all the memory on the system because only a very small portion of that memory is used by the parent partition. Most of the memory is under control of the individual VMs.
    Yes, I had one of the Hyper-V product group tell me that I should let Windows manage it.  A couple of times I saw space on my system disk disappear because the algorithm decided it wanted all the space for the page file.  Made it so I couldn't
    patch my systems.  Went back in and set the page file to 1-4 GB and have not had any issues since.
    . : | : . : | : . tim

  • What is the best practice for developing web service?

    Hi All,
    I'm a newbee to web services...
    I was wondering what would be the best approach in developing a web service,
    using tools or programmatic approach?
    If I use WebLogic Workshop, am I tied to a vendor?
    Is it possible for me to develop web services using workshop and deploy in
    another app server..?
    I would appreciate if somebody could give me a pointer to start.
    I have already referred BEA's docs.
    I'm still confused on a good starting point on the best approach to develop
    protable web services.
    Thanks in advance for any inputs.
    K K

    K K-
    You have a very valid point on the simplify or complicate matters. If you are
    going for clean and not-so-time-centric code, then there are several different
    programs and packages out there you can choose from.
    Since you are specialized in J2EE, than the Sun package may be what you are looking
    for. BEA's classes simplify much of the work you will be doing, but you could
    emulate their classes or extend yours above the functions provided in theirs.
    It all boils down to how much work are you willing to do.
    If you are asking for more detailed, coding 'Design Patterns' to utilize, I would
    wait for a few more posts from other folks as my work often requires me to utilize
    the tools provided.
    Sincerely,
    Eric Ballou
    "K K" <[email protected]> wrote:
    Eric,
    Thanks for the response.
    I was also looking at Sun's WSDP 1.1, which is more programmatic approach.
    Some how, I feel being a J2EE developer, I should go on the direction
    of the
    programmatic approach.
    Using the tools could simplify or complicate things. Also, the Workshop
    samples import all weblogic specific packages.
    My code looks so dirty with many vendor specific packages being imported.
    Could you give me your suggestions for a clean and neat approach?
    I would personally prefer to avoid the quick and dirty approach.
    Thanks again.
    "Eric Ballou" <[email protected]> wrote in message
    news:[email protected]...
    K K-
    The best approach in developing portable web services is knowing whatyou
    are
    planning on using them for as well as how much is willing to be spent,etc.
    BEA's Workshop is portable to other frameworks, but the ease ofintegrating a
    developed client or a developed server can very greatly. Even moreof an
    issue
    is migration from one framework to another. If you choose to developin
    Workshop
    and your company later deploys .Net solutions, some of your work mayhave
    to be
    redone unless the company is willing to keep portions of the 'old'system
    around
    until new versions of the service are available. However, Workshophas
    several
    ant tools available that would assist you in deploying to other appservers or
    even a stand-alone application should you need cross framework abilities.
    If you are just starting out in web services, http://www.webservices.org
    is a
    good place to start checking out vendors in the space.
    Sincerely,
    Eric Ballou
    "K K" <[email protected]> wrote:
    Hi All,
    I'm a newbee to web services...
    I was wondering what would be the best approach in developing a web
    service,
    using tools or programmatic approach?
    If I use WebLogic Workshop, am I tied to a vendor?
    Is it possible for me to develop web services using workshop and deploy
    in
    another app server..?
    I would appreciate if somebody could give me a pointer to start.
    I have already referred BEA's docs.
    I'm still confused on a good starting point on the best approach todevelop
    protable web services.
    Thanks in advance for any inputs.
    K K

  • Best practice for development using REST API - OData

    Hi All, I am new to REST. I am a developer who works mostly in server-side code using Visual Studio. Now that Microsoft is advocating to write code using REST API instead of server-side code or client side object model, I am trying to use REST API.
    I googled and most of the example shows to write a code and put it on Content Editor/Script Editor. How to organize code and deploy to the staging/production in this scenario? Is there any Best Practice or example around this?
    Regards,
    Khushi

    If you are writing code in aspx or cs it does not mean that you need to deploy it in the SharePoint server, it could be any other application running from your remote server. What I mean it you can use C# & Rest API to connect to SharePoint server.
    REST API in SharePoint 2013 provides the developers with a simple standardized method of retrieving information from SharePoint and it can be used from any technology that is capable of sending standard HTTP requests.
    Refer to the following blog that provide your more details about comparison of the major features of these programming choices/
    http://msdn.microsoft.com/en-us/library/jj164060.aspx#RESTODataA
    http://dlr2008.wordpress.com/2013/10/31/sharepoint-2013-rest-api-the-c-connection-part-1-using-system-net-http-httpclient/
    Hope this helps
    --Cheers

  • Best Practices:: How to generate XML file from a ResultSet

    Hi all,
    Could someone please suggest the best practices of how to generate an XML file from a resultset? I am developing a web application in Java with Oracle database and one of my tasks is to generate an XML file when the user, for example, click a "download as XML" button on the JSP. The application is basically like an Order with line items. I am using Struts and my first thought has been to have an action class which will extend struts's DownloadAction and through StAX's Iterator API to create an XML file. I intend to have a POJO which will have properties of all columns of my order and line items tables so that for each order I get all line items and:
    1. Write order details then
    2. Through an iterator write line items of that order to an XML file.
    I will greatly appreciate for comments or suggestions on the best way to do this through any pointers on the Web.
    alex

    Use a OracleWebRowSet in which an XML representation of the result set may be obtained.
    http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/files/oracle10g/webrowset/Readme.html
    http://download.oracle.com/docs/cd/B28359_01/java.111/b31224/jcrowset.htm

  • Best practices for development / production environments

    Our current scenario:
    We have one production database server containing the APEX development install, plus all production data.
    We have one development server that is cloned nightly (via RMAN duplicate) from production. It therefore also contains a full APEX development environment, and all our production data, albeit 1 day old.
    Our desired scenario:
    We want to convert the production database to a runtime only environment.
    We want to be able to develop in the test environment, but since this is an RMAN duplicated database, every night the runtime APEX will overlay it, and the production versions of the apps will overlay. However, we still want to have up to date data against which to develop.
    Questions: What is best practice for this sort of thing? We've considered a couple options:
    1.) Find a way to clone the database (RMAN or something else), that will leave the existing APEX environment intact? If such is doable, we can modify our nightly refresh procedure to refresh the data, but not APEX.
    2.) Move apex (in both prod and dev environments) to a separate database containing only APEX, and use DBLINKS to point to the data in both cases. The nightly refresh would only refresh the data and the APEX database would be unaffected. This would require rewriting all apps to use DBLINKS though, as well as requiring a change to the code when moving to production (i.e. modify the DBLINK to the production value)
    3.) Require the developers to export their apps when done for the day, and reimport the following morning. This would leave the RMAN duplication process unchanged, and would add a manual step which the developers loath.
    We basically have two mutually exclusive requirements - refresh the database nightly for the sake of fresh data, but don't refresh the database ever for the sake of the APEX environment.
    Again, any suggestions on best practices would be helpful.
    Thanks,
    Bill Johnson

    Bill,
    To clarify, you do have the ability to export/import, happily, at the application level. The issue is that if you have an application that consist of more than a couple of pages, you will find yourself in a situation where changes to page 3 are tested and ready but, changes to pages 2, 5 and 6 are still in various stages of development. You will need to get the change for page 5 in to resolve a critical production issue. How do you do this without sending pages 2, 5 and 6 in their current state if you have to move the application all at once??? The issue is that you absolutely are going to need to version control at the page level, not at the application level.
    Moreover, the only supported way of exporting items is via the GUI. While practically everyone doing serious APEX development has gone on to either PL/SQL or Utility hacks, Oracle still will not release a supported method for doing this. I have no idea why this would be...maybe one of the developers would care to comment on the matter. Obviously, if you want to automate, you will have to accept this caveat.
    As to which version of the backend source control tool you use, the short answer is that it really doesn't matter. As far as the VC system is concerned, you APEX exports are simply files. Some versioning systems allow promotion of code through various SDLC stages. I am not sure about GIT in particular but, if it doesn't support this directly, you could always mimic the behavior with multiple repositories. Taht is, create a development repository into which you automatically update via exports every night. Whenever particular changes are promoted to production, you can at that time export form the development repository and into the production. You could, of course, create as many of these "stops" as necessary to mirror your shop's SDLC stages, e.g. dev, qa, intergation, staging, production etc.
    -Joe
    Edited by: Joe Upshaw on Feb 5, 2013 10:31 AM

  • Best practice for developing with CRM 2013 (On Premises)

    Hello all.  I'm just starting to work with CRM, and I have some questions that hopefully will be simple for the seasoned developers.  It's mostly just some best practice or general how-to questions for the group.
    - When creating a new Visual Studio CRM Project I can connect to my CRM Instance and create new WebResources which deploy to the CRM instance just fine, but how can I pull all the existing items that are in the CRM Solution into the Visual Studio CRM project?
     Or do I need to export the solution to a ZIP, expand it with SolutionPackager.exe, then copy these into my Visual Studio project to get it into sync?
    - When multiple developers are working on changes is it best to keep everything in a Visual Studio project as I mentioned above, or is it better for everyone to have their own instance of CRM to code with so they can Export/Import solutions as needed then
    these solutions be manually merged before moving into a common Test/QA environment?
    - When modifying the submenu on a CRM form is it suggested to use Ribbon Workbench or is it better/easier to just export the solution, expand it with SolutionPackager.exe,  modify ribbondiff and anything else required for the change, package it
    back up, then reimport to CRM?  I've heard from some that Ribbon Workbench has some limitations, but being green I wasn't sure what those limitations might be or if it'd be best to just manually make these changes.  Or is thre any way to have a copy
    of ribbondiff in Visual Studio and deploy this without having to repackage the Solution and Import in the ZIP?
    I think that's it for now :)  Thanks for any advise or suggestions.  I really want to start learning the in's and out's of CRM and how all the pieces fit together.  Also can someone direct me to some documentation or books that might give
    more insight on developing for CRM 2013 or 2015 (moving to this soon)?
    Thanks for your time.

    Hi Sam
    Also interested in best practice around this area - especially recommended development routes, unit testing, continuous integration etc - it would be great if you posted here if you find any good articles etc. At the moment we tend to just push changes
    onto a live system as and when appropriate and I'd prefer to move away from that...
    Thanks
    Stuart

  • Best Practice Report development

    Hi All,
    I am working on a blue print for a BI 7.0 project and I am searching for some information regarding a best practice for the development of queries in the three environments within SAP development/quality/production. I was wondering if there is some documentation which SAP explains the best way to use the different environments within SAP. So what are the important things to keep in consideration when building queries.
    So what are the do's and which things are the absolutely dont's?
    Please advise me if you know where  I can find information regarding this topic..
    Thanks in advance!
    Kindest regards,
    Jens

    Hello Jens!
    Just some experiences from me. Maybe they will help.
    - Differ between standard reports and ad hoc reports. Standard reports are build in dev from BW-Team and transported to prod. Ad hoc reports can be build in all systems from bw power users. Standard reports should only be changeable in dev.
    - Dev should only be for developers. Hold testdata in you quality-system so that users can test changes here.
    Best regards,
    Peter

  • Best Practices for image layout and positioning using anchored frames in Framemaker 10

    Hi,
    I'm looking for the best practices in how to layout my images in Framemaker 10 so that they translate correctly to Robohelp 9. I currently have images inside of Anchored frames that "Run into" the right side of my text. I've adjusted the size of the anchored frame so that my text flows correctly around the image. Everything looks good in Framemaker! Yeah! The problem is that when I link my Framemaker document to Robohelp, the text does not flow around my image in the same manner. On a couple of Robohelp screens the image is running into the footer. I'm wondering if I should be using tables in Framemaker in order to get the page layout that I'm looking for. Also, I went back and forth...is this a Framemaker question or is this a Robohelp question. Any assistance would be greatly appreciated.

    I wish I could offer some advice, rather than simply adding to your question. I think there is something odd that happens with anchored frames.
    I have been going through fits trying to figure out why the placement of my graphics shifts when I run them through Frame 10-to-RoboHelp 9 conversion. The placement in our books is flush left to the body text area. However, when they convert, some are flush left, some are centered, and a very few are flush right. My boss is very unhappy with me, as I have been unable to figure out why this is happening. I didn't create these files, so I don't know if it's something that goes back to how the graphics were initially imported. But I can't figure out why everything looks right in Framemaker, with the frame attached to an anchor tag, etc. but the placement goes hinky when they convert.
    Any insights are appreciated. I'm wondering if it's going to come down to deleting them and recreating the graphic frame.

  • Best practice to develop internet app integrating with backend R/3 modules

    While we wait to upgrade from R/3 4.6c, going forward we want to stop investing in ITS Flow Logic applications.
    What is the best practice around using backend RFCs/BAPIs to expose SAP functionality as a web application that is accessible on the internet? One thought looks like using WAS 6.4 - utilizing JRA to call RFCs and using JSP/Servlet; another is to use Webdynpro based development. Will appreciate some architecture advice along side - especially if we also wanted internet surfers to set up user accounts. Thanks!

    Hi Vito,
    I do have the same situation as you and also some of the guys mentioned above as well. I have Portal only users and also users who uses the SAP GUI.
    Thus, what I would advise, taking into consideration of audit as well, is to have the below scenerios:
    1) Users who login to backend with SAP GUI on Citrix only
    We have changed the system parameter: login/password_change_for_SSO=2
    The password change dialog box appears and the password must be changed (input: old and new password). Also we have setup SNC (CyberSafe) so that in our SAP GUI, users can click on the system with SNC setup and login to backend without having to enter userID and password
    2) Users who login to backend with SAP GUI on client (local)
    Users will login with userID and password
    3) Portal user with SSO and no login to backend vwith SAP GUI 
    Portal users will have their password deactivated.
    Explaination to Audit for Portal users:
    We have 90days password reset on Windows (AD). So our Portal users are respecting the audit request of having 90days password reset, but instead of having it in SAP, its in our Windows. Furthermore, SSO is setup as such that the coinnection for these Portal users to the backend is secure.
    We are not able to set login/password_change_for_SSO=3 as we have sites which does not use Citrix. Thus, these sites will have local SAP GUI install.
    Hope that can share some experience of mine to those who are also in my past situation.
    Ray

Maybe you are looking for