Script request for  exporting, importing, inserting entry to Oracle 9i LDAP

Do anyone has the scripts for exporting, importing, inserting entry to Oracle 9i LDAP server? Thanks for help.

you can use ldapsearch utility to generate ldif files. Perform a search on the node you want to export and specify the output to be in LDIF format ( -L - this is a ldapsearch option). Once you have the ldif file, you can import it to any LDAPv3 complaint server such as OID. for OID, use ldapadd/ldapmodify utility to import the data.
These utilities are present under IAS_HOME/bin or ORACLE_HOME/bin.

Similar Messages

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Low Hit Ratio for Export/Import Buffer

    Hi,
    Do low hit ratio (less than 70%) for Export/Import buffer causes performance issue?
    I didnt get any Note related to this.
    Any help will be appreciated.
    Thanks,
    Saurabh

    Thanks everyone,
    Basicaly all the buffers(Program/Nametab/CUA/Screen/Genric Key/Single record) has hit ratio around 99% , but for export/import buffer its 60%. which is indeed less.
    This is what i intended to ask, if this low hit ratio for export/import buffer has much impact ?
    Any note if anyone remembers?
    Thanks,
    Saurabh

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • DRM batch script error for export

    Hi,
    getting error while running batch script for export.
    ERemotableException with message: "Server was unable to process request. ---> Error during Export. Export was unable to run. Error: [Oracle][ODBC][Ora]ORA-01017: invalid username/password; logon denied" while running Export
    4/26/2013 9:02:20 PM - => ERROR: Data Relationship Management Server returned error: "Server was unable to process request. ---> Error during Export. Export was unable to run. Error: [Oracle][ODBC][Ora]ORA-01017: invalid username/password; logon denied."
    can anyone help .

    Hi,
    Did you check the username / password being used to run the export through Batch Client? All correct ? More details would help.
    Denzz

  • Suggestions needed for Export/Import Business Area Problem

    I'm trying to export/import a Business Area from one instance to another and I get the following error during import processes.
    Database Error - ORA-00001: unique constraint (EUL4_US.EUL4_EXP_PK) violated
    The Business Area I'm trying to import doesn't exists in new instance. During the import wizard I've checked refresh existing object and match by identifiers.
    I think my target instance has objects with same identifiers, so I went ahead and changed the identifiers in my source, but still problem persists. How can I correct this error?
    Thanks

    Did you happen to copy the EUL from one database to another before you were importing/exporting the BA?
    Discoverer recognizes EULs using unique reference numbers. However, if you use the database export and import utilities to copy an EUL, the new EUL (including its reference number) will be identical to the original EUL. When EULs have the same reference number, EUL consistency issues can arise if you do both of the following:
    * If you modify objects in both the original EUL and in the new EUL
    * Having modified objects in both EULs, if you then attempt to copy objects between the two EULs using the Discoverer Export Wizard and Import Wizard (or the Discoverer /export and /import commands)
    To avoid potential EUL consistency issues, run the eul5_id.sql script as the owner of the new EUL. The eul5_id.sql script gives a new refererence number to the new EUL and thereby avoids any potential EUL consistency issues.
    Reference
    * How to import an EUL using the standard database import utility
    http://download-west.oracle.com/docs/html/B13916_04/maintain_eul.htm#i1008632
    For 4i, you will ned to run:
    sql> update EUL4_VERSIONS set VER_EUL_TIMESTAMP =TO_CHAR(SYSDATE, 'YYYYMMDDHH24MISS');
    sql> commit;
    Some similar EUL inconsistencies are documented in:
         Note.267476.1     Discoverer Workbook Fails With 'Cannot join tables.Item dependency "" not found in EUL' After EUL Migration, Importing Or Cloning
    You might want to check that if the EUL version timestamps are the same.
    ~Steve.

  • How to use lodercli for export import

    Hello,
    I want to migrate my content server data  , that is on SAPDB version 7.3 (32 bit)  to new server with MAX DB 7.6. (64 bit) .
    I wanted to know:
    1)  how to do export an d import using lodercli. i wanted to use loder of my new server i.e. MAXDB loder to do the export immport. Can any one tell me the steps and the command to do this ?
    2) I have refered SAP note 962019, but need some more help on the steps that is given in this note for hetrageneous copy.
    Note : i dont have any SAP system on the server i have only content server and hence i dont have user SAPR3 on database.
    Regards,
    Bhavesh

    > Please check the following (point 7) in note 962019:
    Ok...
    > 7. The procedure described in this note supports the following migrations:
    > and I am dealing with the same situation hence choose the export import rather than normal backup recovery.
    Well. it says: the procedure supports these migrations, which is true.
    It does not say: "please do use the export/import migration in these situations".
    Please do read the note point 1 where it says;
    "Homogeneous system copies (backup/recovery) can be executed only in
    systems that fulfill  the following conditions:
    a) The database versions in the source system and the target system
        do not differ.
    b) The processor uses the same byte sorting sequence (Little Endian
        -> Little Endian; Big Endian -> Big Endian).
    *If the specified operating system prerequisites are met in the
    source and target system, you MUST use the homogeneous system copy
    and not the heterogeneous system copy.*"
    So in your case export import is not even supported!
    > 2) I know the Cs 6.10 is out of support and it need to upgrade to new CS version that is 6.4 , but if i am already installing the new server for content server with CS 6.4 and MAXDB 7.6 why then why cant i use the export import from older server to new server??
    > Other wise i need to upgrade the MAXDB then upgrade CS and then take the backup and restore, isnt it more logivcal to do export import using loder as we have all prerequisites available??
    1. you don't meet the prerquesites (see above) - you can use backup/restore, so you have to !
    2. export/import is a terribly slow and clumsy procedure
    3. backup/restore is a easy, safe and fast method to perform the system copy
    If you still really, really want to stick with export/import, feel free to try it out.
    The note contain all required steps and the documentation for MaxDB even includes tutorials for the loader.
    regards,
    Lars

  • Custom directories for export/import for Integration Builder

    Hi,
    XI uses default directories for exporting and importing Integration Builder objects, and there is no option to change the directory or to pick up objects from another directory.
    Does anybody know of any way we can change the default directory that XI uses? Maybe some system setting hidden somewhere deep?
    Thanks and Regards
    Manish

    Hi Manish,
    when there's no way there's always an <b>XI way</b>:)
    you can help your basis team using XI
    set you file adapter to monitor export directory
    and try sending your xi export files to a different server/location
    you can have a look how Felix did similar scenario: 
    /people/sap.user72/blog/2005/07/15/copy-a-file-with-same-filename-using-xi
    isn't this a good way ?
    Regards,
    michal

  • Choose a database among various databases for export/import using.

    Hello,
    I am using forms 6i and I want to export several databases using
    forms.
    With regards to that what is the technique/code I could use to select
    the database of my choice for export.
    How could we get the name/services of a particular database among
    various databases using Forms 6i.
    Could some one give me the idea/code to get the above requirement?
    Thanks
    Amit

    Why would you want to use Forms (a client tool) to import or export a database? Imp and exp are command line tools meant to be run on the database server.
    You will probably be hitting other problems, like different database versions. For every database version you need the correct imp and exp.
    If you really want to use Forms, than just make a table that hold the names of the databases.

  • Noob Request For Exporting Help

    Hi
    I took approximately 2.5 hours of video of my son's wedding last year using my 3CCD Panasonic NV-MX300 video camera and want to create an authored Bluray from the footage, which is spread over 3 DV Video tapes.
    My basic principle was this: - to capture the footage as is to my PC and then edit it a little and do some work on parts of the sound which came out poor because I forgot to "zoom" the mic doh!  Then I simply wanted to Export the timeline in order that the "finished Video" can be authored in another program.
    My work with Adobe Premiere CS5 is extremely limited but after watching some tutorials, I have managed to capture the 3 DV tapes and have 3 clips (the 3 full tapes) in my Project.  2 of these clips are roughly 10.5 Gb's each and another 13.7 Gb - so 35 Gb ish of captured footage.  I have placed these in the timeline and juggled bits of them into the right order using the Razor Edit Tool and have manged to increase the sound relative to the overall video in the half dozen sections which were "quiet".  From my point of view this has been a huge learning curve and I have come from knowing nothing about this program to this point in a matter of a few days.
    So when I open my "saved" project, it opens exactly as I want it to and plays perfectly within Adobe Premiere's timeline.  The footgae has hardly been cut as I want the completed video as it was shot, save for the few bits I have cut and replaced elsewhere - for example there was a bible reading which I wasn't able to take properly at the time but managed to get the 2 participants to do it after the service and I have managed to place the reading into the exact part of the video as if it were shot during the service.  As I have said, I used the razor tool to do this by using in and out points then cutting the portion between the two points and then inserting it at the correct position in an earlier section of the timeline.  Using in and out points and the razor tool to create secondary clips within the 3 main clips was the technique I used to increase the volume within some of these secondary clips.  My timeline is now perfect for me but so far hasn't been rendered.  I am not sure what rendering is about and whether it is necessary to render the whole timeline before exporting.
    I then tried to export the timeline.  I want quality footage, which is hardly any different to the original.  The video was shot Widescreen PAL at 720 x 576, 25 fps 32000Hz stereo - lower (these are the details from the "Exxport/Media" screen.
    I do not want to degrade the quality so experimented the other day and exported to H262.Bluray 1440x1080, with 48000Hz stereo sound thinking this would give me a nicely sized video ready for Bluray authoring.  I used 2-pass encoding and after a small while the estimated time was going to be around 30 hours.
    I am using a Quad 660 processor with Windows 7 64-bit.  My OS is within a RAID 1 setup and my scratch disk within a separate RAID 10 setup.  I wasn't overly concerned with this time as I wanted a quality output.  Last night after returning home from work, the encoding had completed but the Sequence01.m4v file came out at massively reduced and compressed at about 7.8Gb which wasn't what I expected at all and the quality was lousy breaking up, blocky and stuttering.  The sound wasn't there as obviously it was captured as a separate .wav file so I then decided to remux the two files in tsMuxer but the resultant .m2ts file played the sound ok but not the video which paused on the opening frame!
    OK so maybe my export settings were hopeless.  Maybe I should have gone for an uncompressed Microsoft Avi file for export?
    Before I experiment further maybe I can ask you guys with vastly more experience what I should do now please?
    As I have said the timeline isn't rendered yet - does it have to be before exporting?  My requirements are simple - to have a single video/audio file with quality matching the original.  My concern isn't the time it takes to export, just so long as I end up with 1 video with synch'd sound ready to author.  In this latter respect I will use Menus and Chapter points for navigation - am sorry but Encore is beyond me so will be using a third party program I have used before and will include some slide shows too.  Previous authoring has been limited to 2xsingle layer dvd's and 1xdual layer dvd's.  This time I want to improve quality somewhat with an aim to play back on a bluray BD25 - with additional slideshows etc and music tracks to acompany the slide shows/menus, I was thinking of limiting the export to no more than say 15/20 Gb's but obviously more that the 7.8Gb achieved after my first attempt.
    Thank you for reading this far - any help and advice would be greatly appreciated.
    Please bear in mind I do not now want to amend my timeline in any way as I have spent enough time on it already, so really need answers limited to improving the quality of my export.
    TIA
    Paul

    Thanks Ann - I think you have nailed the problem.
    I am really struggling now as I found out yesterday that VideoStudio Pro X4 doesn't do what I want it to in giving me automatic options to add slideshows/music tracks and menu templates as MovieFactory does.  Having upgraded to MovieFactory 7 now, so as to take advantage of it's BluRay capabilities, I am back with the original problem
    VS seems to allow you to "Distort" the video back to 16:9 format but this is not an option in MovieFactory.. When I experiment with it's Multi-trim options, it plays back perfectly but after exiting that screen by pressing OK you then end up where you began so I am totally confused now.
    I do have 2 things though: -
    1.  My saved edited Project in Premiere CS5
    2.  A 35 Gb .AVI file exported from 1. which plays back perfectly in Premiere, VLC Player and Windows Media Player.
    If I have to I will export again from my saved project in Premiere but would need to know precisely what "export" settings to use which will flag up the resultant video in MovieFactory.  I honestly cannot remember now what settings I used but remember that export took about 90 minutes.  The export which DH described did not work and I have since deleted the file as it was terribly blocky and stopped and started every few seconds etc.
    Now if I though Encore would give me "automatic" options to add slideshows music, Template Menus which I could play about with with videos embedded within them etc etc then I would jump straight into it.
    My Deadline is to have a fully completed DVD/BluRay disc finished by 22nd May - I have 20 days to do this!
    Hope you can help some more
    Paul

  • Plug-In for exporting/importing links

    I need a plug-in that can export/import all links from a pdf.
    Until now I found only one http://www.evermap.com/autobookmark.asp
    But it's kinda pricy... does anybody know of anything else?

    I use Jeffrey Friedl's “jf Zenfolio” plug-in and highly recommend it. Unlike most plug-ins of this sort, it get syncing right. Most will allow you to upload once, and then if you change things in Lr, the syncing goes to heck. Not JfZenfolio. It's a pleasure to use.
    Jeffrey Friedl's Blog » Jeffrey’s “Export to Zenfolio” Lightroom Plugin

  • Export / import  exp / imp commands Oracle 10gXE on Ubuntu

    I have Oracle 10gXE installed on Linux 2.6.32-28-generic #55-Ubuntu, and I need soe help on how to export / import the base with exp / imp commands. The commands seems to be installed on */usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin* directory but I cannot execute them.
    The error message I got =
    No command 'exp' found, did you mean:
    Command 'xep' from package 'pvm-examples' (universe)
    Command 'ex' from package 'vim' (main)
    Command 'ex' from package 'nvi' (universe)
    Command 'ex' from package 'vim-nox' (universe)
    Command 'ex' from package 'vim-gnome' (main)
    Command 'ex' from package 'vim-tiny' (main)
    Command 'ex' from package 'vim-gtk' (universe)
    Command 'axp' from package 'axp' (universe)
    Command 'expr' from package 'coreutils' (main)
    Command 'expn' from package 'sendmail-base' (universe)
    Command 'epp' from package 'e16' (universe)
    exp: command not found
    Is there something I have to do ?

    Hi,
    You have not set environment variables correctly.
    http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm#BABDGCHH
    And of course that sciprt have small hickup, so see
    http://ubuntuforums.org/showpost.php?p=7838671&postcount=4
    Regards,
    Jari

  • FDM Scripting Query for last imported source file using Batch Processing

    Hi Experts,
    I'm currently in the processing of automating the FDM load process on our version of FDM 9.3.3 using batch processing and the FDM Task Manager. Most of the process works fine including an email alert which notifies users of when a data load has taken place.
    As part of that email alert I am trying to attach the source file that has been loaded in batch processing. I have managed to get an attachment using the following FDM Script Object of:
    "API.MaintenanceMgr.fPartLastFile(strLoc, True, False)".
    But have noticed that using this only attaches the last "manually" imported file rather than the last file imported using the batch processing.
    My question is: Is it possible for someone to steer me into the right direction of either a more appropriate API or if I have missed a step in my script.
    Any help as always would be much appreciated.
    Cheers
    Pip

    Unfortunately the batch process does not work the same way as on-line. I am assuming you are using the normal batch load and not Multiload (although the batch is simisar).
    the batch file name gets recorded on the tBatchContents table, and moved to the import/batches folder under the folder for the current batch run. However, if successful the file gets deleted (and from memory does not get archived). To add the import file to the e-mail, after a successful load, i think you will need to store a copy of it prior to importing the file.

  • Script assistance to export\import LegacyExchangeDN

    Hi Everyone,
    I'm currently in the process of migrating our hosted exchange customers from one site to another.
    I'm looking for a way to create CSV file that will contain the fields: Alias, EmailAddress and LegacyExchangeDN, and a way to use CSV file in order to create X500 addresses for each account based on the details.
    For organization with 5-20 users, manual work is acceptable, but manually setting the X500 addresses for organization with 50-100 users is a pain.
    So any assistance here will be highly highly appreciated.

    Hi,
    The source and target site are similar at most parts. Both are running EX2010 and the Dc structure is similar at most part.
    We're using an automation products that creates the mailboxes in the new site, so the only thing that get's broken is the autocomplete and calendar and which can be fixed if I'm copying the LegacyExchangeDN of each user and create it as X500 address.
    The LegacyExchangeDN structure is also very similar at the two sites, for example:
    New site: /o=HostedExchange/ou=Exchange Administrative Group (FYDIBOHF23SPDLT)/cn=Recipients/cn=poli6c2
    Old site: /o=HostedExchange/ou=Exchange Administrative Group (FYDIBOHF23SPDLT)/cn=Recipients/cn=polib9f
    So you can see, the only part that is changed during the migration is the users CN.
    Anyway, I was able to create the required csv using this command:
    Get-Mailbox -Filter {Emailaddresses -like "*domain.com*"} |select name,displayname,PrimarySmtpAddress,LegacyExchangeDN |Export-Csv "C:\details.csvBut I need some help in importing the details into the new site.

  • Correct Lens Profile Not Available for Exported/Imported Image Files

    Any help with this would be appreciated.  I just noticed yesterday that when I have RAW (DNG) files in my LR5.2 library, I can select the correct lens profile to apply.  When I take three of those files and merge them together in Photomatix Pro they come back as a single TIFF file.  Now LR only has one lens profile available, and it is the wrong one.
    This is happening with my Sony NEX-7 and Tamron 18-200mm f/3.5-6.3 XR DI-III VC.  With the RAW files, I have a list of Tamron lenses from which to choose, and the correct one is available.  For the merged TIFF file, however, only a single Tamron lens is available for selection, and it is not the lens I used.
    If anyone has any ideas about this and can get me past it, I would greatly appreciate it.
    Thanks,
    ~Steve

    Here's my workflow for this particular situation:  After importing the original RAW files, I apply lens correction using the correct profile.  I export to another application that creates a new (TIFF) file that comes back into LR.  Even though I already applied the correction to the original files, I can see that there is a difference in distortion between the original files and the new TIFF.
    If your other application is opening the Raw file directly, it will likely be disregarding either most, or all, Lightroom adjustments even if they are somehow communicated to it. It will come down to whatever independent Raw support is built into the other application.
    The other application would still need to (a) "understand" about Adobe lens profiles etc, (b) "know" where to find the appropriate Adobe profile on disk, and (c) be equipped to do the specific and proprietary processing that they refer to. The same applies for all the other Adobe-proprietary processing instructions. The only software I am aware of which can do all this reliably, in partnership with LR, is Adobe Camera Raw plus Photoshop.
    I've trained myself that whenever a new file is created in an application outside of LR (including PS), I always re-apply lens correction upon first returning to LR.  If I'm interpreting   the above responses correctly, I shouldn't need to re-apply lens correction if I applied it to the original files.  But again, it's not my observation that this is the case.
    If your particular workflow is failing to reflect the first application of lens corrections (as well as perhaps, other specific adjustments) - then what returns to LR will not have undergone those kinds of corrections yet.
    I'm also hearing that I may need to download a lens profile for this lens for a TIFF file format.  I've never needed to do that before, so perhaps I'll give that a shot.
    As I understand it, Photomatix can also accept converted TIFFs or JPGs from LR - instead of Raw fles. I am skeptical that you will realise any "clear water" advantage from sending Raws rather than TIFF (or even, high quality JPG in many cases) from LR into Photomatix  - assuming the LR conversion is already optimising the images for the required purpose. You would be doing that step in an interactive environment rather than blindly via a generic converter. Such intermediate bitmaps are in my experience perfectly good candidates for HDR processing, or for exposure fusion (which I prefer - though I don't use Photomatix in particular).
    Any workflow which causes LR to in effect Export suitable bitmaps that incorporate all current adjustments, will embody lens corrections as part of that. By the way, the workflow into Photoshop HDR is similarly done by way of converted bitmaps - though in that case it is ACR rather than LR which achieves these converted bitmaps, and into memory rather than into separately saved files, but the outcome is effectively the same.
    The only downside is the "cleanup" is to get rid of intermediate TIFFS or JPGs afterwards. I usually put the completed "merged" image at the top of a "stack" containing the component exposures, and collapse the stack down. But the (disposable) intermediates don't get imported to LR, and by setting these to occur in a subfolder or with a distinctive name suffix, they are easily found and deleted in due course.
    RP

Maybe you are looking for

  • To test how can we use the opt  'logical file name' to name the file based

    Hi Sir/Madam,            to test how can we use the opt  'logical file name' to name the file based on the selection made in the dtp run for extracting data as flat file.

  • Importing P2 Media from Hard Disk?

    This is fairly new technology and I have been having a very difficult time finding anyone who has done this, so any help/guidance is much appreciated. To be clear, I am not trying to ingest media from a P2 card reader/camera or P2 store. I have the m

  • HELP my imported SWF

    I have created an animation of a superman like character flying with the animation being his cape flapping, his eye moving occasionally, and he moves up and down slightly to imitate the movement of flight. I exported that animation as a SWF movie. I

  • Setting exact    jtable column width

    I need to set each column width in table 200. Columns are added dynamically. And after one column added the previous one width is become small(about 75). There is test code. Please, help, here is last place I can find the answer. package visual.jtabl

  • 8130 pearl - ringtones and desktop keep reverting back to preloaded version

    If there is a thread that addresses this please direct I am a new owner. I have transferred pics to data card and selected for desktop, however every few days the phone reverts the desktop back to one of the preloaded versions. The same is happening