RAW Best Practices

I've used my D7000 and Apple Aperture for over a year now.  In the past, I've always shot JPEGs exclusively, but as I've read more about the virtues and flexibility of RAW, I've been considering the change.  I know you can shoot RAW+JPEG, but I'd like to avoid that practice.
Since the camera is not applying it's picture control touches to RAW files, what are some best practices for batch settings on import?
Is there anything special I should be doing on RAW import?  Are presets suggested beyond the default "RAW Fine Tuning" settings?  My RAW photos are somewhat flat and lact saturation, which I understand is a result of them being unprocessed by the camera picture control settings.
Or, if I do decide to standardize on RAW, will I have to then manually touch up every shot?
Thanks in advance, John

jdag wrote:
I do think that I will backtrack and use the RAW+JPEG option and keep the pair of files. 
Good idea IMO.
...the Nikon software in fact rendered the JPEG and RAW identically
...the "Nikon way" (ie - JPEG) vs. the RAW. Whether I like the Nikon processing or not on a particular image, at least then I can decide and have the RAW file to manipulate.
The above part is perhaps mis-stated.
DSLRs capture  light data on a sensor. Each different camera internally processes that data extensively using proprietary algorithims to save the vendor's version (for that specific camera) as a "RAW" file (e.g. what Nikon calls NEF). The resultant NEF file saved in-camera must be converted to be viewed. Conversion is an irreversible one-time process.
Apple converts RAW files,
Adobe converts RAW files,
Nikon converts RAW files,
etc.
Each vendor's conversion will be different. The camera vendor has a head start on best doing the conversion because only the manufacturer fully knows how the original data was processed in-camera.
After conversion each vendor saves the data in a standardized lossless format like TIFF or PSD.
Or the vendor can discard much of the image data, compress and save the file in a standardized lossy format like JPEG.
Each vendor's conversion will be different.
So the camera processes the image once: light data--->RAW file (NEF). Then someone (Apple, Adobe, Nikon, etc.) converts the RAW to a viewable image. You can choose whose conversion you prefer but you cannot modify the original Nikon/Canon/etc. processing.
Note that a Nikon JPEG is by definition very not the same as a Nikon RAW file (NEF). They may have similar "feel" because a Nikon raw conversion was done in both cases, but the JPEG file:
• has had a very large amount of image data discarded by the JPEG algorithim
• has been permanently modified by various in-camera settings.
HTH
-Allen

Similar Messages

  • Best practice for photo format: RAW+PSD+JPEG?

    What is the best practice in maintaining format of files while editing?
    I shoot in RAW and import into PS CS5. After editing, it allows me to save as various formats, including PSD and JPEG. PS says that if you want to re-edit the file, you should save as PSD as all the layers are maintained as-is. Hence I'd prefer to save as .PSD. However, in most cases, the end objective is to share the image with others and JPEG is the most suitable format. Does this mean, that for each image, its important to save it in 3 formats viz RAW, PSD and JPEG? Wont this increase the total space occupied tremendously? Is this how most professionals do it? Pls advice.

    Thanks everyone for this continued discussion in my absence over two weeks. Going through it i realize its helpful stuff. During this period, i downloaded Aperture trial and have learnt it (there's actually not much learning, its so incredibly intuitive and simple, but incredibly powerful. Since I used iphoto in the past, it just makes it easier.
    I have also started editing my pics to put them up on my photo site. And over past 10 days, here is the workflow I have developed.
    -Download RAW files onto my laptop using Canon s/w into a folder where i categorize and maintain all my images
    -Import them into Aperture, but letting the photos reside in the folder structure i defined (rather than have Aperture use its own structure)
    -Complete editing of all required images in Aperture (and this takes care of 80-90% of my pics)
         -From within Aperture open in PS CS5 those images that require editing that cannot be done in Aperture
         -Edit in CS5 and do 'Save', this brings them back to Aperture
         -Now I have two versions of these images in Aperture - the original RAW and the new .PSD
    -Select the images that I need to put up on my site and export them to a new folder from where i upload them
    I would be keen to know if someone else follows a more efficient or robust workflow than this, would be happy to incorporate it.
    There are still a couple questions I have:
    1 - Related to PS CS5: Why do files opened in CS5 jump up in terms of their file size. Any RAW  or JPEG file originally btn 2-10 MB shows up as minimum 27 MB in CS. The moment you do some edits and/or add layers, it reaches 50-150MB. This is ridiculous. I am sure I am doing something wrong.  Or is this how CS5 works with everyone.
    2 - After editing a file in CS by launching it from Aperture, I now end up with two versions in Aperture, the original file and the new .PSD file (which is usually 100MB+). I tried exporting the .PSD file to a folder to upload it on my site, and wasnt sure what format and size it would end up with. I got it as a JPEG file within reasonable filesize limits. Is this how Aperture works? Does Aperture allow you options of which format you want to save the file in?

  • Best practice for sqlldr -- direct to core or to stage first?

    We want to begin using sql loader to load simple (but big) tables that have, up to this point, been loaded via perl and it's DBI connection to Oracle. The target tables typically receive 10-20 million rows per day (parsed log data from many thousands of machines) and at any one time can hold more than a billion total records PER TABLE. These tables are pretty simple (typically 5-10 columns, 2 or 3 part primary keys). They are partitioned BY MONTH (DAY is always one of the primary key columns) and set up on very large SAN disk arrays, stripped, etc. I can use sqlldr to load the core tables directly, OR, I could use sqlldr to load a staging table on a daily basis, then PL/SQL and SQL+ to move data from the staging table to the core. My instinct tells me that the second route is SAFER, that is there is less chance that something catastrophic could corrupt the core table, but obviously this would (a) take more time to develop and (b) reduce our over-all throughput.
    If I go the first route, loading the core directly with sqlldr, what is the worst thing that could possibly happen? That is, in anyone's experience, can a sqlldr problem corrupt a very large table? Does the likelihood of a catastrophic problem increase in proportion to the number of rows already in the target table? Are there strategies that will mitigate potential catastrophies besides going to staging and then to core via pl/sql? For example, if my core is partitioned by month, might I limit potential damage only to the current month? Are there any known potential pitfalls to using sqlldr directly in this fashion?
    Thanks
    matthew rapaport
    [email protected]

    Wow, thanks everyone!
    1. External tables... I'd thought of this, but in our development group we have no direct access to the DBMS server so we'd have to do some workflow to move the data files to the dbms server and then write the merge. If sql loader will do the job directly (to the core) without risk, then that seems to be the most straight-forward way to go.
    2. The data in the raw files is very clean, this being done in the step that parses the raw logs (100-500mb each) to the "insert files" (~20mb each), and there would be no transformations in moving data from staging to core, so again that appears to argue for direct-to-core loading.
    3. The data is collected by DAY, but reported on mostly by MONTH (e.g., select day, sum(col), count(col), from TABLE where day between A and B, group by day, order by day, etc where A and B are usually the first and last day of the month) and that is why the tables are partitioned by month, but perhaps this is not the best practice (???). I'm not the DBA, but I can make suggestions... What do you think?
    4. Time to review my sqlldr docs! I haven't used it in a couple of years, and I'm keeping my fingers crossed that it can handle the particular delimiter used in these files (pipe-tab-pipe expressed in perl as "|\t|". If I recall it can, but I'm not sure how to express the tab...
    Meanwhile, thank you very much, you have all been a BIG help... Strange no one asked me how it was that a Microsoft company was using Oracle :-) ... I work for DANGER INC (was www.danger.com if anyone interested) which is now owned (about 9 months now) by Microsoft, and this is the legacy reporting system... :-)
    matthew rapaport
    [email protected]
    [email protected]

  • Best practice - material staging for production order

    Hi Experts,
    could any of You pls, support me with some hints of best practice how to handle material staging WM-PP interface in a certain case?
    Up till now we had a system, where production had no separate location in IM, but one location existed including raw material wh and production. In the same time in WM we had separate storage types for production and raw materials u2013 hence we did material staging transferring goods only inside one IM location between different WM storage types. The material staging should be done based on separate prd. orders.
    Now this need to be changed and separate location need to be handled in IM for production u2013 which means the staging should be done between different IM locations and WM administration also need to be handled.
    Up till now we used LP10 for staging, then LB13 for TO creation etc. We can keep going like that, but if do so, there is another step required in IM u2013 movement 311, where material numbers and qty need to be added manually to finish the whole procedure. I would like to avoid this u2013 which makes the administrational procedure quite long.
    I have been checking the following possibilities:
    1.     Set released order parts-staging at control cycle and use MF60 for staging u2013 but I can not select requirements based on pro ordders here (only able to find demand if component including into selection)
    2.     Two step transfer 313/315 u2013 but this not a supported procedure u2013 313 TI /TO / 315
    3.     Try to find solution how to create 311 movement based on TO or based on WM stock at certain storage type / dynamic bin.
    I have failed.
    So, could any of You pls, support me with some useful ideas, how to handle material staging where 311 included and definetly the last step of procedure, but administrator does not need to enter items manually one by one in MIGO.
    All answers will be appreciated

    Hi,
    Storage location control should be able to take care of your problem.
    If you want to stage the material to a different IM location then the WM location then make the following settings
    If location xxxx is your WM location and location yyyy is your Production location.
    You have defined Production storage type ZZZ for production storage location YYYY and have maintained the supply area for the same
    In WM configuration - For interfaces - IM interface-Control of Assignment "Plant / Stor.Loc. - Whse Number"
    Assign location XXXX as the Standard Location. Maintain entry donot copy sloc in TR for location YYYY
    In WM configuration - For interfaces - IM interface-  Storage Location control for WH
    This entry ensures that there will be a WM tarnsfer Posting between your WM and Production storage Location automatically when you confirm your TO. You can have this done via a btach job also if you want cumulative posting. (schedule job RLLQ0100)

  • Upscale / Upsize / Resize - best practice in Lightroom

    Hi, I'm using LR 2 and CS4.
    Before I had Lightroom I would open a file in Bridge and in ACR I would choose the biggest size that it would interpolate to before doing an image re-size in CS2 using Bicubic interpolation to the size that I wanted.
    Today I've gone to do an image size increase but since I did the last one I have purchased OnOne Perfect Resize 7.0.
    As I have been doing re-sizing before I got the Perfect Resize I didn't think about it too much.
    Whilst the re-size ran it struck me that I may not be doing this the best way.
    Follow this logic if you will.
    Before:
    ACR > select biggest size > image re-size bicubic interpolation.
    Then with LR2
    Ctrl+E to open in PS (not using ACR to make it the biggest it can be) > image re-size bicubic interpolation.
    Now with LR2 and OnOne Perfect Resize
    Ctrl+E to open in PS > Perfect Resize.
    I feel like I might be "missing" the step of using the RAW engine to make the file as big as possible before I use OnOne.
    When I Ctrl+E I get the native image size (for the 5D MkII is 4368x2912 px or 14.56x9.707 inches).
    I am making a canvas 24x20"
    If instead I open in LR as Smart Object in PS and then double click the smart icon I can click the link at the bottom and choose size 6144 by 4096 but when I go back to the main document it is the same size... but maybe if I saved that and then opened the saved TIFF and ran OnOne I would end up with a "better" resized resulting document.
    I hope that makes sense!?!?!?!
    Anyway I was wondering with the combo of software I am using what "best practice" for large scale re-sizing is. I remember that stepwise re-sizing fell out of favour a while ago but I'm wondering what is now the considered best way to do it if you have access to the software that was derived from Genuine Fractals.

    I am indeed. LR3 is a nice to have. What I use does the job I need but I can see the benefits of LR3 - just no cash for it right now.

  • [CS5.5/6] - XML / Data Merge questions & Best practice.

    Fellow Countrymen (and women),
    I work as a graphic designer for a large outlet chain retailer which is constantly growing our base of centers.  This growth has brought a workload that used to be manageable with but two people to a never ending sprint with five.  Much of what we do is print, which is not my forte, but is also generally a disorganized, ad-hoc affair into which I am wading to try to help reduce overall strain.
    Upon picking up InDesign I noted the power of the simple Data Merge function and have added it to our repetoire in mass merging data sources.  There are some critical failures I see in this as a tool going forward for our purposes, however:
    1) Data Merge cannot handle information stored and categorized in a singular column well.  As an example we have centers in many cities, and each center has its own list of specific stores.  Data merge cannot handle a single column, or even multiple column list of these stores very easily and has forced us into some manual operations to concatenate the data into one cell and then, using delimiter characters, find and replace hard returns to seperate them.
    2) Data Merge offers no method of alternate alignment of data, or selection by ranges.  That is to say:  I cannot tell Data merge to start at Cell1 in one column, and in another column select say... Cell 42 as the starting point.
    3) Data merge only accepts data organized in a very specific, and generally inflexible pattern.
    These are just a few limitations.
    ON TO MY ACTUAL DILEMMA aka Convert to XML or not?
    Recently my coworker has suggested we move toward using XML as a repository / delivery system that helps us quickly get data from our SQL database into a usable form in InDesign. 
    I've watched some tutorials on Lynda.com and havent yet seen a clear answer to a very simple question:
    "Can XML help to 'merge' large, dynamic, data sets like a list of 200 stores per center over 40 centers based off of a single template file?"
    What I've seen is that I would need to manually duplicate pages, linking the correct XML entry as I go rather than the program generating a set of merged pages like that from Data Merge with very little effort on my part.  Perhaps setting up a master page would allow for easy drag and drop fields for my XML data?
    I'm not an idiot, I'm simply green with this -- and it's kind of scary because I genuinely want us to proceed forward with the most flexible, reliable, trainable and sustainable solution.  A tall order, I know.  Correct me if I'm wrong, but XML is that beast, no?
    Formatting the XML
    Currently I'm afraid our XML feed for our centers isnt formatted correctly with the current format looking as such:
    <BRANDS>
         <BRAND>
              • BrandID = xxxx
              [Brand Name]
              [Description]
              [WebMoniker]
              <CATEGORIES>
                   <CATEGORY>
                        • xmlns = URL
                        • WebMoniker = category_type
              <STORES>
                   <STORE>
                        • StoreID = ID#
                        • CenterID = ID#
    I dont think this is currently usable because if I wanted to create a list of stores from a particular center, that information is stored as an attribute of the <Store> tag, buried deep within the data, making it impossible to 'drag-n-drop'. 
    Not to mention much of the important data is held in attributes rather than text fields which are children of the tag.
    Im thinking of proposing the following organizational layout:
    <CENTERS>
         <CENTER>
         [Center_name]
         [Center_location]
              <CATEGORIES>
                   <CATEGORY>
                        [Category_Type]
                        <BRANDS>
                             <BRAND>
                                  [Brand_name]
    My thought is that if I have the <CENTER> tag then I can simply drag that into a frame and it will auto populate all of the brands by Category (as organized in the XML) for that center into the frame.
    Why is this important?
    This is used on multiple documents in different layout styles, and since our store list is ever changes as leases end or begin, over 40 centers this becomes a big hairy monster.  We want this to be as automated as possible, but I'd settle for a significant amount of dragging and dropping as long as it is simple and straightforward.  I have a high tollerance for druding through code and creating work arounds but my co-workers do not.  This needs to be a system that is repeatable and understandable and needs to be able to function whether I'm here or not -- Mainly because I would like to step away from the responsibility of setting it up every time
    I'd love to hear your raw, unadulterated thoughts on the subject of Data merge and XML usage to accomplish these sorts of tasks.  What are your best practices and how would you / do you accomplish these operations?
    Regards-
    Robert

    From what I've gleaned through watching Lynda tutorials on the subject is that what I'm hoping to do is indeed possible.
    Peter, I dont disagree with you that there is a steep learning curve for me as the instigator / designer of this method for our team, but in terms of my teammates and end-users that will be softened considerably.  Even so I'm used to steep learning curves and the associated frustrations -- but I cope well with new learning and am self taught in many tools and programs.
    Flow based XML structures:
    It seems as though as long as the initial page is set up correctly using imported XML, individual data records that cascade in a logical fashion can be flowed automatically into new pages.  Basically what you do is to create an XML based layout with the dynamic portion you wish to flow in a single frame, apply paragraph styles to the different tags appropriately and then after deleting unused records, reimport the XML with some specific boxes checked (depending on how you wish to proceed).
    From there simply dragging the data root into the frame will cause overset text as it imports all the XML information into the frame.  Assuming that everything is cascaded correctly using auto-flow will cause new pages to be automatically generated with the tags correctly placed in a similar fashion to datamerge -- but far more powerful and flexible. 
    The issue then again comes down to data organization in the XML file.  In order to use this method the data must be organized in the same order in which it will be displayed.  For example if I had a Lastname field, and a Firstname field in that order, I could not call the Firstname first without faulting the document using the flow method.  I could, however, still drag and drop content from each tag into the frame and it would populate correctly regardless of the order of appearance in the XML.
    Honestly either method would be fantastic for our current set of projects, however the flow method may be particularly useful in jobs that would require more than 40 spreads or simple layouts with huge amounts of data to be merged.

  • Best practice to process inbound soapenc:Array

    Hi, I'm working my way through a real world example where I've sent a keyword search query to Amazon and it return 10 results. This works great. Now I want to loop thru the Array and put the returned values into a table in the database. I can't seen to get down to the element level via XQuery to assign the returned values to my local variables. The XML returned is defined in the wdsl as:
         <xsd:complexType name="DetailsArray">
         <xsd:complexContent>
         <xsd:restriction base="soapenc:Array">
    <xsd:attribute ref="soapenc:arrayType" wsdl:arrayType="typens:Details[]"/>
    </xsd:restriction>
    </xsd:complexContent>
    </xsd:complexType>
         <xsd:complexType name="Details">
    What's the best practice for walking thru this incoming XML Array?
    Thanks...Matt

    Hello again!
    Thanks for quick reply!
    The instance state says : closed.completed.
    Perhaps the problem is not the insert into db, but the transformation activity?
    This is what the audit trail (raw xml) looks like:
    <?xml version="1.0" encoding="UTF-8" ?>
    - <audit-trail>
    - <event sid="0" cat="2" type="2" n="0" date="2008-06-24T10:16:21.875+02:00">
    - <message>
    - <![CDATA[ New instance of BPEL process "BPELProcess3" initiated (# "150001").
      ]]>
    </message>
    </event>
    - <event sid="BpPrc0.1" cat="1" type="2" label="process" n="1" date="2008-06-24T10:16:21.875+02:00" psid="0">
    - <message>
    - <![CDATA[ _cr_
      ]]>
    </message>
    </event>
    - <event sid="BpTry0.2" cat="1" type="2" n="2" date="2008-06-24T10:16:21.890+02:00" psid="BpPrc0.1">
    - <message>
    - <![CDATA[ _cr_
      ]]>
    </message>
    </event>
    - <event sid="BpSeq0.3" cat="1" type="2" label="sequence" n="3" date="2008-06-24T10:16:21.890+02:00" psid="BpTry0.2">
    - <message>
    - <![CDATA[ _cr_
      ]]>
    </message>
    </event>
    - <event sid="BpSeq0.3" cat="2" type="2" wikey="150001-BpRcv0-BpSeq0.3-1" n="4" date="2008-06-24T10:16:21.921+02:00">
    - <message>
    - <![CDATA[ Received "inputVariable" call from partner "client"
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <inputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="payload"><ns1:BPELProcess3ProcessRequest xmlns:ns1="http://xmlns.oracle.com/BPELProcess3">
                <ns1:stations>18700</ns1:stations>
                <ns1:username/>
            </ns1:BPELProcess3ProcessRequest>
    </part></inputVariable>
      ]]>
    </details>
    </event>
    - <event to="GetStationProperties_getStationsProperties_InputVariable" sid="BpSeq0.3" cat="2" type="1" wikey="150001-BpAss0-BpSeq0.3-2" n="5" date="2008-06-24T10:16:21.953+02:00">
    - <message>
    - <![CDATA[ Updated variable "GetStationProperties_getStationsProperties_InputVariable"
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <GetStationProperties_getStationsProperties_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="stations"><stations>18700</stations>
    </part><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="username"><username xmlns="" xmlns:def="http://www.w3.org/2001/XMLSchema" xsi:type="def:string" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"/>
    </part></GetStationProperties_getStationsProperties_InputVariable>
      ]]>
    </details>
    </event>
    - <event to="GetStationProperties_getStationsProperties_InputVariable" sid="BpSeq0.3" cat="2" type="1" wikey="150001-BpAss0-BpSeq0.3-2" n="6" date="2008-06-24T10:16:21.953+02:00">
    - <message>
    - <![CDATA[ Updated variable "GetStationProperties_getStationsProperties_InputVariable"
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <GetStationProperties_getStationsProperties_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="stations"><stations>18700</stations>
    </part><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="username"><username/>
    </part></GetStationProperties_getStationsProperties_InputVariable>
      ]]>
    </details>
    </event>
    - <event sid="BpSeq0.3" cat="2" type="2" wikey="150001-BpInv0-BpSeq0.3-3" partnerWSDL="MetDataService2Ref.wsdl" n="7" date="2008-06-24T10:16:22.968+02:00">
    - <message>
    - <![CDATA[ Invoked 2-way operation "getStationsProperties" on partner "MetDataService".
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <messages><GetStationProperties_getStationsProperties_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="stations"><stations>18700</stations>
    </part><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="username"><username/>
    </part></GetStationProperties_getStationsProperties_InputVariable><GetStationProperties_getStationsProperties_OutputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="return"><return xmlns:ns2="http://schemas.xmlsoap.org/soap/encoding/" xsi:type="ns2:Array" xmlns:ns3="http://no.met.metdata/IMetDataService.xsd" ns2:arrayType="ns3:no_met_metdata_StationProperties[1]" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <item xsi:type="ns3:no_met_metdata_StationProperties">
    <toDay xsi:type="xsd:int">0</toDay>
    <toMonth xsi:type="xsd:int">0</toMonth>
    <fromYear xsi:type="xsd:int">1937</fromYear>
    <municipalityNo xsi:type="xsd:int">301</municipalityNo>
    <amsl xsi:type="xsd:int">94</amsl>
    <latDec xsi:type="xsd:double">59.9427</latDec>
    <lonDec xsi:type="xsd:double">10.7207</lonDec>
    <toYear xsi:type="xsd:int">0</toYear>
    <department xsi:type="xsd:string">OSLO</department>
    <fromMonth xsi:type="xsd:int">2</fromMonth>
    <stnr xsi:type="xsd:int">18700</stnr>
    <wmoNo xsi:type="xsd:int">492</wmoNo>
    <latLonFmt xsi:type="xsd:string">decimal_degrees</latLonFmt>
    <name xsi:type="xsd:string">OSLO - BLINDERN</name>
    <fromDay xsi:type="xsd:int">25</fromDay>
    </item>
    </return>
    </part></GetStationProperties_getStationsProperties_OutputVariable></messages>
    ]]>
    </details>
    </event>
    - <event to="StoreStationProperties_insert_InputVariable" sid="BpSeq0.3" cat="2" type="1" wikey="150001-BpAss1-BpSeq0.3-4" n="8" date="2008-06-24T10:16:23.000+02:00">
    - <message>
    - <![CDATA[ Updated variable "StoreStationProperties_insert_InputVariable"
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <StoreStationProperties_insert_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="EklimaStationsTblCollection"><EklimaStationsTblCollection xmlns:ns0="http://xmlns.oracle.com/pcbpel/adapter/db/top/db" xmlns="http://xmlns.oracle.com/pcbpel/adapter/db/top/db"/>
    </part></StoreStationProperties_insert_InputVariable>
      ]]>
    </details>
    </event>
    - <event sid="BpSeq0.3" cat="2" type="2" wikey="150001-BpInv1-BpSeq0.3-5" partnerWSDL="db.wsdl" n="9" date="2008-06-24T10:16:27.312+02:00">
    - <message>
    - <![CDATA[ Invoked 1-way operation "insert" on partner "db".
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <StoreStationProperties_insert_InputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="EklimaStationsTblCollection"><EklimaStationsTblCollection xmlns:ns0="http://xmlns.oracle.com/pcbpel/adapter/db/top/db" xmlns="http://xmlns.oracle.com/pcbpel/adapter/db/top/db"/>
    </part></StoreStationProperties_insert_InputVariable>
      ]]>
    </details>
    </event>
    - <event sid="BpSeq0.3" cat="2" type="2" wikey="150001-BpRpl0-BpSeq0.3-6" n="10" date="2008-06-24T10:16:27.312+02:00">
    - <message>
    - <![CDATA[ Reply to partner "client".
      ]]>
    </message>
    - <details>
    - <![CDATA[
    <outputVariable><part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="payload"><BPELProcess3ProcessResponse xmlns="http://xmlns.oracle.com/BPELProcess3"/>
    </part></outputVariable>
      ]]>
    </details>
    </event>
    - <event sid="BpSeq0.3" cat="1" type="2" n="11" date="2008-06-24T10:16:27.312+02:00">
    - <message>
    - <![CDATA[ _cl_
      ]]>
    </message>
    </event>
    - <event sid="BpPrc0.1" cat="1" type="2" n="12" date="2008-06-24T10:16:27.312+02:00">
    - <message>
    - <![CDATA[ _cl_
      ]]>
    </message>
    </event>
    - <event sid="BpPrc0.1" cat="2" type="2" n="13" date="2008-06-24T10:16:27.328+02:00">
    - <message>
    - <![CDATA[ BPEL process instance "150001" completed
      ]]>
    </message>
    </event>
    </audit-trail>
    The webservice-method I'm calling in this case is getStationsProperties.
    Kind regards
    Jørn Eirik

  • What is the Best practice for ceramic industry?

    Dear All;
    i would like to ask two questions:
    1- which manufacturing category (process or discrete) fit ceramic industry?
    2- what is the Best practice for ceramic industry?
    please note from the below link
    [https://websmp103.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000409682008E ]
    i recognized that ceramic industry is under category called building material which in turn under mill product and mining
    but there is no best practices for building material or even mill product and only fabricated meta and mining best practices is available.
    thanks in advance

    Hi,
    I understand that you refer to production of ceramic tiles. The solution for PP was process, with these setps: raw materials preparation (glazes and frits), dry pressing (I don't know extrusion process), glazing, firing (single fire), sorting and packing. In Spain, usually are All-in-one solutions (R/3 o ECC solutions). Perhaps the production of decors have fast firing and additional processes.
    In my opinion, the curiosity is in batch determination in SD, that you must determine in sales order because builders want that the order will be homogeneus in tone and caliber, and he/she can split the order in diferents deliveries. You must think that batch is tone (diferents colours in firing and so on) and in caliber.
    I hope this helps you
    Regards,
    Eduardo

  • Best Practice for sugar refinery process

    hello, my company need to deploy a new business concerning raw sugar refinery.
    so we need to analyze the business requirements and propose a process for refinery management  .
    step 1: arrival of goods in docks
    step 2: raw sugar need to be charged in our stock ( quantity and value ) but is not  our property
    step 3: goods need to be delivered to our plant ( we pay the transport as service for our business partner )
    step 4: goods need to be verified ( for quality and quantity ) and accepted by operators
    step 5: goods are processed in a refinery plant, we need to verify timing, costs, quantity and human resources employed ( for costs remittance and transfer  )
    step 6: sugar is delivered to other industrial plants, warehouse and finally sold ( but is not our property ), for us it's like a refinery service.
    step 7: we need to trace production lot from raw sugar arrival to the docks up to step 6 .
    step 8: inventory and  maintenance costs need to be traced because our profit is a part of this refinery service reduced by costs incurred
    any suggestions to find the right best practice ?
    I'm not a skilled BPS, I was looking for oil refinery but  is not the same process, so.. what can i look for?
    TNks

    Hi Kumar,
    In order to have Consigment in SAP u need to have master data such as material master, vendor master and purchase inforecord of consignment type. U have to enter the item category k when u enter PO. The goods receipt post in vendor consignment stock will be non valuated.
    1. The intial steps starts with raising purchase order for the consignment item
    2. The vendor recieves the purchase order.
    3. GR happens for the consignment material.
    4. Stocks are recieved and placed under consignment stock.
    5. When ever we issue to prodn or if we transfer post(using mov 411) from consignment to own stock then liability occurs.
    6. Finally comes the settlement using mrko. You settle the amount for the goods which was consumed during a specific period.
    regards
    Anand.C

  • BEST PRACTICE TO PARTITION THE HARD DISK

    Can some Please guide me on THE BEST PRACTICE TO PARTITION THE[b] HARD DISK FOR 10G R2 on operating system HP-UX-11
    Thanks,
    Amol
    Message was edited by:
    user620887

    I/O speed is a basic function of number of disk controllers available to read and write, physical speed of the disks, size of the I/O pipe(s) between SAN and server, and the size of the SAN cache, and so on.
    Oracle recommends SAME - Stripe And Mirror Everything. This comes in RAID10 and RAID01 flavours. Ideally you want multiple fibre channels between the server and the SAN. Ideally you want these LUNs from the SAN to be seen as raw devices by the server and use these raw devices as ASM devices - running ASM as the volume manager. Etc.
    Performance is not achieved by just partitioning. Or just a more memory. Or just a faster CPU. Performance planning and scalability encapsulates the complete system. All parts. Not just a single aspect like partitioning.
    Especially not partitioning as an actual partition is simple a "logical reference" for a "piece" of the disk. I/O performance has very little do with how many pieces you split a a single disk into. That is the management part. It is far more important how you stripe and whether you use RAID5 instead of a RAID1 flavour, etc.
    So I'm not sure why you are all uppercase about partitioning....

  • Best practice - applying Filters

    What's the best practice in terms of performance versus code reuse, when applying filters?
    Right now, I see 3 ways to specify a filter:
    1. as a <mx:filter> or <s:filter>
    2. in actionscript:
    var glowFilter:GlowFilter = new GlowFilter();
    --OR--
    3. in SVG:
    <s:rect>
      <s:filters>
           <s:DropShadowFilter blurX="20" blurY="20" alpha="0.32" distance="11" angle="90" knockout="true" />
      </s:filters>
    </s:rect>
    What's the best way to apply this, in terms of performance? Thanks.

    Fredrik,
    Though similar to your workflow, here is how I would do it.
    Import those "raw" Clips into a Project, and do my Trimming in that Project, relying on the Source Monitor to establish the In & Out Points for each, and also using different Instances of any longer "master Clip.". I would also do my CC (Color Correction), and all density (Levels, etc.) Effects here. Do not Trim too closely, as you will want to make sure that you have adequate Handles to work with later on.
    Use the WAB (Work Area Bar) to Export the material that I needed in "chunks," using either Lagarith Lossless CODEC, or UT Lossless CODEC *
    Import my music into a new Project and listen over and over, making notes on what visuals (those Exported/Shared Clips from above) I have at my disposal. At this point, I would also be making notes as to some of the Effects that I felt went with the music, based on my knowledge of the available visuals.
    Import my Exported/Shared, color graded Clips.
    Assemble those Clips, and Trim even more.
    Watch and listen carefully, going back to my notes.
    Apply any additional Effects now.
    Watch and listen carefully.
    Tighten any edits, adjust any applied Effects, and perhaps add (or remove existing) more Effects.
    Watch and listen carefully.
    Output an "approval" AV for the band/client.
    Tweak, as is necessary.
    Output "final approval" AV.
    Tweak, as is necessary.
    Export/Share, to desired delivery formats.
    Invoice the client.
    Cash check.
    Declare "wine-thirty."
    This is very similar to your proposed workflow.
    Good luck,
    Hunt
    * I have used Lagarith Lossless CODEC with my PrE 4.0, but have not tried UT. Both work fine in PrPro, so I assume that UT Lossless will work in PrE too. These CODEC's are fairly quick in processing/Exporting, and offer the benefit of smaller files, than Uncompressed AVI. They are visually lossless. The resultant files will NOT be tiny, so one would still need a good amount of HDD space. Neither CODEC introduces any artifacts, or color degredation.

  • BI Best Practice for Chemical Industry

    Hello,
    I would like to know if anyone is aware of SAP BI  Best Practice for Chemicals.And if so can anyone please post a link aswell.
    Thanks

    Hi Naser,
    Below information will helps you in detail explanation regarding Chemical industry....
    SAP Best Practices packages support best business practices that quickly turn your SAP ERP application into a valuable tool used by the entire business. You can evaluate and implement specific business processes quickly u2013 without extensive Customization of your SAP software. As a result, you realize the benefits with less Effort and at a lower cost than ever before. This helps you improve operational efficiency while providing the flexibility you need to be successful in highly demanding markets. SAP Best Practices packages can benefit companies of all sizes, including global enterprises creating a corporate template for their subsidiaries.
    Extending beyond the boundaries of conventional corporate divisions and functions, the SAP Best Practices for Chemicals package is based on SAP ERP; the SAP Environment, Health & Safety (SAP EH&S) application; and the SAP Recipe Management application. The business processes supported by SAP Best Practices for Chemicals encompass a wide range of activities typically found in a chemical industry
    Practice:
    u2022 Sales and marketing
    u2013 Sales order processing
    u2013 Presales and contracts
    u2013 Sales and distribution (including returns, returnables, and rebates, with quality management)
    u2013 Inter- and intracompany processes
    u2013 Cross-company sales
    u2013 Third-party processing
    u2013 Samples processing
    u2013 Foreign trade
    u2013 Active-ingredient processing
    u2013 Totes handling
    u2013 Tank-trailer processing
    u2013 Vendor-managed inventory
    u2013 Consignment processing
    u2013 Outbound logistics
    u2022 Supply chain planning and execution Supply and demand planning
    u2022 Manufacturing planning and execution
    u2013 Manufacturing execution (including quality management)
    u2013 Subcontracting
    u2013 Blending
    u2013 Repackaging
    u2013 Relabeling
    u2013 Samples processing
    u2022 Quality management and compliance
    u2013 EH&S dangerous goods management
    u2013 EH&S product safety
    u2013 EH&S business compliance services
    u2013 EH&S industrial hygiene and safety
    u2013 EH&S waste management
    u2022 Research and development Transformation of general recipes
    u2022 Supplier collaboration
    u2013 Procurement of materials and services (Including quality management)
    u2013 Storage tank management
    u2013 E-commerce (Chemical Industry Data Exchange)
    u2022 Enterprise management and support
    u2013 Plant maintenance
    u2013 Investment management
    u2013 Integration of the SAP NetWeaver Portal component
    u2022 Profitability analysis
    More Details
    This section details the most common business scenarios u2013 those that benefit most from the application of best practices.
    Sales and Marketing
    SAP Best Practices for Chemicals supports the following sales and marketingu2013related business processes:
    Sales order processing u2013 In this scenario, SAP Best Practices for Chemicals supports order entry, delivery, and billing. Chemical industry functions include the following:
    u2022 Triggering an available-to-promise (ATP) inventory check on bulk orders after sales order entry and automatically creating a filling order (Note: an ATP check is triggered for packaged material.)
    u2022 Selecting batches according to customer requirements:
    u2022 Processing internal sales activities that involve different organizational units
    Third-party and additional internal processing u2013 In this area, the SAP Best Practices for Chemicals package provides an additional batch production step that can be applied to products previously produced by either continuous or batch processing. The following example is based on further internal processing of plastic granules:
    u2022 Purchase order creation, staging, execution, and completion
    u2022 In-process and post process control
    u2022 Batch assignment from bulk to finished materials
    u2022 Repackaging of bulk material
    SAP Best Practices for Chemicals features several tools that help you take advantage of chemical industry best practices. For example, it provides a fully documented and reusable prototype that you can turn into a productive solution quickly. It also provides a variety of tools, descriptions of business scenarios, and proven configuration of SAP software based on more than 35 years of working with the
    Chemical industry.
    SAP Functions in Detail u2013 SAP Best Practices for Chemicals
    The package can also be used to support external toll processing such as that required for additional treatment or repackaging.
    Tank-trailer processing u2013 In this scenario, SAP Best Practices for Chemicals helps handle the selling of bulk material, liquid or granular. It covers the process that automatically adjusts the differences between the original order quantities and the actual quantities filled in the truck. To determine the quantity actually filled, the tank trailer is weighed before and after loading. The delta weight u2013 or quantity filled u2013 is transmitted to the SAP software via an order confirmation. When the delivery for the sales order is created, the software automatically adjusts the order quantity with the confirmed filling quantity.The customer is invoiced for the precise quantity filled and delivered.
    Supply Chain Planning and Execution
    SAP Best Practices for Chemicals supports supply chain planning as well as supply chain execution processes:
    Supply and demand planning u2013 Via the SAP Best Practices for Chemicals package, SAP enables complete support for commercial and supply-chain processes in the chemical industry, including support for integrated sales and operations planning, planning strategies for bulk material, and a variety of filling processes with corresponding packaging units. The package maps the entire supply chain u2013 from sales planning to material requirements planning to transportation procurement.
    Supplier Collaboration
    In the procurement arena, best practices are most important in the following
    Scenario:
    Procurement of materials and services:
    In this scenario, SAP Best Practices for Chemicals describes a range of purchasing processes, including the following:
    u2022 Selection of delivery schedules by vendor
    u2022 Interplant stock transfer orders
    u2022 Quality inspections for raw materials, including sampling requests triggered
    by goods receipt
    Manufacturing Scenarios
    SAP Best Practices for Chemicals supports the following sales and
    Manufacturingu2013related business processes:
    Continuous production u2013 In a continuous production scenario, SAP Best Practices for Chemicals typifies the practice used by basic or commodity chemical producers. For example, in the continuous production of plastic granules, production order processing is based on run-schedule headers. This best-practice package also describes batch and quality management in continuous production. Other processes it supports include handling of byproducts,co-products, and the blending process.
    Batch production u2013 For batch production,
    SAP Best Practices for Chemicals typifies the best practice used by specialty
    chemical producers. The following example demonstrates batch production
    of paint, which includes the following business processes:
    u2022 Process order creation, execution, and completion
    u2022 In-process and post process control
    u2022 Paperless manufacturing using XMLbased Process integration sheets
    u2022 Alerts and events
    u2022 Batch derivation from bulk to finished materials
    Enterprise Management and Support
    SAP Best Practices for Chemicals also supports a range of scenarios in this
    area:
    Plant maintenance u2013 SAP Best Practices for Chemicals allows for management
    of your technical systems. Once the assets are set up in the system, it focuses on preventive and emergency maintenance. Tools and information support the setup of a production plant with assets and buildings.Revenue and cost controlling u2013 The package supports the functions that help you meet product-costing requirements in the industry. It describes how cost centers can be defined, attached
    to activity types, and then linked to logistics. It also supports costing and settlement of production orders for batch and continuous production. And it includes information and tools that help you analyze sales and actual costs in a margin contribution report.
    The SAP Best Practices for Chemicals package supports numerous integrated
    business processes typical of the chemical industry, including the following:
    u2022 Quality management u2013 Supports integration of quality management concepts across the entire supplychain (procurement, production, and sales), including batch recall and complaint handling
    u2022 Batch management u2013 Helps generate batches based on deliveries from vendors or because of company production or filling, with information and tools for total management of batch production and associated processes including batch  derivation, batch information cockpit, and a batchwhere- used list
    u2022 Warehouse management u2013 Enables you to identify locations where materials
    or batch lots are stored, recording details such as bin location and other storage information on dangerous goods to help capture all information needed to show compliance with legal requirements
    Regards
    Sudheer

  • Best Practice In Display- Profit & Loss

    Schedule 14          
    Cost of Sales          
    Raw materials and components consumed             5000
    Traded goods                                                     3000
    Stores and Packing materials consumed             2000
                                                                                (A)      10000           
    Decrease / (Increase) in finished goods          
    Opening                                                                       10000
    Less: Closing                                                    (11000)     ==>  Bracket Reprasent need to Subtract
    (Increase) / decrease in excise duty on finished goods                     (1000)
    Net decrease / (increase) in stock of finished goods              (B)          (2000)   ===> 10000 -11000 - 1000 =2000
                                                                                    (A+B)         8000
    (A+B) Is nothing but my total Cost of Sales Figure
    I have Opening Finished Goods Account as 10000 (No Issues)
    but Closing value have in Account as 11000 but in report i want to negate that value & also
    excise duty on finished goods also will be 1000 stored in account in database even that to be negated in report
    can you pl. give best practice of customization of this requirement
    Rgds
    Srinath

    We have found - always save the presentation in PPT format (2003/97) this works almost all the time - it fixes all of the wording issues etc -

  • Working with many sequences- best practice

    Hi.
    I´ve just started using Adobe Premiere CS6. My goal is to create a 2 hour long movie, based on 30 hours of raw gopro footage recorded on a recent vacation.
    Now my question is, what is the best practice for working with so many sequences/movie clips?
    Do you have one heavy project file, with all the clips?
    Or do you make small chapters that contains x number of sequences/is x minutes long, and in the end combine all these?
    Or how would you do it the best way, so its easiest to work with?
    Thanks alot for your help.
    Kind regards,
    Lars

    I'll answer your second question first, as it's more relevant to the topic.
    You should export in the very highest quality you can based on what you started with.
    The exception to this is if you have some end medium in mind. For example, it would be best to export 30 FPS if you are going to upload it to YouTube.
    On the other hand, if you just want it as a video file on your computer, you should export it as 50 FPS because that retains the smooth, higher framerate.
    Also, if you are making slow-motion scenes with that higher framerate, then export at the lowest framerate (for example, if you slow down a scene to 50% speed, your export should be at 25 FPS).
    About my computer:
    It was for both, but I built it more with gaming in mind as I wasn't as heavily into editing then as I am now.
    Now, I am upgrading components based on the editing performance gains I could get rather than gaming performance gains.

  • Best practices for applying sharpening in your workflow

    Recently I have been trying to get a better understanding of some of the best practices for sharpening in a workflow.  I guess I didn't realize it but there are multiple places to apply sharpening.  Which are best?  Are they additive?
    My typical workflow involves capturing an image with a professional digital SLR in either RAW or JPEG or both, importing into Lightroom and exporting to a JPEG file for screen or printing both lab and local. 
    There are three places in this workflow to add sharpening.  In the SLR, manually in Lightroom and during the export to a JPEG file or printing directly from Lightroom
    It is my understanding that sharpening is not added to RAW images even if you have added sharpening in your SLR.  However sharpening will be added to JPEG’s by the camera. 
    Back to my question, is it best to add sharpening in the SLR, manually in Lightroom or wait until you export or output to your final JPEG file or printer.  And are the effects additive?  If I add sharpening in all three places am I probably over sharpening?

    You should treat the two file types differently. RAW data never has any sharpening applied by the camera, only jpegs. Sharpening is often considered in a workflow where there are three steps (See here for a founding article about this idea).
    I. A capture sharpening step that corrects for the loss of sharp detail due to the Bayer array and the antialias filter and sometimes the lens or diffraction.
    II. A creative sharpening step where certain details in the image are "highlighted" by sharpening (think eyelashes on a model's face), and
    III. output sharpening, where you correct for loss of sharpness due to scaling/resampling or for the properties of the output medium (like blurring due to the way a printing process works, or blurring due to the way an LCD screen lays out its pixels).
    All three of these are implemented in Lightroom. I. and II. are essential and should basically always be performed. II. is up to your creative spirits. I. is the sharpening you see in the develop panel. You should zoom in at 1:1 and optimize the parameters. The default parameters are OK but fairly conservative. Usually you can increase the mask value a little so that you're not sharpening noise and play with the other three sliders. Jeff Schewe gives an overview of a simple strategy for finding optimal parameters here. This is for ACR, but the principle is the same. Most photos will benefit from a little optimization. Don't overdo it, but just correct for the softness at 1:1.
    Step II as I said, is not essential but it can be done using the local adjustment brush, or you can go to Photoshop for this. Step III is however very essential. This is done in the export panel, the print panel, or the web panel. You cannot really preview these things (especially the print-directed sharpening) and it will take a little experimentation to see what you like.
    For jpeg, the sharpening is already done in the camera. You might add a little extra capture sharpening in some cases, or simply lower the sharpening in camera and then have more control in post, but usually it is best to leave it alone. Step II and III, however, are still necessary.

Maybe you are looking for

  • Abap code to check number of partitions in the cubes.

    Hello All, I am working on Performance Tuning and need to delete(DROP_EMPTY_FPARTITIONS) the empty partitions and i need the code(Program) for finding Cubes/Aggregates with more than x empty Partitions. Thank you for your time and consideration. Nish

  • User Authorisation - storage location

    Hi, we have three storeage location under one plant, can we restrict user to access only on UM1 storage location not to access balance two storage location. Plant :    X70 Storage Location  :  UM1                                 UN1                  

  • Extra line after first item in quiz questions

    I just started using Captivate 7.  I made a template with some quiz questions and there is an extra line after the 2nd radio button in the templates for multiple choice and true false questions.  Is that normal ? Screen shot.

  • IPv6 NAT command not working on Cisco 1941 ISR

    Dear All, I am using a Cisco 1941 router with IPbase image .I am not able to configure NAT-PT on this router. when I type ipv6 i am not able to see the nat command after that. Do I need to do something different. ANy suggestion here will be helpful.

  • Data typed into this form will not be saved.  Adobe Reader can only save a blank copy of this form.

    This message is new.  I have never had this issue before on my system.  3 days ago I created a PDF using LiveCycle Designer ES2 at my office - exactly like I have done for the past 2 years - and now I get this message when I try to open the PDF on my