Best way to fetch next/previous N records without  re-querying DB?

because the db query is very expensive I only want to do this once and the web app should "remember" all the records. any suggestions? Thanks!

BobXu wrote:
I have tried my best to improve on the backend, but it is not something so I better find a way to solve this problem on the front end.
I am thinking maybe Ajax. any thoughts?Ajax isn't a panacea to fix all your woes.
You will want to optimize your DB queries. If a single access is expensive, break it down into several smaller queries and only pull out a sub-list of data for each page. If you don't need to display every piece of data at a single time, then why ask the database for it all? Hop on over to the JDBC forum and ask them if there isn't room for improvement in your query, table format, or access style.
That said, if the contents of the database are relatively static and paging won't help for some reason, and there is nothing you can do to clean up the connection to the server, you might try caching the results to a local file system or a smaller database that is on the same machine as the web server.

Similar Messages

  • Is there an easy way to fetch next/previous record in Apex?

    I am new to APEX/Oracle, but have a lot of expierence as a mainframe programmer and some experience with SQL.
    I have been tasked by my boss to create a set of pages in an application as follows.
    Page 1: Select an Employees Name and go to Page 2.
    Page 2: Display Employee's Biography Information.
    Add a "Next Employee" button and a "Previous Employee" button that will fetch the next/previous Employees Biography info respectively.
    In essence, he wants a query with every employee's biography information with the employee selected on page 1 used as the starting pointer.
    I have successfully built "select an Employee's name on page 1" and "display his/her info on page 2" with a query that returns a single record.
    What I can not figure out is how to get a next and previous button to get the next or previous record in a multi record query, using the intially selcted employee as the intial pointer.
    Is their an easy way to build this using built-in APEX functionailty, or will it require programming to achieve this requirement?

    Bob,
    I installed the Aria application, but now I wish I'd run the preview first. It's a cool application, but I don't see anything like what greich requested. I'm looking for the same thing.
    <ol>
    <li>     and clicked the Edit or View Details button for an individual. </li>
    <li>That takes me to a custom Form page that shows one person. </li>
    </ol>
    I'm trying to imagine how I'd code buttons for that Form page to let me click to see the next or previous person on that same form. My mind gets totally boggled when I consider that the user might have filtered the report to only show a few records. How do I have any idea what those IDs were and what order they were showing in, to know which is the next individual.
    My only thought it to create a process on the report that could save primary key (e.g. employee ID) to a table or Apex collection? Then the form button runs a process that finds the current ID then uses next or previous?
    I'm not sure how I could capture the PK in the report in order to save it.
    Does this make sense? Anyone got a better idea?
    Thanks,
    Stew

  • Best way to Fetch the record

    Hi,
    Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
    Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
    The table has the following key Columns for the Select (sample Table)
    Client_Visit
    ID Number(12,0) --sequence generated number
    EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
    Create_TS Timestamp(6)
    Client_ID Number(9,0)
    Cascade Flg vahrchar2(1)
    On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
    I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
    Code 1:
    SELECT   au_subtyp1.au_id_k,
                                       au_subtyp1.pgm_struct_id_k
                                  FROM au_subtyp au_subtyp1
                                 WHERE au_subtyp1.create_ts =
                                          (SELECT MAX (au_subtyp2.create_ts)
                                             FROM au_subtyp au_subtyp2
                                            WHERE au_subtyp2.au_id_k =
                                                                au_subtyp1.au_id_k
                                              AND au_subtyp2.create_ts <
                                                     TO_DATE ('2013-01-01',
                                                              'YYYY-MM-DD'
                                              AND au_subtyp2.eff_dte =
                                                     (SELECT MAX
                                                                (au_subtyp3.eff_dte
                                                        FROM au_subtyp au_subtyp3
                                                       WHERE au_subtyp3.au_id_k =
                                                                au_subtyp2.au_id_k
                                                         AND au_subtyp3.create_ts <
                                                                TO_DATE
                                                                    ('2013-01-01',
                                                                     'YYYY-MM-DD'
                                                         AND au_subtyp3.eff_dte < =
                                                                TO_DATE
                                                                    ('2012-12-31',
                                                                     'YYYY-MM-DD'
                                   AND au_subtyp1.exists_flg = 'Y'
    Explain Plan
    Plan hash value: 2534321861
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  1 |  FILTER                  |           |       |       |       |            |          |
    |   2 |   HASH GROUP BY          |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  3 |    HASH JOIN             |           |  1404K|   121M|    19M| 33178   (1)| 00:06:39 |
    |*  4 |     HASH JOIN            |           |   307K|    16M|  8712K| 23708   (1)| 00:04:45 |
    |   5 |      VIEW                | VW_SQ_1   |   307K|  5104K|       | 13493   (1)| 00:02:42 |
    |   6 |       HASH GROUP BY      |           |   307K|    13M|   191M| 13493   (1)| 00:02:42 |
    |*  7 |        INDEX FULL SCAN   | AUSU_PK   |  2809K|   125M|       | 13493   (1)| 00:02:42 |
    |*  8 |      INDEX FAST FULL SCAN| AUSU_PK   |  2809K|   104M|       |  2977   (2)| 00:00:36 |
    |*  9 |     TABLE ACCESS FULL    | AU_SUBTYP |  1404K|    46M|       |  5336   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
       3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
       4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
       7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
                  hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
           filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
                  "AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
       8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
       9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
    I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
    select au_id_k,pgm_struct_id_k from (
    SELECT au_id_k
          ,      pgm_struct_id_k
          ,      ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
          create_ts, eff_dte,exists_flg
          FROM   au_subtyp
          WHERE  create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
          AND    eff_dte  <= TO_DATE('2012-12-31','YYYY-MM-DD') 
          ) d  where rn =1   and exists_flg = 'Y'
    --Explain Plan
    Plan hash value: 4039566059
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  1 |  VIEW                    |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  2 |   WINDOW SORT PUSHED RANK|           |  2809K|   133M|   365M| 40034   (1)| 00:08:01 |
    |*  3 |    TABLE ACCESS FULL     | AU_SUBTYP |  2809K|   133M|       |  5345   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
       2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
                  INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
       3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
                  2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
    Vijay

    Hi Justin,
    Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
    The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
    The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
    Both the production and test environment are 3 Node RAC.
    First Query...
    CPU used by this session     4740
    CPU used when call started     4740
    Cached Commit SCN referenced     21393
    DB time     4745
    OS Involuntary context switches     467
    OS Page reclaims     64253
    OS System time used     26
    OS User time used     4562
    OS Voluntary context switches     16
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     2487
    bytes sent via SQL*Net to client     15830
    calls to get snapshot scn: kcmgss     37
    consistent gets     52162
    consistent gets - examination     2
    consistent gets from cache     52162
    enqueue releases     19
    enqueue requests     19
    enqueue waits     1
    execute count     2
    ges messages sent     1
    global enqueue gets sync     19
    global enqueue releases     19
    index fast full scans (full)     1
    index scans kdiixs1     1
    no work - consistent read gets     52125
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time cpu     1
    parse time elapsed     1
    physical write IO requests     69
    physical write bytes     17522688
    physical write total IO requests     69
    physical write total bytes     17522688
    physical write total multi block requests     69
    physical writes     2139
    physical writes direct     2139
    physical writes direct temporary tablespace     2139
    physical writes non checkpoint     2139
    recursive calls     19
    recursive cpu usage     1
    session cursor cache hits     1
    session logical reads     52162
    sorts (memory)     2
    sorts (rows)     760
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     1
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     9
    Second Query
    CPU used by this session     1197
    CPU used when call started     1197
    Cached Commit SCN referenced     21393
    DB time     1201
    OS Involuntary context switches     8684
    OS Page reclaims     21769
    OS System time used     14
    OS User time used     1183
    OS Voluntary context switches     50
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     767
    bytes sent via SQL*Net to client     15745
    calls to get snapshot scn: kcmgss     17
    consistent gets     23871
    consistent gets from cache     23871
    db block gets     16
    db block gets from cache     16
    enqueue releases     25
    enqueue requests     25
    enqueue waits     1
    execute count     2
    free buffer requested     1
    ges messages sent     1
    global enqueue get time     1
    global enqueue gets sync     25
    global enqueue releases     25
    no work - consistent read gets     23856
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time elapsed     1
    physical read IO requests     27
    physical read bytes     6635520
    physical read total IO requests     27
    physical read total bytes     6635520
    physical read total multi block requests     27
    physical reads     810
    physical reads direct     810
    physical reads direct temporary tablespace     810
    physical write IO requests     117
    physical write bytes     24584192
    physical write total IO requests     117
    physical write total bytes     24584192
    physical write total multi block requests     117
    physical writes     3001
    physical writes direct     3001
    physical writes direct temporary tablespace     3001
    physical writes non checkpoint     3001
    recursive calls     25
    session cursor cache hits     1
    session logical reads     23887
    sorts (disk)     1
    sorts (memory)     2
    sorts (rows)     2810365
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     2
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     5Thanks,
    Vijay
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM

  • What is the best way to archive old e mail folders without buying software. I tried to drag them into a folder on the hard drive but they get saved in a strange format

    What is the best way to archive old e mail folders without buying software. I tried to drag them into a folder on the hard drive but they get saved in a strange format.

    This is on sale for US $12:
    http://www.mupromo.com/?ref=4506

  • Best way to delete large number of records but not interfere with tlog backups on a schedule

    Ive inherited a system with multiple databases and there are db and tlog backups that run on schedules.  There is a list of tables that need a lot of records purged from them.  What would be a good approach to use for deleting the old records?
    Ive been digging through old posts, reading best practices etc, but still not sure the best way to attack it.
    Approach #1
    A one-time delete that did everything.  Delete all the old records, in batches of say 50,000 at a time.
    After each run through all the tables for that DB, execute a tlog backup.
    Approach #2
    Create a job that does a similar process as above, except dont loop.  Only do the batch once.  Have the job scheduled to start say on the half hour, assuming the tlog backups run every hour.
    Note:
    Some of these (well, most) are going to have relations on them.

    Hi shiftbit,
    According to your description, in my opinion, the type of this question is changed to discussion. It will be better and 
    more experts will focus on this issue and assist you. When delete large number of records from tables, you can use bulk deletions that it would not make the transaction log growing and runing out of disk space. You can
    take the table offline for maintenance, a complete reorganization is always best because it does the delete and places the table back into a pristine state. 
    For more information about deleting a large number of records without affecting the transaction log.
    http://www.virtualobjectives.com.au/sqlserver/deleting_records_from_a_large_table.htm
    Hope it can help.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Best Way to Liven a Slightly Dull Recording

    I'm starting to notice that most of my mixes are actually a little too "warm", they seem to be missing some high frequency energy. Many of the tunes sound better in my car when i bump the treble up a notch or two.
    My guess is that I'm losing high frequencies due to imbalanced absorbtion in the studio. I'm constructing new panels this week that will be far superior to the current eggshell foam (yikes!), but would love some input on the best way to correct this after the fact.
    Should I attempt to add some highs back in on individual tracks first, or just treat this in mastering, as all of the instruments that were recorded live are likely to have similar frequencies missing? Linear phase eq on the master output, multi band compression, or download and try out Ozone? Thanks everybody.
    Once I get the band's permission I'm going to post some of the songs, I would love some constructive criticism as well.

    OK, so the ( inexistent ) SRC won't hurt your track.
    POW-r algos were optimized for signals in ( what I've heard )
    #1 Speech
    #2 Pop
    #3 Classical
    That's signal dynamics and frequency not the intent, very generalized examples and nothing near a rule of thumb. 3 may work best for a Pop track and any variations. If you are going to pick one without comparing you could try #2... but it is best to compare when you have time. An easy way to compare is to reduce the input amplitude, then boost the post dither output ( in effect, amplifying the noise/dither )..
    By amplitude practices, I just meant your gain structures in the mix/mastering stages. An example would be how high is the peak sample of the bounce... since you say your mixes have dynamic range you'll have less to worry about there too.
    That old thread... I can actually guess the one. The closest measurement I have made so far has been -188 dB Signal/error ratio induced by rounding, which of course accumulates.
    Usually, the value off the fader is anything bit 0 ( makes perfect sense, different fader settings normally just apply different degrees of DC Offset or around -188 dB. If you really listen and demand excellence, you can hear it. I have since come up with a wokaround which should be lossless so I can continue in my merry way with less compromise while recording.
    re: Fooling with tight mixes
    yeah, you could try moving all that automation around but usually it doesn't end up quite the same.
    Cheers,
    J

  • What is the best way to get the end of record from internal table?

    Hi,
    what is the best way to get the latest year and month ?
    the end of record(KD00011001H 1110 2007  11)
    Not KE00012002H, KA00012003H
    any function for MBEWH table ?
    MATNR                 BWKEY      LFGJA LFMON
    ========================================
    KE00012002H        1210             2005  12
    KE00012002H        1210             2006  12
    KA00012003H        1000             2006  12
    KD00011001H        1110             2005  12
    KD00011001H        1110             2006  12
    KD00011001H        1110             2007  05
    KD00011001H        1110             2007  08
    KD00011001H        1110             2007  09
    KD00011001H        1110             2007  10
    KD00011001H        1110             2007  11
    thank you
    dennis
    Edited by: ogawa Dennis on Jan 2, 2008 1:28 AM
    Edited by: ogawa Dennis on Jan 2, 2008 1:33 AM

    Hi dennis,
    you can try this:
    Sort <your internal_table MBEWH> BY lfgja DESCENDING lfmon DESCENDING.
    Thanks
    William Wilstroth

  • What is the best way to send auto mail with excel(generated by query) attachment?

    Hello,
    Need to generate first data from stored procedure and the save it to an excel file and then send mail with the same excel file.
    So am searching for the best way so that i could do all process in a single task with daily or monthly schedule.
    As per my understanding, we could it via SSIS and by sql server using
    sp_send_dbmail.
    But i have to generate excel file first by stored procedure and then end it by mail.
    So please suggest the best way to accomplish all task in a single task.
    Thanks
    Ajay 

    Hi Ajay,
    As shown in the blog mentioned by Syed, to delete records from the Excel file, you need to use Update OpenRowset command to update the corresponding rows to blank.
    Alternatively, you can also use Derived Column Transformation or Conditional Split Transformation after the OLE DB Source so that you can replace the target rows with NULL before exporting to Excel destination or just redirect the expected records to the
    Excel destination.
    Then, in the Control Flow, you can add a Send Mail Task to send the Excel file by setting it as the attachment. The following screenshot is for your reference:
    Regards,
    Mike Yin
    TechNet Community Support

  • Best way to call methods on similar objects without an interface

    Hi,
    I have two objects that i need to iterate, they both have the same method i need to call during iteration, but those two objects are from different libraries and i cannot change them to add them as implement interface...
       for (Iterator it = documents.iterator(); it.hasNext();) {
               Document1 document = (Document1) it.next();
               document.getName();
    But I can also get a documents's collection where the object is Document2 and not Document1 that also has getName(), what's the best way to implement this? I mean i know i can just add if conditions to say if (instanceof) do this or that.. but I don't think this is good as everytime there's a new type of doc i'd have to add one more...
    Any suggestions?
    Thanks,

    I have two objects that i need to iterate, they both have the same method i need to call during iteration, but those two objects are from different libraries and i cannot change them to add them as implement interface...
    You already know what you need to do. You just don't want to do it.
    You can't treat two (or more) instances the same if they aren't the same. Here are three methods:
    1. Add code (like you propose) to determine which type you have
    2. Create your own classes that extend those classes and have your own class implement your own interface that has the 'getName' method. When you create instances of your own class the constructor can have code that determines which type you have and then use the appropriate 'getName' method. Your app code would use your own classes instead of the ones from the libraries
    3. Use reflection to call the 'getName' method.

  • Best way for add thousans items to listbox without freezing ui in wpf

    Hello guys.
    What is the best way for add thousands items (or even more) to Listbox without freezing UI.
    I search many post in the web but I don't understand how this posts writer wrote that code.
    I realized that ObservableCollection can contain 1000 items or even more and show that items to listbox just for few second without freezing UI but I don't Know how can I use that!
    Can you guys give me an example.
    thanks.

    If you bind an observablecollection you can add items to that from a background thread.  Then bind that to the itemssource.  I usually new up an observablecollection, add the items then set the bound property to  that.
    But I avoid thousands of items. 
    You should provide some sort of filter mechanism the user chooses a category or whatever and then fill with a maximum of 300 items.
    Users simply can't work with thousands of items.
    It is usually reading the data out  a database which takes the time rather than creating objects to bind.
    Hence this:
    protected async override void GetData()
    ThrobberVisible = Visibility.Visible;
    ObservableCollection<CustomerVM> _customers = new ObservableCollection<CustomerVM>();
    var customers = await (from c in db.Customers
    orderby c.CustomerName
    select c).ToListAsync();
    foreach (Customer cust in customers)
    _customers.Add(new CustomerVM { IsNew = false, TheEntity = cust });
    Customers = _customers;
    RaisePropertyChanged("Customers");
    ThrobberVisible = Visibility.Collapsed;
    That's making an asynchronous entity framework call.
    A list of customers is obtained.
    These are then wrapped in a customer viewmodel each.
    Finally the observablecollection Customers is set to this new collection and propertychanged raised to notify the view.
    The itemssource of a datagrid is bound to Customers.
    Hope that helps.
    Recent Technet articles: Property List Editing;
    Dynamic XAML

  • What is the best way to load 14 million COPA records from BW into HANA?

    I have been managing a project in which we are attempting to load COPA data from BW into HANA using Open Hub and we continue to run into memory allocation errors in BW. We have been able to load 350,000 records.
    Any suggestions on what the best approach would be along with BW memory parameters.
    Your replies are appreciated.
    Rob

    Hello,
    this seems to be issue in BW caused by big volume of migrated data. I do not think that this is HANA related problem. I would suggest to post this message into BW area - you might get much better support there.
    But to help as much as I can - I found this (see point 7):
    http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
    7. Specify the number of rows per data package for the data records to be extracted. You can use this parameter to control the maximum size of a data package, and hence also how many main memories need to be made available to structure the data package.
    Hope it helps.
    Tomas

  • Best Way to Go Next to Improve Bright Sky Color?

    Hi guys,
    I've got most of my colors coming out just about the way I want them in Camera Raw, save for one thing:  My bright blue skies come out kind of drab - especially by comparison to the in-camera JPEGs.  Note the difference, for example, in the sky colors in this representative sample.  I'm using the Camera Standard profile from Adobe with some tweaks to the sliders by default, but none of the sliders available will allow me to saturate the sky color without corrupting the balance of the darker colors.
    I wouldn't mind a much more saturated sky but with the bluer color I'm getting from Photoshop, but the real key here is the level of color saturation in the sky.  Clearly the Canon is interpreting the data in the bright sky area pretty differently than Adobe's converter.  In my opinion they're doing a better job of it than Adobe.
    My question is this:  In pursuit of more saturated bright colors, should I tweak up my own profile?  Or perhaps should I craft some action steps in Photoshop proper to correct this?  The former would yield a more integrated, direct result, but the latter wouldn't be too bad as I already use an action to convert images from ProPhoto RGB to sRGB.
    Any suggestions you have would be greatly appreciated.
    Thanks.
    -Noel

    Noel Carboni wrote:
    Hi guys,
    I've got most of my colors coming out just about the way I want them in Camera Raw, save for one thing:  My bright blue skies come out kind of drab - especially by comparison to the in-camera JPEGs.  Note the difference, for example, in the sky colors in this representative sample.  I'm using the Camera Standard profile from Adobe with some tweaks to the sliders by default, but none of the sliders available will allow me to saturate the sky color without corrupting the balance of the darker colors.
    I wouldn't mind a much more saturated sky but with the bluer color I'm getting from Photoshop, but the real key here is the level of color saturation in the sky.  Clearly the Canon is interpreting the data in the bright sky area pretty differently than Adobe's converter.  In my opinion they're doing a better job of it than Adobe.
    My question is this:  In pursuit of more saturated bright colors, should I tweak up my own profile?  Or perhaps should I craft some action steps in Photoshop proper to correct this?  The former would yield a more integrated, direct result, but the latter wouldn't be too bad as I already use an action to convert images from ProPhoto RGB to sRGB.
    Any suggestions you have would be greatly appreciated.
    Thanks.
    -Noel
    Noel,
    In short, color transformation from RAW to output color space in ACR consists of transformation by combination of matrix and lookup table operations, and correcting this via tone curve. Tone curve essentially compresses highlights, so that dynamic range of sensor fits into dynamic range of output media (monitor/paper). As result of this, contrast and saturation of highlights is compressed and you have drab sky (and all other highlight colors, just that it's most visible on the sky). On the other side, contrast of midtones and shadows is enhanced, so dark exposed sky/sea/shadows are oversaturated
    That's one of the reasons why we need rendering, to make the photo look more natural. In Adobe standard profiles, amount of rendering is moderate (older versions) or close to zero (new versions), where actually only rendering is achieved by tone curve
    Manufacturers like Canon, Nikon etc tweak these color profiles to overcome this and other problems (one of them being also handling of blown colors etc). So there are different camera profiles, like Standard, Landscape, Portrait ... However, every manufacturer does this differently, so colors from Canon cameras are different than from Nikon cameras. As I understand Jeff in his previous posts, Adobe team doesn't feel there is a need for additional rendering for some reason ... so Adobe standard profile has minimal amount of rendering
    If you want to use Adobe standard profile, you have to live with that, as you can't correct highlights separately from shadows with neither ACR or DNG PE - correction is only 2D (there is a similar tool from Canon that support 3D coorection, but it works only with DPP and Canon cameras). If you like colors from the camera, use Camera profiles in ACR. But built-in camera profiles (and consequently camera profiles from Adobe also) also have some faults  - for instance,  on my 400D, sky that is close to being blown (in raw color space) but still not blown has hue too low and also looks unnatural with cyan tone (sRGB only, while aRGB profile, that is emulated in ACR is better) - like in the photo you posted above. On the other side, on A620 compact, cyan colors have too high hue and too low saturation (to prevent from blown sky turning cyan I presume), making some beautiful beach scenes quite dull etc ...

  • What is the best way to connect my previous PowerG4 to the mac mini?

    I want to utilize my old G4's HD, but I don't know how to do it so that everything runs smooth. Right now it is hooked-up to the mini in "target mode" via firewire but it seems to slow everything down. Is an enclosure the best option and if so, which one?

    Hi KalikoDawg,
    if you don´t use the ethernet connectors you might as well just set up a net connection between the two. all you need is a twisted-pair ethernet cable (usually w/ green plugs) and i´m pretty sure you won´t have slowdowns. i don´t know about booting from the remote disk though. i´d guess it doesn´t work. but if you just want to grab data off the old G4s HD, that is a good solution.
    just remember to enable the necessary sharing things in network preferences.
    hope it was a help.
    best
    mathias

  • What is the best way to "clean up" a VHS recording after importing?

    I have been importing very old home videos. Some of these are really showing there age. Is there a program that will restore them: like photoshop does with pics.?

    I was under the impression that a TBC only provided a stable signal input. I have a Sony Media Converter on my computer that, isn't great but it works. What I am looking for is something like "EnhanceMovie" so that I can improve the brightness, sharpness, etc. in my imports. Do you know of a mac software that can do this?

  • What is the best way to clean up individual video frames without degrading the quality?

    I have a video that I need to remove several birds that flew by during a video scene.
    The video is an HD 422 1920x1080/23.97P video.
    Is the best to render files from Premiere Pro and open those files in Photoshop then work on each frame; and then render the file again from within Photoshop?  If yes, doesn't this degrade the video quality somewhat?
    The next part may require a separate post in Photoshop forum.
    I've tried to import my rendered mpg & mpeg files into Photoshop but I continue to get an error message "the video file could not be opened."  I've rendered some other formats; .avi & .mov., and these 2 file types open within Photoshop V13 x64 okay.
    CS 6
    Windows 7 x64
    Sincerely,
    JC

    While you can adjust every frame in Photoshop to remove items, a better way is to use the Clone Stamp tool in After Effects for removing items in a scene.
    Here's an excellent video tutorial: http://www.video2brain.com/en/lessons/removing-an-object-with-clone-stamp
    Here's the Help article: http://adobe.ly/Yte2cf

Maybe you are looking for

  • Unable to create new rows in master detail

    Hi, I am new to ADF I am using Jdeveloper 11.1.1.6, I have two tables customer and contacts, customerId is a foriegnKey in contacts. when customer is deleted all the contacts need to be deleted. I have created an association (1 to * customer to conta

  • How to Get User entered value in a text variable

    Hi, I have made a text variable to show the user entered value in the column header. The user enteres value in a formula variable ABC whose default value is say '30'. I am using customer exit to capture this value in the text variable. The code is as

  • ORA-31185 DOM NODES DO NOT BELONG TO THE SAME DOCUMENT

    Hi all, I developed an Oracle function that returns a XMLTYPE. In this function, I first created a node <ROOT> and I would like to append children nodes to this root node. To create children nodes I loop on a cursor and each iteration of this loop "b

  • InDesign installation that supports all languages

    Hi all, I'm a Creative Cloud user and I would like to ask if there is any way of having InDesign installed in a version that is friendly to all world languages. I work primarily with European languages but I also sometimes need to work with translati

  • Error when cancelling cleared invoice

    Hi, I got problem when cancelling cleared invoice. System is pop up an error log : 'Cancellation document is not the same as the original billing doc. $$$' , and 'Data inconsistency during processing of document $$$'. I have not found the clue yet bu