Component BI_CONT: SP level 0001 is incorrect, expected 6

Hello!
I faced with problem after installed  the Solution Manager 7.01 (with SP26)
In step INSTALLATION CHECK on the Diagnostics Prerequisites I get status "Activity Error" with explanation
Component BI_CONT: SP level 0001 is incorrect, expected 6
May be somebody know what I need do?
Or I can do some steps to resolved this problem?
Best regards
Dmitry Zakharenkov

Hi
Please verify the solution manager met all the prerequest specified in the note for ur SP level
[Note 1274287 - End-to-End Diagnostics 7.0 EHP1 SP18 to SP26|https://websmp230.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=1274287]
if no, please make them equal to the note
and if yes, you can check the fix in the note avoid the activity error
[ Note 1297598 - Fix in prerequisite|https://websmp230.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=1297598]
Jansi

Similar Messages

  • Component ST-A/PI: release 01N_700SOL is incorrect, expected 01[M-Z]_CRM.*

    Hello!
    I faced with problem after installed Support Package SAPKITL436  (with SP26) on my Solution Manager 7.01
    Support Package SAPKITL436 need to upgrade ST-A/PI,01M_CRM570 to ST-A/PI,01N_700SOL.
    After upgrading ST-A/PI and installed SP26 I try to Refresh BASIC CONFIGURATION.
    In step INSTALLATION CHECK on the Diagnostics Prerequisites I get status "Activity Error" with explanation
    Component ST-A/PI: release 01N_700SOL is incorrect, expected 01[M-Z]_CRM.*: see note 1274287
    I can not view note 1274287
    May be somebody know what I need do? I can ignore this messege?
    Or I can do some steps to resolved this problem?
    Best regards
    Andrey Ivanov

    Hello,
    The Note is currently in status "For Checking" which means a change has been made and the review process is underway.
    It should be released shortly. However you can check the following:
    Requirements with component ST400 for SP26
        o  Solman 7.0 SP26 ABAP-Stack:
           -  Support Package Stack 07 for SAP WEB AS 7.01 AS ABAP
                                                                     Page 7
           -  Support Package 07 for PI_BASIS 701 (SAPK-70107INPIBASIS)
           -  Support Package 07 for SAP_ABA 701 (SAPKA70107)
           -  Support Package 07 for SAP_BASIS 701 (SAPKB70107)
           -  Support Package 07 for SAP_BW 701 (SAPKW70107)
           -  Support Package 16 for CPRXRPM 400 (SAPK-40016INCPRXRPM)
           -  Support Package 19 for SAP_AP (SAPKNA7019)
           -  Support Package 17 for BBPCRM 500 (SAPKU50017)
           -  Support Package 05 for CRMUIF 5.0 (SAPK-50005INCRMUIF)
           -  Support Package 09 for BI_CONT 7.04 (SAPK-70409INBICONT)
           -  Support Package 26 for ST 400 (SAPKITL436)
           -  Support Package 04 for ST-PI 2008_1_700 (SAPKITLRD4)
           -  Support Package 00 for ST-A/PI 01N_700SOL (SAPK-01NS2INSSA)
           -  Minimum Support Package 24 for ST-ICO 150_700
    (SAPK-1507OINSTPL)
           -  Minimum Support Package 03 for ST-SER 701_2010_1 (SAPKITLOS3)
        o  SolMan 7.0 SP26 JAVA-Stack:
           -  Support Package Stack 07 for SAP WEB AS 7.01 AS JAVA
           -  Support Package 09 for SOLMANDIAG 7.01
    You should verify your component levels agains the above listed ones.
    Also check AGS_UPDATE and ensure you have checked for, downloaded and applied any and all updates.
    Hope this helps.
    Regards,
    Paul

  • Add component BI_CONT

    We have installed a new NetWeaver 7,0 ABAP system with Ehp1. We installed component SAP_BW but did not install BI Content.
    Now I would like to add BI content (Component BI_CONT), how do I proceed ??

    Brian,
    Just to add:-First you need install BI_CONT addon using tcode SAINT & then apply all BI_CONt pacthes using tcode SAPM.
    Make sure you upgarde your SPAM/SAINT pacth level to latest one before you go for BI_CONT Installation/patch upgrade.
    Check SAP Note 1000822 Overview: Notes for BI_CONT add-on
    Thanks,
    Sushil

  • SEM-BW component Upgrade to Level 4

    Hi,
    We are in SAP_BW release 701 Level 3. We are upgrading SEM-BW component from 604 level 2 to Level 4.
    Do SAP_BW has to be upgraded to Level 4 too??
    What are the pros and Cons of Upgrading to SEM-BW 604 level 4?
    Regards
    Rob

    Hi,
    Elements will be adjusted after the upgrade.
    As you already on 7.0. so dtp and transformations will be the same. some 7.4 relates functions will be enabled.
    To know the more information about upgrade to 7.4, please check on Google you may get discussed threads or sap notes.BW 7.4
    Thanks

  • BI_CONT Latest Level

    Hi Guys,
    What is the latest level for BI_CONT (BI 7.0)?  SAP_BW is level 20.
    Thanks,
    Timi Jones

    Hi Timi:
        Please refer to SAP Note 1254946 - BI_CONT 704:Information on Add-On Support Packages
    II.  Available Support Packages for BI_CONT 704 :
    Following is the schedule for the BI_CONT Support Packages
    Support Package       Type    Release Date  Support pack Note
    SAPK-70401INBICONT    CSP      21.11.2008
    SAPK-70402INBICONT    CSP      03.03.2009
    SAPK-70403INBICONT    CSP      04.05.2009
    SAPK-70404INBICONT    CSP      29.09.2009
    SAPK-70405INBICONT    CSP      07.12.2009
    SAPK-70406INBICONT    CSP      01.03.2010
    Best Regards,
    Francisco Milán.

  • Usage task type, task component and task level and service number in time s

    Hi guru,
    when I compile time sheet, I want to fill field task type, task component, task level, sender purchase porder, PO item e service number but the system give me an error, LR105, "service does not exist". I uderstand that this error is caused from missing record inderivationor task component but I work on more services.
    Is there anyway to set in this derivation all the services or a range of services?
    Thanks
    Regards

    not answered

  • ALE Message type HRMD_A Infotype 0001 is incorrect using PROOF flag = X

    Hi
    We have an issue using HRMD_A for Infotype 0001 with the PROOF flag swithed on. Using the PROOF flag triggers a call of HR_MAINTAIN_MASTERDATA instead of a direct write to db tables. However sometimes the update of Infotype 1 is incomplete and records are lost. The data goes through a transformation in the inbound BAdi, here I have checked the sequence is still correct. But when the data is written to Infotype 1 it seems to overwrite the last entries.
    If anyone has any experience with this, I would be very interested for any input.
    Thanks in advance
    Sonny

    Hi Sandip
    Thx for your response. If I understand you correctly you suggest I use the HR_INFOTYPE_OPERATION. This could be an alternative, but I would prefer to understand the reason behind the error.
    HR_INFOTYPE_OPERATION actually calls HR_MAINTAIN_MASTERDATA, so this FM might not solve the issue, but it does give the option for updating one record at a time.
    Sonny

  • BI_CONT Patch levels and Transports

    Hi,
    Our BI development and test systems are on a different (lower) BI Content Patch level than our productive system is.
    Can this cause problems (errors) when transporting to production?
    Thanks.

    Hi,
    I alreadyu made a test from different support packages version (even BW version from 2.1 to 3.1) and i worked.
    My transport included only specific and no customizing, just infocube, objects,... not Business Content developpments.
    Regards
    CYril

  • Why would cause my battery Level to read incorrectly?

    My level shows 90% and above all day. I switched to time instead of percentage and it says 2:26, now the percentage says 87%.
    Any ideas? The macwas made in 2008, Alum/MacBook 13".

    Welcome to Apple Support Communities.
    You likely need to calibrate the battery. It would not hurt to reset the SMC and also the PRAM.
    Battery Calibration: http://support.apple.com/kb/ht1490
    SMC reset: http://support.apple.com/kb/ht3964
    PRAM reset: http://support.apple.com/kb/ht1379

  • SOLMAN_SETUP - DIAGNOSTICS Prerequisites fails for BI_CONT

    Hi,
    We have installed a new Solman 7.0 EHP1 (with new SR1 DVD/media) and then updated it to SP27. SP27 patching has been done with the SPS stack - and this has automatically installed/updated new component ST-BCO and updated BI_CONT to 706 SP1.
    During Post Installation configuration, under SOLMAN_SETUP, we have observed that "DIAGNOSTICS Prerequisites" step is failing with the message
    Component BI_CONT: release 706 is incorrect, expected 704: see note 1274287
    Component BI_CONT: SP level 0001 is incorrect, expected 6
    As per SAP note 1274287, our system meets minimum system requirements for BI_CONT.
    But not for ST-BCO as minimum requirements is - Support Package 01 for ST-BCO 400 (SAPK-40001INSTBCO ?)
    My questions are:
    Any idea why the Diagnostics Prerequisites are still failing.
    I can't find this patch anywhere in sap service marketplaceso any idea from where to download this.
    Appreciate your views on this asap, please.
    BR
    Davinder

    Appreciate your quick response. Following the recommeded Note -- solves my first problem - wow.. Thanks again for this.
    Will really appreciate if you can clarify on my second query also, please-
    "Note 1274287 - End-to-End Diagnostics 7.0 EHP1 SP18 to SP27" - our system meets all the prerequisites, except for the component ST-BCO - our system has ST-BCO 400 SP0.
    But requirement is to have Support Packages 1 (SAPK-40001INSTBCO).... which i am not able to find out from SAP Service marketplace,
    Any idea how this can be downloaded, please
    Requirements with component ST400 for SP27
        o  Solman 7.0 SP27 ABAP-Stack:
           -  Support Package Stack 08 for SAP WEB AS 7.01 AS ABAP
           -  Support Package 08 for PI_BASIS 701 (SAPK-70108INPIBASIS)
           -  Support Package 08 for SAP_ABA 701 (SAPKA70108)
           -  Support Package 08 for SAP_BASIS 701 (SAPKB70108)
           -  Support Package 08 for SAP_BW 701 (SAPKW70108)
           -  Support Package 19 for CPRXRPM 400 (SAPK-40019INCPRXRPM)
           -  Support Package 22 for SAP_AP 700(SAPKNA702)
           -  Support Package 18 for BBPCRM 500 (SAPKU50018)
           -  Support Package 07 for CRMUIF 500 (SAPK-50007INCRMUIF)
           -  Support Package 01 for BI_CONT 706 (SAPK-70601INBICONT ?)
           -  Support Package 01 for ST-BCO 400 (SAPK-40001INSTBCO ?)
           -  Support Package 27 for ST 400 (SAPKITL437)
           -  Support Package 04 for ST-PI 2008_1_700 (SAPKITLRD4)
           -  Support Package 00 for ST-A/PI 01N_700SOL (SAPK-01NS2INSSA)
           -  Minimum Support Package 24 for ST-ICO 150_700 (SAPK-1507OINSTPL)
           -  Minimum Support Package 07 for ST-SER 701_2010_1 (SAPKITLOS7)
    BR
    Davinder

  • Error checkin prerequisites diagnostics in SOLMAN_SETUP

    Hi All!
    I trying execute Basic Configuration in SOLMAN_SETUP. In section installation check on step Prerequisites Diagnostics receive messages:
    Component CPRXRPM: SP level 0007 is incorrect, expected  14
    Component CPRXRPM: release 450_700 is incorrect, expected 400
    Could you please help me to resolve it?
    Thanks for your advance,
    alexander Ganishev

    Hi Shyam!
    If am not wrong CPRXRPM450 version was requested by " Diagnostic Prerequisites" check?
    in system installed version 450_700
    If yes, you can ignore the message from Diagnostic Prerequisites check and set your system to CPRXRPM 400.
    Could you try to implement the required version ?
    How can I set system to CPRXRPM 400? Version CPRXRPM 450 cannot uninstalled.
    and also deploy LMService 701 SP02 Patch 4.
    in system applied LM-SERVICE  7.01 SP4 
    Thanks for your advance,
    Alexander

  • How to find a component inside region at the bean level

    Hi,
    Am using JDEV 11.1.2.0.0 . As per requirement developing a sample to use keyboard shortcut .. Registered the keys combination in the js and added the respective client and server listeners. My requirement goes as follows..
    1. Have a .jspx page called -> HotKeys.jspx.
    2. In the .jspx page have a region called DepartmentTF -> Department.jsff that in turn holds the navigation buttons.
    First -> Ctrl + F
    Next -> Ctrl + N
    3.Having running the page , when i give Ctrl + N , it should identify the Next button and trigger the respective code..
    But the issue is how do i found the Next button component at bean level...
    Thanks and Regards,
    Vinitha G

    Hi,
    if the JSFF is always shown in the region, then you could access the button with the knowledge of the path it sits in
    1. find the regions
    RichRegion _region = uiViewRoot.findComponent("regionId");
    2.
    _region.findComponent("idOfNextButton");
    If you need this to be more dynamic, have a read here:
    http://www.oracle.com/technetwork/developer-tools/adf/downloads/58-optimizedadffacescomponentsearch-175858.pdf
    Frank

  • Furmula displaying incorrect result at Hierarchy Node level

    Hi Experts,
    I have created query which is using simple formula as Amount = Price * Quantity.
    The query is using hierarchy on Material.
    When I drill down to the lowesr level (leaf node) the formula is showing correct result.
    But at all the node levels its displaying incorrect result.
    Ex.
    Mataterial 1  Rate = 4 Quantity = 10
    Maaterial 2   Rate = 3 Quantity = 10
    In hierarchy both the se materials are under same node (Parent node) say Node 1
    The query output is
    Material                Amount
    Node 1                140   (7 * 20)
        Material 1          40   (4 * 10)
        Material 2          30   (3 * 10)    
    It should be
    Material                Amount
    Node 1                 70
        Material 1         40
        Material 2         30
    Please suggest how can this be achieved.
    Regards
    SSS

    Hi SSS,
    Have you ben able to resolve this?? I am also facing the same issue..If you have resolved pls reply.
    Regards,
    Tapan

  • Installing ST-A/PI Addon : "OCS package SAPKITAB9L does not match the current software component vector"

    Dear Gurus,
    When i try to install ST-A/PI Addon from SPAM and select queue : OCS package SAPKITAB9L does not match the current software component vector
    My downloads ;
    1 - SAPKITAB9J ==> CSN0120061532_0073055.PAT  (ST-A/PI , 01Q_731 , Package Level=0000) downloaded from
    ST-A/PI 01Q_731 (INSTALLATIONS AND UPGRADES)
    2 - SAPKITAB9K==> CSR0120031469_0081131.PAT  (ST-A/PI , 01Q_731 , Package Level=0001) downloaded from
    ST-A/PI 01Q_731 (SUPPORT PACKAGES AND PATCHES)
    3 - SAPKITAB9L==> CSR0120031469_0076956.PAT   (ST-A/PI , 01Q_731 , Package Level=0002)
    ST-A/PI 01Q_731 (SUPPORT PACKAGES AND PATCHES)
    I placed them under EPS and uploaded succesfully when i "Load Packages" from " "from Application Server"
    What could be problem ?
    Here is system components ;
    SAP_BASIS
    740
    0005
    SAPKB74005
    SAP Basis Component
    SAP_ABA
    740
    0005
    SAPKA74005
    Cross-Application Component
    SAP_GWFND
    740
    0005
    SAPK-74005INSAPGWFND
    SAP Gateway Foundation 7.40
    SAP_UI
    740
    0006
    SAPK-74006INSAPUI
    User Interface Technology 7.40
    PI_BASIS
    740
    0005
    SAPK-74005INPIBASIS
    Basis Plug-In
    ST-PI
    2008_1_710
    0007
    SAPKITLRE7
    SAP Solution Tools Plug-In
    BI_CONT
    757
    0001
    SAPK-75701INBICONT
    Business Intelligence Content
    BI_CONT_XT
    757
    0001
    SAPK-75701INBICONTXT
    Business Intelligence Content for Bobj I
    SAP_BW
    740
    0005
    SAPKW74005
    SAP Business Warehouse
    CPMBPC
    801
    0005
    SAPK-80105INCPMBPC
    CPM Business Planning and Consolidation
    POASBC
    100_731
    0005
    SAPK-10205INPOASBC
    POA Shared Business Components

    in 000 client but now i am getting DUMP ;
    Category               ABAP Programming Error
    Runtime Errors         CALLBACK_REJECTED_BY_WHITELIST
    ABAP Program           SAPLSTPA
    Application Component  BC-CTS
    What happened?
        An RFC callback has been prevented due to no corresponding whitelist
        entry being configured. The original RFC called function module "RFC_TP" in
        the target system with destination "CALLTP_WindowsNT". The callback called
         function
        module "TRINT_PROGRESS_INDICATOR" in its own system.
    Error analysis
        Function module "RFC_TP" was called. This execute an RFC callback via ABAP
        statement CALL FUNCTION 'TRINT_PROGRESS_INDICATOR' DESTINATION 'BACK'
        . This action resulted in this ABAP short dump.
        This action was triggered explicitly by the application in transaction
        "SAINT               " and in ABAP main program "SAPLSAINT_UI".

  • CS4 NOT capable of sharp displays at all zoom levels

    I must have been asleep, until now, and missed the significance and importance of what follows.
    In post #11 here:
    http://forums.adobe.com/thread/375478?tstart=30
    on 19 March 2009 Chris Cox (Adobe Photoshop Engineer - his title on the old forums) said this, in a discussion regarding sharpness in CS4:
    "You can't have perfectly sharp images at all zoom levels.". Unfortunately, my experience with CS4 since its release late last year has repeatedly confirmed the correctness of this statement.
    What makes this statement so disturbing is that it contradicts an overwhelming amount of the pre- and post-release promotional advertising of CS4 by Adobe, to the effect that the OpenGL features of CS4 enable it to display sharp images at all zoom levels and magnifications. What is surprising is that this assertion has been picked up and regurgitated in commentary by other, sometimes highly experienced, Ps users (some unconnected with, but also some directly connected with, Adobe). I relied upon these representations when making my decision to purchase the upgrade from CS3 to CS4. In fact, they were my principal reason for upgrading. Without them, I would not have upgraded. Set out in numbered paragraphs 1 to 6 below is a small selection only of this material.  
    1. Watch the video "Photoshop CS4: Buy or Die" by Deke McClelland (inducted into the Photoshop Hall of Fame, according to his bio) on the new features of CS4 in a pre-release commentary to be found here:
    http://fyi.oreilly.com/2008/09/new-dekepod-deke-mcclelland-on.html
    Notice what he says about zooming with Open GL: "every zoom level is a bicubically rendered thing of beauty". That, when viewed with the zooming demonstrated, can only be meant to convey that your image will be "sharp" at all zoom levels. I'm sure he believes it too - Deke is someone who is noted for his outspoken criticism of Photoshop when he believes it to be deserved. It would seem that he must not have experimented and tested to the extent that others posting in this forum have done so.
    2. Here's another Adobe TV video from Deke McClelland:
    http://tv.adobe.com/#vi+f1584v1021
    In this video Deke discusses the "super smooth" and "very smooth" zooming of CS4 at all zoom levels achieved through the use of OpenGL. From the context of his comments about zooming to odd zoom levels like 33.33% and 52.37%, it is beyond doubt that Deke's use of the word "smooth" is intended to convey "sharp". At the conclusion of his discussion on this topic he says that, as a result of CS4's "smooth and accurate" as distinct from "choppy" (quoted words are his) rendering of images at odd zoom levels (example given in this instance was 46.67%), "I can actually soft proof sharpening as it will render for my output device".
    3. In an article by Philip Andrews at photoshopsupport.com entitled 'What's New In Adobe Photoshop CS4 - Photoshop 11 - An overview of all the new features in Adobe Photoshop CS4',
    see: http://www.photoshopsupport.com/photoshop-cs4/what-is-new-in-photoshop-cs4.html
    under the heading 'GPU powered display', this text appears :
    "Smooth Accurate Pan and Zoom functions – Unlike previous versions where certain magnification values produced less than optimal previews on screen, CS4 always presents your image crisply and accurately. Yes, this is irrespective of zoom and rotation settings and available right up to pixel level (3200%)." Now, it would be a brave soul indeed who might try to argue that "crisply and accurately" means anything other than "sharply", and certainly, not even by the wildest stretch of the imagination, could it be taken to mean "slightly blurry but smooth" - to use the further words of Chris Cox also contained in his post #11 mentioned in the initial link at the beginning of this post.
    4. PhotoshopCAFE has several videos on the new features of CS4. One by Chris Smith here:
    http://www.photoshopcafe.com/cs4/vid/CS4Video.htm
    is entitled 'GPU Viewing Options". In it, Chris says, whilst demonstrating zooming an image of a guitar: "as I zoom out or as I zoom in, notice that it looks sharp at any resolution. It used to be in Photoshop we had to be at 25, 50 , 75 (he's wrong about 75) % to get the nice sharp preview but now it shows in every magnification".
    5. Here's another statement about the sharpness of CS4 at odd zoom levels like 33.33%, but inferentially at all zoom levels. It occurs in an Adobe TV video (under the heading 'GPU Accererated Features', starting at 2 min 30 secs into the video) and is made by no less than Bryan O'Neil Hughes, Product Manager on the Photoshop team, found here:
    http://tv.adobe.com/#vi+f1556v1686
    After demonstrating zooming in and out of a bunch of documents on a desk, commenting about the type in the documents which is readily visible, he says : "everything is nice and clean and sharp".
    6. Finally, consider the Ps CS4 pdf Help file itself (both the original released with 11.0 and the revised edition dated 30 March 2009 following upon the release of the 11.0.1 update). Under the heading 'Smoother panning and zooming' on page 5, it has this to say: "Gracefully navigate to any area of an image with smoother panning and zooming. Maintain clarity as you zoom to invididual pixels, and easily edit at the highest magnification with the new Pixel Grid." The use of the word "clarity" can only mean "sharpness" in this context. Additionally, the link towards the top of page 28 of the Help file (topic of Rotate View Tool) takes you to yet another video by Deke McClelland. Remember, this is Adobe itself telling you to watch this video. 5 minutes and 40 seconds into the video he says: "Every single zoom level is fluid and smooth, meaning that Photoshop displays all pixels properly in all views which ensures more accurate still, video and 3D images as well as better painting, text and shapes.". Not much doubt that he is here talking about sharpness.
    So, as you may have concluded, I'm pretty upset about this situation. I have participated in another forum (which raised the lack of sharp rendering by CS4 on several occasions) trying to work with Adobe to overcome what I initially thought may have been only a problem with my aging (but nevertheless, just-complying) system or outdated drivers. But that exercise did not result in any sharpness issue fix, nor was one incorporated in the 11.0.1 update to CS4. And in this forum, I now read that quite a few, perhaps even many, others, with systems whose specifications not only match but well and truly exceed the minimum system requirements for OpenGL compliance with CS4, also continue to experience sharpness problems. It's no surprise, of course, given the admission we now have from Chris Cox. It seems that CS4 is incapable of producing the sharp displays at all zoom levels it was alleged to achieve. Furthermore, it is now abundently clear that, with respect to the issue of sharpness, it is irrelevant whether or not your system meets the advertised minimum OpenGL specifications required for CS4, because the OpenGl features of CS4 simply cannot produce the goods. What makes this state of affairs even more galling is that, unlike CS3 and earlier releases of Photoshop, CS4 with OpenGL activated does not even always produce sharp displays at 12.5, 25, and 50% magnifications (as one example only, see posts #4 and #13 in the initial link at the beginning of this post). It is no answer to say, and it is ridiculous to suggest (as some have done in this forum), that one should turn off OpenGL if one wishes to emulate the sharp display of images formerly available.

    Thanks, Andrew, for bringing this up.  I have seen comments and questions in different forums from several CS4 users who have had doubts about the new OpenGL display functionality and how it affects apparent sharpness at different zoom levels.  I think part of the interest/doubt has been created by the over-the-top hype that has been associated with the feature as you documented very well.
    I have been curious about it myself and honestly I didn't notice it at first but then as I read people's comments I looked a little closer and there is indeed a difference at different zoom levels.  After studying the situation a bit, here are some preliminary conclusions (and I look forward to comments and corrections):
    The "old", non-OpenGL way of display was using nearest-neighbor interpolation.
    I am using observation to come to this conclusion, using comparison of images down-sampled with nearest-neighbor and comparing them to what I see in PS with OpenGL turned off.  They look similar, if not the same.
    The "new", OpenGL way of display is using bilinear interpolation.
    I am using observation as well as some inference: The PS OpenGL preferences have an option to "force" bilinear interpolation because some graphics cards need to be told to force the use of shaders to perform the required interpolation.  This infers that the interpolation is bilinear.
    Nothing is truly "accurate" at less than 100%, regardless of the interpolation used.
    Thomas Knoll, Jeff Schewe, and others have been telling us that for a long time, particularly as a reason for not showing sharpening at less than 100% in ACR (We still want it though ).  It is just the nature of the beast of re-sampling an image from discrete pixels to discrete pixels.
    The "rule of thumb" commonly used for the "old", non-OpenGL display method to use 25%, 50%, etc. for "accurate" display was not really accurate.
    Those zoom percentages just turned out to be less bad than some of the other percentages and provided a way to achieve a sort of standard for comparing things.  Example: "If my output sharpening looks like "this" at 50% then it will look close to "that" in the actual print.
    The "new", OpenGL interpolation is certainly different and arguably better than the old interpolation method.
    This is mainly because the more sophisticated interpolation prevents drop-outs that occurred from the old nearest-neighbor approach (see my grid samples below).  With nearest-neighbor, certain details that fall into "bad" areas of the interpolated image will be eliminated.  With bilinear, those details will still be visible but with less sharpness than other details.  Accuracy with both the nearest-neighbor and bilinear interpolations will vary with zoom percentage and where the detail falls within the image.
    Since the OpenGL interpolation is different, users may need to develop new "rules of thumb" for zoom percentages they prefer when making certain judgements about an image (sharpening, for example).
    Note that anything below 100% is still not "accurate", just as it was not "accurate" before.
    As Andrew pointed out, the hype around the new OpenGL bilinear interpolation went a little overboard in a few cases and has probably led to some incorrect expectations from users.
    The reason that some users seem to notice the sharpness differences with different zooms using OpenGL and some do not (or are not bothered by it) I believe is related to the different ways that users are accustomed to using Photoshop and the resolution/size of their monitors.
    Those people who regularly work with images with fine details (pine tree needles, for example) and/or fine/extreme levels of sharpening are going to see the differences more than people who don't.  To some extent, I see this similar to people who battle with moire: they are going to have this problem more frequently if they regularly shoot screen doors and people in fine-lined shirts.   Resolution of the monitor used may also be a factor.  The size of the monitor in itself is not a factor directly but it may influence how the user uses the zoom and that may in turn have an impact on whether they notice the difference in sharpness or not.  CRT vs LCD may also play a role in noticeability.
    The notion that the new OpenGL/bilinear interpolation is sharp except at integer zoom percentages is incorrect.
    I mention this because I have seen at last one thread implying this and an Adobe employee participated who seemed to back it up.  I do not believe this is correct.  There are some integer zoom percentages that will appear less sharp than others.  It doesn't have anything to do with integers - it has to do with the interaction of the interpolation, the size of the detail, and how that detail falls into the new, interpolated pixel grid.
    Overall conclusion:
    The bilinear interpolation used in the new OpenGL display is better than the old, non-OpenGL nearest-neighbor method but it is not perfect.  I suspect actually, that there is no "perfect" way of "accurately" producing discrete pixels at less than 100%.  It is just a matter of using more sophisticated interpolation techniques as computer processing power allows and adapting higher-resolution displays as that technology allows.  When I think about it, that appears to be just what Adobe is doing.
    Some sample comparisons:
    I am attaching some sample comparisons of nearest-neighbor and bilinear interpolation.  One is of a simple grid made up of 1 pixel wide lines.  The other is of an image of a squirrel.  You might find them interesting.  In particular, check out the following:
    Make sure you are viewing the Jpegs at 100%, otherwise you are applying interpolation onto interpolation.
    Notice how in the grid, a 50% down-sample using nearest-neighbor produces no grid at all!
    Notice how the 66.67% drops out some lines altogether in the nearest-neighbor version and these same lines appear less sharp than others in the bilinear version.
    Notice how nearest-neighbor favors sharp edges.  It isn't accurate but it's sharp.
    On the squirrel image, note how the image is generally more consistent between zooms for the bilinear versions.  There are differences in sharpness though at different zoom percentages for bilinear, though.  I just didn't include enough samples to show that clearly here.  You can see this yourself by comparing results of zooms a few percentages apart.
    Well, I hope that was somewhat helpful.  Comments and corrections are welcomed.

Maybe you are looking for