OLAP Cache issue

Hi Team,
We have an issue in broad casting after system upgrade.
We had a BW 7.0, now we have updated to BW 7.3.
Current BW support package is SAPKW73102.
Issue Information "incorrect broadcast type: OLAP".
could you please do let me know, how to proceed further.
Regards,
Jyotsna

Hi,
This is a programme error for BW version 7.30
You need to import BI JAVA PATCH for SAP NW 7.30 to you BI system.
Please check note 1661208 which deals with this problem.
Thanks

Similar Messages

  • Need clarification on OLAP Cache!!!

    Hi,
    This is what is to be followed to turn on or off the olap cache!!!!
    SPRO -> SAP Reference-IMG
    Open the tree: SAP NetWeaver -> Business Intelligence -> Performance Settings
    Execute: Global Cache Settings
    Check the flag: Cache Inactive
    Now the question is....... if you check the "cache inactive" check box, what does it signify? does that mean cache is active or inactive? quite confusing.......
    Regards,
    Surya Tamada.

    Hi,
    Pls look on it,
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/9f4a452b-0301-0010-8ca6-ef25a095834a
    Also check this,
    Re: OLAP Cache issue.....
    http://help.sap.com/saphelp_nw70/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/00/241da12ef84f40bb6ecd3ae73d8b58/content.htm
    Hope it helps.
    Thanks & Regards,
    SD

  • Parallel Caching Issue

    We have found issues with the parallel loading of the OLAP cache using reporting agent jobs where entries are not populated correctly.We rely on the OLAP cache for delivering the very high levels of concurrency during the peak times.
    Once the main batch data loading has been completed we run some background reporting agent jobs to pre-populate the OLAP cach.Each job contains a web application template that holds 1 or more queries and will process 1500+ stores for each run.
    We have different reporting agent jobs for the different web application templates, but we have discovered that if we run these jobs in parallel we do not get the full benefit of the OLAP cache compared to if we run them sequentially.
    If we run the jobs in parallel, when we look in RSRCACHE for these queries it would appear to have populated correctly, but when we check RSDDSTATS for the query performance the following day, we can see that a large number of the
    stores still hit the database and did not benefit from the cache entries. Sometime this can be as much as 60% failure to hit the cache.
    If we run the same job sequentially, then check RSDDSTATS the following
    day, we can see 100% success rate with every store hitting the OLAP cache.
    Is anyone able to advise how we can resolve this parallel caching issue?

    Hi Ganesh,
    I am currently having similar trouble with TBB1 where I receive error message below:
    TRL intialization for MM, FX, Derivatives, co. code WTRD, valn area 003 is not yet complete
    Message no. TPM_TRL052
    Are you familiar with this issue? any help would be greatly appreciated! i hope that you can help..

  • Regarding olap cache

    Hi,
       Can anyone kindly tell olap cache realtime issues.
            Thanks in advance.
    Edited by: Muddus on Mar 10, 2010 7:48 AM
    Edited by: Muddus on Mar 10, 2010 7:48 AM

    Hi,
    To optimize the report performance we will use it. see
    http://help.sap.com/saphelp_nw2004s/helpdata/EN/41/b987eb1443534ba78a793f4beed9d5/content.htm
    Thanks
    Reddy

  • Directory Caching issue with Cisco Jabber client for Windows

    Hi ,
    I am facing cache issue with Cisco Jabber client for Windows. If I do any change related to modification or deletion of contacts in Active Directory/ Callmanager, it does not reflect in the Jabber. Because jabber takes the contacts from the locally stored cache file in the Windows system.
    Every time I have to remove the cache file to overcome this issue, practically it's not possible to do the same with all the Widows users. As, if any employee leaves the company and still I can see his contact appears in the "Cisco Jabber client". I have not seen this issue with Android/Apple iOS.
    Is there any automated way to remove the cache file? 
    Here is the detail of CUCM,Presence and Jabber.
    CUCM version: 9.1.x
    Presence          : 9.1.X
    Jabber              : 10.5 and 10.6

    Hello
    On our environment we had to install a dedicated Microsoft Certificate Authority "just for Cisco Jabber usage" to house the
    Network Device Enrollment Service.
    Our certificate for the CUPS were generated on this Certification Authority too.
    I discussed this certificate matter with my colleagues this afternoon and nobody seems to remember how these certificates were deployed into the
    Enterprise Trust store for the users.
    But I think they asked all 400 users to accept the 3 certificates by answering "yes" to the popup instead of using a script deployed by GPO...
    I wish you success with that deployment and really hope you have a technical partner that *Knows* this subject.
    Our partner left us alone with that unfortunately.
    Florent
    EDIT: If the "Certutil script method" works, please let me know. This could be useful in our own deployment.

  • Sql query cache issue

    I am trying to see the log file in Manage sessions for the sql query in Answers. I see that if we run the same report multiple times, the sql query is showing up only the first time. Second time if I run it is not showing up. If I do a brand new report with diff columns picked it is giving me the sql then. Where do I set this option to show the sql query everytime I run a report even if it is the same report run multiple times. Is this caching issue?

    It shouldn't.... Have you unchecked the "Cache" on the physical layer for this table? If you go onto the Advanced tab, is the option "Bypass the Oracle BI cache" checked?

  • WEB u2013 I cache issue.

    We have BO XI 3.1 PF 1.7 and SAP BW 7.0 SP 18
    We are facing strong caching issue  with WEB- I XI 3.1 , what we observed is it cache previously fetched data and does not recognize data been updated in BW Infoprovider ( in our case we have BEx query on Direct update DSO )
    I have tried u2013
    1.     I made caching inactive for that BEx query in BW side.
    2.     I have check caching setting in BO , they are not helping me much to in active cache in BO
    On executing BEx query in BW, I have checked RSRT cash monitor; the query result is not getting cached, and get up-to-date data. Query works perfectly file.
    Also observed Events in RSDDSTAT_OLAP table, I could find very well Event 9000. This shows me that BEx query hits database and bring data back.
    When I execute BO report on the top of same BEx query, I do not get up to date data , it shows me previously fetched data and does not recognize data been updated in BW.
    Now question is where this data being getting cashed , and how to make that inactive to make sure I do get up to date data in WEB-I .
    Many thanks for your kind response.
    Regards
    Ashutosh D

    HI Ashutosh,
    This would be stored in the C:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\Data\servername_6400\storage folder.  Try renaming this folder and then testing again.  If this allows for updated records, then that proves that the Webi Processing Server is getting the cached data from this directory.
    There are also a few settings you can try to disable in the Webi Processing Server settings within the CMC:
    Disable cache Sharing (Checked)
    Enable Real-Time Caching (Unchecked)
    Enable Document Cache (Unchecked)
    Cache Timeout (1 min) - sets the cache to expire after 1 min.
    Try playing with these settings and see if that resolves the issue. 
    Thanks
    Jb

  • SharedObject and Caching issues

    Background:
         I am working on a project where I am using sharedObject to store data for a favorites list. The project contains 2 swfs, one in AS 3.0 (using Flex 4.5) and the other an AS 2.0 flash 8 project.  Basically the AS 2.0 project read and writes data to the shared object and the AS 3.0 project just reads the data.
    Setup:
         OS: Windows 7
         Flash Player: 10.3     Browsers: IE 9, IE 8, Chrome 14, and some misc Firefox testing -- Chrome seems to be working the best.  
    Problems:
         First off it has been extremely hard to get consistent results across different computers so if anyone has a suggestion as to why that would be (other than different browsers, different flash player versions, different flash player settings/global settings and different OS versions) because I have been making sure all that stuff is consistent across devices for testing purposes.
         1. It appears that some computers/browers won't store the shared object at all even though everything to allow it is turned on.
         2. Some computers/browsers will store it, but won't update until I refresh the page
    General Questions:
         1. Is the storage of sharedObject object data in a .sol file configured the same in both AS 2.0 and AS 3.0 and if so, would this be a means of communicating between two swfs running off the same domain?
         2. Is it possible to have a browser cache sharedObject data(the .sol file)?
          3. I have noticed as new flash player releases are coming out(especially recently i.e. 10.3) I have had to make a lot of repairs to the actionscipt 2 stuff.  Is actionscript 2 support slowly being phased out? Is anyone else experiencing the same problems? 
    Possible Solutions/Findings:
         I am starting to believe that it has something to do with browser caching since Chrome seems to perform the best and Chrome has performed best when it comes to browser caching issues.
    ** code is available if needed but since it was working before I don't think it is relevant.

    Hello,
    #1
    are you using explicit call to flush method immediately afer you're writing data to reference to SharedObject?
    http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/SharedObject. html#flush()
    If not you could have different data read by different browsers sessions run at the same time (e.g. IE/FireFox both hosting your test Flash movies) - as data are flushed only when given Flash runtime session is to be terminated (e.g. movie is to be unloaded) and in other cases as outlined in documentation.
    #2
    the SharedObject is reference to binary file stored on local machine by Flash runtime (Air runtime too). It has nothing similar to web-fetched content to be cached:
    http://en.wikipedia.org/wiki/Local_Shared_Object
    regards,
    Peter

  • SecurityDomain and Caching Issues

    I am running into some caching issues when setting the securityDomain of an imported SWF to match the calling SWF file and I was curious if anyone had any ideas on how to get around this issue.
    Here is the scenario:
    A.swf is launched on Domain A, it sends a Loader request for B.swf on Domain B.
    B.swf will be updated frequently so caching is disabled, A.swf however can be cached.
    B.swf has some ExternalInterface.calls so it requires being in the same securityDomain as A.swf otherwise it will receive a Security Error #2006.
    The code I am using for this is fairly straightforward:
    ActionScript Code:
    var request:URLRequest = new URLRequest("http://domainB/B.swf");
    var loader:Loader = new Loader();
    var context:LoaderContext = new LoaderContext();
    context.securityDomain = SecurityDomain.currentDomain;
    loader.load(request, context);
    I believe B.swf is inheriting the caching setting of A.swf because it is residing in the same securityDomain. If I make a small update to B.swf and refresh A.swf in a browser it will not load the B.swf updates until I clear cache.
    If I get rid of the securityDomain context on the load it will always update B.swf with the most current version, but I run into security issues with ExternalInterface.
    ActionScript Code:
    var request:URLRequest = new URLRequest("http://domainB/B.swf");
    var loader:Loader = new Loader();
    loader.load(request);
    I have tried appending random strings to the end of the URLRequest while using the securityDomain context but it will always used a cached version of B.swf. I have also tried loading B.swf into a ByteArray then using loadBytes on a Loader but that didn't work either.
    Any ideas are appreciated!

    Hello,
    #1
    are you using explicit call to flush method immediately afer you're writing data to reference to SharedObject?
    http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/SharedObject. html#flush()
    If not you could have different data read by different browsers sessions run at the same time (e.g. IE/FireFox both hosting your test Flash movies) - as data are flushed only when given Flash runtime session is to be terminated (e.g. movie is to be unloaded) and in other cases as outlined in documentation.
    #2
    the SharedObject is reference to binary file stored on local machine by Flash runtime (Air runtime too). It has nothing similar to web-fetched content to be cached:
    http://en.wikipedia.org/wiki/Local_Shared_Object
    regards,
    Peter

  • Caching issue in jsp or servlet

    I am not sure where I should post the issue that I have. I have a j2ee web app (struts) running on Apache Web Server + Glassfish + MySQL. My server has multi core.
    The problem is that whenever I insert a new data or delete it from a table, I don't see the change right away. Sometimes, I see the change. Sometimes I don't. It's very inconsistent.
    If the old data is cached in JSP, I shouldn't see the change in log file, but I do see it even in the log file.
    For example, I have a page managing user's folders. when I delete a certain folder name from jsp page, this folder is deleted from a db table. but when I refresh the page, I still see the folder name that is not supposed to show up. or when I go to a different page and come back to this page, I don't see it. The behavior is very inconsistent.
    If it's a browser caching issue, I don't think that I should see it in the log file, but I still see the folder name(which is supposed to be deleted) in the log file.
    I am including these lines in all included jsp pages.
    response.setHeader("Cache-Control","no-cache");
    response.setHeader("Pragma","no-cache");
    response.setDateHeader ("Expires", -1);
    does anybody have any opinion about this?
    It's hard to debug it and describe the behavior.
    But it would be very helpful if someone who had a same experience about this explains about this and tells me how to fix it.
    Thanks.

    caesarkim1 wrote:
    I am including these lines in all included jsp pages.
    response.setHeader("Cache-Control","no-cache");
    response.setHeader("Pragma","no-cache");
    response.setDateHeader ("Expires", -1); Instead of including these lines in all jsp's, make a filter and add these lines to it.

  • Acrobat Connect Pro LMS 7.5 server cache issue - displaying old content

    Our Adobe Acrobat Connect Pro server is showing old Captivate-created content from about 4-6 weeks ago.
    I loaded 35+ sets of Captivate 3 (SCORM 1.2, HTML, zipped) content onto our Acrobat Connect Pro LMS about 6 weeks ago.
    I converted all training content from Captivate 3.0.1 to 4.0.1, 4 weeks ago by opening each file in Captivate 4 and following prompts to "Save As" new files with different file names.
    I reloaded all content 4 weeks ago, and again 2 weeks ago.
    I reloaded about half of all content again last week.
    End User Acceptance testing performed this week showed that most of the courses are showing old content, ranging from 2-6 weeks old
    Attempted fixes and workarounds:
    Deleting content entirely and reloaded from scratch - this will not work long term, as we lose usage data each time we reload completely new files.
    Contacted Adobe, provided times to track incidents of the issue.  We reached Tier 2 - who told us it was our problem and that everything appeared to be working fine from their side.
    New workaround - load new content and reattach course to new content.  This presents the same long term issue as the first workaround, but enables us to retain older versions of content in the system, should we need to revert or report on it.
    Gaining server side access is a bit challenging due to the hosting situation we have, so I am looking (ideally) for a solution that can be performed from the Administrator/Author Frontend.  However, I want to learn the real cause of the problem, wherever it might reside, so that it can be properly corrected and avoided in the future.  I am calling this a server cache issue, as it seems the server has somewhere retained unwanted old versions of content, preventing current content from being displayed to end users.  Viewing content as an end user = see old content.  Viewing content from the Content area (Author view) shows the current files, so I know they are on the server and are loading correctly, up to a point.
    I am preparing all content for another round of loading/reloading due to other issues and updates, so republishing and reloading all 35+ files into the LMS is unavoidable at this point.
    This issue is keeping our LMS from launching to several thousand users across the country, so any suggestions or helpful tips are much appreciated.

    I think I have isolated the source of this problem. It's the Pitstop Professional 9 plug in. I un-installed this, and everything opens quicker than greased lightning. I re-installed it and it's back to slowsville.
    Unfortunately Pitstop is essential to my workflow.
    Until recently I did my pre-press on a Mac G5 with Acrobat Pro 7 and Pitstop 6.5. I never had this problem with slow file opening. But it seems that the delays would occur when I used the plug-in with large complex files.. So it would open files as fast as you'd expect from an elderly machine. But starting to use Pitstop would result in a prolonged period of staring at a spinning beachball.
    I wonder is there any way to stop the Pitstop plug-in from initializing until it is used? So the plug-in stays inert until you select the tool from from the menus.

  • IE Caching Issue while closing and opening a UI thrice or more

    Hi Everybody..
    Greetings of the day..
    I am working on a web application (Can not disclose the name) where at certain level a word document generated contains a customized toolbar.
    It contains one print option, when clicked opens a UI which shows default printer available (Its different from our normal printing).
    When I repeat opening and then closing of this UI thrice or more times, the entire application gets hanged and I have
    to terminate it by killing the associated process from task manager.
    When this issue was analysed our by onsite team they arrived at a conclusion saying that its an IE caching issue.
    But upto my knowledge I know that any application which is .SSL secured, by default IE never does cache control.
    What could be the solution if application is not .SSL secured or if it is.
    Kindly help asap.
    Thanks.

    Hi all
    Did anyone ever manage to fix this? I'm having this issue across multiple iPads. Even if i completely reset the iPad (erasing all content and settings) it does it again when restored.
    I've just got an iPad Air and that does it too. I even created a new Apple ID to see if that would work (trying to avoid any kind of caching aross iCloud, etc) and it does it straight away in those books. It's massivley frsustrating.
    I have noticed it only seems to do it for iBooks formatted books, not ePubs.
    Anyone have a resolution - it's quite embarrassing when you're doing a presentation about book features and the audience assumes youve only the wrong book each time. More annoying when clsoing the book and the wrong cover appears again, but also takes you to that books collection page, rather than the one you were in.

  • Possible session caching issue in SSRS2014

    Using custom FormsAuth, User A can sign into our own main asp.net mvc app (WIF cookie), then SSRS (FormsAuth cookie) and all is well.  Here is where things go bad.  User A signs out of our main application (WIF cookie deleted) then back in into
    our main application as User B then back into SSRS.  SSRS report that displays User!UserID show UserA instead of current User B.  Its like there is either a session or cookie caching issue going on but not for sure.  
    1. What is the proper way to sign out of SSRS and prevent session caching?
    2. Do I need to worry about making my SSRS logon page non-cacheable?  If so, what is the recommended way of doing this?
    thanks
    scott

    Hi scott_m,
    According to your description, you used custom FormsAuthentication in Reporting Services, after user A sign out the application an sign in as user B, SSRS built-in user is shows user A instead of user B.
    Based on my search, once we configured SSRS to use Custom (Forms) authentication by deploying a custom security extension, we can logon to MS Report Manager (MSRM) using credentials of our custom security framework via a logon web page. But there is no way
    to logout or to expire the authentication cookie, so we need to close the browser manually. As a workaround, we can add a logout button to the Report Manager which is using Forms Authentication, then use code to clear the cookie and redirect to home page.
    In addition, if you extend Reporting Services to use Forms Authentication, it’s better to use Secure Sockets Layer (SSL) for all communications with the report server to prevent malicious users from gaining access to another user's cookie. SSL enables clients
    and a report server to authenticate each other and to ensure that no other computers can read the contents of communications between the two computers. All data sent from a client through an SSL connection is encrypted so that malicious users cannot intercept
    passwords or data sent to a report server.
    Here is a relevant thread you can reference:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/5e33949d-7757-45d1-9c43-6dc3911c3ced/how-do-you-logout-of-report-manager
    For more information about Forms Authentication, please refer to the following document:
    https://technet.microsoft.com/en-us/library/aa902691%28v=sql.80%29.aspx?f=255&MSPPError=-2147217396
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    If you have any feedback on our support, please click
    here.
    Wendy Fu
    TechNet Community Support

  • Pre-fill the OLAP cache for a query on Data change event  of infoprovider

    Hi Gurus,
    I have to pre-fill the OLAP cache for a query,which has bad performance.
    I read a doc 'Periodic Jobs and Tasks in SAP BW'
    which suggested sum steps to do this
    i hav created the setting for Bex broadcasting for scheduling job Execution with data change in info provider
    thereafter doc says  "an event has to be raised in the process chain which loads the data to this InfoProvider.When the process chain executes the process u201CTrigger Event Data Change (for Broadcaster)u201D, an event is raised to inform the Broadcaster that the query can be filled in the OLAP cache."
    how can this b done please provide with sum proper steps
    Answers are always appreciated.
    Thanks.

    Hi
    U need to create a process chain or use the existing process chain which you are using to load your current solution, just add event change process type in the process chian  and inside it add the info provider which are going to be affected.
    Once you are done with this go to the broadcaster  and  create new setting for that query...you will see the option for event data chainge in infoprovider just choose that  and create the settings.
    hope it helps

  • Impact of real time cube on query performance and OLAP cache

    Hi:
    We have actual and plan cubes both setup as real time cubes (only plan cube is being planned against, not actual cube) and both cubes are compressed once a day.
    We are planning on implementing BIA accelerator and have questions related to query performance optimization:
    1/ Is there are any query performance benefits in changing the actual cube to be a basic cube (using program SAP_CONVERT_NORMAL_TRANS) if the F table is fully compressed
    2/ Can OLAP cache be leveraged for the queries run against the real time cubes e.g. actual cubes
    3/ What is the impact on BIA of having the actual cube as real time (whetehr or not there is data being loaded/planned during the day in that cube)
    Thank you in advance,
    Catherine

    1) Is there are any query performance benefits in changing the actual cube to be a basic cube (using program SAP_CONVERT_NORMAL_TRANS) if the F table is fully compressed
    From the performance point of view, tha actual cubes i.e std cubes are relatively better.
    2) Yes OLAP cache can be leveraged for bringing up the plan query but all the calculations are done in the planning buffer.
    3) Not sure.

Maybe you are looking for

  • How to set up iMessages?

    I already have my phone and computer connected, and it works well I just have a couple of questions about where to receive and send them. I want to be able to send messages from my computer, but still be able to read the conversation from my phone, h

  • Creation of Smart Forms for PDF

    Hi All, I am new to SFP. So, please tell me how to create Smart Forms for PDF. I already have an SAP standard form of Customer Invoice in this format (SFP) and I need to change the layout and add some additional fields in it as well. Please guide me

  • Oracle 9i and PreparedStatement Date Problem

    Hi, I am working with Oracle 9i and it's version of the classes12.zip driver and I am having difficulties in storing a date value using a PreparedStatement. I only care about the date, but I have tried going the Timestamp route as well and get the sa

  • Can I use a "T" shaped charger?

    The charger to my A1370 MacBook Air just recently frayed out.  I'm a bit catious of buying another "L" shaped one, so I'm wondering if a "T" shaped one would be a good subsitute and which model of it to buy.

  • Service purchasing process in Standalone Scenario.

    We used SRM7.0 in Standalone Scenario, now we need use Service procurement . Question and Description:    first, I create SC .  click  'Add item' -> 'Internal Goods /Service' ,  I input a existed Local Service master, and we can see, the Product cate