Analysis Problem in RAR

hi,
We have some troubles in Risk analysis and Remediation.
In the configuration tab, background job:
-User/Role and Profile Sync are OK
-But when we launch a user, role or profile analysis we always have the statuts error and the error message(in job history):
Failed: error while executing the job:Index , Size:0
So at the moment in the informer tab, we don't have any report.
Thank you in advance for your help,
Julien

here is the end of the log (status error):
Tue Aug 26 21:48:45 CEST 2008 :  Job ID:68 : Analysis done: SAP_EMPLOYEE_ERP05_FR elapsed time: 0 ms
Tue Aug 26 21:48:45 CEST 2008 : Job ID: 68 Status: Running
Tue Aug 26 21:48:45 CEST 2008 : Job ID: 68 Status: Running
Tue Aug 26 21:48:45 CEST 2008 :  Job ID:68 : Exec Risk Analysis
Tue Aug 26 21:48:45 CEST 2008 :  Job ID:68 : Analysis starts: SAP_WP_CEO_HOSP
Tue Aug 26 21:48:45 CEST 2008 :  Job ID:68 : Analysis done: SAP_WP_CEO_HOSP elapsed time: 0 ms
Tue Aug 26 21:48:45 CEST 2008 : Job ID: 68 Status: Running
Tue Aug 26 21:48:45 CEST 2008 : Job ID: 68 Status: Running
Tue Aug 26 21:48:45 CEST 2008 : --- BKG Role Permission Analysis (System: E30) completed ---  elapsed time: 73609 ms
Tue Aug 26 21:48:46 CEST 2008 : Index: 0, Size: 0java.util.ArrayList.RangeCheck(ArrayList.java:507)
java.util.ArrayList.get(ArrayList.java:324)
com.virsa.cc.xsys.bg.bo.BgSchedulerBO.getLastRunDate(BgSchedulerBO.java:627)
com.virsa.cc.xsys.bg.bo.BgSchedulerBO.updateLastRun(BgSchedulerBO.java:688)
com.virsa.cc.xsys.bg.BatchRiskAnalysis.performBatchRiskAnalysis(BatchRiskAnalysis.java:1093)
com.virsa.cc.xsys.bg.BatchRiskAnalysis.performBatchSyncAndAnalysis(BatchRiskAnalysis.java:1272)
com.virsa.cc.xsys.bg.BgJob.runJob(BgJob.java:401)
com.virsa.cc.xsys.bg.BgJob.run(BgJob.java:263)
com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.scheduleJob(AnalysisDaemonBgJob.java:201)
com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.start(AnalysisDaemonBgJob.java:78)
com.virsa.cc.comp.BgJobInvokerView.wdDoModifyView(BgJobInvokerView.java:434)
com.virsa.cc.comp.wdp.InternalBgJobInvokerView.wdDoModifyView(InternalBgJobInvokerView.java:1223)
com.sap.tc.webdynpro.progmodel.generation.DelegatingView.doModifyView(DelegatingView.java:78)
com.sap.tc.webdynpro.progmodel.view.View.modifyView(View.java:337)
com.sap.tc.webdynpro.clientserver.cal.ClientComponent.doModifyView(ClientComponent.java:481)
com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.doModifyView(WindowPhaseModel.java:551)
com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.processRequest(WindowPhaseModel.java:148)
com.sap.tc.webdynpro.clientserver.window.WebDynproWindow.processRequest(WebDynproWindow.java:335)
com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:143)
com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:321)
com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:713)
com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:666)
com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:250)
com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)
com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)
com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doGet(DispatcherServlet.java:46)
javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)
com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)
com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1039)
com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)
com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
java.security.AccessController.doPrivileged(Native Method)
com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:102)
com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:172)
Tue Aug 26 21:48:46 CEST 2008 : Job ID: 68 Status: Error
Tue Aug 26 21:48:46 CEST 2008 : ----------- Background Job History: job id=68, status=2, message=Error while executing the Job:Index: 0, Size: 0
Tue Aug 26 21:48:46 CEST 2008 : -----------------------Complted Job =>68---------------------------------------------------------------

Similar Messages

  • Sales analysis problem

    sales analysis problem
    scenario: there are many plants and customers, the relationship between them is n:n, and every
    customer has a classification.
    my question is how to calculate the number of the plant that sold to one classification of the
    customer in BW.
    for example;
    plant: 1000, 1001, 1002
    customer: 2000, 2001, 2002
    classification: A, B
    customer's classification: 2000:A, 2001:A, 2002:B
    transactions:
    plant->customer
    1000 ->2000:A
    1000 ->2001:A
    1001 ->2000:A
    1001 ->2002:B
    1002 ->2002:B
    so the result is : the number of plant that sold to customer classification A is 2;sold to
    customer classification B is 2;
    how to carry out it in bw with a cube, please help help me solve the problem, thank you very
    much!

    Wilfred,
    You should create a message with SAP Support on this issue.
    Eddy

  • GRC 5.3: CUP risk analysis VS. RAR risk analysis

    I've installed and configured RAR and CUP.  When I do a risk analysis simulation in RAR on a user for adding a role, it comes back with no conflicts.  When I go into CUP and make a new request for adding the same role to the same user, it comes back with risk violations, but it looks like they are critical actions that are being flagged.  Why is there a discrepancy, and how do I go about getting the same risks in CUP as I do in RAR?

    >
    Frank Koehntopp wrote:
    > I guess the behaviour is on purpose.
    >
    > In RAR, you can do a selective analysis on only one kind of risk. You usually only need to do that in the remediation process, where this kind of selection is helpful to track down the root cause (although I'd like to have an ALL option in RAR as well...)
    >
    > In CUP, you do want to see any kind of risk that might arise from a role assignement to a user.
    >
    > I have to say, I can not really understand why you'd want to switch off critical action or permission risks here. The user analysis in RAR and CUP serve two different purposes, hence I cannot see a bug here. If you have defined critical risks, why would you not want to see them???
    Hi Frank,
    I understand your point, but we are in the same situation as the others. We do not want to see Critical Action Risks in CUP because this is a separate process (for us) than Permission Level Risks Analysis piece. With our current structure, our Security Admins use RAR to run Permission Level Risk Analysis and mitigates appropriately. A separate compliance group uses the Critical Action reports to see who has what Critical tcodes, etc. We do not mitigate these "risks," we more or less use it as a report.
    I do not understand what you mean when you say "The user analysis in RAR and CUP serve two different purposes" - I feel it should be the same purpose, to ultimatley simulate if adding security to a user will cause SOD violations. If I have CUP configured to do Permission Level Analysis, that's all I want to be seeing in CUP.
    Let me know if I need to clarify further.

  • Web Analysis Problem using the "Send to Excel" service

    I built reports using WebAnalysis 9.3.0.
    I attached the "Send to Excel" service button to my report. I am using a combination of Excel 2003 & 2007.
    I have the following settings in my webanalysis.properties file:
    MaxDataCellLimit=200000
    XLExportMaxRows=60000
    XLExportMaxColumns=200
    My problem is when I try to use the "Send to Excel" button. For "small" reports Excel opens quickly with my report information. Small being around 1,000 rows and 20 columns. If I add more rows, say another 400 or so - Excel never opens with my report information. Web Analysis sits there for about 5 minutes or so and does nothing. After that I can start using WA again - but no report in Excel.
    I get the same result by right clicking on the report in WA Studio and selecting "Export Data..."
    Any ideas?
    Thanks!

    Hyperion support suggested I try adding the Excel path to my webanalysis.properties file
    So I now have:
    ExcelPath=D:\\Progra~1\Micros~1\\OFFICE11\\Excel.exe;C:\\Progra~1\Micros~2\\OFFICE11\\Excel.exe
    MaxDataCellLimit=200000
    XLExportMaxRows=60000
    XLExportMaxColumns=200
    ResolveDimSetAliases=false
    Where D:\\Progra~1\Micros~1\\OFFICE11\\Excel.exe is the path to Excel as installed on the server where WA Studio is installed.
    And C:\\Progra~1\Micros~2\\OFFICE11\\Excel.exe is the path to Excel as installed on my laptop.
    This did not correct my problem.
    Am I using the ExcelPath=.... correctly?
    I was not sure this would work since WA Studio successfully starts Excel for "smaller" reports. But I tried it anyway.
    Any ideas?

  • Problem in RAR Background Jobs

    Hi Experts,
    We are facing a problem during Batch Risk Analysis-
    Batch Risk Parameters-
    System- RP1
    Options-
    User
    Role
    Profile, Critical Actions and Role/Profile Analysis
    Management reports
    User and Role permission Level risks analysis run fine, but at User Critical Analysis after 44% Systme is throwing an error-
    Error log-
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.bg.CriticalTcdRoleAnalysis deleteExistingData
    INFO: Job ID:371 Critical Role/Profile data deleted
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine riskAnalysis
    INFO:  Job ID:371 : Exec Risk Analysis
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine riskAnalysis
    WARNING:  Job ID:371 : Failed to run Risk Analysis
    com.virsa.cc.dataextractor.dao.DataExtractorException: Cannot extract data from system (ECC Production System); for more details, refer to ccappcomp.n.log
         at com.virsa.cc.dataextractor.bo.DataExtractorSAP.searchUser(DataExtractorSAP.java:551)
         at com.virsa.cc.dataextractor.bo.DataExtractorSAP.userIsIgnored(DataExtractorSAP.java:529)
         at com.virsa.cc.xsys.meng.MatchingEngine.matchCritRoleProf(MatchingEngine.java:1203)
         at com.virsa.cc.xsys.riskanalysis.AnalysisEngine.performCriticalRoleAnalysis(AnalysisEngine.java:3634)
         at com.virsa.cc.xsys.riskanalysis.AnalysisEngine.riskAnalysis(AnalysisEngine.java:320)
         at com.virsa.cc.xsys.bg.CriticalTcdRoleAnalysis.performBkgTcdAnalysis(CriticalTcdRoleAnalysis.java:244)
         at com.virsa.cc.xsys.bg.CriticalTcdRoleAnalysis.loadAnalysisResult(CriticalTcdRoleAnalysis.java:78)
         at com.virsa.cc.xsys.bg.BatchRiskAnalysis.performBatchSyncAndAnalysis(BatchRiskAnalysis.java:1407)
         at com.virsa.cc.xsys.bg.BgJob.runJob(BgJob.java:427)
         at com.virsa.cc.xsys.bg.BgJob.run(BgJob.java:284)
         at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.scheduleJob(AnalysisDaemonBgJob.java:249)
         at com.virsa.cc.xsys.riskanalysis.AnalysisDaemonBgJob.start(AnalysisDaemonBgJob.java:81)
         at com.virsa.cc.comp.BgJobInvokerView.wdDoModifyView(BgJobInvokerView.java:444)
         at com.virsa.cc.comp.wdp.InternalBgJobInvokerView.wdDoModifyView(InternalBgJobInvokerView.java:1236)
         at com.sap.tc.webdynpro.progmodel.generation.DelegatingView.doModifyView(DelegatingView.java:78)
         at com.sap.tc.webdynpro.progmodel.view.View.modifyView(View.java:337)
         at com.sap.tc.webdynpro.clientserver.cal.ClientComponent.doModifyView(ClientComponent.java:481)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.doModifyView(WindowPhaseModel.java:551)
         at com.sap.tc.webdynpro.clientserver.window.WindowPhaseModel.processRequest(WindowPhaseModel.java:148)
         at com.sap.tc.webdynpro.clientserver.window.WebDynproWindow.processRequest(WebDynproWindow.java:335)
         at com.sap.tc.webdynpro.clientserver.cal.AbstractClient.executeTasks(AbstractClient.java:143)
         at com.sap.tc.webdynpro.clientserver.session.ApplicationSession.doProcessing(ApplicationSession.java:321)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessingStandalone(ClientSession.java:713)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doApplicationProcessing(ClientSession.java:666)
         at com.sap.tc.webdynpro.clientserver.session.ClientSession.doProcessing(ClientSession.java:250)
         at com.sap.tc.webdynpro.clientserver.session.RequestManager.doProcessing(RequestManager.java:149)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doContent(DispatcherServlet.java:62)
         at com.sap.tc.webdynpro.serverimpl.defaultimpl.DispatcherServlet.doGet(DispatcherServlet.java:46)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:386)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:364)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:1039)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:265)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:102)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:172)
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.bg.CriticalTcdRoleAnalysis deleteExistingData
    INFO: Job ID:371 Critical Action data deleted
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine riskAnalysis
    INFO:  Job ID:371 : Exec Risk Analysis
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:371 : Before Rules loading,  elapsed time: 0 ms
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.riskanalysis.AnalysisEngine performActPermAnalysis
    INFO:  Job ID:371 : Analysis starts: SUHAILS
    Jul 23, 2009 4:02:45 PM com.virsa.cc.xsys.meng.MatchingEngine matchActRisks
    WARNING:  Error :
    com.virsa.cc.dataextractor.dao.DataExtractorException: Cannot extract data from system (ECC Production System); for more details, refer to ccappcomp.n.log
    Have anyone experienced same issue? Please suggest what could be the reason and solution.
    Thanks,
    Sabita

    Hi, I had included this reply in another thread with relatively the same issue:
    In the Configuration tab (Additional Options), do you have "Enable Offline Risk Analysis" set to "Yes"?
    If yes, the error could occur because of updating the conflict permission details in the offline table VIRSA_CC_PRMVL. I believe this will occur in users who are assigned to big roles that contain lots of authorizations.
    As a test workaround, if you had set "Ignore Critical Roles & Profiles" to yes, you can enter those big roles into the list and run the batch risk analysis for only those users with errors.
    As another test workaround, If you had set "Consider Org Rules" to yes, set it to "No". After that, run again batch risk analysis for the users with errors.
    I agree it is not a direct solution to the problem (which I am also facing right now and dealing with SAP) but helps to identify where the problem is.
    Would be interesting to know what solution SAP had provided for your issue.

  • PsE 9 auto-analysis problems

    PsE 9 is having problems when attempting to auto-analyze photos.  The catalog I am working with is one created in PsE 8 and converted to PsE 9.  No converson errors were reported.  The catalog is about 91 MB is size and contains 16000 photos and a couple thousand video clips.
    These are my Media-Analysis preference options...
         (enabled) Analyze Photos for People Automatically
         (enabled) Analyze Media for Smart Tags Automatically
         (enabled) All filters
         (disabled) Run Analyzer on System Startup
         (disabled) Run Analyzer only when System is Idle
    Analysis will stop after it has processed what seems a random count of photos.  I can monitor the progress by hovering the mouse over the purple tag on the bottom of the Organizer screen.  I know there is an issue when the same photo is processed for more than the normal amount of time.  Normally the photos are processed at more than one per second.  I have waited for up to 10 minutes to process the same photo..
    The ElementsAutoAnalyzer.exe process exists in the Windows Task Manager.  However the CPU utilization is at 0% when analysis is stalled.
    Attempting to manually analyze a photo will result in a progress dialog to pop up and either it will show zero progress and hang or stop at 71%.  I need to exit the Organzer, kill ElementsAutoAnalyzer.exe, and then restart the Organizer to get it to work again. But then it starts analyzing from the beginning again and then stops after some time.
    It looks like there is a memory leak in ElementsAutoAnalyzer.exe.  The VM starts at ~54 MB and keeps on growing with every processed photo.  The size increases about 1 MB per second (about 15 photos).   The organizer stops showing progress after the  ElementsAutoAnalyzer.exe VM size reaches around 1.7 GB.

    I'm trying to use the advertised feature to help me narrow down issues with photos.  I've already got too many of them.  I will take any help from an automatic tool by showing potential issues with photos.  I can be the final judge if a photo is out of focus or is too dark.  My plan is to filter by Blur for example.  Then I can quickly assess all photos for this and remove photos that are clearly not desirable.
    Many of the issues pointed out by John Rellis can be resolved by making your media read-only.
    The large amount of CPU times issue point out by John looks like was resolved in PsE/PrE 9 by the new 'Run Analyzer only when System is idle".
    I also use this for videos.  The scene detect is great and helps me pick out clips as I am assembling my videos.
    I had the impression that Adobe cleaned up some of the auto-analysis features with PrE 9 so that more people could take advantage of it, like I would like to.

  • BI7 BEx Analyser problem

    Hi,
    I am working on BI7. I have problem that I cannot open BI7 BEx Analyser, however I can open 3.X Analyser. On clickin for BI7 Analyser It say that 'Critical programm erroe occured, The programm needs to close, Please refer to the trace for further information'.
    My question is, if I can open 3.X anayser, why I am getting error for BI7 Analyser.
    Can some one please guide me resolve this issue.
    For more details MS Office 2003 Patch 2
    SAP GUI patch level 7.

    Hi,
       Looks like a GUI problem, you'll need to install BI again.
    Uninstall BI. and then do the following
    Install the following softwareu2019s / patches before installing SAP FrontEnd.
    u2022     .Net framework 2
    [http://www.microsoft.com/downloads/details.aspx?FamilyID=0856EACB-4362-4B0D-8EDD-AAB15C5E04F5&displaylang=en]
    u2022     Microsoft Visual J# .NET Version 2 Redistributable Package
    [http://www.microsoft.com/downloads/details.aspx?familyid=F72C74B3-ED0E-4AF8-AE63-2F0E42501BE1&displaylang=en]
    u2022     Install the the office Patch KB907417 if not present.
    [http://www.microsoft.com/downloads/details.aspx?FamilyId=1B0BFB35-C252-43CC-8A2A-6A64D6AC4670&displaylang=en]
    Then install BI7 frontend and BEX.
    Install the BI patch .
    Regards.

  • Problem downloading rar file using HttpURLConnection

    I'm trying to read a rar(archiving) file from the Internet using a HttpUrlConnection.
    The program seems to work fine - however the download is not the same size as the file if downloaded through a browser.
    The archive is 52,332kb when downloaded through a browser
    and 4kb smaller when I try to download it.
    When I try to extract the downloaded file (with winRAR) the header is corrupt and there is an unexpected end of archive.
    Here is the pertinent code.
    try
                // configure and open connection
                HttpURLConnection conn = (HttpURLConnection)   currURL.openConnection(); // currURL is a URL object
                conn.setRequestProperty("Connection", "Keep-Alive");
                conn.setRequestProperty("Accept-Encoding", "gzip,deflate" );
                conn.setRequestProperty("Content-type", "application/x-rar-compressed" );
                conn.setRequestProperty("Cookie", cookie);   
                conn.connect();
                // get an input stream from the HttpURLConnection object    
                InputStreamReader isr = new InputStreamReader( conn.getInputStream() );
                BufferedReader in = new BufferedReader( isr );
                /* the string file_name has been parsed from the url
                 * and is in the format file_name.rar */
                File dest = new File("C:\\my_folder\\"+file_name);
                // open output stream
                OutputStream os = new FileOutputStream( dest );
                OutputStream buffer = new BufferedOutputStream( os );
                int data;
                while ( (data = in.read() ) != -1 )
                   buffer.write(data);
                // close streams
                in.close();
                os.close();
    catch( IOException iox)
         iox.printStackTrace();
    }Thanks in advance.

    Thanks ejp. I closed 'buffer' instead - that fixes the size problem.
    The archive is now exactly the same size as it should be - but the header information is still corrupt.
    When I compared the files using a binary comparison program it
    said that the files although the same size were different by approx. 6.7mb out of 53.5mb.
    I'm perplexed as to what this difference could stem from; is it the encoding, the charset, something similar?
    The underlying file I'm trying to download is an avi if that makes any difference.

  • ADDM and performance analysis problem

    Respected Everyone,
    I am facing a problem, the detail is as under.
    Few days back, my users complained about slow db response, when i opened the oem control page, i got to know the sysman user started a series of commands and consuming the cpu resources from the last 2 hours. i analyzed few commands running under the session and decided to kill it. After killing the session everything was ok, except one thing which i noticed after few hours that performance analysis is not refreshing the advices on the oem control general page.
    What efforts i have done so far ?
    1) i first started ADDM maually but even on ADDM result page, it showing the same old statictics.
    2) i restarted the database, but of no use.
    3) I restarted the machine thought it may be dbconsole problem but of no use.
    Now guys kindly suggest me what should i do now.
    Thanks and Best Regards
    Nadeem Hameed

    Well i checked the ADDM analysis, its ok in ADDM page,
    Tell my why its not updating the statistics on the general page.
    Thanks
    Nadeem

  • Vibration octave analysis problem

    I'm measuring a vibration in range of 1-80Hz, according to Iso 2631.
    The problem is that I need the velocity information for each 1/3-octave bands. It should be expressed in mm/s. If fractional octave analysis is used it always ends up to psd unit (mm/s)^2. Using some basic fft-analysis the unit is correct (mm/s) but resolution is higher than 1/3 octave. Any ideas?
    ps. Is there any plans to develop a weighting filter needed by Iso 2631-2?

    Human vibration filters were added to the Sound and Vibration Measurement Suite in version 6.0 at the end of 2007 compliant to ISO 2631 and ISO 5439.   

  • Transcoding the Analysis Problem with Audio

    I submitted a previous problem regarding sound degradation on imported media.  I diagnosed my problem and uncovered what I feel is a serious fault in Final Cut Pro X.  The media was recorded in a very noisy environment.  I.e. hotel ballrooms.  This provides a lot of background noise.  Fans, electric pops, traffic noise and so on.
    I had accepted the default option to analysis and fix the audio problems. IMHO, Final Cut tried to fix too much.  This produces a thin sounding often broken up audio.  Since the media was lengthy (50 minutes) it took about an hour to process.  Before the process completed working with the project was fine.  As soon as the processing completed the problem appeared.
    So if you experience similar problems just import the media without the audio automatically fixed and do it manually.

    No, just go into the controls and tweak it manually.  The analysis has to run anyway, may as well run it during import.

  • Mercalli V2 analysis problem

    Mercalli V2 users knows that analysis is required before seeing the stabilized clip, my problem in Premiere Pro CS5 with MercalliV2 is that once I prerender the clip by pressing Enter, Mercalli asks me for analysis again, but then even if I go to the Mercalli UI and press OK to start a new analysis it doesn't start anymode, because it knows  that it's done already...
    Anybody knows the solution to this problem ?
    Thanks

    The current Mercalli version (Windows) 2.0.69, (Mac) 2.0.70 works better with transitions.
    The improvement is a workaround to fix a Premiere CS4 and CS5 Bug.
    Is the workaround activated a warn-symbol is visible in the UI. Click on this to get a message.
    The opposite:
    Mercalli can’t detect clip trimming correctly.
    Tip:
    If the start point was trimmed than the stabilization will breaks.
    For a steady output, the user must open the Mercalli UI and press the “OK” button to reanalyze.
    I have done same tests with transitions and all works fine!

  • Temperature sweep analysis problem

    Hello
    I am having problem getting even the simplest  temperature sweep analysis to work in multisim 10.
    As an example, please see attached circuit and pictures.
    The circuit contains a simple voltage divider where R1 has a TC1 of 1 ohm/C, and as shown in figure 1, the resulting voltage is 0.5V when running simulation at
    27degree C with TNOM 27C.
    Raising the Temperature to 28degree C results in a resulting voltage of 0.33V (figure 2) (indicating  a doubling of the R1 value because of the 1 degree temperature raise with a TC1 of 1ohm/C)
    To my understanding, the R1 value should have changed with only 1 ohm.
    Any hints on what might be the problem with my temperature analysis ??
    Does anybody know if this is a known error with Multisim ??
    Any advice/hint is appreciated !
    Attachments:
    test2.ms10 ‏37 KB
    14.JPG ‏74 KB
    22.JPG ‏63 KB

    You are correct in you observation. The value is effectively being doubled. This is apparently a software error that will have to be further addressed by NI. I am responding only verify your findings (2 or more people reporting the same results are always better than just one person reporting it).
    I have no answer as to how to work around this. NI will have to read this and maybe they can come up with a temprary solution until a permanent one can be found.
    Kittmaster's Component Database
    http://ni.kittmaster.com
    Have a Nice Day

  • Runtime Analysis problem

    Hi all,
    I am trying to execute the program in SE30 for performance analysis, after giving the program name and if i try to execute , then its displaying the error " Unable to write to the measurement data file" , what this error belongs to, and how can i do performance analysis. Plz help.
    Balu.

    Hello Bala,
    I think this is a Authority problem.
    Create one sample program and use it in Se30 and check the same error u r getting for the sample report.
    If it is so then it is a authorization problem.
    In the report MS38TF02 put the breakpoint at this line
    AuthCheck for 'T'/'F' automaticaly by run
    AuthCheck for 'R' (see FB SUBMIT_REPORT)
    <b>  if p_obj_type = 'R' and trdir-secu <> space. "Report</b>    authority-check object 'S_PROGRAM'
                        id     'P_GROUP'  field trdir-secu
                        id     'P_ACTION' field 'SUBMIT'.
        if sy-subrc <> 0.
          set cursor field 'RS38T-REPO_NAME'.
          message e005.
        endif.
      endif.
    If the same persist for all then it is aproblem of authorization only.
    If useful reward.
    Vasanth

  • Association analysis problem

    Hello...
    I created an association analysis data mining model, but whe I run it, the job gets cancelled a get the following error:
    No Large itemsets could be generated for the specified model parameters
    No Large itemsets could be generated for the specified model parameters
    Message no. RSDME434
    I'm running the model against a 2 million record BIA indexed infocube, the transaction infoobject is for day + associate number.
    Has anyone got this message?
    What can be the problem?
    Regards
    Eli Hueramo

    Hello,
    I think your support and confidence are too high. You have to reduce these figures and you will get some results.
    Kind regards
    Tobias

Maybe you are looking for

  • IDOC_ADAPTER.ATTRIBUTE_BE_NOT_SUPP Only asynchronous processing supported

    Hello Expert, Error while send IDOC " IDOC_ADAPTER.ATTRIBUTE_BE_NOT_SUPP Only asynchronous processing supported for IDOC Adapter" is returned. Please suggest Thulasi

  • Sleep/Wake button stuck but they wont replace for free?!

    So my sleep/wake button is stuck and I went to the apple store, but they are making me pay $199 to replace it when I still have warranty!  I don't understand they say I have a speck of pink of the LCI but no corrosion which means they should still be

  • SQL Query Performance Concern

    Hi All, I have a two type of query  below  Query 1: Select  A.name, C.description  from  A   JOIN B   ON A.id=B.id LEFT JOIN C ON b.id=c.id  where B.status<>'PASS' Query 2: Select  A.name, C.description  from  A   JOIN  B   ON A.id=B.id  AND B.status

  • Online Migration using SQL developer

    Hi All, We are planning to migrate our production SQL server.2005 database to oracle 10g. If I use online migration using Oracle SQL Developer Tool to migrate data from SQL server 2005 to Oracle 10g, then will it migrate all those records that are in

  • Iam not able to update any app from app store

    iam not able to update any app from app store