HANA Studio "Memory Overview" shows No Data

Hi,
In HANA Studio I am attempting to view the graphs in "Memory Overview" and "Resource Utilization".  When trying to view each one it only shows "No Data" as seen in the image below.  My HANA DB is 1.0 SP07.
I have activated the statisticsserver setting to "true" under the nameserver.ini configuration.  My user also has the required authorizations.
What else am I missing?  Any help would be greatly appreciated.  Thanks!

After applying patch 72 for SP07, the memory overview is now showing data.

Similar Messages

  • How do I enable the Memory Overview in HANA Studio?

    I read John's blog about the SPS7 features:
    http://scn.sap.com/docs/DOC-49991#comment-467746
    Now I wanted to see the Memory Overview myself. I granted the role sap.hana.admin.roles::Monitoring to user SYSTEM and opened the Memory Overview. This is what I get:
    I cannot see the Memory Overview with HANA revision 70, 71 or 72.
    I cannot see the Memory Overview with HANA Studio running on Linux or Windows.
    I cannot see any entries in the tracefiles of the HANA database.
    The same problem with Resource Utilization, I get a blank window.
    The HANA database is up and running. Database, Client and Studio are revision 72.
    What do I do wrong? It seems that I am not the only one affected.
    Regards,
    Mark

    Hi Mark,
    Have a look at the loaded Delivery Units (DU) in your SAP HANA system. Is the HANA_ADMIN DU available?
    Check if all the Delivery Units have been loaded
    In SAP HANA Studio open the Modeler perspective and check the Delivery Unit.
    In the menu Window => Open Perspective => Other select the SAP HANA Modeler perspective.
    In the Quick Launch view navigate to Delivery Units. The following Delivery Units should be loaded by default.
    HANA_ADMIN
    HANA_IDE_CORE
    HANA_TA_CONFIG
    HANA_UI_INTEGRATION_CONTENT
    HANA_UI_INTEGRATION_SVC
    HANA_XS_BASE
    HANA_XS_EDITOR
    HANA_XS_FORMLOGIN
    HANA_XS_IDE
    HANA_XS_LM
    HANA_XS_SQLCC
    SAPUI5_1
    If there are Delivery Units missing please load all Delivery Units using the Import option in the Quick Launch view. Click Import => SAP HANA Content => Delivery Unit
    Select the files from the Server option.
    Open the File drop-down menu and select the missing Delivery Units one by one.
    To load the Delivery Unit press Finish and repeat this for all missing Delivery Units.
    Hay

  • HANA DB 1.00 Build 20 - Error import COPA Data with HANA STUDIO 1.00...

    Hi,
    Has anyone tried loading data records to HANA DB 1.00 Build 20 from the CTL files provided in     https://wiki.wdf.sap.corp/wiki/display/HANADemoenvironment/HowtoloadCOPAdata ?
    It was work on HANA DB 1.00 Build 15 but not with New Build...
    The procedure was exactly the same
    The creation of Tables is done but i have this error with import data with each csv files:
    Command with HANA Studio (Build 20):
                    load from '/tmp/export/TCURT.ctl' threads 4 batch 1000;
    Error:
                   Could not execute 'load from '/tmp/export/TCURT.ctl' threads 4 batch 1000'
                   SAP DBTech JDBC: [257]: sql syntax error: line 1 (at pos 3643214585)
    I tried also:  load from '/tmp/export/TCURT.ctl' -> same error...
    We have all privileges on the tmp folder and files
    I tried also to copy new data from source...
    Do you have an idea?
    Thanks for advance for help .
    Best regards,
    Eric MG

    Hello,
    LOAD TABLE command was not public - full syntax was never mentioned in any SAP guide on help.sap.com or Market Place (except SAP HANA Pocketbook-DRAFT). Therefore it is SAP internal syntax that can be changed anytime. And I guess this is what happened. Command is either deprecated or having different syntax.
    If you need to import CVS file I would suggest to use official way - either IMPORT command (which was also not very public before but since SP03 it is mentioned in SQL Script Guide) or best using SAP HANA Studio import functionality (just be sure you use CVS as type of file).
    You can find explanation of both ways in this blog:
    SAP HANA - Modeling Content Migration - part 2: Import, Post-migration Activities
    Tomas

  • Is data compression all automatic? Or does manual steps occur in HANA Studio

    Hi all,
    I'm new to HANA and have been leveraging SCN in a BIG way to answer my questions (thanks everyone for your contributions). This is my first post as I was unable to find an answer to my current question .
    I've been reading up on data compression in HANA and I learned that there are different techniques, such as Dictionary, Cluster, and Run-length encoding.
    My Question is: Is the compression automatic? Or does it need to be modeled within HANA Studio? Let's use Dictionary Encoding as an example. Are there algorithms in place within HANA that will automatically aggregate values so only distinct values remain? Will the attribute vector  and inverted indexed tables automatically be created?
    Just as some background, this is what I am basing my question on:
    http://www.agilityworks.co.uk/our-blog/demystifying-the-column-store-%E2%80%93-viewing-column-store-statistics-in-sap-ha…
    Thank you!
    Kyle

    Hi Justin,
    you are right, the compression is related to the delta merge - and therefore, as long as delta merges happen automatically, compression will also happen automatically.
    SAP HANA has two compression stages on the column store: the first - and typically dominating one - is the implicit compression obtained from using a data dictionary for all column-store columns.
    The second stage is often called "sparse compression" - it offers additional compression on top of the dictionary compression, with several available algorithms. During the "optimize compression" run, the most appropriate compression algorithm is chosen for each column (and some columns may not be sparse-compressed at all, because it would not bring a benefit).
    The optimize compression run does not happen with every delta merge. Instead, it will be performed with the first delta merge on a given table, and then only if the data content of the table has changed significantly (typically, if I remember correctly, the system waits until the number of records has doubled. Note that the optimize compression run may _change_ the compression type chosen for a given column. The sparse compression itself will be done with every delta merge, using the algorithms selected in the past optimze compression.
    If you have switched off automatic delta merging globally, or if you have disabled automatic delta merging for a given table, there will also be no automatic compression (in the entire system or on the given table). BW on HANA uses the smart merge feature, so that in a standard BW on HANA, automatic compression will happen (with the timing of the delta merge being decided in a cooperation between BW and HANA).
    Best,
    Richard

  • Getting error for Analytic Views data preview on HANA Studio

    Hello
    I have multiple Analytic Views and when trying for data preview, those all are throwing error as "Object Not active or broken" (attached below). I have activated these objects 5-6 times and got successful message but unable to data preview.
    Other persons in team and from different machine using my user id can run data preview successfully, so something problem with my PC. Already I have uninstalled - restart PC and reinstall 3 times for HANA Studio and HANA Client software but problem not solved. During re-installation, I found its getting my previous settings like system details etc. already populated. Even I installed in different folders, however problem not solved and system details automatically received on next start.
    Please guide me to solve this problem
    Below error coming while trying for data preview.
    Thanks
    Suman

    Hi Suman,
    Can you change your system workspace and also the secure storage.
    It can be done via a parameter in the shortcut of studio in the Target field after the exe write
    -data "<WorkspaceLocation>" -eclipse.keyring "<AnyLocation>"
    While usually this is a server error, it should be studio specific.
    Nevertheless, do the above settings and let me know your results.
    Regards,
    Anjali.

  • Importing COPA data into HANA Studio

    Hello-
    I am currently trying to import my COPA data into my HANA Studio.
    I have extracted the files and I am trying to follow the steps listed here:
    https://wiki.wdf.sap.corp/wiki/display/bobjdemo/HowtoimportCOPAdata
    Seems easy enough, but I can't decipher the first step: u201C1. Copy copa_export.tar.bz2 file from
    idesapps\publicshare\HANA  to HANA server /tmp using tools like WinSCPu201D
    I downloaded, but could not gain access to WinSCP. What are other tools "like" WinSCP?
    Is there an alternate way to import this data without the first step?
    Thank you so much.
    Truly appreciative,
    John

    Hi John,
    That wiki link doesn't work as it is an internal SAP site.
    In any case, WinSCP is a FTP/SFTP file transfer tool. You just need to get that file over to the HANA server somehow. It doesn't really matter what tool you use, as long as it works!
    Cheers,
    Ethan

  • Unable to find custom table in HANA Studio - Data Provisioning - SLT

    Hi All,
    We are working on a scenario where in we have created a custom table in ECC and when I tried to find the custom table for performing replication using SLT , however, I am not able to find the custom table in SAP HANA Studio. The custom table related entries are maintained in DD08L, DD02L, DD02T. We are able to find standard tables in SAP HANA Studio which could be replicated using SLT.
    Please could you let us know the reason  or anything we are missing as pre-requisites for custom tables to be visible in SAP HANA Studio for SLT.

    Hi,
    Since you confirmed that the custom table details are available in DD02l, DD02T and DD08L,
    I would request you to check the language in which the source table is maintained/created.
    If the  table at the source is maintained in a different language than you are logged in to HANA St
    udio, it will not be displayed in SLT Provisioning.
    Please let me know your observation.
    BR
    Prabhith

  • Frontend instance: Hana studio shows up with MSVCR100.DLL missing error

    Hi all,
    while trying to start the HANA Studio on my (newly created, nothing modified) frontend instance for the BW740 Trial I get an error message dialog telling me that MSVCR100.DLL is missing.
    What is going wrong here ? Am I supposed to install this C++ related DLL ? Or is the Windows image on which this installation is based upon corrupted ?
    Any help is appreciated,
    Gerald Sauerwein

    Hi Gerald,
    quick question, which of these two solutions are you using:
    SAP Business Warehouse 7.4 SP5 incl. SAP Business Objects BI 4.1 SP2 on SAP HANA 1.0 SP7
    SAP Application Server ABAP 7.4 SP5 incl. Business Warehouse on SAP HANA 1.0 SP7 [Developer Edition]
    Both use different Windows images and it help us narrowing down the issue.
    Thanks and Regards,
      Hannes

  • No SQL (hdbsql / HANA Studio) Connection to SAP HANA Rev. 69 (all services GREEN, running)

    Hi all,
    after starting HANA instance (locally installed on Linux Laptop) via sapcontrol [1], checking with sapcontrol's GetProcessList as well as HANA Studio Admin Console Diagnosis Mode Shows all Services GREEN, running.
    However, when I try to open "Default Administration" in HANA Studio, it "Refreshing Overview" and is busy till infinity.
    Trying to connect via \c command of HDBSQL eventually works, but any commands like \ds (list schemas) will run till infinity without returning anything.
    SAP Business One for HANA SLD shows the Server but not the Company DBs / schemas.
    I attached sapstart.log and statisticsserver_alert_sid.trc.
    Any advice is highly appreciated.
    Thanks & Kind Regards
    Stephan
    [1] I start HANA executing: /usr/sap/<SID>/SYS/exe/hdb/sapcontrol -prot NI_HTTP -nr 00 -function StartWait 270 2 OR /usr/sap/<SID>/SYS/exe/hdb/sapcontrol -nr 00 -function Start with user sidadm
    [2] The issue occurs since the upgrade from Rev. 53 to HANA Rev. 69 (in the context of upgrading from Business One for HANA PL02 to PL09)

    Yes Aleksandr
    I think you are right.
    After doing many installation of HANA Server in differrent AWS instances it didn't work for me.
    But when I installed the HANA Studio on another machine with more RAM it worked.
    I think we have to install HANA Studio on another AWS instance with 16 GB RAM to make it run smoothly.
    Thanks & Regards
    Manu

  • How to add system to login using sid adm user in HANA Studio

    Hi,
    I have added SYSTEM user for my TEST system in HANA Studio. Now, I would like to add TEST system but using <sid>adm user.
    It shows "The system cannot be reached. The logon data could not be used. Do you want to add the system anyway?".
    May I know how to add TEST system using <sid>adm user?
    Thank you.
    regards,
    zl

    You need Database user, not operating system user <sid>adm user.

  • Understanding of the table level information from SAP HANA Studio

    Hello Gurus,
    Need some clarification from following information provided by SAP HANA Studio.
    I have a table REGIONS and the contents are as follows:
    REGION_ID REGION_NAME
    11      Europe
    12      Americas
    13      Asia
    44      Middle East and Africa
    15      Australia
    6      Africa
    The Runtime Information about the table is as follows:
    Image# 1
    Image# 2
    Image# 3
    Total size in KB show in Image#2 and 3 are not matching. Why?
    Total size in KB show in Image#2 and Total Memory Consumption are not matching. Why?
    The values of Memory consumption in Main Storage and Delta Storage are not matching in Image# 1 and Image# 2. Why?
    Estimated Maximum Memory Consumption (Image# 1) and Estimated Maximum Size (Image# 2) are matching.
    Why the Loaded column Image# 2 is showing the value ‘PARTIALLY’. The table has just 6 rows and still partially loaded? Why?
    What is the significance of column Loaded in Image# 2 and 3 ?
    Thanks,
    Shirish.

    Have a look on this
    Playing with SAP HANA
    That presentation should help you to answer the questions yourself
    Regards,
    Krishna Tangudu

  • Start/Stop SAP Hana on SAP Hana Studio - Authentication Method

    Hi dear SAP Hana Colleagues !
    For security reasons, the Security Team is disabling the password authentication of SAP OS Users (<sid>adm). The password will be setted up as "locked" in the /etc/passwd file, to block "intearactive logins".
    As we know and as discribed in the section 2.2 of "SAP Hana Administrator's Guide", to stop/start SAP Hana Systems by SAP Hana Studio, we need to provide the "user/password" data to execute these actions, and this will not be possible if the password has remaining as "locked" in the OS.
    Are there another way to authenticate the <sid>adm user from SAP Hana Studio to execute the stop/start of SAP Hana System ?
    I'll wait your answers.
    Best regards,
    Rodrigo A. Botelho

    Rodrigo,
    esse espaço é de HANA em Portugues.
    Por favor, refaça a pergunta em Português e/ou pergunte no espaço de HANA em inglês:
    SAP HANA and In-Memory Computing
    Vou bloquear essa thread agora.
    Abs,
    Henrique.

  • Some thing wrong with HANA Studio embedded browser setting?

    i have hana studio 1.50.1, and sap.hana.admin.roles:Monitoring role, but when i try to open 'resource utilization', 'memory overview editor' or 'memory allocation statistics editor' , all shown as blank without any error, what 's wrong here?

    Version 1.50 would indicate that you are running an SAP internal only build of the HANA Studio. Please do not post SAP internal questions on the public forums. If you are having problems with an internal build, I would suggest you uninstall and re-install an officially supported revision of the HANA Studio. Latest supported versions of the HANA Studio are available in the corporate software corner.

  • HANA STUDIO SPS06: IMPORT ERROR FOR A SCHEMA

    Hi HANA Experts,
    I have a problem for exporting and importing data  for a schema.
    For exporting I used interface of HANA Studio SPS06 , while importing I used command
    IMPORT "SFLIGHT" ."*" AS BINARY from "/tmp/SFLIGHT" WITH RENAME SCHEMA "SFLIGHT" TO "SHANA"
    I am getting the error as
    SAP DBTech JDBC:[2048]:column store error: table import failed: [30134] out of memory
    Kindly suggest. Thank you
    Regards

    PrPro will come to you. It has fewer "automations," and "big-button" solutions, than PowerDirector (and Magix and PrE), but with that comes power!
    It is the same with Encore vs the limited authoring capabilities of PowerDirector, Magix, Roxio and PrE. Again, more hand-work, but oh, so much more power!
    Can you tell that I am a control freak, and do not mind getting my hands dirty to access the power?
    There are a few things that Encore cannot do, that programs like Sonic Scenarist can do, but there are usually workarounds, either in the design, or in the implementation, that will get you "there," or close.
    I strongly recommend that any new user to Encore purchase and read Jeff Bellune's excellent book, The Focal Easy Guide to Adobe EncoreDVD 2.0, Focal Press. That book covers everything that Encore can do, with two current exceptions - Adobe Dynamic Link (CS4 version) and BD authoring. Both of those were added after the book and EncoreDVD 2.0, were released. However, the Help files cover both of those newer additions well.
    Along with the fundamentals of Encore, Jeff covers all sorts of tricks and tips for getting more out of Encore, than the DVD-specs. actually allow. Obviously, he cannot break those rules, though he CAN help you create the illusion that the rules HAVE been broken. The user will swear that they just saw things, that are impossible, based on the DVD-specs., which are very limiting. Hey, it IS "smoke and mirrors," but if the user is fooled, you have succeeded, right?
    Good luck, and when you get around to adding some "tricks" to your DVD-Videos, post back. I'd suggest a new thread, so that other users can benefit from the new title, and so that users know that this is a new question. You will find that there is not too much, that someone here has not done, one way, or another. Oh, there ARE some limitations, where there is no good way to "bend" those DVD-specs., but you'll be surprised at what one can "bend."
    Happy authoring,
    Hunt

  • Sharepoint web analytics does not show any data

    Hello,
    We installed Sharepoint Web Analytics some days ago (separated application pool).
    Installation completed successfully.
    Search works fine and returns expected results.
    But reports still don't show any data :
    Data Last Updated: 15.08.2014 02:00:16 There is no data available for this report. Here are some possible reasons: (1) Web Analytics has not been enabled long enough to generate data; (2)
    There is insufficient data to generate this report; (3) Data logging required for this report might not be enabled; (4) Data aggregation might not be enabled at the level required for this report.
    What I tried :
    connect to website using different users (admin user included)
    checked that needed services on server are started (especially analytics services)
    restart services Web Analytics Data Processing Service and
    Web Analytics Web Service
    checked that all services applications are started (WSS_UsageApplication status was
    stopped so I started it using Sharepoint 2010 Management Shell)
    checked service application associations (especially if Analytics Service Application Proxy is checked)
    manual execution of jobs (Web Analytics Trigger Workflows Timer Job, Microsoft SharePoint Foundation Usage Data Import, Microsoft SharePoint Foundation Usage Data Processing)
    manual start of incremental crawling
    restart IIS
    checked scope of data logging (especially if Enable usage data collection and
    Enable health data collection are checked)
    checked that the .usage file are generated correctly on the disk
    checked in the Logging database (WSS_UsageApplication) that the
    RequestUsage view contains data collected from the .usage files
    checked that data is successfully extracted from the logging database into the staging database (LastLoggingExtractionTime)
    checked that data was successfully copied from the staging database to the reporting database (LastDataCopyTime)
    checked on the website side that Advanced Web Analytics feature is
    Active
    recreate web analytics application
    checked for any message in ULS logs that could help... I only noticed that error :
    The SharePoint Health Analyzer detected an error.  Drives are running out of free space.  Available drive space is less than twice the value of physical memory. Can it cause the problem ?
    All appears to work fine but I can't see any data in the reports.
    The thing is that Inventory data is collected successfully and I can see the related reports. Traffic and Search data are still empty.
    Anything else I can try ?
    Thanks.

    Yes I have granted application pool identity full control on the Web Analytics Service Application.
    I also checked that Sharepoint 2010 timer service is started.
    I can't see anything in the logs that could help, when accessing the reports. The only thing I noticed is this kind of log which occurs
    regularly and seems to be an exception :
    Enumerating all sites in SPWebApplication Name=SharePoint - 80.
    Site Enumeration Stack:
    at Microsoft.SharePoint.Administration.SPSiteCollection.get_Count()
    at Microsoft.SharePoint.Taxonomy.UpdateHiddenListJobDefinition.Execute(Guid targetInstanceId)
    at Microsoft.SharePoint.Administration.SPTimerJobInvokeInternal.Invoke(SPJobDefinition jd, Guid targetInstanceId, Boolean isTimerService, Int32& result)
    at Microsoft.SharePoint.Administration.SPTimerJobInvoke.Invoke(TimerJobExecuteData& data, Int32& result)
    Maybe it can be the cause ?
    I have also cleaned the drive where .usage file are generated so the error "Drives are running out of free space.  Available drive space is less than twice the value of physical memory." does not appear anymore.

Maybe you are looking for