Guidelines for HANA SQL Script

Dear Experts,
Could you please share best practices on the usage of SQL script in procedures and Script based calculation views?
I searched already for the relevant information, but could not find any yet. I request you share any information other than what is available in help.sap.com based on your experience.
Thank you,
Raghavendra.

Hi Raghavendra,
As Lars said, it is very difficult to answer your question and it always depend on the designing solution for specific requirements. General rules we follow in scripting are
1. Avoid inline queries
2. Break complex SQL to small and simple SQL statements and club the results
3. Always analyze cost using Visualize Plan
4. Avoid cursors/loops
5. Avoid mix of SQL and CE (most commonly found best practice )
Regards,
Chandra.

Similar Messages

  • STRING_AGG function in SAP HANA SQL Script

    Hi,
    We have HANA Revision 7.3.
    STRING_AGG function seems to work when we just execute the query without assinging it to the value.
    However as soon as we assign it to a value in a stored procedure or scripted Calcuation view, it fails.
    Any ideas?
    Thanks,
    Hyun

    Lars,
    My apologies.
    Here are 3 screen shots. Two are error message generated
    when using STRING_AGG in a Calculation View, one is it running successfully in
    SQL window when it is not assigned to output variable.
    Thanks,
    Hyun

  • What is the SQL ROWNUM equivalent in HANA SQL script

    Hi,
    Could any one let me know what is the ROWNUM equivalent in HANA? Thanks for the help.
    Thanks,
    Jyothirmayi

    Hi Jyothirmayi,
    ROW_NUMBER() OVER (ORDER BY <FIELD LIST>) is the function to generate row numbers.
    ORDER BY clause is mandatory to generate row numbers in HANA.
    Regards,
    Chandra.

  • Retrieve alert values for use as parameter in corrective action sql script

    I am trying to write a corrective action sql script to kill a session that is blocking other sessions. I have the "blocking session count" metric set and the alert is firing correctly.
    Is there any way to retrieve the sid and serial number from the alert generated and use it in a corrective action sql script?
    Here is the alert generated:
    Target Name=myproddb.world
    Target Type=Database Instance
    Host=myprodserver
    Metric=Blocking Session Count
    Blocking Session ID=SID: 522 Serial#: 5228
    Timestamp=Mar 4, 2008 5:57:12 PM EST
    Severity=Warning
    Message=Session 522 is blocking 1 other sessions
    Notification Rule Name=Testing Corrective actions
    Notification Rule Owner=sysman
    Clearly the sid, and serial # is contained within the alert Message field
    what I want to write for the sql script is :
    alter system kill session '%sid%,%serial_no%' immediate;
    and have GC pass in the sid and serial_no to the script.
    The "Target Properties" listed on the right of the Edit Corrective Action screen lists minimal details pertaining to the alert and certainly not the session sid, serial no.
    Generically, is there any way to retrieve the values from an alert and use them in a corrective action script or job?
    I've looked into getting the values from the mgmt$alert_history table, but I'm hoping that GC can pass the values to the sql script.
    thanks in advance for your help.

    Hi
    You can implementing a procedure like this.
    1. When a block session count alarms occurs, there is a column in the v$lock that you can examine.
    #!/bin/ksh
    #kill_block_session.sh
    #first export your variables
    export ORACLE_HOME=/oracle/product/10.2.0.3
    export ORACLE_SID=SIDNAME
    $ORACLE_HOME/bin/sqlplus "/ as sysdba" << EOF
    execute immediate killed_blocks;
    EOF
    # end
    The killed_blocks is a procedure:
    create procedure
    declare
    v_sid varchar2(15);
    v_serial varchar2(15);
    -- now a sql query that retrieve the sid and serial
    -- you can obtain these values from v$session and v$lock
    select vs.sid,vs.serial into v_sid,v_serial
    from v$session vs,v$lock vl
    where vs.sid=vl.sid
    and vl.block >0
    -- After this, you execute a dbms_put line with these
    -- values
    But you understant that this response action is very dangerous, because its possible that you kill sessions that the blocking are transitient.
    You must examine your enviroment and your application and establish the metric like UDM and not for only session blocking count.
    You must to see:
    - The type of block
    - The ctime time in the v$lock for to understatn the amount of time to determine that the block is need killed.
    - In my opinion you need a special UDM and deactivate the blocking sesion count
    If you want help to create this UDM send me a mail to [email protected]
    Regards
    Robert

  • How to move SQL scripts using HANA ALM ?

    I see  clear documentation of moving the content easily from Dev to test and to production using DU which exports only content. Similarly can we use the transport management/ALM for moving sql scripts, for example table creation scripts or table structures that needs to be moved from Dev to QA ?
    Thanks
    Suresh

    Sutirtha,
    Thank you very much. I really appreciate all of your responses.
    I have one more question ( Sorry for flooding you with lot of questions :) )
    How to capture the errors which occured inside the SQL Scripts.
    My scenario is like this. I have two sql scripts Script_1.sql and Script2.sql namely. I would like to put dependency on the execution of the scripts.
    If Script_1.sql executes succesfully without any errors, then EXECUTE Script_2.sql
    else if Script_1.sql execution contains errors then DON'T EXECUTE Script_2.sql
    As of now, i observed that the Process flow shows Success State even if there is an error in the sql script execution. Because of which i was not able to put dependency on the execution of the scripts.
    I think i can do it by tweaking(changing) the code inside the SQL Scripts, like logging the errors in some log table and reading it before the next script execution.
    But the restriction is, I Should not touch/change the existing SQL Scripts.
    Do we have any mechanism in the ProcessFlows to identify the SQL Errors that occurred during execution of SQL Scripts ?
    Please suggest any idea on this. It will be great if you can help in this.
    Thanks in advance,
    SriGP.

  • OTL SQL script for timecard data

    Hi,
    I'm looking for a sql script to create a view to quiry timecard data including status, employee name, hours, project, task, type, etc.
    Any help or pointers will be greatly appreicated.
    Joon

    The query will depend on a lot of factors related to the OTL setup and timecard layouts. You can't just have a generic query.
    --Shiv                                                                                                                                                                                                                                                                               

  • How can I run a SQL script file...

    How can I run a SQL script file from a location on my computer without providing the whole path?
    Is there some way I can set a "Working folder" in SQL Plus??
    Thanks!
    Tom

    You can create an environment variable called "SQLPATH" which is a list of directories that SQL*Plus will search for your .SQL
    scripts.
    I would like to use another directory than the oracle/bin...
    How can I do this ??
    Hello,
    U can do this by this way:
    Save odm_script.sql file to the default Oracle
    directory i.e. Oracle-Home/bin and Run following command
    through SQL Plus.
    SQL>@Script_Name
    I hope this will resolve ur problem.
    Regards,
    Omer Saeed Khan.

  • SQL Script VS HANA views

    Hi Gurus,
    We are in the process of building the SAP HANA views. As we know there are multiple options
    1) Attribute , Analytic & Calc views
    2) Scripted Calculation views and CE functions.
    I have read many articles and as per SAP Documentation, they say build the information view with  Attribute, analytic and calculation View (Graphical) If it does not suffice they go for scripted calculation view.
    Just wanted to understand. Are there any limitations or any issues faced in the projects with scripted calc views. If we look at the John Appleby tips, it suggests that avoid SQL script unless it cannot be done with graphical views.
    http://scn.sap.com/community/hana-in-memory/blog/2013/12/29/6-golden-rules-for-new-sap-hana-developers
    If we build the views with graphical method, then it seems the parallelism can very well achieved ,means query is split into multiple sub queries and executed in parallel as in Visualize plan.
    If we write the Sql script, can this parallel processing  achieved or not ?  If the requirement can be achieved without writing script which method to choose.
    I have included the 10 to 12 attribute views in analytic view then Calculation view; it seems there may be some performance issue which I am going to check. As we know we can use base table directly in calculation view. What is the best method to use .Can we use these base  tables in calculation view directly or build the attribute views first, then analytic  view and after that build calculation view.
    As we know we use the attribute views from re usability perspective.  Is there any other reason that we need to use attribute views instead of joining the base tables in calc view .
    Regards
    Ram Ramanathaiah

    Thank you Krishna for the answers. I have gone through those links. But there is nowhere consensus opinion that the way we use the views. Some of the answers conflict each other .When we are starting the project we need to make decision about the best approach we wanted to take. If possible I would like to understand more about this from you and other experts.
    1) Build the views with Attribute analytic or calc views or
    2) Build the views only with base tables in calc views or
    3) Build everything using the SQL script /CE functions.
    Regards
    Ram

  • Write to ERP Tables on HANA DB using SQL script

    Hello All,
    We are using HANA as our primary database for ABAP system and trying to feed the data to ABAP tables using SQL script and experiencing authorization errors . Please see below for more details.
    Scenario.
    I am getting no authorised error when i try to write some data to Z* tables using SQL script in HANA studio.But I am able to create new tables in the same schema.
    As shown above Query1: SAPSR1 is the schema which contains underline ABAP tables. ZGSA is existing table and now i am trying to insert new rows into it.
    Query 2&3: Creating new tables in SAPSR1 works fine.
    Can you please suggest me whether it is right approach or i need to have RFC to update these table from some other tool/app?.
    Thanks in advance,
    Naresh

    Hi Naresh,
    Obi Wan would now probably say: "this is not the functionality you're looking for".
    Even though you are working with Z-tables you really don't want to start messing with those from outside the context of the NetWeaver system.
    Instead you want to keep the control over all tables in the NetWeaver schema completely  to the SAP<sid> user and NetWeaver.
    For your data loading scenario, just write a simple ABAP report with native sql or an AMDP to do the copying of the data for you.
    Don't spread your code across the landscape and don't loosen access restrictions on your schema.
    - Lars

  • HANA Expert SQL Script RSDHA383 error

    Hello Experts.
    My scenario assumes to load data from source to target with reading attributes in HANA Expert SQL Script and storing them into target object.
    (e.g. I have 0PLANT in Source and in target I store 0PLANT with it attributes: SALESORG,RF_STORETY, DISTR_CHAIN etc)
    So, when I'm trying to start loading the error occurs
    Saving ended with errors
    S:RSDHA:383 /BIC/Z* --some target infoobject name
    Exception CX_RSDHA_MSG logged
    Experimentaly I've found that this error occurs when the Source and Target have different structures.
    I've modeled two DSO and try next cases:
    The first case is when these DSO's have the similar structures. Target fields filled by HANA Expert Script. In this case everything is alright and no error occurs.
    The second case is when I add new field(attribute of any source characteristic) to the target DSO(of course I make changes in my script for this new field). In these case I got an error:
    Saving ended with errors
    S:RSDHA:383 /BIC/Z* --added infoobject name
    Exception CX_RSDHA_MSG logged
    The same error occurs when I'm trying to load data from DataSource based on OpenDSO.
    Does anybody know why it happens?
    Thanks for any help.
    Best regards,
    Alex.

    Hello.
    I've discovered some interesting things:
    1) if target has no KeyFigures, you're able to have a different structures of inTab and outTab. So you can move data as you wish.
    2) if target KeyFigures are subset of source KeyFigures (the same InfoObjects or the same field names "/BIC/ZKFIG"), you can also have a different srtructures of inTab and outTab.
    So there is some workaround with namings. The only question is how to add new KeyFigures and calculate them through Expert HANA Script.
    Best regards,
    Andrew
    Edit:
    If target attributes are subset of source and follow the same order as source, you can make a transformation. If they have different order, you'll get that error.

  • Please help me resolve the Lync server 2013 deployment error: "An error occurred while applying SQL script for the feature BackendStore."

    I am getting an error in "Step 2 - Setup or Remove Lync Server Components" of "Install or Update Lync Server System" step.
    "An error occured while applying SQL script for the feature BackendStore. For details, see the log file...."
    Additionally, all previous steps such as: Prepare Active Directory, Prepare first Standard Edition server, Install Administrative Tools, Create and publish topology are done without any errors. The user that I used to setup the Lync server is member of:
    Administrators
    CSAdministrator
    Domain Admins
    Domain Users
    Enterprise Admins
    Group Policy Creator Owners
    RTCComponentUniversalServices
    RTCHSUniversalServices
    RTCUniversalConfigReplicator
    RTCUniversalServerAdmins
    Schema Admins
    I have tried to re-install all the things and started to setup a new one many times but the same error still occurred. Please see the log below and give me any ideas/solutions to tackle this problem.
    ****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.BlobStore'****
    Initializing DbSetupBase
    Parsing parameters...
    Found Parameter: SqlServer Value lync.lctbu.com\rtc.
    Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
    Found Parameter: Publisheracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Administrators;LCTBU\RTCUniversalServerAdmins.
    Found Parameter: Replicatoracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
    Found Parameter: Consumeracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Read-only Administrators;LCTBU\RTCUniversalReadOnlyAdmins.
    Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
    Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
    Found Parameter: Role Value master.
    Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
    Sql version: Major: 11, Minor: 0, Build 2100.
    Sql version is acceptable.
    Validating parameters...
    DbName rtcxds validated.
    SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
    DbFileBase rtcxds validated.
    DbPath D:\CsData\BackendStore\rtc\DbPath validated.
    Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
    LogPath D:\CsData\BackendStore\rtc\LogPath validated.
    Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    Checking state for database rtcxds.
    Checking state for database rtcxds.
    State of database rtcxds is detached.
    Attaching database rtcxds from Data Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath, Log Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    The operation failed because of missing file '\\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath\rtcxds.mdf'
    Attaching database failed because one of the files not found. The database will be created.
    State of database rtcxds is DbState_DoesNotExist.
    Creating database rtcxds from scratch. Data File Path = D:\CsData\BackendStore\rtc\DbPath, Log File Path= D:\CsData\BackendStore\rtc\LogPath.
    Clean installing database rtcxds.
    Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
    ****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.RtcSharedDatabase'****
    Initializing DbSetupBase
    Parsing parameters...
    Found Parameter: SqlServer Value lync.lctbu.com\rtc.
    Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
    Found Parameter: Serveracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
    Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
    Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
    Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
    Sql version: Major: 11, Minor: 0, Build 2100.
    Sql version is acceptable.
    Validating parameters...
    DbName rtcshared validated.
    SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
    DbFileBase rtcshared validated.
    DbPath D:\CsData\BackendStore\rtc\DbPath validated.
    Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
    LogPath D:\CsData\BackendStore\rtc\LogPath validated.
    Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    Checking state for database rtcshared.
    Reading database version for database rtcshared.
    Database version for database rtcshared - Schema Version5, Sproc Version 0, Update Version 1.
    Thanks and Regards,
    Thanh Le

    Thanks Lạc
    Phạm 2
    I Had similar issue i end up uninstalling and reinstallting but same issue, then i change the drive but same issue. It was I/O issue. After adjusting my I/O it fix our issue and installation went on without any issue. 
    If any one using KVM here is detail article 
    We just  give this option cache=‘writeback
    using this article http://www.ducea.com/2011/07/06/howto-improve-io-performance-for-kvm-guests/ and http://itscblog.tamu.edu/improve-disk-io-performance-in-kvm/ this fix my issue thanks 

  • An error occurred while applying SQL script for the feature BackendStore.

    Hello,
    I am using my AD in Windows Azure VMs. I created new VM of A3 (4 cores, 7 GB Memory) Windows Server 2012 R2, Port 1433 MSSQL added, made it a member of Domain and planned to install first Lync Server 2013 on it.
    In "Setup or Remove Lync Server Components" of "Install or Update Lync Server System", got an Red Coloured text "An error
    occurred while applying SQL script for the feature BackendStore."
    I have not enabled monitoring and archiving server in topology builder. I added "Network Service" and assign "Full Control" in Security Permissions of "C:\CsData" and "C:\LyncShare".
    I executed the SQL Setup Wizard and upgraded any instance to 2012.
    Please guide.
    Thanks, Divyaprakash Koli

    Please check you have enough disk space for the disk where the folders are.
    Check view log for detailed log information.
    The following link is a similar thread for you to refer:
    http://social.technet.microsoft.com/Forums/lync/en-US/a3cb9ab0-7451-4df5-af96-3d2784d1b075/an-error-occurred-while-applying-sql-script-for-the-feature-backendstore-for-details-see-the-log?forum=lyncdeploy
    Lisa Zheng
    TechNet Community Support

  • Shell script for below pl/sql script dbms_file_transfer

    Please let me know how tt write the shell script for below pl/sql script dbms_file_transfer it is
    I have trasfer the files from asm into filesystem .
    it is working . but i have to put in the loop
    begin
    dbms_file_transfer.copy_file(
    source_directory_object => 'src',
    source_file_name => 'ncsn',
    destination_directory_object => 'dest',
    destination_file_name => 'ncsn');
    end;
    Edited by: user8680248 on 27/10/2009 20:55

    user8680248 wrote:
    Please let me know how tt write the shell script for below pl/sql script dbms_file_transfer it is
    I have trasfer the files from asm into filesystem .
    it is working . but i have to put in the loop
    begin
    dbms_file_transfer.copy_file(
    source_directory_object => 'src',
    source_file_name => 'ncsn',
    destination_directory_object => 'dest',
    destination_file_name => 'ncsn');
    end;What database version?
    What are you trying to do exactly?
    It's working but you have to put it in a loop. Fine, what's the problem you are having?
    begin
      loop
        exit when ... whatever the exit condition is ...
        dbms_file_transfer.copy_file(
          source_directory_object => 'src',
          source_file_name => 'ncsn',
          destination_directory_object => 'dest',
          destination_file_name => 'ncsn');
      end loop;
    end;

  • Generating SQL Script for Existing Tables and DBs

    Hello,
    is it possible to generate automatically a SQL-Script from an existing table or oracle database ?
    I want to export an existing table from an Oracle DB (g11) if its possible with the data.
    Perhaps somebody could me explain how to to do this.
    I am using the "SQL Developer 2.1" and the "enterprise manager konsole".
    I'm a rookie in using this tools.
    Thank you for any informations.
    N. Wylutzki

    If you want to export data, you should use the export utility. This is documented:
    http://tinyurl.com/23b7on

  • Looking for an SQL query to retreive callvariables + ECC from a RUN SCRIPT RESULT (Translation to VRU)

    Hi Team,
    I am looking for an SQL query to check the data (ECC + CallVariable) received following a RUN SCRIPT RESULT when requesting an external VRU with a Translation Route to VRU with a "Run External Script".
    I believe the data are parsed between the Termination Call Detail + Termination Call Variable .
    If you already have such an SQL query I would very much appreciate to have it.
    Thank you and Regards
    Nick

    Omar,
    with all due respect, shortening a one day's interval might not be an option for a historical report ;-)
    I would recommend to take a look the following SQL query:
    DECLARE @dateFrom DATETIME, @dateTo DATETIME
    SET @dateFrom = '2014-01-24 00:00:00'
    SET @dateTo   = '2014-01-25 00:00:00'
    SELECT
    tcv.DateTime,
    tcd.RecoveryKey,
    tcd.RouterCallKeyDay,
    tcd.RouterCallKey,
    ecv.EnterpriseName AS [ECVEnterpriseName],
    tcv.ArrayIndex,
    tcv.ECCValue
    FROM Termination_Call_Variable tcv
    JOIN
    (SELECT RouterCallKeyDay,RouterCallKey,RecoveryKey FROM Termination_Call_Detail WHERE DateTime > @dateFrom AND DateTime < @dateTo) tcd
    ON tcv.TCDRecoveryKey = tcd.RecoveryKey
    LEFT OUTER JOIN Expanded_Call_Variable ecv ON tcv.ExpandedCallVariableID = ecv.ExpandedCallVariableID
    WHERE tcv.DateTime > @dateFrom AND tcv.DateTime < @dateTo
    With variables, you can parametrize your code (for instance, you could write SET @dateFrom = ? and let the calling application fill in the datetime value in for you).
    Plus joining two large tables with all rows like you did (TCD-TCV) is never a good option.
    Another aspect to consider: all ECC's are actually arrays (always), so it's not good to leave out the index value (tcv.ArrayIndex).
    G.

Maybe you are looking for

  • Inbox messages do not appear yet are loading

    I have IMAP mail. My messages are loading, ie: I have 20 unread messages, however non can be viewed in inbox. Please help! I think I did this while I was reorganizing some of my other mailboxes. iMac   Mac OS X (10.3.9)   1.8GHz power PC G5

  • Lightroom vs Nikon Capture NX

    Hello All, I've been using Lightroom forever - as a beta user - and now. I just sold my finepix S1 pro and am using a new D300 shooting Raw+jpg Fine and I shoot Raw with my Finepix S2 (RAF) files as well. I spoke with a Nikon rep and they told me tha

  • Import process authentication & SSO

    We have created an import process that picks up some data and loads into CRMOD via web services on a timed 24 hour basis. Am I correct in the assumption that this process will need to use a hard coded user name and password of an active CRMOD user in

  • How to avoid or get rid of digital 'banding' when slow motion is used.

    how can i avoid or get rid of those ugly digital 'bandings' when i slow down a hd video? thx pieter

  • You call those music videos?

    I just checked out what music videos are available, and oh my gosh do they suck! SERIOUSLY. Whoever is responsible for them should be canned, or better yet, be forced to work for Microsoft. I know it's early, but where are all the great 80's & 90's v