SAP HANA Load Controller

Dear HANA Guru,
Help needed,  Can you provide some pointer on Load Controller ?
Regards,
Manoj.

Hello,
like what it is good for?
It is part of Sybase Replication scenario and it controls replication. You can read more in SAP Master Guide:
page 21 - chapter 2.4 Log-Based Replication
https://service.sap.com/~sapidb/011000358700000604552011
The SAP HANA Load Controller, a component that resides in SAP HANA, coordinates the entire replication process: it starts the initial load of source system data to the SAP HANA database in SAP HANA, and communicates with the Sybase Replication Server to coordinate the start of the delta replication.
Tomas

Similar Messages

  • Benefits/ Limitations when we use  SAP HANA Extended and Standard products.

    Hi All,
    We are going to purchase SAP HANA media. Please let me know what are the benefits/ limitations when we use  SAP HANA Extended and Standard products.
    Thanks in Advance,
    Vishall

    Hello,
    please check Master Guide (page 9 - chapter 1.6 Software Components)
    https://service.sap.com/~sapidb/011000358700000604552011
    There you can find edition comparison and detailed description.
    To extract the short summary:
    Platform edition = SAP HANA Database + Studio + Client + Host Agent + Information Composer
    Enterprise edition = (Platform edition) + LT Replication Add-on (SLT) + LT Replication Server (SLT) + SAP BO Data Services
    Enterprise extended edition = (Enterprise edition) + Sybase Adaptive Server Enterprise + Sybase Replication Server + Sybase Replication Server Agent + SAP HANA load controller
    It depends on what licenses you already have and how do you intends to use HANA.
    Platform edition = ETL via BO Data Services (you already have license for BO Data Services)
    Enterprise edition = SLT and/or BO Data Services (including licenses)
    Enterprise extended edition = all replication technologies
    Tomas

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • CPU Work load optimization in SAP HANA

    Hi All,
    In virtual SAP HANA , there is an option to assign/optimize the vCPU to corresponds to Physical CPU as mentioned in the document below,
    .http://www.vmware.com/files/pdf/SAP_HANA_on_vmware_vSphere_best_practices_guide.pdf
    But,in Physical SAP HANA Systems, are the below mentioned features available,
    Can we assign dedicated CPU cores manually to a particular user/users ?
    Or, Is there a way to reserve  certain CPU cores for particular Application/Schema Threads/Sessions?
    Thanks for your help!!
    -Gayathri
    Message was edited by: Julie Blaufuss

    Nope, there is no such cpu-core based allocation option available.
    You can limit how many worker threads will be created, in case you need to balance the CPU usage on in a multi instance setup.
    However, you don't have any control on the actual amount of CPU resource that any SAP HANA instance, let alone DB user or query has.
    Comparing this with what you can do in vSphere is not suitable, as we are looking at a different level of abstraction here.
    To SAP HANA the machine that vSphere emulates will have x cores and SAP HANA will use these x cores - all of them.
    It's important not to forget that you have an additional layer of indirection here and things like CPU core binding can easily have negative side effects.
    For the SAP HANA users it would be more interesting to have a workload management within SAP HANA that would allow to manage different requirements on responsiveness and resource usage (it's not just CPU...). And that is what the SAP HANA development is working on.
    Maybe we're lucky and see some features in this direction later this year.

  • SAP HANA Lifecycle Manager Couldn't deploy using credentials

    Hello,
    After HLM 1.0.7.7 upgrade from 1.0.6 on IBM appliance we could not apply support packages in normal mode. When we launch the Upgrade process from Hana Studio or Command line we recive the following message:
    Error Message: Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.
    Detailed Message: null or empty argument
    Help: http://help.sap.com/hana
    We tried to downdgrade and reinstall HLM again with no luck. Users, Parsswords and Certificates was also renewed. We could read this log from HLM.
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Submit for subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_GetUpdateTypeTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Submit for subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_GetUpdateTypeTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Starting execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Starting execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor
    Starting process [Deploys SAP Host Agent configurations using sidadm credentials, com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmTask] ({})...
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor
    Starting process [Deploys SAP Host Agent configurations using sidadm credentials, com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmTask] ({})...
    INFO 2014-05-08 16:05:15 com.sap.lm.util.status.StatusTracker
    Status tracker initialized using storage file /usr/sap/hlm_bootstraps/HNB/HLM/param/configdeploystatus.properties.
    INFO 2014-05-08 16:05:15 com.sap.lm.util.status.StatusTracker
    Status tracker initialized using storage file /usr/sap/hlm_bootstraps/HNB/HLM/param/configdeploystatus.properties.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Progress [ 0%]
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Progress [ 0%]
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor
    Deploying SAP Host Agent configurations on host  with sidadm credentials.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor
    Deploying SAP Host Agent configurations on host  with sidadm credentials.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Process message: Deploying SAP Host Agent configurations on host
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Process message: Deploying SAP Host Agent configurations on host
    INFO 2014-05-08 16:05:16 com.sap.lm.services.hostcontrol.SAPHostControlService
    Deployed Host Agent configurations [/usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-client-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/gen-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-studio-repository-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-nzdm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-hdbeuspack.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/chmod-file.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v2.conf] with signature /usr/sap/hlm_bootstraps/HNB/HLM/SIGNATURE.SMF on host .
    INFO 2014-05-08 16:05:16 com.sap.lm.services.hostcontrol.SAPHostControlService
    Deployed Host Agent configurations [/usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-client-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/gen-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-studio-repository-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-nzdm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-hdbeuspack.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/chmod-file.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v2.conf] with signature /usr/sap/hlm_bootstraps/HNB/HLM/SIGNATURE.SMF on host .
    ERROR 2014-05-08 16:05:16 com.sap.lm.hana.hlm.update.slpp.task.AbstractUpdateProcessExecutor
    Process executor failed
    java.lang.IllegalArgumentException: null or empty argument
      at com.sap.lm.hlm.common.runtime.HlmRuntime.assertNotNullOrEmptyArgumens(HlmRuntime.java:342)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:74)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:69)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor.deployconfigurations(AbstractDeployConfigProcessExecutor.java:114)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmProcessExecutor.executeInternal(DeployConfigWithSidAdmProcessExecutor.java:42)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor.startInterruptibleExecution(AbstractProcessExecutor.java:128)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor$1.run(AbstractProcessExecutor.java:108)
    ERROR 2014-05-08 16:05:16 com.sap.lm.hana.hlm.update.slpp.task.AbstractUpdateProcessExecutor
    Process executor failed
    java.lang.IllegalArgumentException: null or empty argument
      at com.sap.lm.hlm.common.runtime.HlmRuntime.assertNotNullOrEmptyArgumens(HlmRuntime.java:342)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:74)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:69)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor.deployconfigurations(AbstractDeployConfigProcessExecutor.java:114)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmProcessExecutor.executeInternal(DeployConfigWithSidAdmProcessExecutor.java:42)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor.startInterruptibleExecution(AbstractProcessExecutor.java:128)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor$1.run(AbstractProcessExecutor.java:108)
    ERROR 2014-05-08 16:05:16 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask, Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.] has failed: {}
    ERROR 2014-05-08 16:05:16 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask, Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.] has failed: {}
    Could you please help us with this issue?
    Kind Regards,
    Message was edited by: Tom Flanagan

    Hi Alston,
    I'm sorry for the late reply. Are you still experiencing this error?
    I've seen this issue before with another customer and the final resolution was to make sure the /etc/hosts file does not have the ipv6 BEFORE ipv4. Here is an example:
    >> cat /etc/hosts
    # special IPv6 addresses
    ::1             localhost ipv6-localhost
    ipv6-loopback
    ff02::3         ipv6-allhosts
    192.168.180.31  abceccd01.sap.com abceccd01
    192.168.180.34  abceccs01.sap.com abceccs01
    192.168.180.35  abccrms01.sap.com abccrms01
    192.168.180.30  abceccs01.sap.com abceccs01
    10.36.10.5      abceccs01-nfs.sap.com  abceccs01-nfs
    The resolution would be to change the ordering. So the IPv6 section appears below the IPv4 section.
    After that is done, perform the following to "reset" HLM.
    1. Stop all browsers that have HLM pages open
    2. Stop the Studio
    3. Execute /usr/sap/hlm_bootstraps/<SID>/HLM/stop-hlm.sh –f
    4. Check that there are no any left HLM processes
    5. Go to /usr/sap/hlm_bootstraps/<SID>/HLM/param/
    and rename all files except logging.properties
    6. go to directory: /usr/sap/hlm_bootstraps/<SID>/HLM
    and rename directory persistence
    7 restart SAP Host Agent - as root: /usr/sap/hostctrl/exe/saphostexec
    -restart
    8. open HLM ui via browser:
    https://<hostname>:1129/lmsl/HLM/<SID>/ui/
    I hope this helps.
    Best Regards,
    Jimmy

  • SAP HANA XS and SAPUI5 - how to live update a chart?

    Hi all,
    we are developing with SAP HANA Developer Edition and SAPUI5.
    We read data with an XSJS File out of the Database ( i read the recent 40 datasets of the table with an SQL statement and LIMIT).
    This is the code in my controller.js which calls the XSJS service that reads the data out of HANA DB.
    loadChart: function() {
            var model = new sap.ui.model.json.JSONModel();
            var aUrl = '../../../dev/data_HANAXS/services/readdata.xsjs?cmd=loadChart';
            jQuery.ajax({
                url: aUrl,
                method: 'GET',
                dataType: 'json',
                success: function(data) {
                    model.setData({modelData : data});
            return model;
    Then in my view.js i set the model for the line chart and i can see my data.
    var model = oController.loadChart();
    linechart.setModel(model);   
    What i want to do now is to live update the chart in regular time intervals, i am using SETINTERVAL ( you can also use jQuery.sap.delayedCall ) for that:
    function updateChart() {
                var model = oController.loadChart();
                linechart.setModel(model);
    setInterval(updateChart, 5000); // every 5 seconds refresh the chart
    And this works fine BUT the COMPLETE CHART (including x-axis and y-axis) is drawn again every 5 seconds , what i want to know is how can i realize it so that ONLY the DATA LINE is drawn again every 5 seconds?
    It would be great if anyone could help me regarding this issue?
    It must be possible to do a live update of a chart especially for cases where you get steadily new data that has to be visualized.
    Regards & Thanks,
    Andreas

    Hi Micha,
    yes i tried that some days before but it did not work. After your message here, i tried it again but same result, what i get after some seconds is "No Data". Moreover i am not quite sure how to set the data with model.setData().
    What i do is the following, maybe you know if there is a mistake: this is the code for updating the chart in the controller.js:
    update: function(oController) {
            var model = new sap.ui.model.json.JSONModel();
            var aUrl = '../../../xxxx.xsjs?cmd=loadChart';
            jQuery.ajax({
                url: aUrl,
                method: 'GET',
                dataType: 'json',
                success: function(data) {
                    model.setData({modelData : data});
                    oController.onUpdateCall(model, oController);
    onUpdateCall: function (updateJSONModel, oController) {
            oController.updateModel = updateJSONModel;
            var linechart = sap.ui.getCore().byId("linechart");
            var oModel = linechart.getModel();
            var oUpdateData = oController.updateModel.getData();
            oModel.setData(oUpdateData);
    I use the same cmd parameter and i call the same xs service method "loadChart" (which always returns the newest datasets - data is continuosly pushed in our HANA DB) that initially loads the data for the chart.
    I get no error messages in my browser.
    When i do oController.update(oController) in my view.js after 5 seconds, i get "No Data" but analysing the data in browser shows me that the xs service returns the data, so data is definitely there.
    Regards & Thanks,
    Andreas

  • SAP HANA modelling Standalone

    Hello Experts,
    We are in the process of HANA Standalone implementation and design studio as reporting tool. When I am modeling, I did not figure out answers to some of the below questions .Below are the questions. Experts, please help.
    Best way of modeling: The SAP HANA LIVE is completely built on calculation view; there are no Attribute and Analytical views. I have got different answer why there is only Calculation view and there are no Alaytic view and Attribute views. We are in SP7 latest version. This is a brand new HANA in top of non-SAP (DB2 source).  What is the best way to model this scenario, meaning, can we model everything in the Calculation view’s like SAP HANA live or do you suggest using the standard attribute, analytical and calculation views to do the data model. Is SAP moving away from AV & AT to only calculation Views to simply the modeling approach?
    Reporting: We are using the design studio as front end tool. Just for example, if we assume that we are
    Using the BW, we bring all the data in to BW from different sources, build the cubes and use the bex query. Here in bex query we will be using the restricted key figures, calculated key figures calculations etc. From the reporting wise, we have the same requirements, calculations, RKF, CKF,Sum, Avg etc. if we are Using the design studio on top of standalone HANA, where do I need to implement all these calculations? Is it in different views?  (From reporting perspective, if it’s BW system, I would have done all the calculations in BEx.)
    Universe: If we are doing all the calculations in SAP HANA like RKF. CKF and other calculations , what is the point in having additional layer of universe , because the reporting compnets cam access the queries directly on views .In one of our POC , we found that the using universe affect performance.
    Real time reporting: Our overall objective is to give a real time or close to real time reporting requirements, how data services can help, meaning I can schedule the data loads every 3 or 5 min to pull the data from source. If I am using the Data services, how soon I can get the data in HANA, I know it depends on the no of records and the transformations in between the systems & network speed. Assuming that I will schele the job every 2 min and it will take another 5 min to process the Data services job , is it fair to say the my information will be available on the BOBJ tools with in 10 min from the creation of the records.
    Are there any new ETL capabilities included in SP7, I see some additional features included in SP7. Is some of the concepts discussed are still valid, because in SP7 we have star join concept.
    Thanks
    Magge

    magge kris wrote:
    Hello Experts,
    We are in the process of HANA Standalone implementation and design studio as reporting tool. When I am modeling, I did not figure out answers to some of the below questions .Below are the questions. Experts, please help.
    Best way of modeling: The SAP HANA LIVE is completely built on calculation view; there are no Attribute and Analytical views. I have got different answer why there is only Calculation view and there are no Alaytic view and Attribute views. We are in SP7 latest version. This is a brand new HANA in top of non-SAP (DB2 source).  What is the best way to model this scenario, meaning, can we model everything in the Calculation view’s like SAP HANA live or do you suggest using the standard attribute, analytical and calculation views to do the data model. Is SAP moving away from AV & AT to only calculation Views to simply the modeling approach?
    >> I haven't read any "official" guidance to move away from typical modeling approach, so I'd say stick with the usual approach- AT, then AV, then CA views. I was told that the reason for different approach with HANA Live was to simplify development for mass production of solutions.
    Reporting: We are using the design studio as front end tool. Just for example, if we assume that we are
    Using the BW, we bring all the data in to BW from different sources, build the cubes and use the bex query. Here in bex query we will be using the restricted key figures, calculated key figures calculations etc. From the reporting wise, we have the same requirements, calculations, RKF, CKF,Sum, Avg etc. if we are Using the design studio on top of standalone HANA, where do I need to implement all these calculations? Is it in different views?  (From reporting perspective, if it’s BW system, I would have done all the calculations in BEx.)
    >> I'm not a BW guy, but from a HANA perspective - implement them where they make the most sense. In some cases, this is obvious - restricted columns are only available in Analytic Views. Hard to provide more complex advice here - it depends on your scenario(s). Review your training materials, review SCN posts and you should start to develop a better idea of where to model particular requirements. (Most of the time in typical BI scenarios, requirements map nicely to straightforward modeling approaches such as Attribute/Analytic/Calculations Views. However, some situations such as slowly-changing dimensions, certain kinds of calculations (i.e. calc before aggregation with BODS as source - where calculation should be done in ETL logic) etc can be more complex. If you have specific scenarios that you're unsure about, post them here on SCN.
    Universe: If we are doing all the calculations in SAP HANA like RKF. CKF and other calculations , what is the point in having additional layer of universe , because the reporting compnets cam access the queries directly on views .In one of our POC , we found that the using universe affect performance.
    >>> Depends on what you're doing. Universe generates SQL just like front-end tools, so bad performance implies bad modeling. Generally speaking - universes *can* create more autonomous reporting architecture. But if your scenario doesn't require it - then by all means, avoid the additional layer if there's no added value.
    Real time reporting: Our overall objective is to give a real time or close to real time reporting requirements, how data services can help, meaning I can schedule the data loads every 3 or 5 min to pull the data from source. If I am using the Data services, how soon I can get the data in HANA, I know it depends on the no of records and the transformations in between the systems & network speed. Assuming that I will schele the job every 2 min and it will take another 5 min to process the Data services job , is it fair to say the my information will be available on the BOBJ tools with in 10 min from the creation of the records.
    Are there any new ETL capabilities included in SP7, I see some additional features included in SP7. Is some of the concepts discussed are still valid, because in SP7 we have star join concept.
    >>> Not exactly sure what your question here is. Your limits on BODS are the same as with any other target system - doesn't depend on HANA. The second the record(s) are committed to HANA, they are available. They may be in delta storage, but they're available. You just need to work out how often to schedule BODS - and if your jobs are taking 5 minutes to run, but you're scheduling executions every 2 minutes, you're going to run into problems...
    Thanks
    Magge

  • SAP HANA D3 Library errors - "queue" is not a function

    Hello all,
    i have a question regarding D3 Integration into SAP HANA and hope one can help me.
    The last days i developed a D3 Choropleth (spatial data) with hover effects, tooltips and a legend. I have an existing SAPUI5 and HANA XS application running and now i wanted to integrate my D3 Choropleth.
    1 ) First of all, i added the following tags into the head section of the index.html file of my SAPUI5 project as i need those libaries.
    <script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
    <script type="text/javascript" src="http://d3js.org/queue.v1.min.js"></script>
    <script type="text/javascript" src="http://d3js.org/topojson.v1.min.js"></script>
    2) Then, i added the program code of my D3 Choropleth into the view.js of my SAPUI5 project:
    var html2 = new sap.ui.core.HTML("d3choropleth", {
                    content: "<div class='D3Choropleth'>" + "</div>",
                    preferDOM: false,
                    afterRendering: function() {
                                  .... here is my code ...
    I have a SAPUI5 shell with different NavigationItems (tabs) and for this D3 Choropleth tab i am writing:
    case "WI_choropleth":
    oShell.setContent(html2);
    break;
    When i start my SAPUI5 project and klick on the tab where the D3 Choropleth should be rendered, i get the following errors:
    d3.scale.threshold is not a function
    d3.geo.albers center is not a function
    queue is not a function
    Moreover, i have to say that my D3 Choropleth works standalone outside HANA very well, that is why i assume it is a library integration issue.
    In my browsers (firefox) developer console i can see that there is a D3 library loaded by default (as it is one of SAPUI5s components) with the path: sap/ui5/1/resources/sap/ui/thirdparty/D3.js.
    BUT this is a really old version of D3 (2.9), the current release is D3 (3.4), so maybe the problem is that the D3 libary that is loaded by default overwrites my integration of D3 (script tag above)?
    Does anyone have the same issues and knows how to solve that? Furthermore it seems that the d3 queue library is also not integrated as the error "queue is not a function" occurs and i also ask me how to solve that error.
    We have SAP HANA Developer Edition Revision 80 (by AWS), HANA Studio and Client are on revision 73 (64bit).
    It would be great if anyone could help me with my issue.
    Further question: Is the current release of D3 going to be integrated into the next HANA revision?
    Thanks a lot & regards,
    Andreas

    Hi Andreas,
    You are right. The way you integrate the libraries in your code is not the way that will work for HANA XS projects. This is why it doesn't accept the queue function (library is not loaded) and also why the 3.x only D3JS funcitons are not accepted (only the internal D3JS library is loading and these functions were not a part of 2.9 yet).
    To integrate third party libraries, you need to add XSJSLIB files to your project. These need to pass the server-side JSLint checks before accepted by the XS engine. (Client-side checks that fail might not count though).
    Please see this post from David Brookler for more information.
    http://scn.sap.com/community/developer-center/hana/blog/2013/12/23/db001-using-libraries-in-xs
    Best regards,
    Tobias

  • SAP HANA Query

    Hello, I have a query about SAP HANA. If I want to bring BSEG table in HANA from ECC but do not want to do Live update(SLT) and do not want to create all the fields that exist in BSEG in ECC one by one. what features of native HANA would I use to do that?
    Thanks.

    Hi Venkat,
    You can go with BODS or SLT.
    In BODS, you can select required fields from BSEG table in ECC and map the same to HANA table.
    You can also go with SLT, do one time load using LOAD option. Even here you can select reruired fields, not all the fields from BSEG table.
    Regards,
    Chandu.

  • How to - Extract data from Cloud For Customer into SAP HANA system

    Hello Community,
    I have a requirement for extracting the existing data from Cloud for Customer into separate SAP HANA Box.
    Is it possible to achieve the same ? If yes, Please guide me for the same.
    Awaiting quick response.
    Regards
    Kumar

    Hi Kumar,
    In addition to what Thierry mentioned you could also use the C4C integration via standard Operational Data Provisioning (ODP) interfaces. This integration was acutally built for SAP BW and allows you to access any C4C data sources. From my perspective you can also build upon that for a native SAP HANA integration. Please also have a look at this guide: How To... load SAP Business Suite data into SAP... | SAP HANA.
    Besides that question let me also add the following: SAP Cloud for Customer already runs on SAP HANA since Nov. 2013. You may also use the powerful built-in analytics within C4C for analyzing data and any of your reporting demands. If your report should consider external data as well, you can combined the existing C4C data source with an external, so-called Cloud Data Source. More infomation is published in the C4C Analytics Guide: http://help.sap.com/saphelp_sapcloudforcustomer/en/PDF/EN-3.pdf.
    I hope this helps...
    Best regards,
    Sven

  • Creating Logic and modeling for SAP BW datasource in SAP HANA views and SLT

    Hi to all,
    I have small question.
    We have BW system and SAP ECC as source system and to get data from SAP ECC source system we are using SAP ECC Standard data source.
    Now we required to, create modeling in SAP HANA by use of HANA View and get data from SAP ECC source table via SLT.
    So we can replicate same modeling,as we done in SAP BW system.
    But  my question is that, as SAP ECC standard data source hit multiple table and do run-time calculation, before sending to SAP BI.
    How we can derives such calculation and logic, in our HANA studio, because as we know in SLT data transformation can be done in limited way also
    in view we can not drive such logic via SQL script ?.
    Is there any way we can do such modeling and logic in SAP HANA ?
    or
    is there any standard document for this type of case, which we can use ?
    Note: We don't want to use SAP DS or DXC for data loading to HANA, as  we want it in more real time.
    Regards
    Pavneet Rana

    Thanks for reply,
    As SLT is based on ABAP, so we can write complex logic in it.
    But if we talk about SAP ECC standard data source, they have complex logic, based on multiple table.
    So we need to write code from scratch in SLT to derive same result. but it will be huge time consuming , and required good ABAP skills, also it will lead to bug in code.
    Also it will reduce the real time performance of SLT, due to complex logic.
    Second option is procedure via SQL script, which is again huge effort in term if logic and can lead to error/bug in logic.
    Does we have any other way or architecture to do this is very simple way, with high performance and less error issue.
    Regards
    Pavneet Rana

  • SAP HANA Limitations

    Dear All,
    Please let us know the maximum HANA DB Size and the recent Scale-out / Scalability recommendation from SAP .
    Also please let us know the supportive document or SNOTE for the same.
    Thanks
    Arun R

    Hi Arun,
    To know how much memory has been used by different objects in HANA database, then the easier way is to use the TOTAL_SIZE_IN_MEMORY (along with main, Delta etc) from M_CS_TABLES. This would give you idea of how much memory has been used by objects which are partially or fully loaded to memory. Please note that it give 0 value for the objects which are available in the systm, but not yet loaded to the memory.
    Please see this ote:
    SAP HANA Database Size - In-Memory Business Data Management - SCN Wiki
    Please see this forum:Where can I see the HANA database size? | SCN
    For scalabilty:
    Scalability - SAP HANA | SAP HANA

  • Unable to run SAP HANA Cloud Connector on windows 8.1

    Hi Experts,
    I am trying to run SAP HANA Cloud Connector on Windows 8.1 machine , i could see the go.bat command response as below:
    But when i run https://localhost:8443 URL in a browser, i don't see any response. Just a blank screen.
    By looking at the logs:
    #ERROR#com.sap.scc.jni#Start Level Event Dispatcher#  #Cannot load the Cloud Connector native library. (exception java.lang.UnsatisfiedLinkError: no sapscc20jni in java.library.path)
    As per this reference https://help.hana.ondemand.com/help/frameset.htm?204aaad4270245f3baa0c57c8ab1dd60.htmlInstallation on Microsoft Windows OS it doesn't talk anything about support of HCC on Windows 8.1, just wondering if above error is because of it.
    For more details, PFA log files.
    Regards,
    JK

    As a workaround have tried copying sapscc20jni.dll file from auditor folder at sapcc-2.4.0-windows-x64 into Windows\System32 and it worked like a charm
    Thanks Virinchy P
    Regards,
    JK

  • Memory Management of SAP HANA

    Hi All,
    I went through one of the documentation in SAP HANA Memory management .
    http://www.saphana.com/servlet/JiveServlet/download/2299-2-12806/HANA_Memory_Usage_v2.pdf
    This gave me a really good understanding about the Memory management of HANA . Queries for Used and Resident memory and comparison with Overview tab numbers
    I had few questions , Which was almost answered in other discussed in one :
    But i still have few questions about  Resident and  used memory
    Used Memory : Code + Tables + DB Managment
    Resident :  what is the formula or content ?
    What does this picture refers to ?
    Infact the below statements are bit confusing
    When memory is required for table growth or for temporary computations, the SAP HANA code obtains it from the existing memory pool. When the pool cannot satisfy the request, the HANA memory manager will request and reserve more memory from the operating system. At this point, the virtual memory size of the HANA processes grows.
    Once a temporary computation completes or a table is dropped, the freed memory is returned to the memory manager, who recycles it to its pool, usually without informing Linux6. Thus, from SAP HANA’s perspective, the amount of Used Memory shrinks, but the process’ virtual and resident sizes are not affected. This creates a situation where the Used Memory may even shrink to below the size of SAP HANA’s resident memory, which is perfectly normal.
    My doubt here is how  in any given point of time used memory can go below the used memory , because resident memory is always loaded with what is there in used memory , When used memory itslef is less , what does resident contains extra .
    Also how to make a relation with HANA used memory , Database Resident memory , Virtual memory .
    In case of  a memory issue , what should we check , Used memory of HANA   OR resident memory of HANA  ?
    Thanks,
    Razal

    Hi  all,
    I am trying understand memory part bit in details ,  as i am building a complete monitoring infrastructure for HANA , and memory is core of HANA  all part of HANA
    Can you also help me to understand how to make some difference  for used memory in HANA and Resident memory
    When we say that the Resident memory is something from OS point of view  , this is the memory of the OS which is really being used .
    So if the  used memory from HANA Perspective is full , OS still have some free memory  which can be used , How that part is managed .
    When i say we are out of memory ,  Both used memory from HANA
    Resident memory from OS is full ?
    OR does the used memory is simply a calculation of Code + table + etc from HANA point of view .
    When execute query :
    SELECT SERVICE_NAME,ROUND(SUM(TOTAL_MEMORY_USED_SIZE/1024/1024/1024), 2) AS
    "Used Memory GB", ROUND(SUM(PHYSICAL_MEMORY_SIZE/1024/1024/1024), 2) AS
    "DB RESIDENT Memory GB" FROM SYS.M_SERVICE_MEMORY GROUP BY SERVICE_NAME
      SERVICE_NAME     Used Memory GB DB RESIDENT Memory GB
    1 nameserver         6.73           1.7                 
    2 preprocessor       5.38           0.24                
    3 indexserver        9.19           4.35                
    4 scriptserver       7.52           1.83                
    5 statisticsserver  8.52           3.87                
    6 xsengine          7.92           1.82                
    7 compileserver    5.31           0.23                
    On top of all this , In admin view  i get used memory as 17.87 as used memory and 18.89 as peak  .
    How this used memory is summed up in admin view .
    I am using version 70 .
    Thanks,
    Razal

  • SAP HANA : differences and costs

    Dear All,
    HANA is designed to enable faster ways to process data from business applications in real time, without impacting the underlying source system. I believe HANA will bring new and interesting opportunities to SAP. Which are the most important technical differences between HANA, BWA and BOEA? Which are the cost differences?
    Regards,
    Bob

    Bob,
    Here are my thoughts:
    HANA: Here's my recommendation for the best in-class next-generation BI solution. Deploy HANA as a standlone server with two logical system partitions. 1st partition for standalone HANA modeling and 2nd partition to host HANA DB for BW 7.3. This way you get to exploit HANA performance for both SAP BW system as a database and also use the HANA power for standalone non-BW (ECC and flat-file) application that need good performance.
      BWA: I would not recommend a BWA solution for a new BW implementation. Given the fact that SAP has announced recently that BWA 7.3 will be the last release for BWA. However, if there is a critical BW application in production that needs better performance, then BWA will be a quick fix solution. The cost difference between an SAP HANA solution and a BWA solution is narrowing down quickly. I do not want to go into details of cost.
    BOEA (aka BOE Accelerator Version): Guys, there is some confusion here. BOEA does not support BWA acceleration for non-BW data source data. Meaning that if you are using BOEA (accelerated version) - you can load non-SAP and Non-BW data but can not accelerate the data using BWA unless the data is put in an SAP BW info provider (cubes,etc..). Only BW info providers in a BOEA solution can exploit BWA acceleration. Other data (non-SAP and Non-BW ) can only exploit on indexing capability of BOE internally but not the BWA. BWA portion of BOEA is specific to BW in the backend.
    Also, please read my executive summary on SAP HANA @ http://www.sendspace.com/filegroup/u1AO5NidaMTYSD49JoHsjg
    Regards,
    Rama

Maybe you are looking for

  • Possible solution for white C2D iMacs with screen artifacts or lines

    I have a white 20" Core 2 Duo iMac with the famous screen artifacts issues. Screen tearing, horizontal lines, goofy pixels, you name it. From my research it seems there are countless others with these problems. I tried running the SMC fan control app

  • Apple is right to let Aperture go

    After at first being a little surprised at Aperture’s demise it occurred to me that Apple is right to do this. Continuing to develop an app with small market share on an operating system with small market share makes little sense. It is better for th

  • PO Creation with accrual account assignment

    We have a scenario with Insurances and Software Licences fee, since the amount is quite high, the actual quaterly fee is accrued every month to smoothen the effect in the books. When we create PO, is it possible to assigin accrual account (instead of

  • How do I free up 'Other' Space on my iPod Touch?

    So I just bought an 8GB iPod Touch 4 today, and while syncing my music and downloading apps, I noticed there was a chunk of space called "Other" on my iPod (Shows in iTunes) That takes up a large 0.9 GB. Since my iPod is an 8GB and I want to keep the

  • Selection variant F110

    Hi, Anyone knows how can I configure a selection variant for the payment program? I have gone as far as reaching the variant attributes that are copied directly from the assignment of the selection Screen, however the field I want to change PAR_UNIX