HANA or Hadoop?

Hi All,
I am an ABAPer wanted to takeup a career as a Data Scientist into analytics, but dont know what to learn , or need to change  from SAP to Hadoop?
As per my knowledge to become a data scientist a person must have knowledge of :
1) BI
2) DB
3) SAS
4) Big data (Hadoop)
My question is can an ABAPer learn Hadoop and shift his career or learn BI and Hana will give the same profession as Data scientist into analytics .
kindly give me suggestions.

Thanks Seshu. since microfocus cobol uses ORacle and CObol embedded SQL. Forget to mention indetail about i have good knowledge in Oracle SQL handling along with the programming language.
I want to know the future market about HANA and how complication on SAP hana, though i dont have any idea / knowledge in SAP area.
What kind of role, job responsibility we will get if i get the certification. Please advise.
Rgds,
Nathan.

Similar Messages

  • BW on HANA, Archive data to Hadoop

    Dear All,
    We are planning to start a poc for one of our clients. Below is the scenario.
    use BW on HANA for real time analytics and Hadoop as a cold storage.
    Archive historical data to HAdoop
    report on HANA and Hadoop.
    Access Hadoop data using SDA
    I request you to provide implementation steps if somebody have worked on similar scenario.
    Thanks & Regards,
    Rajeev Bikkani

    Hi Rajeev Bikkani,
                   Currently NLS using HADOOP is not available by default and SAP highly recommends IQ for NLS.If you opt for IQ in longer run it will be easy to maintain and scaling up and also for better query performance.In longer run it will yield you better ROI. Initially HADOOP will be cost efective but amount of time user spends to get this solution will be challenging and later on maintaining it and scaling it up also be very challenging.. So SAP highly recommeds IQ for a NLS. SAP positions Hadoop to handle Big data along with  HANA for mainly handling unstructured data and not for NLS. So please reconsider your option.
    I went  through the link and I dont agree with the point "Archiving was not an option, as they needed to report on all data for trending analysis. NLS required updating the info providers and bex queries to access NLS data. SAP offers Sybase IQ as a NLS option but it doesn't come cheap with a typical cost well over a quarter million dollars."
    Because you can query on the archived data and it doesnt need data to be written back to providers, during runtime data will be picked from NLS and showcased.
    If you still wanted to use HADOOP as NLS , then I can suggest you an process but I have not tried it personnaly.
    1)Extract data selectively from your infoprovider via OHD and keep it in OHD table.
    2)Write the data from the OHD table  to Hbase table .(Check the link below for how to do it.)
    3)Delete the data fron the OHD table.
    4)Whatever data  moved to OHD table should be deleted from the infoprovider by slective deletion.
    5)Now Connect the HANA to HADOPP via SDA and virtualise teh table to which we have written the data.
    6) Then build a view on top of this table and query on it.
    7)HANA view historical data can be combined with BW provider data via open ODS or Composite provider or Transient provider
    Reading and Writing to HADOOP HBASE with HANA XS
    http://scn.sap.com/community/developer-center/hana/blog/2014/06/03/reading-and-writing-to-hadoop-hbase-with-hana-xs
    Hope this helps..
    Thanks & Regards
    A.Dinesh

  • NLS Archiving

    Dear All,
    Could you please do let me know, how to perform NLS archiving for SAP BW7.4 powered by HANA to HADOOP system.
    Regards,
    Jo

    Hi Jyotsna,
    I agree with srinivasan, NLS license from sap means  sybase -IQ as data base for Near line storage.
    SAP BW Near-Line Storage Solution (NLS) Based on SAP Sybase IQ
    SAP BW 730: What's New in the SAP BW Near-Line Storage Solution
    You can access data from HADOOP by using smart data access technique.
    There are also other NLS database ( DB2, Oracle Etc ) that supports BW but the down side is, only authorized NLS implementation  partners can implement in NLS in customer namespace.There are very few companies can implement third party NLS.
    Thanks,
    Shakthi Raj Natarajan.

  • Error while creating virtual table in Hana

    Hi Folks,
    Good Day!
    I am trying to create the virtual table in Hana system from Hadoop data base but am getting some privilege issue.
    Please help me to resolve the issue.
    Error Screen shot:
    Thanks,
    Hari

    Hi Hari,
    Check if you have the following privileges assigned to your user ID in Object privileges for the created Remote DNS
    Regards,
    Nehal

  • ESP vs. Hana (...) - Differences

    Hi guys,
    i am working on my thesis at the moment and comparing streaming technologies with hana, hadoop and so on. ESP is a part of my work.
    I want to ask you guys, are there some camparisons between hana and esp? What are the differences between them, for example, I/O behaviour? I want to ask: Do i need ESP if i have Hana, and why? Where are the limitations of hana, when do i need a CEP system like ESP? Are streaming technologies, like ESP or Apache Storm, really nessesary and when, when do we need them?
    I hope you can help me with some useful information, links or something, especially about SAP ESP and HANA.
    Thank you!

    ESP is a technology that complements SAP HANA.  As they are fundamentally different technologies, it really makes more sense to talk about how they complement each other rather than how they are different.  SAP HANA is a database and an application platform, while ESP is - well, an event stream processor.  So let's start with the fact that ESP is not a database and is not built on a database. It's a real-time event processing engine that operates on an entirely event-driven model.  Data streams into the ESP engine in real-time, flows through continuous queries that have been defined, and results are published - i.e. streamed out from the ESP engine - in real-time, in response to those  incoming events. ESP is does not provide permanent or even long-term data persistence and is not designed to support ad-hoc queries (yes, there is some very limited ability to query windows...but let's not let that complicate things.
    Generally, SAP ESP is used with SAP HANA in either/both of the following ways:
    1. To capture streaming data in SAP HANA, providing the ability to receive live data streams, apply any desired filters, transformations,  aggregations, correlations, etc and then capturing the desired data  in HANA (in the desired form)
    2. Real-time response:  monitor data as it streams in to generate alerts, notifications, or initiate a response as fast as the information is received.
    For an overview of what SAP ESP is, how it works, and how it can be used, I suggest this paper.  You might also find this video showing how ESP can be used with SAP HANA to be helpful. And there are a number of example use cases for ESP described in the Event Processing community on SCN.

  • Can SLT replicate data directly to Hadoop

    Hi,
    Has anyone successfully connected SAP SLT directly to Hadoop or other big data services where the big data is the target? We have a scenario where our SAP OER will produce millions of records of which we wish to report against but see that the scale, ie number of records plus the data retention requirements will mean a typical SAP BW and or a SAP HANA will not be a viable architecture.
    So is SLT a contender, excluding potential license discussions?
    Thanks
    Brian

    Hi Justin,
    with the gerneral replication to non-ABAP DBs as a target we made some good movements and got a lot of experience within the last year in customer projects. Despite that we do not plan to release it with the next SP version unrestrictedly because we did not test all available setups so far. So we cannot brand it as out-of-the-box working from a legal perspective. Anyhow when you have a scenario with the replication to non-ABAP just conctact us, ususally we did with the common databases already some customer projects and can enable it without any efforts.
    For the general strategy it is planned to release it unrestricted with SP9 mid of 2015. Technically it is alreay working.
    When I talked about non-ABAP DB it is only the DB's that are supported by NW (Oracle, MS SQL, IBM, ASE, MaxDB, Informix). All other DB's are only supported if the SAP basis team would deliver a library that allows SLT to connect to this DB. Hadoop is not include - unfortunately we are dependent on this team and their strategy.
    Best,
    Tobias

  • HANA and Cassandra data integration

    Hello Guys
    What interesting data integration possibilities are available between hana and cassandra ?

    Hi Safdar,
    It looks like the hadoop controller is not installed. Please refer to the step 1 of my blog:
    Hadoop MapReduce as Virtual Functions in HANA
    Regards,
    Eric Du

  • Error while creating a BO Universe (using IDT) on top of SAP HANA Calculation View with Input Parameters

    Hi All..
          We are trying to create a Universe (using IDT-Version4.0) on top of the HANA Calculation view. The Calculation view has 4 input parameters, So
    in Universe side we have created the respective prompts. For the creation of derived table we have used the following code
    SELECT *
    FROM "_SYS_BIC"."<schema_name>.<CV_Calculation_View_test>"
    'PLACEHOLDER'=('$$IP_A$$','@Prompt(A)'),
    'PLACEHOLDER'=('$$IP_B$$','@Prompt(B)'),
    'PLACEHOLDER'=('$$IP_C$$','@Prompt(C)'),
    'PLACEHOLDER'=('$$IP_D$$','@Prompt(D)')
    While validating the above code we are getting an error.
    I have attached the snapshot of that error for your reference. Please find the attachment and help me in resolving it.
    Thanks in advance.

    Hello George,
    I don't have any personalization set on the info space. Also I am using mapped Account in BI to connect to HANA. The confusing part is that it is able to validate the infospace with the input parameters and index it. However query does not return any results. I even tried running the same query which explorer sends to HANA in the SQL editor and there too the same results,the query does not return anything. The model does return data when I do a data preview and if accessed from other tools like AAO.
    Also when I use SSO connection to HANA, indexing of the infospace fails. Where can I see the error log?
    Thanks,

  • Creation of Universe based on Hana OLAP connection

    I have to create a universe in IDT 4.1 sp2 based on SAP HANA OLAP
    connection ?
    How can i access SAP HANA view which was created using OLAP connection in Business objects Explorer? It does not allow creating Universe on top of SAP HANA OLAP connection and gives an error
    SAP Business Objects query and reporting applications can directly connect to OLAP SAP HANA connections. No universe is required, only a published OLAP SAP HANA connection.
    Thanks

    Hi Jyothy,
    Refer the below link on how to create universe on Analytic View/Calculation View using relation connection in IDT
    http://www.sapanalyticsguru.com/index.php/sap-bobi/31-universe-creation-on-hana-view-using-information-design-tool

  • Adding BW Modelling plugins in HANA Studio

    Hello Experts,
    I installed SAP HANA Studio in my local machine (outside C:Program Files, i.e in  D Drive)
    I followed 2 options to install BW Modelling Tools Plugin, both have been unsuccessful. Can some one please let me know what is missing.
    I checked the pre-requisites mentioned in the installation document (http://help.sap.com/download/netweaver/bwmt/SAP_BW_Modeling_Tools_Installation_Guide_en.pdf) and my system has all the necessary ones.
    Option 1
    - Followed the instructions mentioned in the https://tools.hana.ondemand.com/
    - HANA Studio --> Help --> Install New Software --> Provide the link https://tools.hana.ondemand.com/luna
    - I get the below error
    Unable to connect to repository https://tools.hana.ondemand.com/luna/content.xml
    Connection to https://tools.hana.ondemand.com refused
    Option 2
      - Download latest patch from Service market place
      - HANA Studio --> Help --> Install New Software --> Provide the Archive path
      - Error message I get is below,
    Missing requirement: BW Core UI 1.6.3 (com.sap.bw.core.ui 1.6.3) requires 'bundle com.sap.adt.destinations.ui 2.18.0' but it could not be found
      Cannot satisfy dependency:
        From: BW DataStore UI 1.6.3 (com.sap.bw.datastore.ui 1.6.3)
        To: bundle com.sap.bw.core.ui 1.1.0
      Cannot satisfy dependency:
        From: BW DataStore Object 1.6.3 (com.sap.bw.feature.adso.feature.group 1.6.3)
        To: com.sap.bw.datastore.ui [1.6.3]
    Thanks for providing some hints.
    Regards,
    Srinivas

    If SAP could provide the HANA tools over just http:// instead of https:// then it would work under proxy conditions with the network connectivity option in eclipse set to "native" mode

  • Install Software HANA unable to read repository

    Hello,
    I'm new to Eclipse. I'm trying to get up the trial Cloud environment for SAP HANA. I have Eclipse Juno installed. When I attempt to do an install of the HANA software on top of Eclipse I receive error "Unable to read Repository".
    We do not use a proxy, so network was set to "DIRECT". I can get to the SAP site from the browser tools.hana.ondemand.com/luna. I downloaded the certificate for that site and added it to the cacerts file for JDK1.6.0_45.
    The error message in full says:
    Unable to read repository at tools.hana.ondemand.com/luna/content.xml.
    Using the browser, tools.hana.ondemand.com/luna/content.xml returns a 404 error. If I change it to tools.hana.ondemand.com/luna/content.jar, it gives me a download.
    As far as I can tell, there's no way to make Eclipse point to content.jar. Can someone tell me where this is going wrong?
    Thanks.
    Danny Hearn

    On 06/26/2015 08:52 AM, Danny Hearn wrote:
    > Hello,
    >
    > I'm new to Eclipse. I'm trying to get up the trial Cloud environment
    > for SAP HANA. I have Eclipse Juno installed. When I attempt to do an
    > install of the HANA software on top of Eclipse I receive error "Unable
    > to read Repository".
    > We do not use a proxy, so network was set to "DIRECT". I can get to the
    > SAP site from the browser tools.hana.ondemand.com/luna. I downloaded
    > the certificate for that site and added it to the cacerts file for
    > JDK1.6.0_45.
    >
    > The error message in full says:
    > Unable to read repository at tools.hana.ondemand.com/luna/content.xml.
    >
    > Using the browser, tools.hana.ondemand.com/luna/content.xml returns a
    > 404 error. If I change it to tools.hana.ondemand.com/luna/content.jar,
    > it gives me a download.
    > As far as I can tell, there's no way to make Eclipse point to
    > content.jar. Can someone tell me where this is going wrong?
    >
    > Thanks.
    >
    > Danny Hearn
    Likely the best support for SAP/HANA would come from SAP/HANA.

  • Can not access data generator in HANA SHINE Content

    Hi Expert,
      After import Delivey Unit of SHINE Content, then i was told by the Guide Book that i could use data generator to generate demo data by access URL
    http://<myServer>:<XS Port>/sap/hana/democontent/epm/admin/ui/WebContent/admin.html
    STEP1:I used SYSTEM on the HANA login page
    STEP2:Error 404 refelect
    Does anyone else have encountered this problem?
    Message was edited by: Tom Flanagan

    You might check the repository browser, but I don't believe that URL is correct.  There is no ui or WebContent folder in the path in my system.  Its just /sap/hana/democontent/epm/admin/
    But I suggest checking the folder structure in the Repositories view in your own system to make sure. My system is a new version and could be different than what you are seeing. However this would explain why you are getting the 404. 
    Also make sure that SYSTEM has the necessary SHINE admin role added.

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • SAP HANA Lifecycle Manager Couldn't deploy using credentials

    Hello,
    After HLM 1.0.7.7 upgrade from 1.0.6 on IBM appliance we could not apply support packages in normal mode. When we launch the Upgrade process from Hana Studio or Command line we recive the following message:
    Error Message: Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.
    Detailed Message: null or empty argument
    Help: http://help.sap.com/hana
    We tried to downdgrade and reinstall HLM again with no luck. Users, Parsswords and Certificates was also renewed. We could read this log from HLM.
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Submit for subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_GetUpdateTypeTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Submit for subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_GetUpdateTypeTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Starting execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Starting execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask]
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor
    Starting process [Deploys SAP Host Agent configurations using sidadm credentials, com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmTask] ({})...
    INFO 2014-05-08 16:05:15 com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor
    Starting process [Deploys SAP Host Agent configurations using sidadm credentials, com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmTask] ({})...
    INFO 2014-05-08 16:05:15 com.sap.lm.util.status.StatusTracker
    Status tracker initialized using storage file /usr/sap/hlm_bootstraps/HNB/HLM/param/configdeploystatus.properties.
    INFO 2014-05-08 16:05:15 com.sap.lm.util.status.StatusTracker
    Status tracker initialized using storage file /usr/sap/hlm_bootstraps/HNB/HLM/param/configdeploystatus.properties.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Progress [ 0%]
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Progress [ 0%]
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor
    Deploying SAP Host Agent configurations on host  with sidadm credentials.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor
    Deploying SAP Host Agent configurations on host  with sidadm credentials.
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Process message: Deploying SAP Host Agent configurations on host
    INFO 2014-05-08 16:05:15 com.sap.lm.hana.hlm.update.slpp.context.ProgressObserverAdapter
    Process message: Deploying SAP Host Agent configurations on host
    INFO 2014-05-08 16:05:16 com.sap.lm.services.hostcontrol.SAPHostControlService
    Deployed Host Agent configurations [/usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-client-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/gen-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-studio-repository-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-nzdm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-hdbeuspack.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/chmod-file.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v2.conf] with signature /usr/sap/hlm_bootstraps/HNB/HLM/SIGNATURE.SMF on host .
    INFO 2014-05-08 16:05:16 com.sap.lm.services.hostcontrol.SAPHostControlService
    Deployed Host Agent configurations [/usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-sda-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/newdb-studio-repository-postupdate.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-client-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-executable-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-load-controller-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/gen-hdbstudio-deliveryunit.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/change-newdb-studio-repository-installation-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-nzdm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hdb-plugin.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-load-controller.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-installation-number-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-hdb-plugin-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-server-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/updatehostagent-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-repository-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-hostagent.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-hdbeuspack.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-server-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-newdb-studio-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/chmod-file.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-client.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/find-java-home.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/detect-sedm.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/install-newdb-studio-repository-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/exists-path-v2.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/update-newdb-client-v3.conf, /usr/sap/hlm_bootstraps/HNB/HLM/operations.d/uninstall-newdb-studio-v2.conf] with signature /usr/sap/hlm_bootstraps/HNB/HLM/SIGNATURE.SMF on host .
    ERROR 2014-05-08 16:05:16 com.sap.lm.hana.hlm.update.slpp.task.AbstractUpdateProcessExecutor
    Process executor failed
    java.lang.IllegalArgumentException: null or empty argument
      at com.sap.lm.hlm.common.runtime.HlmRuntime.assertNotNullOrEmptyArgumens(HlmRuntime.java:342)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:74)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:69)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor.deployconfigurations(AbstractDeployConfigProcessExecutor.java:114)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmProcessExecutor.executeInternal(DeployConfigWithSidAdmProcessExecutor.java:42)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor.startInterruptibleExecution(AbstractProcessExecutor.java:128)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor$1.run(AbstractProcessExecutor.java:108)
    ERROR 2014-05-08 16:05:16 com.sap.lm.hana.hlm.update.slpp.task.AbstractUpdateProcessExecutor
    Process executor failed
    java.lang.IllegalArgumentException: null or empty argument
      at com.sap.lm.hlm.common.runtime.HlmRuntime.assertNotNullOrEmptyArgumens(HlmRuntime.java:342)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:74)
      at com.sap.lm.hlm.common.runtime.HlmRuntime.deployHlmExecutableScripts(HlmRuntime.java:69)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.AbstractDeployConfigProcessExecutor.deployconfigurations(AbstractDeployConfigProcessExecutor.java:114)
      at com.sap.lm.hana.hlm.update.slpp.task.deployconfig.DeployConfigWithSidAdmProcessExecutor.executeInternal(DeployConfigWithSidAdmProcessExecutor.java:42)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor.startInterruptibleExecution(AbstractProcessExecutor.java:128)
      at com.sap.lm.hlm.common.tasks.process.AbstractProcessExecutor$1.run(AbstractProcessExecutor.java:108)
    ERROR 2014-05-08 16:05:16 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask, Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.] has failed: {}
    ERROR 2014-05-08 16:05:16 com.sap.lm.hlm.common.wizard.process.WizardProcessExecutor
    Execution of subtask with ID [6cd61e09-eb68-4448-86c0-b4319abbdc3f_DeployConfigWithSidAdmTask, Execution of 'Deploys SAP Host Agent configurations using sidadm credentials' failed.] has failed: {}
    Could you please help us with this issue?
    Kind Regards,
    Message was edited by: Tom Flanagan

    Hi Alston,
    I'm sorry for the late reply. Are you still experiencing this error?
    I've seen this issue before with another customer and the final resolution was to make sure the /etc/hosts file does not have the ipv6 BEFORE ipv4. Here is an example:
    >> cat /etc/hosts
    # special IPv6 addresses
    ::1             localhost ipv6-localhost
    ipv6-loopback
    ff02::3         ipv6-allhosts
    192.168.180.31  abceccd01.sap.com abceccd01
    192.168.180.34  abceccs01.sap.com abceccs01
    192.168.180.35  abccrms01.sap.com abccrms01
    192.168.180.30  abceccs01.sap.com abceccs01
    10.36.10.5      abceccs01-nfs.sap.com  abceccs01-nfs
    The resolution would be to change the ordering. So the IPv6 section appears below the IPv4 section.
    After that is done, perform the following to "reset" HLM.
    1. Stop all browsers that have HLM pages open
    2. Stop the Studio
    3. Execute /usr/sap/hlm_bootstraps/<SID>/HLM/stop-hlm.sh –f
    4. Check that there are no any left HLM processes
    5. Go to /usr/sap/hlm_bootstraps/<SID>/HLM/param/
    and rename all files except logging.properties
    6. go to directory: /usr/sap/hlm_bootstraps/<SID>/HLM
    and rename directory persistence
    7 restart SAP Host Agent - as root: /usr/sap/hostctrl/exe/saphostexec
    -restart
    8. open HLM ui via browser:
    https://<hostname>:1129/lmsl/HLM/<SID>/ui/
    I hope this helps.
    Best Regards,
    Jimmy

  • How to load SAP R/3 tables into a schema in SAP HANA Cloud Platform

    Hello, everyone.
    First of all, my apologies for any basic mistakes, I'm new here, so there's a lot of stuff I don't know yet.
    I need to upload some tables to a schema in my HANA trial account to test some algorithms and see if the kind of application I want to build is possible, but I don't know the best way to do this.
    To my understanding, using the Cloud Connector would merely make my tables visible to my application in the cloud, but it would not actually upload them (or rather, I'm sure there's a way to upload them using it, but I suspect there are far simpler methods out there).
    I think one of the simplest possible methods would be to create the schema, open a tunnel using the console client, and to import the tables to the schema via the HANA Studio. Maybe, if that doesn't work, a CMIS repository could be used (although I'm somewhat skeptical, as I think something like this it's not its intended use). Or maybe there's another possibility I'm not seeing. Either way, I would love to hear some answers that could shed some light on this.
    Also, as I mentioned earlier I'm only using a trial account for this, I don't know if there's any limitation that could prevent me from doing this. I'd also like to hear your opinion on this matter.
    Thank you.

    BUDAT is the technical name for Posting Date field. This field is used many DB Tables/ Strucutres. Using where used option you can find the required tables.
    I suggest you to look at BSEG table for Finace Document postings by Posting Date.

Maybe you are looking for

  • Attributes iteration of a value node bound to a structure invalid?

    A context value node bound to a structure always delivers all attributes of the structure bound instead of the attributes, which were selected from the available structure attributes. For example, if structure S has the attributes A, B and C and the

  • ********** CSS TIMEOUT PROBLEMS ********

    Hi, Is there a way to stop the CSS trying to processing client requests even though all services are down. I would like the CSS to basically send an immediate "timeout" to the client side, knowing that all services are down. Instead of getting locked

  • Need FM/BAPI for production order

    Hi all Let me know FM / BAPI for get component overview based on production order?

  • Webcam not entering low-power mode after resume

    Can other W520 owners with a webcam (and potentially other models) confirm whether or not the camera prevents the USB subsystem from entering low power mode? 1) Unplug all USB peripherals, and turn off Bluetooth (this runs on USB as well) 2) Put the

  • TS1373 Transfer of music

    In Itunes my downloaded music appears on my laptop, it will not transfer to my ipod.