NLS Archiving in BI 7.0

Hi Experts,
       I have some doubts on NLS Archiving.
1)    When I am working with SAP ADK Archiving in BI 7.0, the data which we are going to archive needs to be compressed. I have come across this problem and I compressed the data and the data archiving is fine. Is this  same if I go for any NLS archiving (SAND, PBS etc) in BI 7.0 do I need to compress the data before I go fror  archive.
2)   We canu2019t show the archived data in Report using Conventional SAP ADK. But when we go for NLS Archiving we have the option to check the Data Archived also in report by setting in RSRT Properties: Read Data from Near Line Storage. My doubt here is by checking this flag ,can we see the archived data only in RSRT or can we see the archived data even When  we execute the report in BEX analyzer or through portal . Why because in our case users will check reports from Portal only.Then do we have any option in portal or Bex Analyser also to see archived data from  Near Line Storage
Please Correct me if I am wrong also please share your Comments.
Thanks
vamsi

Hi Vamsi,
1) Yes, you need to compressed the data also before archiving it thru NLS. Otherwise, it will not be available on your data options for the list of records to be archived.
2) Enabling this option lets the user to query the archived data aside from rsrt. Btw, you need to test also if NLS application is unavailable, and check if your online data is still accessible, as it reads the archived data even if you only queried for online data based on our experience.
cheers!
CCC

Similar Messages

  • NLS Archival of Data

    My query is on NLS Implementation of Sybase IQ.
    Suppose, we have archived our data based on time-slice (We have two options : time-slice / request-based).
    During the first archival, the data was archived for years 2010,2011 and 2012. Are partitions created based on time in IQ ?
    Next time when I archive the data for 2010, will a new partition be created in NLS storage or will the data be send to the previous archived partition?
    If a new archived partition is created in NLS storage, which partition will the query search if criteria is 2010 ? New or the older partition ?
    I need to know how the data is stored in NLS archived storage.
    Can the archival process from BW on  HANA to SYbase IQ be automated on a timely basis instead of manually doing it ?
    It would be really helpful, if anyone could provide a link which explains the whole process completely.
    Thanks in advance.

    Hi Pooja,
    My answers below:
    Does that mean, the query will search for all the partitions of the table in IQ?  Suppose for a table
    Archive Request 1 partition : data for years 2010, 2011 , 2012
    Archive Request 2 partition: data for year 2010
    - The partitions are based on the archive request, but your example is not correct, because the year 2010 is already archived and locked in the first request.
         - If you make requests based on quarter, you will have the data partitioned by quarter. If by year, the partition will be by year. And if by a range of years, that will be a single partition. So your first archive request would create a single partition with three years of data.
    Will the query search both the requests ? Doesn't it affect the performance ?
    - If the query is based on a date range, it will only search the partitions holding the date range. It will do this by the request ID. If your query does not specify a date and uses another criteria, it will search all partitions.
    And , while creating an archival process we specify time characteristics as the primary partition.
    So, in the Archive request 1 partition, will there be separate sub-partition for years 2010, 2011 , 2012 respectively ? So that, as and when the query search is for 2010, it goes and searches only the 2010 partition ?
    - This is done by the request. You should plan your requests or your archiving process chains in BW to meet your requirements.
    Or does IQ does any indexing ?
    - IQ does create indices on all of the columns. In addition, the archiving process will create some indices. BW will create a view for the archive and will create indices on the columns used in the view's where clause. In addition to the IQ indexing, BW is also aware of which archive request holds which time slice, so the performance is optimized on both ends.
    Hope that helps,
    Eric

  • NLS Archiving

    Dear All,
    Could you please do let me know, how to perform NLS archiving for SAP BW7.4 powered by HANA to HADOOP system.
    Regards,
    Jo

    Hi Jyotsna,
    I agree with srinivasan, NLS license from sap means  sybase -IQ as data base for Near line storage.
    SAP BW Near-Line Storage Solution (NLS) Based on SAP Sybase IQ
    SAP BW 730: What's New in the SAP BW Near-Line Storage Solution
    You can access data from HADOOP by using smart data access technique.
    There are also other NLS database ( DB2, Oracle Etc ) that supports BW but the down side is, only authorized NLS implementation  partners can implement in NLS in customer namespace.There are very few companies can implement third party NLS.
    Thanks,
    Shakthi Raj Natarajan.

  • How Transformations or Routines will work for the NLS Archived data in BW on HANA?

    Hi,
    I have archived data from BW on HANA system into NLS IQ.
    Now I'm creating a Transformation or Routines, in this case how the archived data will be read without writing any ABAP Code.
    Thanks & Regards,
    Ramana SBLSV.

    Hi Ramana,
    May be i will try to explain with 2 cases. hopefully one of this will be your case:-
    DSO1 -> you have archived this DSO.
    Case 1:-
    Now you want to load from DSO 1 -> DSO2 (direct transformation)
    so here , you question is while loading from DSO1 to DSO 2 ,will  the archived data also  be loaded to DSO 2?
    if so, there is a  infoprovider property you need to change for this
      In extra-> infoprovider properties -> change-> Near line usage switched on (by default it is off).
    so, the archived data also will be used in this case.
    Case 2:-
    you are loading from DSO 3 to DSO2. lookup on DSO1.
    so in lookup DSO1, you need to use archived data as well?
    In this case, you have to use the Read from DSO rule type. this will access from both active table and NLS data.
    Let me know if this is not the case with you?
    Regards,
    Sakthi

  • NLS Archiving question.

    Hi,
    I know that with NLS, we can report on the archive data but can we also report on navigational attributes after data is archived.
    Also can i pass filters on navigational attributes to the archive data from a report?
    Thanks & Regards
    -Ranjit

    Hi,
    reporting with NLS works with filtering on navigational attributes on any supported release (as of BW 7.0). However the filter itself is not moved into NLS, since NLS storage does not store the master data.
    Example: if you filter on an attribute of a material, NLS returns all relevant material data rows and OLAP processor does the filtering. So the query result is correct however the runtime can be very long.
    On BW 7.30 this has improved a bit. For filters where the selection on navigation attributes selects less than 100 records of basis characteristic, this filter is translated into the basic characteristic and the filter is passed into NLS.
    Example: you want to filter on an attribute of a company code. This filter fits for 34 company codes. The selection to NLS is send with these 34 company codes instead of the selection on the attribute. This works only for characteristics with low cardinality. For the example with material this would most likely not work.
    On BW 7.40 with SAP HANA there is a solution called Smart Data Access which allows to cooperate SAP HANA with SybaseIQ, so that the relevant data from NLS is moved at runtime into SAP HANA and then joined with attributes table on SAP HANA. This improves the runtime also for high cardinality characteristics.
    There are also solutions from 3rd party NLS vendors which can improve the runtime when filtering on navigational attributes.

  • Archiving in BI 7.0

    Hallo Experts.
    Well i need some help related to BI 7.0 Archiving.
    I have gone though some doc on SDN but yet not clear bout mentioned points.
    1> Preferable Method of Archiving in bi 7.0 ex- (NLS or ADK)
    2>Drawbacks of NLS or ADK.
    3> Expected ttime to Archieve around 500 gb of data.
    4> in details difference in ADK and NLS archiving or which approach is best and why ?
    Early input will be highly appereciated.
    Thanks in Advance
    AKG

    Hi,
    ->Preferable method will be as per your requirement if there is a already existing connection with SAP system such as UNIX you can go for ADK, it is very easy and simple, if you don't have any such system then you can store the data in the third party tool using the nearline storage.
    -> Expected time will depend on your server and availability, you can always schedule a background job using ADK.
    For us it has taken approx 1 hr for 1 million data records.
    Also check this links:
    Archiving In BI 7
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/409e7ce8-cbd8-2b10-c1ae-e27696c333c7
    Hope this helps...
    Rgs,
    Ravikanth.

  • Java related error while installing 10.2.0.1 on Radhat Linux 5

    Hi ,
    I am getting below error while running runInstaller.sh on Radhat Linux 5.
    Help is much appreciated.
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaBrightItalic.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaBrightRegular.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaSansDemiBold.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaSansDemiOblique.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaSansOblique.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaSansRegular.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaTypewriterBold.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaTypewriterBoldOblique.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaTypewriterOblique.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/LucidaTypewriterRegular.ttf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/fonts/fonts.dir
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/audio/soundbank.gm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/ko/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/ja/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/sv/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/zh/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/it/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/fr/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/es/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/de/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/zh_TW/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/zh.GBK/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/ko.UTF-8/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/locale/zh_TW.BIG5/LC_MESSAGES/sunw_java_plugin.mo
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/cursors.properties
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/invalid32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_CopyDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_CopyNoDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_LinkDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_LinkNoDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_MoveDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/images/cursors/motif_MoveNoDrop32x32.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/security/US_export_policy.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/security/cacerts
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/security/java.policy
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/security/java.security
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/security/local_policy.jar
    Archive: ../stage/Components/oracle.swd.jre/1.4.2.8.0/1/DataFiles/filegroup1.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/ControlPanel
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/java
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/java_vm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/keytool
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/kinit
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/klist
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/ktab
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/orbd
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/policytool
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/rmid
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/rmiregistry
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/servertool
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/tnameserv
    Archive: ../stage/Components/oracle.swd.jre/1.4.2.8.0/1/DataFiles/filegroup3.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/CHANGES
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/COPYRIGHT
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/Welcome.html
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/README
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/LICENSE
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/THIRDPARTYLICENSEREADME.txt
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/ControlPanel.html
    Archive: ../stage/Components/oracle.swd.jre/1.4.2.8.0/1/DataFiles/filegroup4.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/plugin/i386/ns4/libjavaplugin.so
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/plugin/i386/ns610/libjavaplugin_oji.so
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/plugin/i386/ns610-gcc32/libjavaplugin_oji.so
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/plugin/desktop/sun_java.desktop
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/plugin/desktop/sun_java.png
    5 archives were successfully processed.
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup6.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/diagnostics/config/registration/OUI.xml
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup5.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/clusterparam.ini
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup2.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/lsnodes
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/runInstaller
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/addNode.sh
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/addLangs.sh
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/runConfig.sh
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup1.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3-nls.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/jewt4.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/jewt4-nls.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/help4.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/help4-nls.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_de.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_es.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_fr.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_it.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_ja.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_ko.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_pt_BR.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_zh_CN.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_zh_TW.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstImages.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/oracle_ice.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/swingaccess.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3-swingaccess.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/classes12.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraPrereq.jar
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup8.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/ouica.sh
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup3.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_de.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_es.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_fr.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_it.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_ja.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_ko.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_pt_BR.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_zh_CN.nls
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/resource/cons_zh_TW.nls
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup10.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/1_oui_intro.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/2_oui_using.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/3_oui_oracle_homes.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/4_oui_response_files.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/5_oui_cluster_installs.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/6_oui_translations.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/A_oui_troubleshooting.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/B_oui_sample_files.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/bookbig.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/bookicon.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/contbig.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/conticon.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/cpyr.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/doclib.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/help.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/index.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/index.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/indxicon.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/larrow.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/leftnav.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/oracle.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/oui.pdf
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/preface.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/prodbig.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/prodicon.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/rarrow.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/rcf.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/rightnav.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/title.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/toc.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/toc.htm
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/topnav.gif
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/guide/uarrow.gif
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup4.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/oraparam.ini
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup7.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/ouica.properties
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup9.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ouica.jar
    Archive: ../stage/Components/oracle.swd.oui/10.2.0.1.0/1/DataFiles/filegroup11.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/bin/runInstaller.sh
    11 archives were successfully processed.
    Archive: ../stage/Components/oracle.swd.oui.core/10.2.0.1.0/1/DataFiles/filegroup2.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/oneclick.jar
    Archive: ../stage/Components/oracle.swd.oui.core/10.2.0.1.0/1/DataFiles/filegroup1.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/xml.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/emCfg.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/orai18n-collation.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/orai18n-mapping.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ojmisc.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/xmlparserv2.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/srvm.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraInstaller.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraInstallerNet.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/share.jar
    Archive: ../stage/Components/oracle.swd.oui.core/10.2.0.1.0/1/DataFiles/filegroup3.jar
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/lib/linux/liboraInstaller.so
    inflating: /tmp/OraInstall2009-09-30_06-23-57PM/oui/lib/linux/libsrvm10.so
    3 archives were successfully processed.
    LD_LIBRARY_PATH environment variable :
    Total args: 24
    Command line argument array elements ...
    Arg:0:/tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/java:
    Arg:1:-Doracle.installer.library_loc=/tmp/OraInstall2009-09-30_06-23-57PM/oui/lib/linux:
    Arg:2:-Doracle.installer.oui_loc=/tmp/OraInstall2009-09-30_06-23-57PM/oui:
    Arg:3:-Doracle.installer.bootstrap=TRUE:
    Arg:4:-Doracle.installer.startup_location=/software/database/install:
    Arg:5:-Doracle.installer.jre_loc=/tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2:
    Arg:6:-Doracle.installer.nlsEnabled="TRUE":
    Arg:7:-Doracle.installer.prereqConfigLoc=/tmp/OraInstall2009-09-30_06-23-57PM/prereq :
    Arg:8:-Doracle.installer.unixVersion=2.6.18-8.el5:
    Arg:9:-mx150m:
    Arg:10:-cp:
    Arg:11:/tmp/OraInstall2009-09-30_06-23-57PM:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraInstaller.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/oneclick.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/xmlparserv2.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/srvm.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/share.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraInstallerNet.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/xml.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/orai18n-collation.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/orai18n-mapping.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/emCfg.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ojmisc.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstImages.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_de.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_es.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_fr.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_it.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_ja.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_ko.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_pt_BR.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_zh_CN.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/InstHelp_zh_TW.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/oracle_ice.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/help4.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/help4-nls.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3-swingaccess.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/ewt3-nls.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/swingaccess.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/classes12.jar::/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/OraPrereq.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/jewt4.jar:/tmp/OraInstall2009-09-30_06-23-57PM/oui/jlib/jewt4-nls.jar:
    Arg:12:oracle.sysman.oii.oiic.OiicInstaller:
    Arg:13:-scratchPath:
    Arg:14:/tmp/OraInstall2009-09-30_06-23-57PM:
    Arg:15:-sourceLoc:
    Arg:16:/software/database/install/../stage/products.xml:
    Arg:17:-sourceType:
    Arg:18:network:
    Arg:19:-timestamp:
    Arg:20:2009-09-30_06-23-57PM:
    Arg:21:-paramFile:
    Arg:22:/tmp/oraparam.ini:
    Arg:23:-debug:
    Initializing Java Virtual Machine from /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/bin/java. Please wait...
    [oracle@localhost database]$ Oracle Universal Installer, Version 10.2.0.1.0 Production
    Copyright (C) 1999, 2005, Oracle. All rights reserved.
    Exception java.lang.UnsatisfiedLinkError: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/i386/libawt.so: libXp.so.6: cannot open shared object file: No such file or directory occurred..
    java.lang.UnsatisfiedLinkError: /tmp/OraInstall2009-09-30_06-23-57PM/jre/1.4.2/lib/i386/libawt.so: libXp.so.6: cannot open shared object file: No such file or directory
    at java.lang.ClassLoader$NativeLibrary.load(Native Method)
    at java.lang.ClassLoader.loadLibrary0(Unknown Source)
    at java.lang.ClassLoader.loadLibrary(Unknown Source)
    at java.lang.Runtime.loadLibrary0(Unknown Source)
    at java.lang.System.loadLibrary(Unknown Source)
    at sun.security.action.LoadLibraryAction.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at sun.awt.NativeLibLoader.loadLibraries(Unknown Source)
    at sun.awt.DebugHelper.<clinit>(Unknown Source)
    at java.awt.Component.<clinit>(Unknown Source)
    at oracle.sysman.oii.oiif.oiifm.OiifmGraphicInterfaceManager.<init>(OiifmGraphicInterfaceManager.java:222)
    at oracle.sysman.oii.oiic.OiicSessionInterfaceManager.createInterfaceManager(OiicSessionInterfaceManager.java:193)
    at oracle.sysman.oii.oiic.OiicSessionInterfaceManager.getInterfaceManager(OiicSessionInterfaceManager.java:202)
    at oracle.sysman.oii.oiic.OiicInstaller.getInterfaceManager(OiicInstaller.java:436)
    at oracle.sysman.oii.oiic.OiicInstaller.runInstaller(OiicInstaller.java:926)
    at oracle.sysman.oii.oiic.OiicInstaller.main(OiicInstaller.java:866)
    Exception in thread "main" java.lang.NoClassDefFoundError
    at oracle.sysman.oii.oiif.oiifm.OiifmGraphicInterfaceManager.<init>(OiifmGraphicInterfaceManager.java:222)
    at oracle.sysman.oii.oiic.OiicSessionInterfaceManager.createInterfaceManager(OiicSessionInterfaceManager.java:193)
    at oracle.sysman.oii.oiic.OiicSessionInterfaceManager.getInterfaceManager(OiicSessionInterfaceManager.java:202)
    at oracle.sysman.oii.oiif.oiifm.OiifmAlert.<clinit>(OiifmAlert.java:151)
    at oracle.sysman.oii.oiic.OiicInstaller.runInstaller(OiicInstaller.java:984)
    at oracle.sysman.oii.oiic.OiicInstaller.main(OiicInstaller.java:866)
    Mk

    Hi
    May be you can check the Redhat FTP site.
    CentOS 4.6, 32-bit
    http://isoredirect.centos.org/centos/4.6/isos/i386/
    CentOS 4.6, 64-bit
    http://isoredirect.centos.org/centos/4.6/isos/x86_64/
    CentOS 5.1, 32-bit
    http://isoredirect.centos.org/centos/5.1/isos/i386/
    CentOS 5.1, 64-bit
    http://isoredirect.centos.org/centos/5.1/isos/x86_64/
    Regards
    Mudhalvan M.M

  • Is Near Line Storage (NLS)  ONLY for BI archiving?

    Gurus:
    Is NLS also used for data archiving in other business modules?
    At least I cannot find any info about that.
    Thanks!

    Hi,
    in the SAP context NLS is only for BI. (There are hardware solutions that also use the name NearLine Storage)
    The reason is that SAP only offers the NLS interface for BI.
    In ERP there is no standard NLS interface offered, probably because the access to the DB is hard coded in the transactions. In BI there is a central analytical engine, so here a central interface can be offered.
    For ERP, CRM, etc. there are vendors that offer a similar functionality, allowing access to archived data either in the standard transactions or similar to standard transactions.
    best regards,
    Joseph

  • *HOW TO DELETE THE ARCHIVE LOGS ON THE STANDBY*

    HOW TO DELETE THE ARCHIVE LOGS ON THE STANDBY
    I have set the RMAN CONFIGURE ARCHIVELOG DELETION POLICY TO APPLIED ON STANDBY; on my physical standby server.
    My archivelog files are not deleted on standby.
    I have set the CONFIGURE ARCHIVELOG DELETION POLICY TO NONE; # default on the Primary server.
    I've checked the archivelogs with the FRA and they are not beign deleted on the STANDBY. Do I have to do something for the configuation to take effect? Like run a RMAN backup?
    I've done a lot ofresearch and i'm getting mixed answers. Please help. Thanks in advanced.
    J

    Setting the Policy will not delete the Archive logs on the Standby. ( I found a thread where the Data Guard product manager says "The deletion policy on both sides will do what you want" ). However I still
    like to clean them off with RMAN.
    I would use RMAN to delete them so that it can use that Policy are you are protected in case of Gap, transport issue etc.
    There are many ways to do this. You can simply run RMAN and have it clean out the Archive.
    Example :
    #!/bin/bash
    # Name: db_rman_arch_standby.sh
    # Purpose: Database rman backup
    # Usage : db_rman_arch_standby <DBNAME>
    if [ "$1" ]
    then DBNAME=$1
    else
    echo "basename $0 : Syntax error : use . db_rman_full <DBNAME> "
    exit 1
    fi
    . /u01/app/oracle/dba_tool/env/${DBNAME}.env
    echo ${DBNAME}
    MAILHEADER="Archive_cleanup_on_STANDBY_${DBNAME}"
    echo "Starting RMAN..."
    $ORACLE_HOME/bin/rman target / catalog <user>/<password>@<catalog> << EOF
    delete noprompt ARCHIVELOG UNTIL TIME 'SYSDATE-8';
    exit
    EOF
    echo `date`
    echo
    echo 'End of archive cleanup on STANDBY'
    mailx -s ${MAILHEADER} $MAILTO < /tmp/rmandbarchstandby.out
    # End of ScriptThis uses ( calls an ENV) so the crontab has an environment.
    Example ( STANDBY.env )
    ORACLE_BASE=/u01/app/oracle
    ULIMIT=unlimited
    ORACLE_SID=STANDBY
    ORACLE_HOME=$ORACLE_BASE/product/11.2.0.2
    ORA_NLS33=$ORACLE_HOME/ocommon/nls/admin/data
    LD_LIBRARY_PATH=$ORACLE_HOME/lib:/lib:/usr/lib
    LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
    LIBPATH=$LD_LIBRARY_PATH:/usr/lib
    TNS_ADMIN=$ORACLE_HOME/network/admin
    PATH=$ORACLE_HOME/bin:$ORACLE_BASE/dba_tool/bin:/bin:/usr/bin:/usr/ccs/bin:/etc:/usr/sbin:/usr/ucb:$HOME/bin:/usr/bin/X11:/sbin:/usr/lbin:/GNU/bin/make:/u01/app/oracle/dba_tool/bin:/home/oracle/utils/SCRIPTS:/usr/local/bin:.
    #export TERM=linux=80x25 wrong wrong wrong wrong wrong
    export TERM=vt100
    export ORACLE_BASE ORACLE_SID ORACLE_TERM ULIMIT
    export ORACLE_HOME
    export LIBPATH LD_LIBRARY_PATH ORA_NLS33
    export TNS_ADMIN
    export PATH
    export MAILTO=?? your email hereNote use the env command in Unix to get you settings.
    There are probably ten other/better ways to do this, but this works.
    other options ( you decide )
    Configure RMAN to purge archivelogs after applied on standby [ID 728053.1]
    http://www.oracle.com/technetwork/database/features/availability/rman-dataguard-10g-wp-1-129486.pdf
    Maintenance Of Archivelogs On Standby Databases [ID 464668.1]
    Tip I don't care myself but in some of the other forums people seem to mind if you use all caps in the subject. They say it shouting. My take is if somebody is shouting at me I'm probably going to just move away.
    Best Regards
    mseberg
    Edited by: mseberg on May 8, 2012 11:53 AM
    Edited by: mseberg on May 8, 2012 11:56 AM

  • Retriving Archived data

    Hi,
    As per the requirement, we have archived the data.
    1. Can you please let me know the bests practices to get back data from Archivevd files.
    2  Assume that if we use PBS to get back the data, will BO report can get back from the  Archived data  like BW can get back from the NLS Server..
    Thanks in advancce
    Annie

    Hi,
    go through the below link it may help you
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0ded994-c520-2a10-9da7-bc92c9e9882d?quicklink=index&overridelayout=true
    Regards,
    Marasa.

  • Do we have some way to reduce Dimension table size using NLS?

      Initially we thought the Cube
         archive in NLS would be like:
    Replicate Cube in NLS with DAP.  DAP creation will create a new structure same like "Star Schema".
    While we load data to NLS it deletes from DB.
    And hence it reduced size of the DB.
    Associated Dimensions will also be replicated in the NLS.
    After actually implementation we found :
    Fact table size is reducing and DB size with associated E/F fact table is reducing
    Dimension tables are still there in DB, and still saving space in DB.
    Do we have some other way to reduce Dimension table size using NLS?
    As when we are analyzing top 100 tables in SAP BW, we are seeing multiple dimensions are part
    of this list.
    Thanks,
    Jaydip

    Hi Jaydip,
    Please check below link it might be helpful for you.
    http://www.informatik.uni-jena.de/dbis/lehre/ss2011/dbarch/SAP_BW_DB2_NLS_22062011.pdf

  • NLS in BI

    Hi,
    We are currently looking to put in a bit of a ILCM to our BI system. We are on BW 7.0 and currently have no archiving in place. The main issue we would have when archiving is still being able to report on the data that is archived, and for this I believe NLS is the way to go. Now I am aware that with our current setup we would require a third party add on to help with this, but I was reading somewhere saying that this is no longer necessary with BW 7.3.
    So my question is, is this true? Or if not and we still require third party what are the recommendations for providers.
    All new to this, so any help is greatly appreciated.
    Scott

    Hi Vamsi,
    1) Yes, you need to compressed the data also before archiving it thru NLS. Otherwise, it will not be available on your data options for the list of records to be archived.
    2) Enabling this option lets the user to query the archived data aside from rsrt. Btw, you need to test also if NLS application is unavailable, and check if your online data is still accessible, as it reads the archived data even if you only queried for online data based on our experience.
    cheers!
    CCC

  • Script remove archives from standby after applied

    Hello,
    I am looking for a script to remove archivelog files from the standby flash recovery area after beeing applied.
    My database is 11.2.0.3.3 on Linux Redhat 5.
    Can someone kindly proovide me an example?
    Thank you

    RMAN does that for you :
    RMAN>CONFIGURE ARCHIVELOG DELETION POLICY TO APPLIED ON STANDBY;
    Then set the date to what you need and run as often as you need.
    http://docs.oracle.com/cd/E18283_01/server.112/e17022/rman.htm#BAJHHAEB
    #!/bin/bash
    # Name: db_rman_arch_standby.sh
    # Purpose: Delete archive from standby
    # Usage : db_rman_arch_standby <DBNAME>
    if [ "$1" ]
    then DBNAME=$1
    else
    echo "basename $0 : Syntax error : use . db_rman_full <DBNAME> "
    exit 1
    fi
    . /u01/app/oracle/dba_tool/env/${DBNAME}.env
    echo ${DBNAME}
    MAILHEADER="Archive_cleanup_on_STANDBY_${DBNAME}"
    echo "Starting RMAN..."
    $ORACLE_HOME/bin/rman target / catalog rmancat/rmancat@rcatalog << EOF
    delete noprompt ARCHIVELOG UNTIL TIME 'SYSDATE-1';
    exit
    EOF
    echo `date`
    echo
    echo 'End of archive cleanup on STANDBY'
    mailx -s ${MAILHEADER} $MAILTO < /tmp/rmandbarchstandby.out
    . /u01/app/oracle/dba_tool/bin/rmemptyfolders.sh ${DBNAME}
    # End of ScriptUses an ENV file ( standby.env ) for each database. Use Linux ENV to compare to your settings
    ORACLE_BASE=/u01/app/oracle
    ULIMIT=unlimited
    ORACLE_SID=STANDBY
    ORACLE_HOME=$ORACLE_BASE/product/11.2.0.2
    ORA_NLS33=$ORACLE_HOME/ocommon/nls/admin/data
    LD_LIBRARY_PATH=$ORACLE_HOME/lib:/lib:/usr/lib
    LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
    LIBPATH=$LD_LIBRARY_PATH:/usr/lib
    TNS_ADMIN=$ORACLE_HOME/network/admin
    PATH=$ORACLE_HOME/bin:$ORACLE_BASE/dba_tool/bin:/bin:/usr/bin:/usr/ccs/bin:/etc:/usr/sbin:/usr/ucb:$HOME/bin:/usr/bin/X11:/sbin:/usr/lbin:/GNU/bin/make:/u01/app/oracle/dba_tool/bin:/home/oracle/utils/SCRIPTS:/usr/local/bin:.
    export TERM=vt100
    export ORACLE_BASE ORACLE_SID ORACLE_TERM ULIMIT
    export ORACLE_HOME
    export LIBPATH LD_LIBRARY_PATH ORA_NLS33
    export TNS_ADMIN
    export PATHBest Regards
    mseberg
    Edited by: mseberg on Oct 25, 2012 4:58 AM

  • BI NLS (Nearline Storage) interface with Documentum?

    Hi Experts,
    I attended a presentation by SAND Technologies at SAP Tech ED07 on BI Nearline Storage.  They mentioned BI NLS will will interface to archiving technology Open Text.  Does anyone know if NLS will interface to archiving provider EMC Documentum?
    Thanks for your help!

    Missed to add the following points.
    As per the following document ( pg 20 ) it says "A typical third Party NLS Solution is required "
    http://scn.sap.com/docs/DOC-39944
    Is there anything else which lies between the db2 software and our BI system which needs to be implemented with the help of IBM /or any other certified partner

  • BW on HANA, Archive data to Hadoop

    Dear All,
    We are planning to start a poc for one of our clients. Below is the scenario.
    use BW on HANA for real time analytics and Hadoop as a cold storage.
    Archive historical data to HAdoop
    report on HANA and Hadoop.
    Access Hadoop data using SDA
    I request you to provide implementation steps if somebody have worked on similar scenario.
    Thanks & Regards,
    Rajeev Bikkani

    Hi Rajeev Bikkani,
                   Currently NLS using HADOOP is not available by default and SAP highly recommends IQ for NLS.If you opt for IQ in longer run it will be easy to maintain and scaling up and also for better query performance.In longer run it will yield you better ROI. Initially HADOOP will be cost efective but amount of time user spends to get this solution will be challenging and later on maintaining it and scaling it up also be very challenging.. So SAP highly recommeds IQ for a NLS. SAP positions Hadoop to handle Big data along with  HANA for mainly handling unstructured data and not for NLS. So please reconsider your option.
    I went  through the link and I dont agree with the point "Archiving was not an option, as they needed to report on all data for trending analysis. NLS required updating the info providers and bex queries to access NLS data. SAP offers Sybase IQ as a NLS option but it doesn't come cheap with a typical cost well over a quarter million dollars."
    Because you can query on the archived data and it doesnt need data to be written back to providers, during runtime data will be picked from NLS and showcased.
    If you still wanted to use HADOOP as NLS , then I can suggest you an process but I have not tried it personnaly.
    1)Extract data selectively from your infoprovider via OHD and keep it in OHD table.
    2)Write the data from the OHD table  to Hbase table .(Check the link below for how to do it.)
    3)Delete the data fron the OHD table.
    4)Whatever data  moved to OHD table should be deleted from the infoprovider by slective deletion.
    5)Now Connect the HANA to HADOPP via SDA and virtualise teh table to which we have written the data.
    6) Then build a view on top of this table and query on it.
    7)HANA view historical data can be combined with BW provider data via open ODS or Composite provider or Transient provider
    Reading and Writing to HADOOP HBASE with HANA XS
    http://scn.sap.com/community/developer-center/hana/blog/2014/06/03/reading-and-writing-to-hadoop-hbase-with-hana-xs
    Hope this helps..
    Thanks & Regards
    A.Dinesh

Maybe you are looking for