SAP LT RS capabilities

Hi Experts,
Can anyone help me on the below points:
1. Can we send/replicate tables through LT RS from SAP to no-SAP systems (SQL DB)?
2. Is it possible to do transformation of data before sending the data from SAP to HANA?
Thanks in advance
Brillion

Hi Tobias,
Many thanks for your response.
1. Do you have any contact or documentation for this.
2. data transformation is done from SLT or through LT replication server ?
Cheers
Brillion

Similar Messages

  • Questions about SAP's SRM capabilities

    SRM Experts, I have two questions:
    In SAP, is it possible:
    1. To have "Amount Only Requisitions", that is, specify requisitions that enable requesters to order goods and services specifying only a dollar amount and not quantity.
    2. Reopen Closed Requisitions and Purchase Orders, that is,  re-open ANY previously closed requisition or purchase order, not only those from the last batch of the Close
    Reconciliation process.
    Thanks!

    Ok let me try...
    1) One of the Key building blogs will be the SAP NetWeaver Process Integration 7.1. Its provides the ES Repository for Service Metadata and the ES Registry for the Service Endpoints. The Registry you use for finding and classifcation of your services. Any yes XI/PI could be used as ESB.
    Another keybuilding block is the SAP NetWeaver Composition Enviroment 7.1. This one is used at Design and Runtime for Composite Process, Composite Views and Composite Application which consume Enterprise Services.
    2) WS-Poilcy, WS-Adressing and BPEL is defenetly supported with PI 7.1(release planned for september 2007)
    3) This a part of the NetWeaver Composition Enviroment. For Monitoring you need brokered Service communication... in this case you can use PI as integration Broker which is able to monitor your communication.
    4) Exacutable Business Process(BPEL) are supported by PI 7.1
    You are able to Design these processes with the Design tools of the PI.
    For Highlevel Process modeling ARIS for SAP Netweaver is integrated.
    So from High Level Process models drill down to theService Operations are all part of the Enterprise Service Repository and can be used to realise you business tasks.
    5) SAP provide the ES Workplace and SAP Discovery System for Enterprise SOA
    regards,
    Robin

  • SAP INBOX FEATURES/CAPABILITIES.

    Hi,
    Please send me some links which have the the general SAP inbox features and SAP inbox features from a workflow perspective.
    Regards,
    Sukumar.

    Was able to handle it, thanks to SAP suppied standard docs...

  • SAP charting capabilities (ABAP)

    Hi everybody,
    I did some research into SAP's charting capabilities and I am a little confused about the different ways charts can be created.
    One key feature I need in my charting application is event handling, for example when a user clicks on a pie wedge.
    Right now I see three different ways of creating charts in ABAP
    - Using the GRAPH_... function modules for example GRAPH_2D
    - Using the Graphical Framework as in GFW_PROG_TUTORIAL
    - Using the CL_GUI_CHART_ENGINE class
    Is there any major difference between these three? I know that GFW and CL_GUI_CHART_ENGINE support event handling. Do the GRAPH_... FuBas so?
    BTW, is it possible to use the xml file created in Chart Designer in one of the aforementioned solutions?
    Thanks everybody,
    Mane

    This code snippet shows how to use function GFW_PRES_SHOW.
    ...data gathering
    *__Make data format
    DATA: lv_cnt(2), lv_text(30).
    REFRESH values.
    REFRESH column_texts.
    SORT it_gra BY zcono.
    *__Row definition
    LOOP AT it_gra.
      lv_cnt = sy-tabix.
      CONCATENATE 'VALUES-VAL' lv_cnt INTO lv_text.
      CONDENSE lv_text.
      ASSIGN (lv_text) TO <fs>.
      <fs> = it_gra-cnt.
    ENDLOOP.
    *___Row Text
    values-rowtxt = '??????'.
    APPEND values.
    *__Column definition
    LOOP AT it_gra.
      column_texts-coltxt = it_gra-zcono.
      APPEND column_texts.
    ENDLOOP.
    *___Call graph function
    CALL FUNCTION 'GFW_PRES_SHOW'
         EXPORTING
           presentation_type = 31        "graph type 31 = pie type, 8 or 1 = bar chart
           parent = contc                "Custom Container
         TABLES
           values = values               "row data(Maximun 32)
           column_texts = column_texts   " column definition
         EXCEPTIONS
           error_occurred = 1
           OTHERS = 2.

  • SAP and SQL Reporting Services.

    Hi,
    I'm trying to implement a few reports using MS SQL Reporting Services.  I don't think that SAP (we are currently on 4.6C, and BW is not an option for now) has anything nearly as cool.  The problem is that SQL RS can only create queries against ODBC and OleDB providers.  One of the solutions would be to create a view in SAP DB and query against it, but the logic behind is just too complicated for a view that you can create in SAP.  Is there another way that I can interface SAP real time to SQL RS?  Anyone had experience with that, or is there a solution from SAP with similar capabilities?
    Thanks,
    Leon

    Hi Leon,
    maybe you can use the SAP.NET Connector and save the results in SQL-SERVER-Tables or use the OLEDB Provider for SAP from NSX software.
    Greetings,
    Andreas Rohr

  • Different Capabilities between BW 3.0 and 3.5

    Hi Everyone,
    I am currently gathering user requirements for BW FICO.
    There's a user requirement to edit/change the data that has been updated in the infoproviders. Has anyone of you done such a requirement before? How can we edit the data once it has been updated into the infocube?
    My BW system is currently BW 3.0B.
    I have been tasked to check if other versions like BW 3.5 is able to do direct maintenance of infocube data.
    But I can't find any documentation on the new capabilities of 3.5. Does anyone of you have the link to such documentations?
    Please help.
    Thanks.
    Regards,
    Shunhui.

    Hi,
    Go through the below the documentation:
    http://help.sap.com/saphelp_nw04/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
    Assess where your SAP BW-based reporting activities are optimized and where they are underperforming. Take home strategies to bolster the ease, quality and performance of BW navigation, drill-down, formatting, printing, analytics, and dashboards.
    The Web Application Designer delivers powerful layout and formatting capabilities, including drill-down analytics, which are critical for solutions such as SAP SEM. Yet there are common misconceptions about this tool's capabilities and skill set requirements that often lead to improper or under-utilization.
    Whether you've upgraded to SAP BW 3.5 or are planning a future upgrade, this session provides invaluable instruction for how to leverage the new SAP Information Broadcaster to better schedule and execute the delivery of SAP BW reports. Get real-world advice on how companies are already deriving value from this important tool
    Formatting and printing reports is a major challenge. Explore and fully understand the requirements, benefits, and trade-offs of SAP BW and third-party tools available to help you meet your formatting requirements.
    Now that you understand your options, proceed step-by-step through examples to create attractive, formatted reports using the SAP BW tools you've already invested in. This session draws on real-world examples of how the world's leading SAP BW users successfully create formatted reports.
    Often considered the exclusive domain of BW administrators and the IT community, the SAP BW Reporting Agent can be readily leveraged by knowledgeable power users to schedule and automate the running of pre-calculated, offline reports. This session tells you how and when it's appropriate for power users to pre-schedule reports.
    Regardless of your level of experience with SAP BW, this session offers a wealth of practical advice for how to fully leverage the power of SAP BW ad hoc query functionality. Gain insights into how to train and trust your power users and still maintain tight control over security and system performance.
    Get practical advice to simplify and customize the creation of financial reports in SAP BW, and ensure that these reports are timely, accurate, and compliant with regulatory guidelines and standards
    See how advanced techniques enable you to go beyond standard SAP BW reporting capabilities and extract greater value from your SAP BW system. Find out how to ask the right questions and build requirements so that you get what you need from your technical team.
    SAP BW 3.5 and beyond introduce new features and tools to expand your reporting options and capabilities. Whether you've already upgraded to SAP BW 3.5, or are considering a future upgrade, you'll need to understand the implication these new development have on your reporting environment.
    Hope this helps
    DST

  • Authobject for SAP collaboration assignment block in Solman 7.1

    Hi Guru,
    I am trying to find the authorisation object for edit option in SAP collaboration assignment block in Solman 7.1.
    The edit button is disabled.
    I could not find it through trace as it was greyed out.
    Could you please suggest which authorisation object is used for this.
    Regards,
    Pooja

    Hi,
    Please see if this can help you ...
    The Request for Change in Solution Manager 7.1 has a new screen area to define the scope. This new area is the scope assignment block.
    The Request for Change in Solution Manager 7.1 offers inside the Scope assignment block the possibility for several follow up documents also of different types. The field subject is obsolete in Solman 7.1 it was replaced by the change category of the scope assignment block.
    To define which change categories are valid you can perform the following customizing.
    SAP Solution Manager Implementation Guide - SAP Solution Manager - Capabilities (Optional) - Change Management - Standard Configuration - Change Request Management Framework - Make Settings for Change Transaction Types
    Choose SMCR Request for Change or your corresponding customer specific request for change and then double click on u201CCopy Control Rulesu201D in the left navigation area.
    Here you can define which follow up documents (also your own Z & Y Types) you want to allow in the scope as change category as well as define which information of the request for change shall be automatically copied. e.g. priority, text, date, attachments, context.
    Let me know if you need any further clarifications...
    Thanks,
    Ravi

  • Tracking Treasury Stock in SAP?

    Hello!
    I am wondering if SAP has any capabilities to track Treasury stock purchases and issuances for stock options and employee stock purchase plan. 
    Any thoughts on this would be greatly appreciated!
    Thank you,
    Diane

    dwrobertson,
    There are at least two issues you should know about
    integration with SAP:
    - Captivate requires a change to a variable in the HTM file
    after you
    publish (though you could make the same change to the
    SCORM.htm template
    file so you make the change once and everytime you publish it
    uses the
    updated template).
    -SAP doesn't use the score from Captivate. Scoring in SAP is
    based on the
    completion of all of the SCO's in a course. Since Captivate
    creates a
    course with a single SCO, when a learner completes your SCO
    (your Captivate
    file) they probably see a score of 100.
    The change to the HTM file needs to be to the variable
    "g_intAPIOrder".
    Change it from a 0 to a 1.
    More specific information can be found here (though there
    isn't any
    reference to SAP):
    http://www.adobe.com/devnet/captivate/articles/output_scorm_04.html#scorm_api
    Let me know if this works/helps.
    Regards,
    Andrew

  • Customer attributes in SAP Solution Manager

    Hi,
    I have created Customer Attributes in SAP Solution Manager, SPRO--> SAP Solution Manager-->Capabilities-->Implementation/Upgrade-->Blueprint and Configuration-->Object Attributes--> Definition of Customer Attributes for Object Types. I have assigned the Customer Attributes for Project/Solution.
    Now I have the attribute in the project nodes but I can't enter Attribute Values. How can I solve this problem?
    Regards.

    Hi,
    you need assign value in solar02, when you assign attributes to the object. check out
    Adding customer attributes to objects in Project and Solution
    Thanks
    Jansi

  • CRM 7.0 EHP1 and FINBASIS

    Gurus,
    I hope this question is posted in the correct forum...if not, let me know where it ought to go.
    We are currently running CRM7.0 with NW7.01 dual stack with these components:
    SAP_ABA     701     0007     SAPKA70107
    SAP_BASIS     701     0007     SAPKB70107
    PI_BASIS     701     0007     SAPK-70107INPIBASIS
    ST-PI     2008_1_700     0004     SAPKITLRD4
    SAP_BS_FND     701     0008     SAPK-70108INSAPBSFND
    SAP_BW     701     0007     SAPKW70107
    SAP_AP     700     0021     SAPKNA7021
    WEBCUIF     700     0008     SAPK-70008INWEBCUIF
    BBPCRM     700     0008     SAPKU70008
    ST-A/PI     01N_700CRM     0000          -
    Now, we'd like to move to CRM 7.0 EHP1 which includes NW7.02
    When I log in to the maintenance optimizer to pull down the entire list of patches, I'm also prompted to install a completely new component called FINBASIS.  Now I'm familiar with this component from the ECC6.0 side, but never from CRM.
    I'm wondering if I must install this or if optional when I move to CRM 7.0 EHP1.
    Does anyone have any experience with what I am describing?
    --NICK

    I found the master guide and it states this:
    Finbasis for CRM
    Finbasis for CRM is a optional SAP add-on that can be either installed on top of either SAP ERP or SAP CRM. This add-on includes the Financial Supply Chain Management (FSCM) applications SAP Dispute Management and Collections Management, which extend the SAP ERP Financials capabilities.
    These applications add extra functionality to the Shared Service Center business scenario available with the SAP CRM Interaction Center. To use these applications, you may simply deploy this software unit on top of your SAP CRM system, without having to upgrade your SAP ERP system to EHP5.
    So since we already have an ECC system that is moving to EHP5, I don't think we'll add this component to CRM.

  • Test Package in Test Management tab of Change Document

    Hi Expert,
    When we test on CHARM, we realized that the test package in test management tab can only be
    attach before the developer change the status to "Pass normal change to
    test". Is this standard or can we do any configuration to allow the
    test package be attach during "Test for Preliminary Input" status?
    Thanks
    Best Regards
    Remy

    Hi,
    I manage to solve this issue as below,
    Go to SPRO
    >SAP Solution Manager Implementation Guide
    >SAP Solution Manager
    >Capabilities (Optional)
    >Change Control Management
    >Standard Configuration
    >Change Request Management Framework
    >Adjust UI objects by User Status
    Thanks
    Best Regards
    Remy

  • Line items in an Invoice

    Hi,
    I understand that SAP Standard SD capabilities does not allow us to have more than 9999 line items in an invoice/sales order, I was wondering if it is possible to have around 50000 line items in a single invoice..........what is the maximum number of line items allowed in a single invoice in the most recent version ECC 6.0 ?
    In our Project the client's each invoice would have line items running into 5 digits. Would appreciate any suggestions on how to tackle this problem. Is it possible to customize this requirement in SD or would it be advisable to use sum other billing engine as a solution.
    Thanks in Advance,
    Saikiran

    Hi,
        try using BAPI for your purpose
        BAPI_SALESORDER_CREATEFROMDATA
    Regards
    amole

  • Urgent change with unlock tasklist

    Hello everyone,
    the problem occured to our customer with customizing urgent change. They want import transport request manually through tasklist, but if urgent change is in status "Authorized for Production", tasklist is still locked.
    I can release transport only through action "Import Urgent Change into Production System".
    I think it has something to do with SAP Solution Manager > Capabilities (Optional) > Change Request Management > Make Settings for Change Transaction Types.  We tried many options with action, but without success.
    Could you somebody help me.
    Best regards
    Jan Strakoš

    Hi Luigi,
    You may always use imports via task list of the project that linked to UC.
    but this is import all project buffer in que.
    What the reason of customers that they want use only task list and not actions?
    And what is the problem to use actions?
    Unlocked task list of UC will not give any benefits regarding to UC transport import or release.
    It is more easy to switch UC statuses via actions and import will be done automatically.
    Otherwise you need to go to Task list and press additional buttons that will do the same.
    Rg Dan

  • Interview help

    Hello Everybody ,
       iam having interview for bi/bw support consultant and interview specs consists of data management techniques,improving and maintaining sap bi monitoring capabilities.solutions to support issues and understanding of BCC SAP solution and how its bw/bi configuration support the business.knowledge of wad.please send me the expected questions and answers though iam searching sdn using specs.
    Regards
    Priya

    Hi priya
    Here are some Q&A.
    Normally the production support activities include
    Scheduling
    R/3 Job Monitoring
    B/W Job Monitoring
    Taking corrective action for failed data loads.
    Working on some tickets with small changes in reports or in AWB objects.
    The activities in a typical Production Support would be as follows:
    1. Data Loading - could be using process chains or manual loads.
    2. Resolving urgent user issues - helpline activities
    3. Modifying BW reports as per the need of the user.
    4. Creating aggregates in Prod system
    5. Regression testing when version/patch upgrade is done.
    6. Creating adhoc hierarchies.
    we can perform the daily activities in Production
    1. Monitoring Data load failures thru RSMO
    2. Monitoring Process Chains Daily/weekly/monthly
    3. Perform Change run Hierarchy
    4. Check Aggr's Rollup
    To add to the above
    1)check data targets are ready for reporting,
    2) No failed or cancelled jobs in sm37 monitors and Bw Monitor.
    3) All requests are loaded for day, monthly and yearly also.
    4) Also to note down time taken for loading of critical info cubes which are used for reporting.
    5) Is there any break in any schedules from your process chains.
    As the frequent failures and errors , there is no fixed reason for the load to be fail , if you want it for the interview perspective I would answer it in this way.
    a) Loads can be failed due to the invalid characters
    b) Can be because of the deadlock in the system
    c) Can be because of previous load failure , if the load is dependant on other loads
    d) Can be because of erroneous records
    e) Can be because of RFC connections
    These are some of the reasons for the load failures.
    Why there is frequent load failures during extractions? and how to analyse them?
    If these failures are related to Data, there might be data inconsistency in source system. Though we are handling properly in transfer rules. We can monitor these issues in T-code -> RSMO and PSA (failed records) and update.
    If we are talking about whole extraction process, there might be issues of work process scheduling and IDoc transfer to target system from source system. These issues can be re-initiated by canceling that specific data load and ( usually by changing Request color from Yellow - > Red in RSMO). and restart the extraction.
    What is the daily task we do in production support.How many times we will extract the data at what times.
    It depends... Data load timings are in the range of 30 mins to 8 hrs. This time is depends in number of records and kind of transfer rules you have provided. If transfer rules have some kind of round about transfer rules and updates rules has calculations for customized key figures... long times are expected..
    Usually You need to work on RSMO and see what records are failing.. and update from PSA.
    What are some of the frequent failures and errors?
    As the frequent failures and errors , there is no fixed reason for the load to be fail , if you want it for the interview perspective I would answer it in this way.
    a) Loads can be failed due to the invalid characters
    b) Can be because of the deadlock in the system
    c) Can be because of previous load failure , if the load is dependant on other loads
    d) Can be because of erroneous records
    e) Can be because of RFC connections
    These are some of the reasons for the load failures.
    for Rfc connections:
    We use SM59 for creating RFC destinations
    Some questions
    1)     RFC connection lost.
    A) We can check out in the SM59 t-code
    RFC Des
    + R/3 conn
    CRD client (our r/3 client)
    double click..test connection in menu
    2) Invalid characters while loading.
    A) Change them in the PSA & load them.
    3) ALEREMOTE user is locked.
    A) Ask your Basis team to release the user. It is mostly ALEREMOTE.
    2) Password Changed
    3) Number of incorrect attempts to login into ALEREMOTE.
    4) USE SM12 t-code to find out are there any locks.
    4) Lower case letters not allowed.
    A) Uncheck the lower case letters check box under "general" tab in the info object.
    5) While loading the data i am getting messeage that 'Record
    the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
    6) object locked.
    A) It might be locked by some other process or a user. Also check for authorizations
    7) "Non-updated Idocs found in Source System".
    8) While loading master data, one of the datapackage has a red light error message:
    Master data/text of characteristic ZCUSTSAL already deleted .
    9) extraction job aborted in r3
    A) It might have got cancelled due to running for more than the expected time, or may be cancelled by R/3 users if it is hampering the performance.
    10) request couldnt be activated because there is another request in the psa with a smaller sid
    A)
    11) repeat of last delta not possible
    12) datasource not replicated
    A) Replicate the datasource from R/3 through source system in the AWB & assign it to the infosource and activate it again.
    13) datasource/transfer structure not active.
    A) Use the function module RS_TRANSTRU_ACTIVATE_ALL to activate it
    14) ODS activation error.
    A) ODS activation errors can occur mainly due to following reasons-
    1.Invalid characters (# like characters)
    2.Invalid data values for units/currencies etc
    3.Invalid values for data types of char & key figures.
    4.Error in generating SID values for some data.
    15. conversio routine error
    solution.check the data format in source
    16.OBJECT CANOOT BE ACTIVATED.or error when activating object
    check the consistency of the object.
    17.no data found.(in query)
    check the info provider wether data is there or not and delete unsucessful request.
    18.error generating or activating update rules.
    1. What are the extractor types?
    • Application Specific
    o BW Content FI, HR, CO, SAP CRM, LO Cockpit
    o Customer-Generated Extractors
    LIS, FI-SL, CO-PA
    • Cross Application (Generic Extractors)
    o DB View, InfoSet, Function Module
    2. What are the steps involved in LO Extraction?
    • The steps are:
    o RSA5 Select the DataSources
    o LBWE Maintain DataSources and Activate Extract Structures
    o LBWG Delete Setup Tables
    o 0LI*BW Setup tables
    o RSA3 Check extraction and the data in Setup tables
    o LBWQ Check the extraction queue
    o LBWF Log for LO Extract Structures
    o RSA7 BW Delta Queue Monitor
    3. How to create a connection with LIS InfoStructures?
    • LBW0 Connecting LIS InfoStructures to BW
    4. What is the difference between ODS and InfoCube and MultiProvider?
    • ODS: Provides granular data, allows overwrite and data is in transparent tables, ideal for drilldown and RRI.
    • CUBE: Follows the star schema, we can only append data, ideal for primary reporting.
    • MultiProvider: Does not have physical data. It allows to access data from different InfoProviders (Cube, ODS, InfoObject). It is also preferred for reporting.
    5. What are Start routines, Transfer routines and Update routines?
    • Start Routines: The start routine is run for each DataPackage after the data has been written to the PSA and before the transfer rules have been executed. It allows complex computations for a key figure or a characteristic. It has no return value. Its purpose is to execute preliminary calculations and to store them in global DataStructures. This structure or table can be accessed in the other routines. The entire DataPackage in the transfer structure format is used as a parameter for the routine.
    • Transfer / Update Routines: They are defined at the InfoObject level. It is like the Start Routine. It is independent of the DataSource. We can use this to define Global Data and Global Checks.
    6. What is the difference between start routine and update routine, when, how and why are they called?
    • Start routine can be used to access InfoPackage while update routines are used while updating the Data Targets.
    7. What is the table that is used in start routines?
    • Always the table structure will be the structure of an ODS or InfoCube. For example if it is an ODS then active table structure will be the table.
    8. Explain how you used Start routines in your project?
    • Start routines are used for mass processing of records. In start routine all the records of DataPackage is available for processing. So we can process all these records together in start routine. In one of scenario, we wanted to apply size % to the forecast data. For example if material M1 is forecasted to say 100 in May. Then after applying size %(Small 20%, Medium 40%, Large 20%, Extra Large 20%), we wanted to have 4 records against one single record that is coming in the info package. This is achieved in start routine.
    9. What are Return Tables?
    • When we want to return multiple records, instead of single value, we use the return table in the Update Routine. Example: If we have total telephone expense for a Cost Center, using a return table we can get expense per employee.
    10. How do start routine and return table synchronize with each other?
    • Return table is used to return the Value following the execution of start routine
    11. What is the difference between V1, V2 and V3 updates?
    • V1 Update: It is a Synchronous update. Here the Statistics update is carried out at the same time as the document update (in the application tables).
    • V2 Update: It is an Asynchronous update. Statistics update and the Document update take place as different tasks.
    o V1 & V2 don’t need scheduling.
    • Serialized V3 Update: The V3 collective update must be scheduled as a job (via LBWE). Here, document data is collected in the order it was created and transferred into the BW as a batch job. The transfer sequence may not be the same as the order in which the data was created in all scenarios. V3 update only processes the update data that is successfully processed with the V2 update.
    12. What is compression?
    • It is a process used to delete the Request IDs and this saves space.
    13. What is Rollup?
    • This is used to load new DataPackages (requests) into the InfoCube aggregates. If we have not performed a rollup then the new InfoCube data will not be available while reporting on the aggregate.
    14. What is table partitioning and what are the benefits of partitioning in an InfoCube?
    • It is the method of dividing a table which would enable a quick reference. SAP uses fact file partitioning to improve performance. We can partition only at 0CALMONTH or 0FISCPER. Table partitioning helps to run the report faster as data is stored in the relevant partitions. Also table maintenance becomes easier. Oracle, Informix, IBM DB2/390 supports table partitioning while SAP DB, Microsoft SQL Server, IBM DB2/400 do not support table portioning.
    15. How many extra partitions are created and why?
    • Two partitions are created for date before the begin date and after the end date.
    16. What are the options available in transfer rule?
    • InfoObject
    • Constant
    • Routine
    • Formula
    17. How would you optimize the dimensions?
    • We should define as many dimensions as possible and we have to take care that no single dimension crosses more than 20% of the fact table size.
    18. What are Conversion Routines for units and currencies in the update rule?
    • Using this option we can write ABAP code for Units / Currencies conversion. If we enable this flag then unit of Key Figure appears in the ABAP code as an additional parameter. For example, we can convert units in Pounds to Kilos.
    19. Can an InfoObject be an InfoProvider, how and why?
    • Yes, when we want to report on Characteristics or Master Data. We have to right click on the InfoArea and select “Insert characteristic as data target”. For example, we can make 0CUSTOMER as an InfoProvider and report on it.
    20. What is Open Hub Service?
    • The Open Hub Service enables us to distribute data from an SAP BW system into external Data Marts, analytical applications, and other applications. We can ensure controlled distribution using several systems. The central object for exporting data is the InfoSpoke. We can define the source and the target object for the data. BW becomes a hub of an enterprise data warehouse. The distribution of data becomes clear through central monitoring from the distribution status in the BW system.
    21. How do you transform Open Hub Data?
    • Using BADI we can transform Open Hub Data according to the destination requirement.
    22. What is ODS?
    • Operational DataSource is used for detailed storage of data. We can overwrite data in the ODS. The data is stored in transparent tables.
    23. What are BW Statistics and what is its use?
    • They are group of Business Content InfoCubes which are used to measure performance for Query and Load Monitoring. It also shows the usage of aggregates, OLAP and Warehouse management.
    24. What are the steps to extract data from R/3?
    • Replicate DataSources
    • Assign InfoSources
    • Maintain Communication Structure and Transfer rules
    • Create and InfoPackage
    • Load Data
    25. What are the delta options available when you load from flat file?
    • The 3 options for Delta Management with Flat Files:
    o Full Upload
    o New Status for Changed records (ODS Object only)
    o Additive Delta (ODS Object & InfoCube)
    SAP BW Interview Questions 2
    1) What is process chain? How many types are there? How many we use in real time scenario? Can we define interdependent processes with tasks like data loading, cube compression, index maintenance, master data & ods activation in the best possible performance & data integrity.
    2) What is data integrityand how can we achieve this?
    3) What is index maintenance and what is the purpose to use this in real time?
    4) When and why use infocube compression in real time?
    5) What is mean by data modelling and what will the consultant do in data modelling?
    6) How can enhance business content and what for purpose we enhance business content (becausing we can activate business content)
    7) What is fine-tuning and how many types are there and what for purpose we done tuning in real time. tuning can only be done for infocube partitions and creating aggregates or any other?
    8) What is mean by multiprovider and what purpose we use multiprovider?
    9) What is scheduled and monitored data loads and for what purpose?
    Ans # 1:
    Process chains exists in Admin Work Bench. Using these we can automate ETTL processes. These allows BW guys to schedule all activities and monitor (T Code: RSPC).
    PROCESS CHAIN - Before defining PROCESS CHAIN, let us define PROCESS in any given process chain. Is a procedure either with in the SAP or external to it with a start and end. This process runs in the background.
    PROCESS CHAIN is set of such processes that are linked together in a chain. In other words each process is dependent on the previous process and dependencies are clearly defined in the process chain.
    This is normally done in order to automate a job or task that has to execute more than one process in order to complete the job or task.
    1. Check the Source System for that particular PC.
    2. Select the request ID (it will be in Header Tab) of PC
    3. Go to SM37 of Source System.
    4. Double Click on the Job.
    5. You will navigate to a screen
    6. In that Click "Job Details" button
    7. A small Pop-up Window comes
    8. In the Pop-up screen, take a note of
    a) Executing Server
    b) WP Number/PID
    9. Open a new SM37 (/OSM37) command
    10. In the Click on "Application Servers" button
    11. You can see different Application Servers.
    11. Goto Executing server, and Double Click (Point 8 (a))
    12. Goto PID (Point 8 (b))
    13. On the left most you can see a check box
    14. "Check" the check Box
    15. On the Menu Bar.. You can see "Process"
    16. In the "process" you have the Option "Cancel with Core"
    17. Click on that option. * -- Ramkumar K
    Ans # 2:
    Data Integrity is about eliminating duplicate entries in the database and achieve normalization.
    Ans # 4:
    InfoCube compression creates new cube by eliminating duplicates. Compressed infocubes require less storage space and are faster for retrieval of information. Here the catch is .. Once you compress, you can't alter the InfoCube. You are safe as long as you don't have any error in modeling.
    This compression can be done through Process Chain and also manually.
    Tips by: Anand
    Ans#3
    Indexing is a process where the data is stored by indexing it. Eg: A phone book... When we write somebodys number we write it as Prasads number would be in "P" and Rajesh's number would be in "R"... The phone book process is indexing.. similarly the storing of data by creating indexes is called indexing.
    Ans#5
    Datamodeling is a process where you collect the facts..the attributes associated to facts.. navigation atributes etc.. and after you collect all these you need to decide which one you ill be using. This process of collection is done by interviewing the end users, the power users, the share holders etc.. it is generally done by the Team Lead, Project Manager or sometimes a Sr. Consultant (4-5 yrs of exp) So if you are new you dont have to worry about it....But do remember that it is a imp aspect of any datawarehousing soln.. so make sure that you have read datamodeling before attending any interview or even starting to work....
    Ans#6
    We can enhance the Business Content bby adding fields to it. Since BC is delivered by SAP Inc it may not contain all the infoobjects, infocubes etc that you want to use according to your company's data model... eg: you have a customer infocube(In BC) but your company uses a attribute for say..apt number... then instead of constructing the whole infocube you can add the above field to the existing BC infocube and get going...
    Ans#7
    Tuning is the most imp process in BW..Tuning is done the increase efficiency.... that means lowering time for loading data in cube.. lowering time for accessing a query.. lowering time for doing a drill down etc.. fine tuning=lowering time(for everything possible)...tuning can be done by many things not only by partitions and aggregates there are various things you can do... for eg: compression, etc..
    Ans#8
    Multiprovider can combine various infoproviders for reporting purposes.. like you can combine 4-5 infocubes or 2-3 infocubes and 2-3 ODS or IC, ODS and Master data.. etc.. you can refer to help.sap.com for more info...
    Ans#9
    Scheduled data load means you have scheduled the loading of data for some particular date and time you can do it in scheduler tab if infoobject... and monitored means you are monitoring that particular data load or some other loads by using transaction RSMON.
    1.Procedure for repeat delta?
    You need to make the request status to Red in monitor screen and then delete it from ODS/Cube. Then when you open infopackage again, system will prompt you for repeat delta.
    also.....
    Goto RSA7->F2->Update Mode--->Delta Repetation
    Delta repeation is done based on type of upload you are carrying on.
    1. if you are loading masterdata then most of the time you will change the QM status to red and then repeat the delta for the repeat of delta. the delta is allowed only if you make the changes.
    and some times you need to do the RnD if the repeat of delta is not allowed even after the qm status id made to red. here you have to change the QM status to red.
    If this is not the case, the source system and therefore also the extractor, have not yet received any information regarding the last delta and you must set the request to GREEN in the monitor using a QM action.
    The system then requests a delta again since the last delta request has not yet occurred for the extractor.
    Afterwards, you must reset the old request that you previously set to GREEN to RED since it was incorrect and it would otherwise be requested as a data target by an ODS.
    Caution: If the termianted request was a REPEAT request itself, always set this to RED so that the system tries to carry out a repeat again.
    To determine whether a delta or a repeat are to be requested, the system ONLY uses the status of the monitor.
    It is irrelevant whether the request is updated in a data target somewhere.
    When activating requests in an ODS, the system checks delta repeat requests for completeness and the correct sequence.
    Each green delta/repeat request in the monitor that came from the same DataSource/source system combination must be updated in the ODS before activation, which means that in this case, you must set them back to RED in the monitor using a QM action when using the solution described above.
    If the source of the data is a DataMart, it is not just the DELTARNR field that is relevant (in the roosprmsc table in the system in which the source DataMart is, which is usually your BW system since it is a Myself extraction in this case), rather the status of the request tabstrip control is relevant as well.
    Therefore, after the last delta request has terminated, go to the administration of your data source and check whether the DataMart indicator is set for the request that you wanted to update last.
    If this is NOT the case, you must NOT request a repeat since the system would also retransfer the data of the last delta but one.
    This means, you must NOT start a delta InfoPackage which then would request a repeat because the monitor is still RED. For information about how to correct this problem, refer to the following section.
    For more information about this, see also Note 873401.
    Proceed as follows:
    Delete the rest of this request from ALL updated data targets, set the terminated request to GREEN IN THE MONITOR and request a new DELTA.
    Only if the DataMart indicator is set does the system carry out a repeat correctly and transfers only this data again.
    This means, that only in this case can you leave the monitor status as it is and restart the delta InfoPackage. Then this creates a repeat request
    In addition, you can generally also reset the DATAMART indicator and then work using a delta request after you have set the incorrect request to GREEN in the monitor.
    Simply start the delta InfoPackage after you have reset the DATAMART indicator AND after you have set the last request that was terminated to GREEN in the monitor.
    After the delta request has been carried out successfully, remember to reset the old incorrect request to RED since otherwise the problems mentioned above will occur when you activate the data in a target ODS.
    What is process chain and how you used it?
    A) Process chains are tool available in BW for Automation of upload of master data and transaction data while taking care of dependency between each processes.
    B) In one of our scenario we wanted to upload wholesale price infoobject which will have wholesale price for all the material. Then we wanted to load transaction data. While loading transaction data to populate wholesale price, there was a look up in the update rule on this InfoObject masterdata table. This dependency of first uploading masterdata and then uploading transaction data was done through the process chain.
    What is process chain and how you used it?
    A) We have used process chains to automate the delta loading process. Once you are finished with your design and testing you can automate the processes listed in RSPC. I have a real time example in the attachment.
    1. What is process chain and how you used it?
    Process chains are tool available in BW for Automation of upload of master data and transaction data while taking care of dependency between each processes.
    2. What is transaction for creating Process Chains ?
    RSPC .
    3. Explain Colector Process ?
    Collector processes are used to manage multiple predecessor
    processes that feed into the same subsequent process. The collector
    processes available for BW are:
    AND :
    All of the direct predecessor processes must raise an event in order for subsequent processes to be executed
    OR :
    A least one predecessor process must send an event The first predecessor process that sends an event triggers the subsequent process
    Any additional predecessor processes that send an event will again trigger
    subsequent process (Only if the chain is planned as “periodic”)
    EXOR : Exclusive “OR”
    Similar to regular “OR”, but there is only ONE execution of the successor
    processes, even if several predecessor processes raise an event
    4. What are application Process ?
    Application processes represent BW activities that are typically
    performed as part of BW operations.
    Examples include:
    Data load
    Attribute/Hierarchy Change run
    Aggregate rollup
    Reporting Agent Settings
    5. Tell some facts about process Chains
    Process chains are transportable Button for writing to a change request when
    maintaining a process chain in RSPC
    Process chains available in the transport connection wizard (administrator workbench)
    If a process “dumps”, it is treated in the same manner as a failed process
    Graphical display of Process Chain Maintenance requires the 620 SAPGUI and SAP BW 3.0B Frontend GUI
    A special control background job runs to facilitate the execution of the of the other batch jobs of the process chain
    Note your BTC process distribution, and make sure that an extra BTC process is available so the supporting control job can run immediately
    6. What happens when chain is activated ?
    When a chain gets activated It will be copied into active version The processes will be planned in batch as program RSPROCESS with type and variant given as parameters with job name BI_PROCESS_<TYPE> waiting for event, except the trigger The trigger is planned as specified in its variant, if “start via meta-chain” it is not planned to batch
    7. Steps in process chains ?
    Go to transaction code-> RSPC
    Follow the Basic Flow of Process chain..
    1. Start chain
    2. Delete BasicCube indexes
    3. Load data from the source system into the PSA
    4. Load data from the PSA into the ODS object
    5. Activate data in the ODS object
    6. Load data from the ODS object in the BasicCube
    7. Create indexes after loading for the BasicCube
    Also check out theese links:
    Help on "Remedy Tickets resolution"
    production support issues
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    https://forums.sdn.sap.com/click.jspa?searchID=678788&messageID=1842076
    Production Support
    Production support issues
    Business Intelligence Old Forum (Read Only Archive)
    http://help.sap.com/saphelp_nw2004s/helpdata/en/8f/c08b3baaa59649e10000000a11402f/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/19683495-0501-0010-4381-b31db6ece1e9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/36693695-0501-0010-698a-a015c6aac9e1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9936e790-0201-0010-f185-89d0377639db
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/263de690-0201-0010-bc9f-b65b3e7ba11c
    /people/siegfried.szameitat/blog/2006/02/26/restarting-processchains
    /people/siegfried.szameitat/blog/2006/02/26/restarting-processchains
    For common data load errors check this link:
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Re: In production Support , how i can acquire the knowledge
    Re: How to resolve tickets  its urgent
    Re: production support issues
    production support
    check it out
    /thread/152949 [original link is broken]
    production support issues
    production support issues
    Production Support Issues
    /thread/153963 [original link is broken]
    Issue log on SAP- BW Production support
    issues in production support
    Production support issues
    /thread/155620 [original link is broken]
    Production support issues
    Production support issues
    production errors
    Re: HI,wht r de errors in Support in BW
    Production Support
    /message/3267132#3267132 [original link is broken]
    Assign points if useful
    Regards,
    Hari Reddy

  • Using MDM in HR

    Sector public has requirement to control information of people when they change from one public institution to other and they want to keep central ID for every body in order to have the complete record. MDM is made to work with this kind of situation?

    Hi Sam,
    Generally, the consolidation of employee data in a cross-group environment may involve legal implications. For data consolidation you need at least the number and name of the employee, and the transfer of any such information on national or international scale is sometimes strictly forbidden. 
    Apart from that, there are often restrictions in the maintenance and reorganization of data.
    However, SAP MDM offers capabilities to define customer-specific objects. So it might be possible (taken all legal requirements into account) to define a small employee record on MDM, e.g. to have at least the information which employee is on which pay-roll etc.
    Regards,
    Markus

Maybe you are looking for

  • Mail not updating the smart folder list after "archive" correctly

    Hello, I am using OSX Mail on Yosemite (10.0.2) and have - since about two months ago - weird behavior of smart mailboxes. Previously, when I Archived  (ctrl-cmd-A) messages selected in Smart mailbox, the GUI immediately removed the messages from the

  • Another Flash question

    Hey folks-- I've created a site with some Flash buttons for navigation. I created them within Dreamweaver MX 2004. I'm using Mac OSX. Here's the problem: when I upload the files to the server, the flash button links work in Safari. They do not work i

  • The cost to add channel SNY?

    WHAT IS the cost to add channel SNY?

  • Need Sap Interactive forms Scenario.

    Hi Experts! Its an urgent, I need some simple scenario based on HR related for Sap Interactive forms by adobe, If it is workflow invoved also no problem. Any  one can send that plz. Advance Thanks, vikram.c

  • Installing two systems in one host - clarification

    Hi Experts, We have requirement to install the R3 and BI development system in the same host. I have completed installation of the BI7 system. The database used is SQL server 2005. Now when we install the other system, I am not sure how to go about t