Info mart

SAP,in BW, has provided the concept of info marts to find out a solution for the data delivery problem.
what actually is an info mart?
please help

Hi..
An infomart  is an ..
1. It is dynamic and Disposable
2. It is Volatile.
3. It is a subset of Information for specific auidence
4. it it persiscent or virtual, Multidimentional or flat.
5. It focuses on reporting and analysis
with regards,
hari kv

Similar Messages

  • How to find Info Source for Export Data source in Data Marts node

    Hi
                  I need to load data from ODS to Info Cube. I created the Export Data source for the ODS. I can see the Export Data Source but in the Data Marts node of Info Source i cannot find the Info Source for the Export Data source i created. I replicated Data sources in the BW source system. I also tried to use Insert Lost Nodes from the Context Menu of the InfoSource node but nothing worked. Please let me know what i need to do to see the Info source in the Data Marts.
    Thanks
    Padma

    In the infosource tab in RSA1 - use settings --> display generated objects
    you will be able to see the datamart infosources..

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Data Mart and Data Extraction from an Infocube

    Can a data mart which is built on one Infocube in BW support delta extraction? We have two separate BW systems and are trying to extract data from one Infocube in one BW (Source) and load it to one Infocube in another BW(Target)?
    We have built a data mart on the source BW infocube and have successfully been able to extract and load the initial data load into the target BW infocube. I noticed that the field 0RECORDMODE was not on the datasource that was created to support this data mart so my gut feeling is that we will not be able to do delta data extractions from this data mart.  Any feedback or confirmation of this?

    Hi,
    The InfoCube that is used as an export-DataSource is first initialized , meaning that the current status is transferred into the target BW. When the next upload comes around only those requests will then be transferred that have come in since the time of initialization. Different target systems can also be supplied like this
    Since it is request based ..data mart on info cube supports delta.
    Hope its helpful,
    Anu.

  • Transport Error on Data Mart Application Component

    Hi All,
    I have to transportthe export data source for my DSOs. I did use transport collection to make sure all the relevent objects are collected. When I transport the export data sources everything is going fine, but the DM (Application Component for the export data sources fail and put the object in repair status).
    The 8* data sources are active and available in the myself source system and the update rules from the DSO to Cube is also active, but the info source is not there because the Data Mart - Application component failed. It says as follows in the transport log.
    'Object R3TRAPCODM is in repair status. Therefore, it cannot be imported'.
    If the object is in repair status, can I ask the basis guys to re-import with over-write option? will it fix the issue. I could not see the repair statuis on the object as it is an SAP component.
    Or
    anyone tell me how to transport export data sources. I am in BI 7.0 but using RSA1OLD setup.
    Anyclues....?
    Thanks,
    Alex(Arthur Samson)

    I am closing this question. Identified the cause of the error.

  • Oracle BPM Process Data mart

    I am required to create audit reports on BPM workflows.
    I am new to thid & need some guidance on configuring BPM Process Data mart. What are the pre-requisites for configuring it & what are the steps to do it.
    Also, need some inputs on BAM database. What is the frequency of data upload. Is it data update or insert in BAM.

    Hi,
    You might want to check out the Administration and Configuration Guides on http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/index.html.
    I suspect you might find the BAM and Data Mart portions of this documentation a bit terse, so I've added the steps below that provides more detail. I wrote this for ALBPM 6.0, but believe it will still work for Oracle BPM 10g. It was created from an earlier ALBPM 5.7 document Support wrote called "ALBPM 5_7 Configuring and Troubleshooting the BAM and DataMart Updater.pdf.
    You can define how often you want the contents in both databases updated (actually inserted) and how long you want to persist the contents of the BAM database during the configuration.
    Here's the contents of the document:
    1. Introduction
    The use of BAM (Business Activity Monitoring) and Data Mart (or Warehouse) information is becoming more and more widespread in today’s BPM project implementations for the obvious benefits they bring to the management and tuning of processes.
    BAM is basically composed by a collection of measurements of current processes load and execution times. This gives us an idea of how the business is doing at this moment (in a pseudo real-time fashion).
    Data Mart, on the other hand, is a historical view of the processes load and execution times. And this gives us an idea of how the business has developed since the moment the projects have been put in place.
    In this document we are not going to describe exhaustively all configuration aspects of the BAM and Data Mart Updater, but rather we will quickly move from one configuration step to another paying more attention to subjects that have presented some difficulties in real-life projects.
    2. Creating the Service Endpoints
    The databases for BAM and for Data Mart first have to be defined in the External Resources section of the BPM Process Administrator.
    In this following example the service endpoint ‘BAMJ2EEWL’ is being defined. This definition is going to be used later as BAM storage. At this point nothing is created.
    Add an External Resource with the name ‘BAMJ2EEWL’ and, as we use Oracle, select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMBAM’. This user, and its database, will be created later
    ·     the password for the user
    Scroll down to the bottom of the page and click <Save>.
    In addition to a standard JDBC connection that is going to be used by the Updater Service, a remote JDBC configuration needs to be added as the Engine runs in a WebLogic J2EE container. This Data Source is needed to grant the Engine access over BAM tables thru the J2EE Connection Pool instead of thru a dedicated JDBC. The following is an example of how to set this up.
    Add an External Resource with the name ‘BAMremote’ and select the Oracle driver, then click <Next>
    On the following screen, specify:
    ·     the Lookup Name that will be used subsequently in WebLogic - here I have given it the name ‘XAbamDS’
    Then click <Save>.
    In the next example the definition ‘DWHJ2EEWL’ is created to be used later as Data Mart storage. If you are not going to use a Data Mart storage you can skip this step.
    Add an External Resource with the name ‘DWHJ2EEWL’ and select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMDWH’. This user, and its database, will be created later
    ·     the password for the user
    3. Configuring BAM Updater Service
    Once the service endpoint has been created the next step is to enable the BAM update, select the service endpoint to be used as BAM storage and configure update frequency and others. Here the “Updater Database Configuration” is the standard JDBC we configured earlier and the “Runtime Database Configuration” is the Remote JDBC as we are using the J2EE Engine.
    So, here’s the example of how to set up the BAM Updater service….
    Go into ‘Process Monitoring’ and select the ‘BAM’ tab and enter the relevant information (using the names created earlier – use the drop down list to select):
    Note that here, to allow me to quickly test BAM reporting, I have set the update frequency to 1 minute. This would not be the production setting.
    Once the data is input, click <Save>.
    We now have to create the schema and related tables. For this we will open the “Manage Database” page that has appeared at the bottom of the BAM screen (you may have to re-select that Tab) and select to create the database and the data structure. The user required to perform this operation is the DB system administrator:
    Text showing the successful creation of the database and data structures should appear.
    Once we are done with the schema creation, we can move to the Process Data Mart configuration screen to set up the Common Updater Service parameters. Notice that the service has not been started yet… We will get to that point later.
    4. Configuring Process Data Mart Updater Service
    In the case that Data Mart information is not going to be used, the “Enable Automatic Update” checkbox must be left off and the “Runtime Database Configuration” empty for this service. Additionally, the rest of this section can be skipped.
    In the case it is going to be used, the detail level, snapshot time and the time of update should be configured; in addition to enabling the updater and choosing the storage configuration. An example is shown below:
    Still in ‘Process Monitoring’, select the ‘Process Data Mart’ tab and enter the name created earlier (use the drop down list to select).
    Also, un-tick the Generate O3 Cubes (see later notes):
    Then click <Save>.
    Once those properties have been configured the database and the data structure have to be created. This is performed at the “Manage Database” page for which the link has appeared at the bottom of the page (as with BAM). Even when this page is identical to the one shown above (for the BAM configuration) it has been opened from the link in the “Process Data Mart” page and this makes it different.
    Text showing the successful creation of the database and data structures should appear.
    5. Configuring Common Updater Service Parameters
    In the “Process Data Mart” tab of the Process Monitoring section -along with the parameters that are specific to the Data Mart - we will find some parameters that are common to all services. These parameters are:
    • Log directory: location of the log file
    • Messages logged from Data Store Updater: severity level of the Updater logs
    • Language
    • Generate Performance Metrics: enables performance metrics generation
    • Generate Workload Metrics: enables workload metrics generation
    • Generate O3 Cubes: enables O3 Cubes generation
    In this document we are not going to describe in detail each parameter. But we will mention a few caveats:
    a. the Log directory must be specified in order for the logs to be generated
    b. the Messages logged from Data Store Updater, which indicates the level
    of the logs, should be DEBUG for troubleshooting and WARNING otherwise
    c. Performance and Workload Metrics need to be on for the typical BAM usage and, even when either metric might not be used on the initial project releases, it is recommended to leave them on in case they turn out to be useful in the future
    d. the Generation of O3 Cubes must be off if this service is not used, otherwise the Data Mart Updater service might not work properly .
    The only changes required on this screen was to de-select the ‘Generate O3 Cubes’ as shown in the last section.
    6. Set up the WebLogic configuration
    We need to set up the JDBC data source specified above, so go to Services / JDBC / Data Sources.
    Click on <Lock and Edit> and then <New> to add a New data source.
    Specify:
    ·     the Name – use the name you set up in the Process Administrator
    ·     the JNDI Name – again use the name you set up in the Process Administrator
    ·     the Database Type – Oracle
    ·     use the default Oracle Database Driver
    Then click <Next>
    On the next screen, click <Next>
    On the next screen specify:
    ·     the Database Name – this is the SID – for me that is XE
    ·     the Host Name – as I am running on my laptop, I’ve just specified ‘localhost’
    ·     the Database User Name and Password – this is the BAM database user specified in the Process Administrator
    Then click <Next>
    On the next screen, you can test the configuration to make sure you have got it right, then click <Next>
    On the next screen, select your server as the target server and click <Finish>:
    Finally, click <Activate Changes>.
    7. The Last Step: Starting Up and Shutting Down the Updater Service
    ALBPM distribution is different depending on the Operating System. In the case of the Updater Service:
    -     For Unix like Operating Systems the service is started or stopped with the albpmwarehouse.sh shell script. The command in this case is going to look like this:
    $ALBPM_HOME/bin$ ./albpmwarehouse.sh start
    -     For Windows Operating Systems the service is installed or uninstalled as a Windows Service with the albpmwarehouse.bat batch file. The command will look like:
    %ALBPM_HOME%\bin> albpmwarehouse.bat install
    After installing the service, it has to be started|stopped from the Microsoft Management Console. Note also that Windows will start automatically the installed service when the computer starts. In either case the location of the script is ALBPM_HOME/bin Where ALBPM_HOME is the ALBPM installation directory. An example will be:
    C:\bea\albpm6.0\j2eewl\bin\albpmwarehouse.bat
    8. Finally: Running BAM dashboards to show it is Working
    Now we have finally got the BAM service running, we can run dashboards from within Workspace and see the results:
    9. General BAM and Data Mart Caveats
    a. The basic difference between these two collections of measurements is that BAM keeps track of current processes load and execution times while Data Mart contains a historical view of those same measurements. This is why BAM information is collected frequently (every minute) and cleared out every several hours (or every day) and why Data Mart is updated infrequently (once a day) and grows indefinitely. Moreover, BAM measurements can be though of as a minute-by-minute sequence of Engine Events snapshots, while Data Mart measurements will be a daily sequence of Engine Events snapshots.
    b. BAM and Data Mart table schemas are very similar but they are not the same. Thus, it is important not to use a schema created with the Manage Database for BAM as Data Mart storage or vice-versa. If these schemas are exchanged by mistake, the updater service will run anyway but no data will be added to the tables and there will be errors in the log indicating that the schema is incorrect or that some tables could not be found.
    c. BAM and Data Mart Information and Services are independent from one another. Any of them can be configured and running without the other one. The information is extracted directly from the Engine Database (PPROCINSTEVENT table is the main source of info) for both of them.
    d. So far there has not been a mention of engines, projects or processes in any of the BAM or Data Mart configurations. This is because the metrics of all projects published under the current Process Administrator (or, more precisely, FDI Directory) are going to be collected.
    e. It is also important to note that only activities for which events are generated are going to be measured (and therefore, shown in the metrics). The project default is to generate events only for Interactive activities. This can be changed for any particular activity and for the whole process (where the activity setting, when specified, overrides the process setting). Unfortunately, there is no project setting for events generation so far; thus, remember to edit the level of event generation for every new process that is added to the project.
    f. BAM and Data Mart metrics are usually enriched with Business Variables. These variables are a special type of External Variables. An External Variable is a process variable with the scope of an Instance and whose value is stored on a separate column in the Engine Instances table. This allows the creation of views and filters based on this variable. A Business Variable, then, shares all the properties of an External Variable plus the fact that its value is collected in all BAM and Data Mart measurements (in some cases the value is shown as it is for a particular instance and in others the value is aggregated).
    The caveat here is that there is a maximum number of 256 Business Variables per FDI. Therefore, when publishing several projects into a single FDI directory it is recommendable to reuse business variables. This is achieved by mapping similar Business Variables of different projects with a unique real Variable (on the variable mapping performed at publish time).
    g. Configuring the Updater Service Log
    In section 5. Configuring Common Updater Service Parameters we have seen that there are two common Updater properties related to logging. These properties are “Log directory” and “Messages logged from Data Store Updater”, and they specify the location and level of these two files:
    - dwupdater.log: which is the log for the Data Mart updater service
    - bam-dwupdater.log: which is the log for the BAM updater service
    In addition to these two properties, there is a configuration file called ‘WarehouseService.conf’ that allows us to modify these other properties:
    - wrapper.console.loglevel: level for the updater service log
    - wrapper.logfile.loglevel: level for the updater service log
    - wrapper.java.additional.n: additional argument to the service JVM
    - wrapper.logfile.maxsize: maximum size of the updater service log files
    - wrapper.logfile.maxfiles: maximum number of updater service log files
    - wrapper.logfile: updater service log file name (the default value is dwupdater-service.log)
    9.1. Updater Service Log Configuration Caveats
    a. The first three parameters listed above have to be modified when increasing the log level to DEBUG (since the default is WARNING). The loglevel parameters have to be set to DEBUG and a java.additional.n (where n is a consecutive integer to the already used ones) has to be set to –ea to enable asserts, since without this option no DEBUG message is going to be generated.
    b. Of the other arguments, maxfiles might need to be increased to hold a few more days of data when the log level is set to DEBUG (with the default value up to two days are stored).
    c. The updater service has to be stopped, uninstalled, installed and then started for any of these changes to take effect.
    Hope this helps,
    Dan

  • Info set , info provider and info spoke

    Hi gurus plz give me the difference in between info set and info provider and info spoke. and give me what are the prerequisites are needed to process infoset, infoprovider and info spoke?
    Regards
    Vidhu

    InfoProvider
    An InfoProvider is an object for which queries can be created or executed in BEx
    Check this for more,
    http://help.sap.com/saphelp_nw04/helpdata/en/4d/c3cd3a9ac2cc6ce10000000a114084/content.htm
    Infoset:
    An InfoSet is a special view of a dataset, such as logical database, table join, table, and sequential file, and is used by SAP Query as a source data. InfoSets determine the tables or fields in these tables that can be referenced by a report. In most cases, InfoSets are based on logical databases.
    SAP Query includes a component for maintaining InfoSets. When you create an InfoSet, a DataSource in an application system is selected.
    Navigating in a BW to an InfoSet Query, using one or more ODS objects or InfoObjects.You can also drill-through to BEx queries and InfoSet Queries from a second BW system, that isConnected as a data mart.
    The InfoSet Query functions allow you to report using flat data tables (master data reporting).Choose InfoObjects or ODS objects as data sources. These can be connected using joins.You define the data sources in an InfoSet. An InfoSet can contain data from one or more tables that are connected to one another by key fields.The data sources specified in the InfoSet form the basis of the InfoSet Query.
    Check the link for more,
    http://help.sap.com/saphelp_nw2004s/helpdata/en/ad/2225391d4f000be10000000a114084/frameset.htm
    InfoSpoke:
    Info spoke is an open hub destination defines to which target the data is to be relayed.
    Check this link for more,
    http://help.sap.com/saphelp_nw2004s/helpdata/en/ad/2225391d4f000be10000000a114084/frameset.htm
    Check this doc also,
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/eb462104-0701-0010-07ae-d866630e0989
    Hope this gives you a good idea...

  • Difference bw open hub destination and info spoke ???

    Hi,
    what are difference b/w open hub destination and info spoke ???
    Thanks In advance.
    Regards
      Nitin

    Hi,
    The open hub destination is the object that allows you to distribute data from a BI system to non-SAP data marts, analytical applications, and other applications. It ensures controlled distribution across multiple systems.
    The open hub destination defines the target to which the data is transferred.
    An InfoSpoke can extract data in two ways:
    1.Full mode (F): the data corresponding to the selection criteria of the InfoSpoke is transferred from the data source into the respective destination.
    2.Delta mode (D): records that were added since the last extraction only are transferred in this mode. This is only possible for the source objects InfoCube and DataStore object.
    See the below Articles:
    InfoSpoke: Part -1
    www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/g-i/infospoke%20part%201.pdf
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

  • How to import date when having info source in SAP BI/BW

    i have an infosource and infoobjects (sap demo) in SAP BI and want to import data from SAP-R3 system now.
    unfortunately i really dont know how.
    i would like to go to source system, replicate datasource but SAP DEMO is not included in data sources,how do i get it there?
    f.e. 0D_VENDOR is in infoobject but not in datasource....
    please help a newbie,thanks!

    Hi Friend,
    0D_VENDOR is in info object but not in datasource.
    It looks that you want to have a datasource for 0D_VENDOR. If yes then follow the below steps.
    1. To use any Charactestic Infoobject as Export Datasource you have to check the "Export datasource" checkbox under master data.
    This checkbox you can find in Master Data/texts tab for that infoobject.
    2. Now you can find it in the datasources in the Source System. Go to Source systems and go to Datasource overview for your BW system. You should find the datasource 8****** under BW Datasources-> Business Info Warehouse -> Data Marts -> Data Marts Master Data.
    Hope this helps.
    Regards
    Hemant Khemani

  • Issue in loading data from data mart

    Actual load is from APO to BW is happening, but load gets failed when it is loaded from base cube to target cube (data mart), this flow is in 3.5.
    Attached document contains the system msg.

    Your load is still in process.
    Where load failure mesasge?
    Try to activate source 8DSO/Cube and later actiavte update rules thru prorgams.
    Can you check your job details from SM37.
    From info pack monitor, copy request id.
    Go SM37,Job name - <enter copied request id>.
    Use job status - Active/cancel/finished.
    Use proper data ranges.
    User - *

  • Data target is not available when creating an info package in BW

    Hi,
    I am not sure how, but I have accidentally deleted the info packages for one of the info sources. I have created init package successfully, but when I try to create a delta info package for the data mart, the data target( which is an info cube) is not appearing in the data target tab.
    Any help in this issue is higly appreciated.
    Thanks
    Prashant

    PRASHANT ASOPA , did you find a solution to this - I am facing a similar situation! I have tried almost everything including deleting and recreating the update rule, recreating the export datasource, activating everything a couple of times and of course cretaing and deleting info packages, but no matter what I do I can't get the system to recognize the data target.
    The "funny" thing is that everywhere else, even in the "Show data flow" everything looks right!

  • I am trying to delete the data from the info cube.

    Hi All,
               i am trying to delete the data from the info cube.in cube reporting symbol is availble.when i click on delete button delete symbol is coming again when i click on refresh button request is not deleting .please give me solution .
    Thanks,
    chandu

    Hi,
    Also try to delete in Data Mart Status.
    R u deleting particular request or the entire data in cube, if entire data then right click on Cube select Delete Data, it entirely deletes Data.
    Hope it helps you...Let us know if you still have any issues.
    Reg
    Pra

  • Info package is inactive in BW production

    I have imported transfer structure, info source, data mart data source and info package. All are in active but info package is inactive. It says that no transfer structure is active. When I open transfer rules everything is active.
    Is there any special methods to folllow when we work with data marts?
    Thanks in advance

    Please Replicate the data source activate the infosource .go to abap editor (SE38) and execute rs_transtruct_activate_all and give the infosource name and the source system name .
    Data marts are not transported . Please create data marts with expoert data marts facility.
    Message was edited by: PSG

  • What is the difference between Open hub destination and Info spokes

    what is the difference between Open hub destination and Info spokes?
    Please seacrh the forum before posting a thread
    Edited by: Pravender on Aug 16, 2010 11:17 AM

    Hi,
    When a user initiates open hub extraction by creating an InfoSpoke, behind-the-scenes activity involves OO ABAP which is calling classes to determine each of the different components involved in making the open hub extraction possible.  In particular, this enhancement will focus on 2 standard classes: one used to determine file destination name and path and the other to control the user interface of the InfoSpoke which will ultimately allow the user to enter his/her own filename and path.
    The open hub service enables us to distribute data from an SAP BW system into external data marts, analytical applications, and other applications. With this, we can ensure controlled distribution using several systems.
    The central object for the export of data is the InfoSpoke. Using this, we can define the object from which the data comes and into which target it is transferred.
    Regards,

  • Esquema de Cálculo - RM0002 - Info Record

    Pessoal,
    Boa noite!
    Gostaria de pedir uma ajuda. Estou em um projeto de implantação de SAP e no processo de testes de solicitação de cotação, encontrei um problema na momento de salvar o preço de mercado. O sistema acusa o seguinte erro: "Não foi possível determinar tipo de condição para o preço Nº mensagem 06657"
    Examinando a situação, verifiquei que o processo atualiza o Registro Info com as informações de cotação, mas nesse passo, o sistema reclama que a condição WOBT não está cadastrada para o esquema de cálculo RM0002.
    Verifiquei o esquema de cálculo, e confirmei que essa condição não estava cadastrada. Realizei o cadastro da condição no esquema, mas continuo encontrando o mesmo erro.
    Durante as análises, verifiquei que essa condição é utilizada no esquema de determinação de Pedido de Compras, entre outras coisas e consequentemente estava cadastrado no esquema RM0000.
    Por acaso alguém tem alguma idéia de como resolver esse problema? Procurei notas, posts em fóruns, porém sem sucesso.
    Desde já obrigado pela ajuda!
    Att,
    Raphael.

    Marcos,
    problema resolvido! Houve um problema em uma configuração que foi refeita e já está td ok!
    Determinar esquema de cálculo p/determinação preço de mercado em Determinar determinação de esquema.
    Muito obrigado!
    Att,
    Raphael

Maybe you are looking for

  • Issue with JPDs in my weblogic application

    I have a weblogic integration application running on WL 8.1 SP5. There are 2 JPDs and 2 web services in the application. I am having an issue in the live server when the application has been running for quite a while. One of the JPDs stops working. T

  • Is there any way to automatically sync my macbook to my imac?

    I suspect there should be an easy answer to this. My wife and I have separate accounts on our iMac. The iMac is automatically backing up to a 1Tb external drive via Time Machine. My wife also uses a Macbook for school and work and would like a good b

  • Applications won't launch.  (Quit unexpectedly)

    This problem just started today. I think it may have something to do with cleaning out my hard drive of stuff I didn't need. I ran 2 programs, Monolingual, which deletes all unnecessary languages from Safari, Adium, etc. The other program I ran was i

  • Message driven bean example is not working

    Hi, i tried message driven bean example on J2ee 6.20. I am able to make connection factory, but when i tried to deploy ear file a get an exception: Error occurred com.inqmy.services.deploy.container.DeploymentException: java.lang.NullPointerEx ceptio

  • AT100s Wi-Fi freezes few minutes after booting

    The Wi-Fi tick option freezes on "turning off" by its own. Basically its turning off and freezing while Im using the tablet in any way. Also the hardware buttons like volume +&- arent working and the power button wont work unless I force close the ta