BI Cubes for 0RT_PA_TRAN_GDS_MOV datasource?

Hi all,
I am trying to load the data source '0RT_PA_TRAN_GDS_MOV'. I have activated the datasource but not able to get data into it, though the task execution was successful. I guess I am missing some cubes activation here. Please help me on identifying relevant BI cubes for 0RT_PA_TRAN_GDS_MOV and how to activate it?
Thanks,
Venkatesan.

Hi Venkatesan,
In POS Data Management IMG  settings,
Under Intergration with other SAP Components, you can find the set Access of Remote cube to Transactional  Database,
Add the Datasource (POS Analytics DS),
Here you should define with relevant BW Task codes 0001 & 0004.
0001 - Supply BW without Distribution.
0004 - Supply BW with Distribution.
Try this and let me know.
Thanks and Regards,
Ramesh D

Similar Messages

  • There is a standard cube for 0SEM_BCS_10?

    Hi guys. I`m willng to use 0SEM_BCS_10 to get data fo preparation to consolidation. But I didn't find a standard infocube to get these data that cames from 0SEM_BCS_10..it will be a "staging" cube where the BCS cube (0BCS_C10) will get data by data stream ...do anyone know if there is this standard cube or I will have to create a Z?
    Regards,
    Thiago

    Hi,
    There is no standard cube for this datasource. You have to create a Z-cube. And please use more searching in the forum.  I answered such a question a couple of times before.

  • Loading the cube from 3 datasources and getting 3 records for each keyfield

    Hi All,
    I am loading an InfoCube from 3 separated datasources. These 3 datasources are UD Datasources and their unique source system is UD Connect.
    Each of the datasource contains a unique key field 'Incident Number' (same as we use have in Datasources for DSO).
    The problem is, when I am loading data with these 3 datasources to the cube, for each 'Incident number' there becomes 3 records.
    We have reports on this Infocube and the report also displays 3 records for each incident number.
    If I remove Incident Number key field from 2 of the Datasources, the data from these datasources do not reach to the Cube.
    For many of you, this may be a minor problem ( or may not be a problem at all !!! ) , but as a New Joinee in SAP field, this has become a showstopper issue for me.
    Please suggest.
    Thanks in Advance.

    Hi Pravender,
    Thanks for your interest.
    The scenario is, I have 3 datasources form the same source system, All the 3 datasources have different fields except 'Incident Number'. So, each and every field has only one value in the report. But due to 3 separate datasources, it creates 3 records displahying values of each datasource in a separate record.
    There is no field in the query output which is having different values for the different source systems. Due to 3 records in the cube, one record will contain the value for a particular field and the other two records will show a Blank for that field.
    Regards.

  • How to set  Non Cumulative Init flag for Generic datasource

    Hi all,
       I need to create a generic datasource which can be used to initialize the Non Cumulative Cube. Eg. setting the opening balance.
       The STOCKUPD field in the ROOSOURCE should be set to X so that the infopackage will have the "Generate Initial Status" option. But there is no interface available in R/3 pluggin to do this.
    How can this be done? Or is it not allowed for generic datasources as it is only valid of 2LIS_03_BX datasource
    for stock updation.
      Will there be other issues if I change the table entry for STOCKUPD in roosource table for my datasource?
    regs
    Anoop

    Hello,
      any idea on this?
    regs
    Anoop

  • Regarding BW cubes for CS (Customer Service) module

    Hi Experts,
    I need information regarding
    BW CUBES, REPORTS AND DATASOURCE
    for Customer Service(CS) module.
    Thanks in Advance
    HR

    Cubes :
    http://help.sap.com/saphelp_nw70/helpdata/en/b8/d50c3c99d56448e10000000a114084/content.htm
    Reports :
    http://help.sap.com/saphelp_nw70/helpdata/en/b5/81073c61563259e10000000a114084/content.htm
    Datasource :
    http://help.sap.com/saphelp_nw70/helpdata/en/08/d60c3c99d56448e10000000a114084/content.htm

  • Real-time data acquisition for HR datasources

    Dear Experts,
    I have a couple of questions on real-time data acquisition...
    a) Can you tell me if any standard HR datasources support real-time data acquisition?
    b) Can we apply real-time data acquisition for generic datasources? If yes, is there any difference in the process compared to working with business content datasources when using real-time data acquisition?
    Hope you can provide some answers...as always, points will be awarded for answers that effectively address the questions.
    Thanks a bunch.
    K

    Hi Karthik,
    a)The decision to go for SAP remote cube depends on the volume of data and the frequency of queries. It is advicible not to use the SAP remote cube if the data volume is high and also, the query frequency is high. Which module in HR are yuo implementing? In HR, the data volume is generally high. So, if you go for loading the data from R/3 every 3 hours, you are asfe as long as the loading volumes are less. For example, for implementing Time management, I would not advice frequent loads as it is time consuming process. So, make a decision based on the above mentioned lines. If the data volume is not high, I would prefer the SAP ermote cube as it will reduce managing the loads.
    b)I mentioned FM extractor just for the sake of control of data selection. You can even go for view/table extractor.
    Hope this helps.
    Thanks and Regards
    Subray Hegde

  • DSO Keyfields for 0CO_OM_CCA_1  datasource

    Hello All,
    I am using datasource 0CO_OM_CCA_1 to load my plan data. My most important concern is clearing out of the full load when I load the DTP from this DS as its full. I saw some suggestions here to use a Z DSO to avoid complications and just upload deltas. In that case what should be my key fields for the custom DSO.
    Please advise.
    Much thanks

    Hi Krrish,
    I would like to bring to your notice the benefits and limitations of the two stratergies, which may be would help you chose the correct one for your scenario.
    Deletion of overlapping request -
    1. If you choose delete when the selections are same or more comprehensive, any repair load would need to have the same granularity in selection as your original full DTP, else you run the risk of loosing data. ie : If you are loading period wise, weekly full loads to your cube for that corresponding period or week(selection of period and week).
    For any repair load you would do, you would have to use the selection of the granularity of period or week or a completely disparate selection.
    If you use a selection of period and CO area say, then you will delete a request in the cube for the same period and having data of all CO areas.
    An example of diosparate selection would be, use CO Area alone as the selection and use selective deletion of data all periods and then load this CO area for all periods. But again, here you have a risk that next time you do a repair for any past preiod, this DTP with disparate selection does not get deleted and you will end up with duplicates and again a selective deletion would be required.
    Summary - Repairing is complicated and needs data sets to be loaded in terms of full week or period
    2.You cannot compress the requests... otherwise delete overlapping selections would not work...Unless you surely are not going to load the data of the request being compressed.
    Advantages - Can be automated using process chain as you are already aware...
    DSO before Cube
    1. You reduce the data flow to the cube, by allowing only deltas to flow through. The amount of data being loaded and deleted is lesser, thus improving load performance in the cube. But you would increase activation time in the DSO.
    2. Repair from R/3 is simple in this case. Load the repair load to DSO and a delta generated would correct the data.
    3. Cube can be compressed. Improving query performance.
    Hope it helps,
    Best regards,
    Sunmit.

  • Records are repeatedly loading for RDA DataSource

    Hi
    Iam trying to load the data for custom datasource into DSO . This datasource is real time enabled one .
    Eventhough there is no new records are created in ECC system , the real time datasource getting the same number of old reocrds dailiy . I mean , the records are repeatedly loading into the DSO object .
    Why the records are repeatedly loading into the DSO object for real time datasource .
    Please let me know
    Regards
    mohammed

    Hi,
    In RSO2 click on a'GENERIC DELTA' and select the Time stamp and give the lower limit and upper limit
    and if you are loading the data into DSO then select the radio button is "New Status of changed record"
    if you are loading the data into CUBE then select the radio button is "Additive Delta".
    Regards,
    Yerrabelli.

  • How to run setup for new datasources without breaking old datasources?

    Hi,
    I am wondering how to install some more data sources (2LIS_05_Q0NOTIF, 2LIS_17_10NOTIF and 2LIS_18_10NOTIF) without breaking the delta mechanism for other data sources when I start to use the new datasources.
    I have installed some QM cubes (0QM_C04, -05, -08 and -11) and delta loads run every night.  Now I want to add two more cubes (0QM_C02 and -03). The new cubes use datasources that are not yet available.
    If I now activate the needed datasources, delete the content of the setup tables again and run 'Application-Specific Setup of Statistical Data' for 'Quality Management', 'Plant Maintenance' and 'Service Management', I am afraid that I will re-initialize the delta queue for the datasources that are already in use... In other words: I am afraid of breaking all datasources in application 05, 17 and 18...
    I cannot find documentation addressing this problem. Any input, anyone?
    Best regards,
    Christian Frier

    Hi all,
    if I understand you all correctly, I can use the following plan:
    0) Wait until no documents are posted on the R/3 side.
    1) Run the delta loads and check that the queues are empty (RSA7 and LBWQ).
    2) Delete the setup tables.
    3) Run the 'Application-Specific Setup of Statistical Data'
    4) Create and execute infopackages for initial dataload (for the new datasources).
    5) Create and execute infopackages for delta loads      (for the new datasources).
    6) Run the infopackages for delta load from the 'old' datasources without running a new initial dataload.
    So basically 6) is the step I worry about.
    That is that I can start to use 2LIS_05_ITEM, 2LIS_17_ITEM and 2LIS_18_ITEM without breaking the delta sequence for 2LIS_05_TASK, 2LIS_17_TASK and 2LIS_18_TASK.
    regards,
    Christian

  • How to find rows in F table of a cube for a given request ID

    Can someone tell me how to find the rows in the F table of a cube for a given request id?

    Hi,
    Copy the Request ID number of that cube and go to manage of the cube and select contents tab, select Infocube content,
    select list of objects to be displayed with results and in that screen give Request ID at Request ID row and at bottom Max no. of Hits keep blank and execute at the top.
    this will display all the data rows loaded with that request. this is records from cube.
    from Fact table, go to SE11, give /BIC/FXXXX "XXXX" is cube name for custom cube.
    get the SID of that cube from the Cube contents as mentioned above, for field select for output along with Request ID select Request SID also.
    from SE11 give table /BIC/DXXXXXP "XXXX" is cube name for custom cube. P for Packet Dimension. here pass reqest SID in 'SID_0REQUID' and get value for DIMID.
    pass this DIMID into KEY_XXXXP, this give the fact table rows for that request DIM ID.
    hope this helps
    Regards
    Daya Sagar

  • What is the method of getUserName() for JDBC DataSource in JDeveloper 11g.

    Hi All,
    I am facing a issue about getUserName() Method.
    The below method
      getDBTransaction().getConnectionMetadata().getUserName()
    working fine for JDBC URL While the  same method not working for JDBC DataSource.
    So What is optional method for getUserName() for JDBC DataSource,  I need  JDBC DataSource because in Our Standalone weblogic server we using data source.
    Maroof

    Hi Vohar,
    JNDI is there then how we can get user name you have any idea?
    Our Connection.xml look like below
                                        user                              oraJDBC                              1500                              192.168.0.0                              user1                                       thin                 

  • Workbook Displays Data Not Present in the Cube for Non-Cumulative KF

    Hi All,
    I have a user that is reporting using noncumulative info object 0TOTALSTCK. If I look in the cube, the data starts in October and goes from there. When viewing the data, there are figures that display for the months prior to October even though there is no data in the cube.
    The interesting thing is that when viewed in excel using the analyser, these figures display using square brackets. This is then throwing out the total values of the prior months as they are being added in.
    Does anyone have an idea why this value would display even though there is no data in the cube for it?
    Thanks in advance for any help supplied.

    Just thought I would share the solution to this.
    It looks like the validity date for the cube was not set up properly. I fixed it using transaction RSDV.
    Details on how to proceed can be found here:
    http://help.sap.com/saphelp_nw70/helpdata/EN/02/9a6a1f244411d5b2e30050da4c74dc/frameset.htm

  • Advantages of using Virtual cube & Multi Cube for Reporting

    Dear all,
    Can you please explain me the advantages of desiging Virtual Cube with services & multi Cubes for Reporting.
    Thanks in advance
    Thanks & regards
      Sailekha

    Hi sailekha;
    Pls go through the link below:
    http://help.sap.com/saphelp_nw04/helpdata/en/62/d2e26b696b11d5b2f50050da4c74dc/content.htm
    For the multi-cube;
    you can create a reports out of different infoprovider...let's say you have infocube for actual and another for plan...you can create a report to see actual and plan both...
    Hope this helps.
    BK

  • Dleta is not working properly for 2LIS_18_I0ITEM datasource

    Hello Experts,
    I am having issue with the extractor 2LIS_18_I0ITEM.
    All delta record are not coming to BI  some are coming and some are missing.
    it's production environment so I can't delete and fill the setup table because for that i need to lock the user.
    Is there any way to resolve this issue.
    Thanks
    Chetan

    hi,
    the solution for this would be to check for the delta update job, either v3 or queued delta job and schedule it properly.
    For correcting the missed delta records, identify the delta which is missed by you in BW system on a broader level example delta records for a particular day are not correct etc.
    Run the Setup table selectively for a selection on identified characteristic e.g. date etc.
    Delete the data selectively in the DSO and cube for this selection and then run a full repair from source till the cube selectively.
    For this you need not lock the users as you are extracting history data in the setup table and the new postings can happen.
    If still you are skeptical, do the following activity on a weekend.
    regards,
    Arvind.

  • Please guide me to develop cube for HR Analysis

    Hi All,
    My Client wants to build a cube for their HR Analysis. They have data in Peoplesoft and Kronos systems.
    I have never worked on any cubes for HR. I needed some guidance regarding that.
    If somebody has done any such implementation,some document or presentation which can be used as a base,then please forward it to me [email protected]
    Please guide me about what kind of analysis we can perform using cubes and in general how it looks like.Any suggestion will be highly appreciated.
    Thanks,
    Vikash
    Please direct me to the correct forum if i am at wrong place

    My guess is you would want to be in the Essbase or Planning forums. But It more sounds like you need to bring in a consultant that understands Essbase and HR reporting. The cube design will rely on the data and reporting and analysis needs

Maybe you are looking for

  • Why is my Photoshop CS4 crashing with Snow Leopard!?

    IN an attempt to make my machine work faster, an employee at the apple store said I should upgrade to snow leopard and get more RAM. Snow Leopard upgrade--done.  version 10.6.2 RAM--ordered 2 more gigs from crucial, not installed yet. Hardware: Model

  • LinkToAction how to use it with parameter?

    Hello, searching for tutorials about the use of the LinkToAction I have seen that here is a possiblity in WD4J to use a context-node. How can you pass a parameter in WD4A? (I am using a rowrepeater for a list of books and would like to pass the bookI

  • Graph palette like control

    Hi, I am trying to implement Graph palette like control for my application. Graph palette has 3 controls - cursor, zoom and hand tool. Cool thing is the zoom control. When you click on that - a picture menu(like, picture ring control) pops up. How ca

  • Can i still get the Gen. 1 phone

    I had the 1st generation iphone, but it was washed away by the ocean back in sept. My parents are letting me get an iphone again for christmas ( it was their way of showing responsibility) and i really want the 1st generation phone and not the new on

  • Why doesn't Photoshop have a glyphs panel?!

    I look for it every time I want to add a glyph because both Illustrator and InDesign have it. I have to open Illustrator, type the glyph, and then copy and paste it into Photoshop. It's a very inefficient process....