Datasource on APO Planning Area - Transportation Error

Hi All,
             I have created the Datasource on APO Planning Area. Datasource working fine check in RSA3 and also in BW side. when transporting the data source from APO Dev to APO QA i am getting following error and transport fails. Please suggest.
Thanks
Christopher
   Execution of programs after import (XPRA)
   Transport request   : AD1K909333
   System              : AQ3
   tp path             : tp
   Version and release: 372.04.10 700
   Post-import methods for change/transport request: AD1K909333
      on the application server: hllsap112
   Post-import method RSA2_DSOURCE_AFTER_IMPORT started for OSOA L, date and time: 20080725125524
   Execution of "applications after import" method for DataSource '9ADS_PP_APO'
   Import paramter of AI method: 'I_DSOURCE:' = '9ADS_PP_APO'
   Import paramter of AI method: 'I_OBJVERS:' = 'A'
   Import paramter of AI method: 'I_CLNT:' = ' '
   Import paramter of AI method: 'LV_CLNT:' = '100'
   DataSource '9ADS_PP_APO': No valid entry in table /SAPAPO/TSAREAEX
   Planning area for DataSource '9ADS_PP_APO' does not exist in target system
   Extract structure /1APO/EXT_STRU100002737 is not active
   The extract structure /1APO/EXT_STRU100002737 of the DataSource 9ADS_PP_APO is invalid
   Errors occurred during post-handling RSA2_DSOURCE_AFTER_IMPORT for OSOA L
   RSA2_DSOURCE_AFTER_IMPORT belongs to package RSUM
   The errors affect the following components:
      BC-BW (BW Service API)
   Post-import method RSA2_DSOURCE_AFTER_IMPORT completed for OSOA L, date and time: 20080725125532
   Post-import methods of change/transport request AD1K909333 completed
        Start of subsequent processing ... 20080725125524
        End of subsequent processing... 20080725125532
   Execute reports for change/transport request: AD1K909333
   Reports for change/transport request AD1K909333 have been executed
        Start of................ 20080725125532
        End of.................. 20080725125532
   Execution of programs after import (XPRA)
   End date and time : 20080725125532   Ended with return code:  ===> 8 <===

Christopher,
There seems to be no extract strucutre available for this data source in quality. This is creating the problem in quality. The extract strucutre which is created in your scenario would be available in temp folder and that will not be availbale for transport, so you need to have the datasource generated in quality and then you transport the active version to quality so that it will be available with the changes as that of development.
Regards
Vijay

Similar Messages

  • Regarding datasources on 1 planning area

    Hi all,
    Can i create more than 1 datasource for one planning area.
    Please suggest.
    Shashi.

    Thanx a lot for your answer,but we has alreary 1 datasource on 1 planning area ,if i create 1 more datasource on that planning area ,does it effect the existing datasource becoz we have appended 2 new fields in the existing data source which is not present in that planning area.So i would like to know if i create a new data source for that planning area does it effect the existing datasouse.
                              If you answer this it could be very help ful to us...........
                       Thanx
    Regards,
    Shashi.

  • Virtual Cube to load APO Planning area (LC)?

    Hello,
    Does anyone know if it is technically possible to use a virtual cube in APO/BW to load the APO planning area (liveCache)?  It would be great to have some SAP documentation to suport this.
    Thanks,
    Chad

    Thanks for the reply.  I'm actually looking to source data from a non-sap system and would like to explore the option of using a virtual cube connected to the external system using SAP UDC (universal data connector).  The data could be loaded to a basic cube, but this would mean data redundancy.  If this can be avoided by the use of a virtual cube, I would prefer to use that method.  I just wasn't sure if SAP-APO would allow for the data to be loaded into liveCache from a virtual cube.  I do like the BAPI option also.  If a virtual cube with services is used, is an ABAP function module required to get the data?

  • Help Required for Mapping Key figures from Cube to APO Planning area.

    Hello Experts,
    We have created cube in APO BW and now we want to map it to Planning area how we can map.
    Can any body explain how we can map keyfigures?
    What is the use of livechache how it will be updated?
    Regards
    Ram

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • Help: An error when I initial the planning areas:internal error

    Hi,
      An error when I initial the planning areas, I create time series objects for planning area DP01, system show an error message to me as follow:
      internal error [/SAPAPO/OM_TS_TGRID_CREATE]
      Message no. /SAPAPO/TSM099
    Thanks,
      Thomson

    Hi Thomson,
    I think in planning area you did not only add the time series key figures
    to the planning area but also defined key figure semantics for those
    key figures. This must not be. To see those semantics you can check
    the key figure details in transaction /SAPAPO/MSDP_ADMIN on the
    key figure tab by selecting key figure details. For the time series
    keyfigures, the semantics column is filled with values like 300, 301,
    etc. If you wants to use time series key figures in addition
    to the standard SNP-key figures, the semantics field has to be empty.
    You can compare that with the settings in the other planning area where
    everything is working fine.
    If this is not the case then please refer the below note and execute the consistency reports as in note:
    577038     Data inconsistencies in SNP
    Regards,
    Sunitha.

  • Planning area transport issue - Urgent

    Hello Everyone,
    We are in SCM 5.1. In my planning area, there is already data and CVCs. When I make changes to the keyfigure disaggregation calculation type in the planning area and transport the changes from Dev to QA, all the CVCs and data in the QA system gets deleted. One thing to note here is that there is a message in the transport log - "No valid storage bucket profile exists". But that storage bucket profile exists in QA and I made sure to run the time series consistency check in Dev and QA before moving the transport.
    We opened an OSS message with SAP and they released 2 OSS notes and still after applying these 2 notes, the issue is not fixed. Supposedly in SCM 5.1, planning area changes could be transported to the target system without a data loss. Wanted to find out if anyone has come across this issue or if you had transported planning area changes after Go-Live without a data loss in SCM 5 or 5.1. I appreciate your quick reply to this message. Thanks.
    Regards,
    Mohammed.

    Hi,
    I need to run the report - RSDG_IOBJ_ACTIVATE for all the characteristics and keyfigures or just the keyfigures which I have changed in the planning area? Please note that I have changed only the disaggregation type in the planning area and not anything related to that keyfigure in RSA1.
    Do I need to run the report - RSDG_CUBE_ACTIVATE for the internal cube of the planning object structure?
    Please elaborate. Thanks.
    Regards,
    Mohammed.

  • Selected columns in APO planning area

    I have a requirment such that i have to copy the data from the columns selected by user and paste it to another column. my questuion is     Using abap code how can i find whether a column is selected  or which columns are selected.

    SVK,
    I don't know about ABAP, but Macro function COLUMN_MARKED() should do the trick.
    http://help.sap.com/saphelp_scm70/helpdata/EN/4b/592a4900e2584ae10000000a42189c/frameset.htm
    Best Regards,
    DB49

  • APO- BI Datasource from Planning Area

    Hi All,
    I need help with APO-BI datasource generated from Planning Area.
    In the Dev environment we had two clients:
    DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    However the Transport fails and the error message is:
    Cannot replicate DataSource
    Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    TIA
    Amrita

    Hi   Amrita Goswami
    Based on the above post it seems to be your maintain two clients in Dev one is for creation and another is for testing and when it comes to test environment your maintain only one client and maintain the DS in one it wont give any impact..
    Based on the error
    > +Cannot replicate DataSource+
    > +Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L+
    There could be two reasons
    1) Needs to replicate the data source once you have imported it to test environ ment and than ran the program "RSDS_DATASOURCE_ACTIVATE_ALL" by giving the name of the source and DS name if its BI 7.0
    If its 3.x then have to execute the program :"RS_TRANSTRU_ACTIVATE_ALL" By specifying the transfer structure name.
    2) RS_AFTER_IMPORT  in some cases its because of improper transport of the update rules.
    Solution would be recollect the transport and release the DS transport first and execute the ( 1)Activities and then transport the remaining._
    Hope its clear a little..!
    Thanks
    K M R
    ***Even if you have nothing, you can get anything.
                                                But your attitude & approach should be positive..!****
    >
    Amrita Goswami wrote:
    > Hi All,
    > I need help with APO-BI datasource generated from Planning Area.
    >
    > In the Dev environment we had two clients:
    >
    > DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    >
    > So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    >
    > Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    >
    > However the Transport fails and the error message is:
    > Cannot replicate DataSource
    > Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    >
    > If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    >
    > Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    >
    > TIA
    > Amrita
    Edited by: K M R on Feb 6, 2009 12:03 PM
    Edited by: K M R on Feb 6, 2009 12:18 PM

  • RE: Need help on cornering the APO BI issue relevant to Planning area

    HI Guys,
    Iam loading historical data from my infocube to APO planning area...
    MY Planning will be done in apo for weekly based..
    for thatmy client has configuired Fisc VArnt which is
    specific for 48 periods but not 52 periods...
    My clinet will be planning in week which will will be like it starts from 07.01.2010,
    14.01.2010,21.01.2010,31.01.2010..
    but for testing purpose we are taking thisfrom a flat file andloaded into the infocube..
    and created gen export datasource into scmapo system and loaded into cune from that cube iam feeding itto planning area..
    when I execute the transaction /n/sapapo/tscube tha data copied was successful,
    but the keyfigure amnt when I saw in the planning area Transaction /n.sapapo.sdp94which will be distributed across the week.
    lets say my keyfigure values is 100 in the infocube for the jan month and week 1
    and the value which i CAN SEE IN THE PLANNING AREA IS week1 25 week 2 25 week 3 25 and week 4 25
    which will be 100 total for amonth,
    but it should not be like that 100 should go into a particular week and should display 100 for that particular week..
    I have calmonth calday, fiscper(posting period) which we have maintained in ob29 as 48 periods
    when i derived calweek in the transformation iam getting 48 weeks but
    when i try to load it to planning area iAM GETTING AN ERROR LIKECOMBINATION IS INCONSISTENT..with the calmonth
    CODE WHICH i HAVE DERIVED CALWEEK FROM CALDAY:
    DATA: lv_year(4) TYPE c,
    lv_month(2) TYPE c,
    lv_day(2) TYPE c,
    lv_result(6) TYPE c,
    v_poper TYPE poper.
    lv_year = SOURCE_FIELDS-calday+0(4).
    lv_month = SOURCE_FIELDS-calday+4(2).
    lv_day = SOURCE_FIELDS-calday+6(2).
    SELECT SINGLE poper FROM t009b INTO v_poper WHERE bumon = lv_month
    AND butag = lv_day.
    IF sy-subrc = 0.
    CONCATENATE lv_year v_poper+1(2) INTO lv_result.
    RESULT = lv_result.
    CLEAR: lv_result.
    ENDIF.
    gURUS CAN ANY BODY THROW SOME LIGHT ON THIS.. iWILL BE HIGHLY OBLIGED
    when i load the data from infocube to planning arae using /SAPAPO/TSCCUBE, the copy was succeeful.. But the issue is the keyfigure value is dis aggregating..
    For ex If my sales hostory for week 1 001.2010 and for calmonth 01.2010 is 100, but it is disaggegating the values for whole month 4 weeks as 25,25,25,25 but it needs to b written as 100 for week 1.rather it should be aggregated on the highlevel as 100 for week 1.
    Do I need to check any Charecterstics combination but all are consistent....
    Even the periodicities in the planning area and infocube are consistent , since i am able to copy in to planning area..
    I dont have calweek in my infocube i have derived calweek with logic provided in my earlier thread, but as of now iam going with calyear and calmonth, fisper3 (postig period), since 48 posting periods are maintained in the 0b29 or t009b tables.
    whether I need to implement any oss notes for this, If I include calweek and calmonth and try to copy in to planning area Iam getting an error periodicities are not matching  SAP Note 1408753 - /sapapo/tscube wrong error message
    /SAPAPO/TSM 232
    Regards
    Balaram

    thnx for replying the thread..
    Where do I maintain this the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile) with the time periodicities for Fisc Varnt and Posting period specificu2026
    Can you pls elaborate on this Iam new to APO BW implementation part, Sine this is a burning issue
    Actually what seeting I need to do there, I think infocube structures are good and copying data in to planning area..
    I have calmonth and Fiscper3 for 48 periods in my infocube
    also whether I need to maintain any calweek in the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile)
    when I checked in planning book keyfigure overview it is maintained there for 48 periods
    For Planning Book-data View(Time Bkt Profile)  how can we achieve this...
    If you could throw some light more on this I have my APO Counter part I will ask him to change the master planning structure accordingly
    Regards
    Ram

  • Planning area data not updated in infocube in APO system

    Hi all,
    User is uploading a flat file using addon program to an infocube 1, from this infocube data for  sales forecast KF is getting transferred to a planning area KF using standard program RTS_INPUT_CUBE.  I can see the updated data in Planning book (which refer the same planning area) data views. In planning book the sales forecast data is also getting copyied to second KF 'Arrangement FC' and its correct.
    Now there is a infocube 2 (second) which is getting data directly from this planning area (infocube also contains both KFs). But When i checked this infocube 2 , the updated data is availabe in Sales forecast KF , but arrangement forecast KF is having old data only and user is running query on second KF , so he is getting wrong report.
    Since there is no data flow for this infocube 2, and it is getting data from planning area, I feel its remote infocube.
    I have also found the datasource for this planning area but don't know how to move further to find why data is not updating properly? Please help on this.
    I have information that 2 weeks before data was properly updated, but this time data is not updated,
    system version  is SAP SCM 4.0

    Hi Vivek
    it is advisable to run the background jobs when the planning books are not being accesses by the users to prevent such inconsistencis. Hence i would advise you to run the jobs durng non-working hours. and if you have a global system, then you may restrict to run the jobs based on regional levels.
    in addition, it is also a good practice to run consistency jobs before and after your have completed the background jobs. i hope you are using process chains to execute the sequeuce of jobs. if yes, then you can add consistency check jobs in the process chains.
    some of the consistency check reports are :
    /SAPAPO/OM17 - liveCache Consistency Check
    /SAPAPO/TSCONS - Consistency Check for Time Series Network
    /SAPAPO/CONSCHK - Model Consistency Check
    and so and so forth..
    you can find these conssistency jobs under APO Adiminstration --> Consistency checks in APO.
    let me know if this helps.
    Rgds, Sandeep

  • SEM Planning Areas Error

    Hi All,
    We have recently upgraded to BW 3.3 Content. After the upgrade I am unable to access/change the Planning Areas in BPS0 transaction. I am getting a pop-up with an error message as Syntax error in program /1SEM/CL_CHALEV_100ZDIN=======CP. (ZDIN is the name of my planning Area)
    The error is only for existing Planning Areas. I am able to create new Planning Areas without any problems.
    I have already refered the notes - 528726, 412101 and 435425. But no luck.
    Has anybody faced this issue.
    Bye
    Dinesh

    hi Dinesh,
    Have u refered that note.
    http://help.sap.com/saphelp_nw04/helpdata/en/5d/7c4b52691011d4b2f00050dadfb23f/frameset.htm
    hope it helps
    Regards
    Siddhu

  • Reassigning Datasource to a new planning area

    Hi,
    Currently we have a datasource in a planning area. However this planning area was not as per the naming conventions and hence a new planning area had to be created. No we need to reassign the datasource to this new planning area. Is there any way to achieve this without deleting the datasource and recreating a new one with the same name?
    Thanks & Regards
    Dharmendra

    There may be possible approaches through backend database table updates. However I would not recommend that. You would need to regenerate the datasources. Else there will be surely issues while migrating or even while doing backups.

  • Generating data source for a planning area

    Hi all,
    I have a problem when generating a datasource for a planning area as it gives the following logs.
    Activate table /1APO/EXT_STRUXXXXX
    Tabble 1APO/EXT_STRUXXXXX  was activated with warnings
    and the data source is not being activated.
    I have also created an entry in the table /SAPAPO/TSAREAEX and tried to activate the datasource for the planning area ,it is saying the datasource already exists.
    Can anyone give me a straight solution for this.
    thnks in advance.

    Hi Satish,
    After you generate the datasource, go to transaction /n/sapapo/sdp_extr and try doing a check on this datasource. If you still get those warning/error messages, try the Repair datasource option and see if it fixes the issue. In RSA1, replicate this datasource and then try activating it there and see if it works. Don't create an entry in the table /SAPAPO/TSAREAEX, as this is not a standard recommended way of doing it. Good luck
    Regards,
    Mohammed.

  • Back up cube for report from planning area

    Hi Gurus,
            We are in implementation and i had seen that client wanted to see daily data and wanted to delete that data and load next day data.One more point is client even wanted monthly data as well in a same cube and this two facilities should be in single info cube.
    How is is possible?
    for further info we have calday and calmonth time Chart.
    please suggest me some good ideas.
    I will be verythankfull to your suggestions.
    Thanks a Lot
    Regards,
    Raj

    Hi Raj,
    You can use SAP BW to create reports for the user. You have the option of using an external BW system or the BW system that is coupled with the APO server. If you have an external BW system, then it is recommended that you report from there. As you may know APO is an OLTP system and thus is configured for such. The external BW system will be configured for OLAP access and thus would be much more suitable for reporting purposes.
    You may want create a SAP RemoteCube so that the data in your report is "as fresh as possible".
    Here are the steps if you will be using the BW component in the APO system:
    1) Create an export datasource for your planning area. (Transaction /n/SAPAPO/SDP_EXTR)
    2) Create an InfoSource and attach the datasource your have created in step 1. (transaction RSA1 Modelling tab)
    3) Create a SAP RemoteCube and attach your InfoSource and SourceSystem to that.
    4) Create a BeX query and a BeX report (either in Web or Excel).
    If you will be using an external BW system here are the steps:
    1) Create an export datasource for your planning area in the APO system (Transaction /n/SAPAPO/SDP_EXTR)
    In the external BW system:
    2)replicate the datasource your have created.(transaction RSA1, Modelling tab, sourcesystems->choose the APO system->Right click->Replicate DataSources).
    3) Create an InfoSource and attach the datasource that you have created in step 2. (transaction RSA1 Modelling tab)
    3) Create a SAP RemoteCube and attach your InfoSource and SourceSystem to that.
    4) Create a BeX query and a BeX report (either in Web or Excel).
    Note that a RemoteCube is only suitable for few users only. If you will have many users, you need to create a Basic InfoCube instead.
    In your BeX query, you can choose the granularity of your report. If you want your report to be aggregated to monthly level then be sure to include the 0CALMONTH InfoObject.
    Please post again if things are not clear and I will be happy to help out.

  • SNP planning Area Backup

    Hi All,
         Currently i am working on 5.0 SCM.I have one query.
    For Reporting perpose we required to take backup of SNP planning Area,so that i can save SNP planning Area data in Info cube.I would like to see data in Infocube according to Product,Location & Resource wise only. So that i can use this data for reporting perpose.
         Can we create whole data flow process for SNP Planning Area?
         We have lot of Product & resources in APO,so is it feasible to take backup of SNP planning Area?
         I want to take backup of SNP planning area only for reporting perpose,so that i can publish report according to my Product,Location & Resources.
        So kindly suggest proper solution to achive above requirement.
       Thanks in Advance!
    Regards
    Sujay

    Attaching note 428147
    Summary
    Symptom
    When extracting data from an SNP planning area, you must take account of some special features. These are explained in this note.
    Solution
          1. Data can be extracted only for existing SNP aggregates.
          Existing SNP aggregates are:
              o 9AAC (9AACTNAME)
              o 9AACPR (9AACTNAME, 9APPMNAME)
              o 9ALA (9ATRNAME)
              o 9ALO (9ALOCNO)
              o 9ALORE (9ALOCNO, 9ARNAME)
              o 9AMA (9AMATNR)
              o 9AMALA (9AMATNR, 9ATRNAME)
              o 9AMALARE (9AMATNR, 9ARNAME, 9ATRNAME)
              o 9AMALO (9ALOCNO, 9AMATNR)
              o 9AMALORE (9ALOCNO, 9AMATNR, 9ARNAME)
              o 9AMAPR (9AMATNR, 9APPMNAME)
              o 9AMARE (9AMATNR, 9ARNAME)
              o 9APR (9APPMNAME)
              o 9ARE (9ARNAME)
              o 9AREPR (9APPMNAME, 9ARNAME)
               Only the characteristic/characteristic combinations contained in these aggregates may be used in the InfoSource, transfer structure and transfer rules.The simplest way of doing this is by always generating Export-DataSources for SNP planning areas to one of the aggregates and not to the Basis planning object structure '9ASNPBAS'.A typical result of using the Basis planning object structure '9ASNPBAS' without correcting the transfer structure is that, while data is transferred to a RemoteCube, it cannot be extracted in a BasisCube.
    As of APO 3.0 Support Package 17 (or note 443613), only export DataSources for aggregates can be generated in SNP planning areas  they are no longer generated for the Basis planning object structure.
    Furthermore, note that the navigation attributes 9ATTYPE, 9ALOCTO, and 9ALOCFROM up to APO 3.0 Support Package 17 (or note 441653) must be deleted from the transfer structure for the SNP aggregates '9AMALA' (9AMATNR/9ATRNAME) and '9AMALARE' (9AMATNR/9ATRNAME/9ARNAME). These navigation attributes can be used from the specified correction.
    Due to technical restrictions the characteristics '9ATRNAME' and '9ATTYPE' can only be restricted with single values in the selection for the data extraction.The use of intervals and formatting characters (+/*) is not allowed for these characteristics.
          2. For an SNP aggregate, only the key figures assigned to it can be extracted. You can see which key figures are assigned to an aggregate in transaction /SAPAPO/MSDP_ADMIN on the Key figure assignment tab of the relevant planning area. As of APO 3.0 Support Package 17 (or note 439366), only the key figures assigned to this aggregate are taken into account when an export DataSource is being generated for aggregates.
          3. During the definition of queries, the following restrictions apply:
               In the case of queries for the aggregate '9AMALA', certain key figures can only ever be extracted with another key figure.The key figure combinations that can only be extracted together are:
              o (9AFSHIP, 9ADMDDF, 9ADMDDFVMI)
              o (9AFSHIPLP, 9ADMDDFLP)
              o (9AFSHIPVMI, 9ADMDDFVMI)
              o (9APSHIP, 9ADMDDI, 9ADMDDIVMI)
              o (9APSHIPLP, 9ADMDDILP)
              o (9APSHIPVMI, 9ADMDDIVMI)
              o (9ATSHIP, 9ADMDDT, 9ADMDDTVMI)
              o (9ATSHIPVMI, 9ADMDDTVMI)

Maybe you are looking for

  • How to populate the fields in a report.

    Hi, I've created a function module which calls a report. The report generates a file with sequence no. in the application server. The problem now is that I want the data from the presentation layer but the Z-Report doesn't generate any output with da

  • Number of Bags field

    I would like to using Number of Bags field for MIGO T code Transactions.but i am not getting Tables for these field.please tell me where it is saved

  • How do I get rid of spinning rainbow wheel without the OS ? Please help

    How do I get rid of spinning wheel without the OS (CD) ?  I live overseas and left it back home in the US. Sincere thanks  you for your help. Desperate Mom!

  • Change Value of Veriables in Enter-Query mode

    hi all I have made a search utility on a table, there is reqirement that in the enter-query mode the user can enter any combination of character, in simple words there is requirement of %% at begining and at end of desired search field. as :system.la

  • HOW TO REMOVE DUPLICATES FROM INFOCUBES

    If by mistake , data is loaded twice to the infocube then how to eliminate the data from the infocube?