Copy Data Package (Referencing Target Issue)

Hello Experts,
We are trying to run an allocation but on runtime the application doesn't recognize the target variable of the data package.
This is the Script Logic we are using:
*RUNALLOCATION
*FACTOR = 1
*DIM VERSAO WHAT=$VF$; WHERE = $VT$
*ENDALLOCATION
This is the Data Package Script used to set the parameters:
PROMPT(COPYMOVEINPUT,%SELECTION%,%TOSELECTION%,"Test","VERSAO")
INFO(%EQU%,=)
INFO(%TAB%,;)
TASK(ZBPC_DESP_RATEIOS_RUN_1,SUSER,%USER%)
TASK(ZBPC_DESP_RATEIOS_RUN_1,SAPPSET,ORC_GERENC)
TASK(ZBPC_DESP_RATEIOS_RUN_1,SAPP,Desp_Resp_Oper)
TASK(ZBPC_DESP_RATEIOS_RUN_1,SELECTION,%SELECTION%)
TASK(ZBPC_DESP_RATEIOS_RUN_1,REPLACEPARAM,%VF%EQU%%VERSAO_SET%%TAB%VT%EQU%%VERSAO_TO%)
TASK(ZBPC_DESP_RATEIOS_RUN_1,LOGICFILENAME,TESTE.LGF)
TASK(ZBPC_DESP_RATEIOS_RUN_1,TAB,%TAB%)
TASK(ZBPC_DESP_RATEIOS_RUN_1,EQU,%EQU%)
When we run the Data Package, we receive a Succefull Status Message, but on the Log we can see that SAP BPC routine doesn't recognize our target. It understands that VERSAO_SET is the source of the Allocation, but it just don't read the target value of the prompt for wich variable we are using VERSAO_TO.
LOG BEGIN TIME:2009-08-10 10:39:14
FILE:\ROOT\WEBFOLDERS\ORC_GERENC\ADMINAPP\Desp_Resp_Oper\TESTE.LGF
USER:CBD\24613820
APPSET:ORC_GERENC
APPLICATION:Desp_Resp_Oper
FACTOR:1
ALLOCATION DATA REGION:
VERSAO:PTPL
VERSAO:WHAT:PTPL,WHERE:%VERSAO_TO%,USING:,TOTAL:
On this sample, "PTPL" is the version we've selected as source, and the target we've selected is "COM" but the variable can't read it.
Assuming that "_SET" is used to reference the first variable of the prompt, could you please clarify us regarding which tag should we use to reference the second variable of the prompt?
Thanks in advice!
Edited by: Adalberto  Vides Barbosa on Aug 10, 2009 7:26 PM

Hi,
As discussed by the experts earlier, the problem is getting the variable %VERSAO_TO%.
The variable %VERSAO_SET% is a dynamic variable used by BPC to get value from DM package. However, the other variable is not recognized by the system.
Hope this helps.

Similar Messages

  • Issue in Update routine due to Data Package

    We have this peculiar situation.
    The scenario is ..
    We have to load data from ODS1 to ODS2.
    The data package size is 9980 while transferring data from ODS1 to ODS2.
    In the update rule we have some calculations and we rank the records based on these calculations.
    The ODS key for both ODS1 and ODS2 is same ie Delivery Number , Delivery Item & Source System.
    For example a Delivery Number has 12 Delivery Items.
    These Delivery Items are in different Data Packages namely Data Package 1 and Data Package 4.
    So instead of having the ranks as 1 to 10 its calculating it as 1 to 5 and second item as 1 to 5.
    But what we require is Rank as 1 to 10.
    This is due to the fact that the items are in different Data packages.
    In this case the ABAP routine is working fine but the Data Package is the problem.
    Can anybody any alternative solution to this issue.?
    Thanks in advance for assistance.............

    CODE FOR INTER DATA PACKAGE TREATMENT
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    DATA: v_packet_nbr TYPE i VALUE 1.
    DATA:
      g_requnr  TYPE rsrequnr.
    DATA:
      l_is        TYPE string VALUE 'G_S_IS-RECNO',
      l_requnr    TYPE string VALUE 'G_S_MINFO-REQUNR'.
    FIELD-SYMBOLS: <g_f1> TYPE ANY,
                   <g_requnr> TYPE ANY.
    TYPES:
      BEGIN OF global_data_package.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: recno   LIKE sy-tabix,
      END OF global_data_package.
    DATA lt_data_package_collect TYPE STANDARD TABLE OF global_data_package.
    DATA ls_datapack TYPE global_data_package.
    datapackage enhancement Declaration
    TYPES: BEGIN OF datapak.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: END OF datapak.
    DATA: datapak1 TYPE STANDARD TABLE OF datapak,
          wa_datapak1 LIKE LINE OF datapak1.
    Declaration for Business Rules implementation
    TYPES : BEGIN OF ty_ydbsdppx.
            INCLUDE STRUCTURE /bic/aydbsdppx00.
    TYPES: END OF ty_ydbsdppx.
    DATA : it_ydbsdppx TYPE STANDARD TABLE OF ty_ydbsdppx WITH HEADER LINE,
           wa_ydbsdppx TYPE ty_ydbsdppx,
           temp TYPE /bic/aydbim00100-price,
           lv_tabix TYPE sy-tabix.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8YDBIM001.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    TABLES: rsmonfact.
      TYPES:
        BEGIN OF ls_rsmonfact,
          dp_nr TYPE rsmonfact-dp_nr,
        END OF ls_rsmonfact.
      DATA: k TYPE i,
            v_lines_1 TYPE i,
            v_lines_2 TYPE i,
            v_packet_max TYPE i.
    declaration of internal tables
      DATA: it_rsmonfact TYPE STANDARD TABLE OF ls_rsmonfact.
    INTER-PACKAGE COLLECTION TREATMENT *******************
      ASSIGN (l_requnr) TO <g_requnr>.
      SELECT dp_nr FROM rsmonfact
        INTO TABLE it_rsmonfact
        WHERE rnr = <g_requnr>.
      DESCRIBE TABLE it_rsmonfact LINES v_packet_max.
      IF v_packet_nbr < v_packet_max.
      APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
        CLEAR: DATA_PACKAGE.
        REFRESH DATA_PACKAGE.
        v_packet_nbr = v_packet_nbr + 1.
        CLEAR: MONITOR[], MONITOR.
        MONITOR-msgid = '00'.
        MONITOR-msgty = 'I'.
        MONITOR-msgno = '398'.
        MONITOR-msgv1 = 'All data_packages have been gathered in one. '.
        MONITOR-msgv2 = 'The last DATA_PACKAGE contains all records.'.
        APPEND MONITOR.
      ELSE.
    last data_package => perform Business Rules.
        IF v_packet_max > 1.
          APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
          CLEAR: DATA_PACKAGE[], DATA_PACKAGE.
          k = 1.
    We put back all package collected into data_package, handling recno.
          LOOP AT lt_data_package_collect INTO ls_datapack.
            ls_datapack-recno = k.
            APPEND ls_datapack TO DATA_PACKAGE.
            k = k + 1.
          ENDLOOP.
          CLEAR : lt_data_package_collect.
          REFRESH : lt_data_package_collect.
        ENDIF.
    sorting global data package and only keep the first occurence of the
    *record
      SORT DATA_PACKAGE BY material plant calmonth.
      DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE
            COMPARING material plant calyear.
      SELECT * FROM /bic/aydbsdppx00
          INTO TABLE it_ydbsdppx
          FOR ALL ENTRIES IN DATA_PACKAGE
            WHERE material = DATA_PACKAGE-material
              AND plant    = DATA_PACKAGE-plant
              AND calyear  = DATA_PACKAGE-calyear.
    Enhance Data_package with Target additionnal fields.
      LOOP AT DATA_PACKAGE.
        CLEAR : wa_datapak1, wa_ydbsdppx.
        MOVE-CORRESPONDING DATA_PACKAGE TO wa_datapak1.
        READ TABLE it_ydbsdppx INTO wa_ydbsdppx
          WITH KEY material = DATA_PACKAGE-material
                      plant = DATA_PACKAGE-plant
                    calyear = DATA_PACKAGE-calyear.
        IF sy-subrc NE 0.       "new product price
          APPEND wa_datapak1 TO datapak1.
        ELSE.                   " a product price already exists
          IF wa_ydbsdppx-calmonth GE DATA_PACKAGE-calmonth.
    keep the eldest one  (for each year), or overwrite price if same month
            APPEND wa_datapak1 TO datapak1.
          ENDIF.
        ENDIF.
      ENDLOOP.
    ENDIF.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Edited by: mansi dandavate on Jun 17, 2010 12:32 PM

  • Is it possible to have duplicate columns from source List to target List while copying data items by Site Content and Structure Tool in SharePoint 2010

    Hi everyone,
    Recently ,I have one publishing site template that has a lot of sub sites which  contain a large amount of  content. 
    On root publishing site, I have created custom list including many custom fields 
    and saved it as  template  in order that 
    any sub sites will be able to reuse it later on .  My scenario describe as follows.
    I need to apply Site Content and Structure Tool to copy   a lot of items
     from  one list to another. Both lists were created from same template
    I  use Site Content and Structure Tool to copy data from source list 
    to target list  as figure below.
    Once copied  ,  all items are completed.
     But many columns in target list have been duplicated from source list such as  PublishDate ,NumOrder, Detail  as  
    figure below  .
    What is the huge impact from this duplication?
    User  can input data into this list successfully  
    but several values of some columns like  “Link column” 
    won't  display on “AllItems.aspx” page 
    .  despite that they show on edit item form page and view item form page.
    In addition ,user  can input data into this list  as above but 
    any newly added item  won't appear on 
    on “AllItems.aspx” page
    at all  despite that actually, these 
    item are existing on  database(I try querying by power shell).
    Please recommend how to resolve this column duplication problem.

    Hi,
    According to your description, my understanding is that it displayed many repeated columns after you copy items from one list to another list in Site Content and Structure Tool.
    I have tested in my environment and it worked fine. I created a listA and created several columns in it. Then I saved it as template and created a listB from this template. Then I copied items from listA to listB in Site Content and Structure Tool and it
    worked fine.
    Please create a new list and save it as template. Then create a list from this template and test whether this issue occurs.
    Please operate in other site collections and test whether this issue occurs.
    As a workaround, you could copy items from one list to another list by coding using SharePoint Object Model.
    More information about SharePoint Object Model:
    http://msdn.microsoft.com/en-us/library/ms473633.ASPX
    A demo about copying list items using SharePoint Object Model:
    http://www.c-sharpcorner.com/UploadFile/40e97e/sharepoint-copy-list-items-in-a-generic-way/
    Best Regards,
    Dean Wang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Can not copy data within a realtime cube due to combination issues

    Hello all,
    I want to use a planning function on a realtime cube in order to copy data from from Co_area CO1 into Co_Area CO2. The Co_Are and the profit center are part of the realtime cube, but not the nav attribute of the profit center.
    When I test/run the planning function via a sequence in planning modeller I get an error message: The combination 'CO2/PCxy,#' is not valid, valid is 'CO2/PCxy,abcd': characteristic '0PROFIT_CTR' .
    Has this error something to do with the relationship of the master data of profit center and its nav attributes?
    Any help would be great.
    Best regards,
    Stefan form Munich/Germany

    I guess this problem is coming because of some wrong combination of records present in cube for controlling area 1. Make the below changes in the planning function. Dont include profit center in the level. Just include controlling area. Create a characteristic relationship of type derivation by master data attributes. Give source char as Controlling area and target as profit center. This will mantain consistency of char. combinations b/w controlling area and profit center in cube for the newly generated records.

  • Data Package Issue in DTP

    Hi gurus,
    My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
    In source , i have 10 records in that 7 records are duplicate records.
    I reduced the DTP datapackage size from 50000 to 5.
    When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
    My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
    Thanks in advance !

    Hi ,
    It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
    Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
    This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
    Hope it helps .
    Thanks
    Kamal Mehta

  • Similar issue: data package

    Hi all,
    I have to load data to an ODS to Cube. Requirement is while loading data, records with same key should be selected in same datapackage. I am using 3.5 update rule and infopackage for updating the target.
    For example
    Article Area
    123    01
    123    02
    123    03
    I need all articles with 123 should be selected in the same datapackage. Scattered selection shouldn't be allowed. The purpose of selecting similar article in the same data package is, then only the logic written in the update rule will work correctly.
    If any body have idea about this please help me.
    Thanks in advance,

    You cant get all records of similar key into same datapackage.There will always be chances that one or more records come in your next datapackage.
    May be you can increase the size of the datapackage in infopackage scheduler -
    >DataS Default size (something like this), so that all data comes in one datapackage.
    Lots of ppl have this problem but this is limitation in BW 3.X,
    Hope this helps.

  • Does client copy erase all the data in the target client or?

    hi
    experts
    does clientcopy erase all the data in the target client or except user master records.
    regards
    rajendra.

    It depends upon the profile you use for the client copy.
    if you use SAP_ALL
    All data is deleted from target client and source client data is copied.
    and therefore you shall see the profiles and understand them .
    SAP_ALL    All Client-Specific Data w/o Change Documents    
    SAP_APPL   Customizing and Application Data w/o Change Docs 
    SAP_APPX   SAP_APPL w/o Authorization Profiles and Roles    
    SAP_CUST   Customizing                                      
    SAP_CUSV   Customizing and User Variants                    
    SAP_CUSX   Customizing w/o Authorization Profiles and Roles 
    SAP_PROF   Only Authorization Profiles and Roles            
    SAP_UCSV   Customizing, User Master Records and User Variants
    SAP_UCUS   Customizing and User Master Records              
    SAP_UONL   User Without Authorization Profiles and Roles    
    SAP_USER   User Master Records and Authorization Profiles   
    Hope it helps.
    Thanks
    Amit

  • Copy data from one Table to another Table

    How can I copy data from one Oracle Table to another Oracle Table on a different server? Question 2: How can I clear all of the data in one Table with a single SQL script?
    Thanks...

    Question 1:
    I assume you have the privileges. If you don't, ask the DBA to give them to you. Then
    1. Login to database_source (It could be either the source or the target. Let's assume it's the source.)
    2. Create a database link to database_target: CREATE DATABASE LINK link_to_database_target CONNECT TO myuserid IDENTIFIED BY mypassword USING 'database_target'; Note the single quotes.
    3. Copy the table data: INSERT INTO targetowner.mytable@link_to_database_target SELECT * FROM sourceowner.mytable; COMMIT;
    Question 2:
    You have two options, but you may not have privileges for both.
    Option 1:
    DELETE FROM tableowner.tablename; COMMIT;
    Advantage: Since this is a DML (Data Manipulation Language) statement, you have to commit the transaction. Also, the data will be gone but the table size is NOT changed, so it's ready for accepting replacement data. DML statements can simply be executed not only from SQL scripts, but from PL/SQL scripts as well.
    Disadvantage: Slow, because all record deletion is logged, so you can recover from it by issuing a ROLLBACK; instead of the COMMIT; above. The table size is NOT changed, so if you are short of disk space or tablespace space, you have not resolved the issue.
    Option 2:
    TRUNCATE TABLE tableowner.tablename;
    Advantage: Since this is a DDL (Data Definition Language) command, you do NOT have to commit the transaction. (DDL commands automatically commit both before and after their execution.) The table size will be changed back to the initial extent size which is the minimum size a table can have and can only be set when the table is created. If it needs to be changed, the table has to be dropped and recreated with a different initial extent size. The statement execution of this command is not logged, therefore it's much faster then the DELETE.
    Disadvantage: No rollback. Being a DDL, this command cannot be executed straight from PL/SQL. If you need to issue this within PL/SQL, you will have to use dynamic SQL.

  • Maximum package size for data packages was exceeded and Process terminated

    Hello Guru,
    When i am execute the process chain i got this message Maximum package size for data packages was exceeded and Process terminated,any body help to me in this case how can i proceed.
    Thanks & Regards,
    Suresh.

    Hi,
    When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Hope this helps.
    Thanks,
    JituK

  • Copy data of posting level greater 10 in EHP2

    Dear all,
       We install EHP2 ( enhancement package 2 ) to our BCS.
       According to the document , we should be able to copy data with posting level >10.
       We are trying to copy data(all data types) from one group currency to another one.
       First ,we create a copy method with data type = All data types and with one currency translation method.
       We also create a copy task with that method.
       When we run the task, the system keep showing that :
    =====================================================
    Data stream for activity numbers is incomplete
    Message no. UCF6884
    Diagnosis
    The data stream for activity numbers does not contain the Source System and Referenced Activity fields. Therefore, the system cannot store the link between the new activity number to be assigned and the activity number of the source.
    System Response
    The data cannot be copied or loaded.
    Procedure
    Add the missing fields to the data stream for activity numbers.
    ======================================================
    There are two fields missing Source System and Referenced Activity.
    But, I don't know where or which data model we should assign those two fields.
    We try to add 0BCS_COINRR & 0LOGSYS to activity number DSO.
    The system still shows that message.
    Anyone knows how to make the copy work with posting level > 10?
    Please advise .
    Thx,
    Jeff
    Edited by: Jeff Huang on Sep 16, 2008 3:25 AM

    Jeff, are you still havnig this problem?
    - after we have implemented EHP2, I hope to copy PL20 and 30 too.
    Please let me know if you have succeeded or have found a workaround

  • Unable to bulk copy data. - Random failure, running as SQL Agent with Admin rights and timeouts=0

    Hello,
    My setup is SQL Server 2012 (11.0.5522) and SSIS, but still running the 2008 R2 created packages. The server is Windows Server 2008 R2 with 32GBs of memory.
    I am running a control package which calls 4 packages at once to run simultaneously for performance reasons. It runs every day with issues, but maybe once a month I get the failure:
    'Unable to bulk copy data. You may need to run this package as an administrator.'
    The SQL Agent is setup as admin, it has access to the create global objects the source and destination databases are on the same server and the timeout is set to zero. Each package has the standard batch size and takes about 3-4 minutes to complete.
    Its not easily re-creatable and always runs fine when I re-start the package.
    Any help would be appreciated.
    Kind regards, Graham.

    see
    https://popbi.wordpress.com/2012/09/03/ssis-unable-to-bulk-copy-data-you-may-need-to-run-this-package-as-administrator/
    https://support.microsoft.com/kb/2216489?wa=wsignin1.0
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Copy data between cubes

    I need to copy data between two cubes (through a business rule). Can I do it using Partition/Replication? If so, does anyone have an example on how it is done? I'm currently using @XREF, but that does not transfer data for blocks that don't already exist in target database.
    I'm very new to this so a detailed description will help.
    Thanks for your help.

    Yes, partitions are great. I like to use replicated partitions because I can control the data, and deal with integrity issues etc. Your usage may vary.
    Basically, you go to your "Source" database, go to the partitions menu, and "Create New Partition". You then walk through each of the tabs in the partition menu
    Type Tab: Choose the type. Let's say a "replicated" partition.
    Connection Tab: Choose what databases will be connected with the partition
    Areas Tab: I like to check the "Use Text Editor" to just type out the formulas. Also check the "Show Cell Count" to have some confidence your formulas are working as planned. Here you define what data moves from source to target. For example I might setup the following
    Source:
    ("Actuals Data"),
    @LEVMBRS("Province",0),
    @GENMBRS("Company",2)
    Target:
    ("Actuals Data"),
    @LEVMBRS("Province",0),
    @GENMBRS("Company",2)
    If the names don't match, you can adjust that in the advanced properties or mapping properties. (If you have multiple slices of data - use the advanced mapping).
    Now validate and save

  • Data Packages not hitting server

    Hi,
    I have an issue where the users log in remotely and submit data packages. When they run the package they get a successfully submitted message, but when they have a look on the View Status box the package never arrived.
    I have to admit this only happens when they run the packages for a lot of entities, but this should not be happening. Have anyone encountered this problem before?
    Regards,
    Andries

    I have faced this same problem, now running on BPC 5.1 SP8 (on SQL 2005) but first starting back on 5.1 SP3 or thereabouts. Support notes / release notes claims it was fixed sometime around SP4, but I am still facing it on SP8.
    It's sporadic, and I find it happens on both a wireless and ethernet LAN connection from the client. I even find it happens when I'm running Excel on the server, connecting via an RDP connection.
    I've filed a message with SAP support, but haven't yet had the time to reproduce it in ApShell (which is difficult to do, since it's sporadic.)
    I've set up data manager debugging logs on both client and server, and find that they are of very little use, since when this problem occurs, exactly nothing is logged to either log.
    Restarting the COM apps on the server always seems to resolve the problem; afterward the packages always seem to run on the first attempt.
    I've also noticed that while one user cannot run a package (even after 5 or 10 or 20 attempts), other users can run packages on their first attempt. Then an hour later when the admin bounces the COM apps, the first user can run their package fine.
    The problem seems to occur regardless of the type of package. Most packages are simple script logic, calling a custom logic file (normally LGF, not LGX) but some are calling the standard copy package, or custom DTS packages that run a stored procedure directly against the database.

  • Wbadmin Insufficient storage available to create either the shadow copy storage file or other shadow copy data

    Wbadmin throws the following error when running on Win8.1 upgraded from Win8 and using the -allCritical flag.
    Insufficient storage available
    to create either the shadow copy storage file or other shadow copy data
    I want to be able to get the operating system's state volumes and not use -include:c: in a fixed way for cases when os is not on drive C.
    Any suggestions?

    Hi Yanivac,
    For the issue, the target volume for a critical-volume backup can be a local drive, but it cannot be any of the volumes that are included in the backup. This is important.
    Also, this error means that there is no enough space on the target volume.
    I suggest you check the space on your target volume.
    I would like to share the article with you:
    http://technet.microsoft.com/en-us/library/cc742130.aspx
    Regards,
    Kelvin hsu
    TechNet Community Support

  • Copying data between models

    Hi Gurus,
    I need help in copying data between models. Below is the scenario.
    Model A (Source model)
    It contains plan data:
    Important dimensions: Facility, Product, Time
    Model B (Target model)
    Important dimensions: Facility, Product, Time; Additional dimensions: Cost Center, Cost Element
    Model C (For lookup while copying data)
    Important Dimensions: Facility, Product, Time, Cost Center, Cost Element
    Task:
    Copy data from A to B and populate cost center and cost element in B by lookup C (key for lookup is Facility, Product, Time)
    Please let me know the required steps to do the job.
    Can we do it using logic script and avoid BADI?
    Which data package should I use?
    Thanks.

    Hi Vadim,
    It seems like I can move the data from A to B while data gets saved in A using input screen.
    So far I have written below which is working fine. For now I am hard coding the cost center, now I need to know how to lookup to the other model. Please review and let me know. Thanks.
    *XDIM_MEMBERSET P_ACCOUNT = VOLUME, REVENUE
    *XDIM_MEMBERSET P_CATEGORY = Forecast
    *DESTINATION_APP = PLANNING_RETRACTION
    *ADD_DIM COST_CENTER = 3000011
    *WHEN P_ACCOUNT
    *IS "VOLUME", "REVENUE"
    *REC(EXPRESSION = %VALUE%)
    *ENDWHEN

Maybe you are looking for