External Data Transfer to FI-SL Using RGUREC00 - Is Plan Data Supported?

Hi,
I  copied standard  program (RGUREC00- to load PLAN data into FI-SL )  to z program and changed as per the requirement from client but it dosen't work for 'PLAN Data', it works only for 'Actuals'.
Is there any other program to load PLAN data ?
Please guide me.

Check RKEPCU20.
Rob

Similar Messages

  • IDOC data transfer SAP to Java using SAP JCo

    Dear Experts,
              The challenging requirement we are having is, we need to create the interface for data transfer between SAP system and the Java system. The data will be transferred from SAP to java and similarly once some processing done in Java again the details needs to be transferred from Java to SAP.
                 For this data transferred we are planning to use IDOC process and for interface "SAP Java connector" we are planning to use. In this case we are having some doubts.
    1. The data from SAP is going to be transfered from one Custom transaction (Z tcode). Once  "Outbound IDOC" will get triggered and will carry the details. Now the doubt is, whether the data / details will get transfered to JAVA system automatically or we need to perform any other steps from SAP ABAP coding...(like converting in to flat file, XML file and etc) ??
    2. We are planning to install "SAP Jco" in Java server. Is this correct...??
    3. Other than SAP Jco any other softwares needs to be installed or not..??
    4. Since we are going to trigger the "outbound IDOC" from custom transaction, we are planning to develope one program in SE37. Other than this any other program we need to develop or not..??
    If anybody has detailed steps or explanation please share it with us.
    Warm Regards,
    VEL

    Hi All,
      For the above mentioned issue, we implemented JCo software in JAVA system and created the JAVA program including SAP logon credential details like Client, User name, password and Language details.
    When this JAVA program was compiled successfully then, that non SAP system will appear in SAP gateway Tcode.
    Once non SAP system started appearing in SAP gateway that means, both SAP & Non SAP are connected automatically.
    Regards,
    Velmurugan P

  • Data Transfer Workbench can copy sales & purchase modules's data?

    Hi,
    Just wondering that,any way to copy the Sales & Purchase Modules's data from existing company to new company? Copy Express can't do it. But how about Data Transfer Workbench can copy?
    Thanks
    Regards,
    Danny
    Edited by: Danny Gan on Mar 18, 2008 5:21 AM

    Danny,
    How many field from the ORDR or RDR1 have to exported depends on the business logic implemented and what was being used.  No one can really provide a standard set of 5 or 10 fields as you know the clients business better than anyone in the Forum.
    If the Client used User fields then data from the user fields should also be exported.
    The base fields from ORDR would be CardCode, DocDate, DocDueDate, Transpcode (shipping code), SlpCode, GroupNum (Payment terms code), etc.
    Similarly on RDR1, ItemCode, Quantity, Price and any additional row level information that may pertain to their business
    Suda

  • Which data sources I need to use for Actual&Plan comparision of GL Account?

    Could you please let me know which datasources I need to use for Actual & Plan comparision of GL Account data?
    Current SAP BI version is BI 7.0.
    As per my knowledge, I am thinking of using the datasources 0CO_OM_CCA_1 for Plan data & 0FI_GL_10 for Actuals.
    Is this right?
    I have one more question here:
    While extracting the data from ds 0CO_OM_CCA_1, I am getting only the Profit & Loss Account data where as the Balance Sheet Account data is not coming.
    And also there is some confusion in VTYPE because the standard ds 0CO_OM_CCA_1 is giving the data with Value type 10 is & Version 000.
    As we all know the VTYPE '10' stands for Actuals then how can we say that this ds 0CO_OM_CCA_1 gives Plan data?
    Please clarify.

    Hi,
    Basically 0CO_OM_CCA_1 data source is used to extract actual, plan, and commitments.
    This data is differentiated with value type, you can see sap note 523742 for more details on value type.
    For balance sheet accounting data check the below link,
    [http://help.sap.com/saphelp_nw04s/helpdata/en/5d/ac7d8082a27b438b026495593821b1/content.htm]
    Regards,
    Durgesh.

  • Can we use actual and plan data for same cube?

    hi friends,
    can we use actual data and plan data in same cube in bps.
    Thanking u
    suneel.

    Hi,
    Let us take std cube 0SD_C03 where we store billing info,sales order info and delivery info. But still in reporting we can get values without any mismatch in their respective KF. It is possible because of a cahracterstic called Documnet category which gets values in update rule by constant .
    Similarly we can have 2 types of data (plan and actual) in a single cube by having different value for a characterstic. Most frequently used characterstics for this are 0VERSION and Value type.
    With rgds,
    Anil Kumar Sharma .P

  • Error in Data Transfer : From CO-PA to SOP Flexible Planning

    Hi Friends
    I am working on a Scenario of Transferring CO-PA data to Flexible Planning, I am encountering few problems, I have done the followig steps, request you to check and get back to me..
    The following configuration steps are done with Example;
    Characteristic assignment
    u2022 Sales Organization
    u2022 Region
    u2022 Plant
    u2022 Material Number
    u2022 Material Group
    Key figure(s) assignment
    u2022 Sales Quantity
    Assignment: Information structure - operating concern
    S555 u2013 Z001
    Define activities Z1
    Mass Processing Job : ZSO-PA
    We are able to transfer data from CO-PA to standard SOP, But we tried for transfer to Flexible Planning SOP by creating Mass Processing Job setting as below configuration path.
    Production -> Sales & Operations Planning (SOP) ->Functions -> Mass Processing -> Transfer LIS / CO-PA
    After executing Job MC8G - Schedule, we checked in Tcode - MC8I u2013 Check, Status: Processed with errors, And also data are not displayed in Tcode : MC95 for Material & active version.
    We couldnu2019t find log for error status.
    What is the method / Procedure for transfer data to Flexible Planning SOP?
    Please get back to me.

    Dear,
    Status:            Processed with errors ok
    Activities checked:? what is the error message here like
       COPY
           Planning type in activity and job not identical
           Planning type of job:  SOPKAPAM
           Plng. type of activity:Z004PLAN
    When checking a mass processing job in SOP/flexible planning (Transaction MC8I), in the log error message "Activity and COPA profile have different planning versions?
    Or planned/actual indicator in the COPA profile?
    Version should be        A00
    Please check the MC8S. Then an Avtivity in MC8T
    after you create the job in MC8D with the acitivity you have defined for your info structure.
    After that check in MD62 for plan value.
    Regards,
    R.Brahmankar

  • CATS - Data Transfer to Target Components via CATA - Error "No Data for Trf

    Hello
    I am recording hours via CAT2 on Network Activity. These hours are then approved.
    However, when I run CATA to transfer the hours to CO and PS, I get popup information "No Data for Transfer".
    When i proceed further , i get another popup saying all records were read and saved successfully, and no. of confirmations generated.
    However, when i run CJI3, I do not see any Actual cost line items.
    Please help
    JG

    Hello Joy,
    (1) Perhaps you have on your profile on CAC1, the "Immediate Transfer to HR" flag set.
    This way you have the "Release on Saving" flag set, all the time you save data, it gets transferred directly.
    (2) But, I see that you are having the issue with the CO and PS, so I highly suggest you to check on if the data is on the interface tables.
      a. First you should make sure that the data exists on CATSDB with Status 30.
      b. Then, you should check the data on CATSPS (Interface table for PS) and CATSCO( interface table for CO).
      c. If you don't have the data on these tables there should be no record after all to transfer.
    Regards,
    Bentow.

  • Data transfer from R3 and BW TO external system

    Hi experts
    I got one question which I couldnt find good answer.
    it concerns...data transfer from both R3 system and BW(here data is master data) to an external system.
    what are the various options to do this both from BW side and R3 side?
    which of them offers best performance?
    Thanks
    BR
    Amanda
    Edited by: amanda ciders on Nov 20, 2008 2:17 PM

    Hi
    Im aware that I can use OpenHub Functionality for this.Can anyone explain how to do this as I am unable to implement this for this situation.....
    do anybody know other solutions like creating views or FM like that...
    .it will be of great help if u can explain any solution in greater detail(including open hub)..
    Thanks
    BR
    Amanda
    Edited by: amanda ciders on Nov 20, 2008 2:26 PM

  • Help me find ghost files after botched data transfer

    I just bought a new MacBook. While setting up the OS upon first boot, I hooked up my old machine (which had a dead LCD, hence the new machine) for the data transfer. Well, after a solid hour+ of data transfer over firewire, there was an error, after which the old machine didn't seem to be a valid source of data transfer for the OS.
    So I boot up the new machine and none of my files appear to have transferred. Ok, I can transfer important things by hand, so it's no problem. I boot the old machine holding down T and it appears as a mounted volume, from which I copy my files.
    Here's the issue: that first, failed attempt to transfer files left "ghost" files somewhere eating up my HD space. Ie, if I sum up the size of all my files and folders in the / directory, they add up to around 60 Gb, which seems reasonable given all my music and such. But if I look at my main partition volume in Disk Utility, it claims I'm using a full on 179 Gb, which is totally unreasonable.
    So there's around 120 Gb of "ghost" usage laying around. How do I track it down?

    Log out and then into the newly created user account. To transfer them, choose Go to Folder from the FInder's Go menu, provide /Users/Shared/ as the path, and drag them there.
    (67061)

  • JPA - Best Practice For Data Transfer?

    I've been considering an alternative method for data transfer between applications, by using Serialized or Encoded to File JPA Entities. (either Binary or XML)
    I know this procedure may have several draw backs as compared to traditional exported SQL queries or data manipulation statements, however, I want to know if anyone has considered or used this process for data transfer?
    The process would be to
    - query the database and load the JPA Entities
    - Serialize or Encode them to file
    - zip up the entire folder with the JPA entities
    - transfer the data to destination machine
    - extract the data to a temp directory
    - reload the JPA entities by de-serializing and persisting them to the database
    The reason I'm considering this process, is basically because I have a desktop application (manages member records, names, dates, contributions, etc) used by different organisations in different locations (which are not related except by purpose ie clubs or churches) and would like to have a simple way of transporting all data associated with a single profile (information about a member of the organisation) from one location to another in a simple way, ie users interact only with the application without the need for any database management tool or such.
    I'm also considering this because it is not easy to generate an SQL Script file without using a dedicated Database Management Tool, which I do not want the application users to have to learn how to use.
    I would appreciate ANY suggestions and probable alternative solutions for this problem. FYI: I'm using a Java DB database.
    ICE

    jschell wrote:
    In summary you are moving data from one database to another. True
    You only discussed flow one way. Well the process is meant to be bi-directional. Basically what I envision would to have something like:
    - the user goes to File -> Export Profile...
    - then selects one or more profiles to export
    - the export process completes and the zip archive is created and transfered (copied, mailed, etc) to the destination location
    then on the destination pc
    - the user goes to File -> Import Profile
    - selects the profile package
    - the app extracts, processes and imports the data (JPA serialized for example)
    Flow both ways is significantly more complicated in general.Well if well done it shouldn't be
    And what does this have to do with users creating anything?Well as shown above the user would be generating the Zip Archive (assuming that is the final format)
    Does this make the problem clearer?
    ICE

  • Unable to connect via "Data Transfer" Mode On N95 ...

    Its been happening for a little while now. When i plug the USB cable in, it asks if i wanna connect via PC Suite, Data Transfer etc
    When i click Data Transfer it says the following:
    "Unable to activate Data Transfer Mode. Mass Memory is in use by another application"
    Does anyone know the fix for this problem?
    Many thanks
    Message Edited by tiff777 on 06-Nov-2008 07:56 PM
    Message Edited by tiff777 on 06-Nov-2008 07:57 PM

    Hi There
    Im not sure this would work for you all but i came accross this discussion after suffering the same problem and after pulling my hair out i finally solved it and thought it was only fair i posted how.
    I had tried all options on here apart from deleteing everything off my memory card ( i wasn't keen on this option ) i checked the running apps and it just showed that standby was running,
    So i thought i would try uninstalling recently installed item one by one, i had recently installed some  software called CALL RECORDER  and although it was not showing up that it was running it was in the background after deleting this the problem was solved ,
    So try and think if there are any apps that you have installed that may be running in the background but not showing up , i would advise deleting these one by one as i did and if it still doesnt work you can reinstall it ,
    hope this helps
    BUT EVEN AFTER ALL THIS I STILL LOVE MY N95 8GB !!!!!!!!!!!!!!!!!!!!!!!

  • Establish a connection through RF modem's on client & server side & to set up PPP communication for data transfer

    hi
    can any1 over here help me out in how to establish connection between 2 RF modem's for data transfer , between client & server USing LABVIEW?
    I want to establish a connection between 2 PC's through  RF modem on client & server side & to set up PPP communication for data transfer.
    (I have tried data transfer through RS-232 using TCP/IP whn the 2 PC's are connected over ethernet... which is working.
    I also tried connecting loopback cable between 2 PC's COM port & geting data transfer using VIsa configure serial port & other visa functions  ... which is working)
    can u guide me how to establish connection between 2 RF modem's using LABview?
    & how does the data transfer take place between 2 RF modems through RS-232?
    is it using TCP/IP?
    If you got any links to go abt this issue do send me related links .. or any examples .....
    I am currently using Labview version 8.
    Waiting in anticipation.. reply ASAP..
    thanking you
    Regards
    Yogan..

    Howdy yogan,
    Maybe you could clarify a few things for me, and we'll see how we can help ya. TCP/IP protocol occurs through an ethernet connection; RS-232 communication occurs through an RS-232 serial connection, typically through a cable that has a DB9 connector on both ends. Do you mean that the RF modems in question have the option to communicate via RS-232 and/or via TCP/IP ethernet? Specific information like the manufacturer of your RF modems, the model number of your RF modems, and how you connect the modems to the PC would enable us to give you more efficient support.
    You can check our Instrument Driver Network (IDNet) to see if a plug-and-play/IVI driver already exists for your RF modem. (You'll need to know its manufacturer and model number.) In the case that you do find an IDNet driver for your modem, you can use this KnowledgeBase article for instructions on how to use the driver.
    Another excellent resource to consider is the NI Example Finder. You can access this within LabVIEW by navigating to Help»Find Examples and then searching for serial or TCP/IP examples.
    Message Edited by pBerg on 03-10-2008 04:35 PM
    Warm regards,
    pBerg

  • HCM Best Practices version 1.500 (HCM Data Transfer Toolbox)

    Hi,
    Does anyone have the HCM Data Transfer Toolbox (BPHRDTT, HR-DTT, HRDTT)? This seems to be discontinued and been replaced by LSMW which does not provide anything even close to this tool!!!!
    I wish this would have been posted in the Downloads on SDN.
    Thanks
    Check out the following link to see what I mean:
    http://help.sap.com/bp_hcmv1500/HRBP_DTT_V1_500/index.htm
    Edited by: Andreas Mau on Jan 30, 2009 8:42 AM

    Hi Uwe,
    Thanks for giving it a try, but my question is very specific for the reason that SAP has discontinued a perfectly good tool and replaced it with some do-over.
    LSMW - is not always suitable and for large loads the performance is out of wack. You always have to start from scratch creating the stuff.
    PU12 - The interface toolbox is nice but it could offer something more like XML or MHTML formating so you can actually reuse the data without pain. Also for payroll extraction on retro-results for only deltas this is not useful either.
    HR-DTT - Was a tool that had all you really needed and was working just fine. It had everything not just infotypes but also T558B/C/E load although I prefer T558D than T558C. Here SAP is totally lacking support. The 2 BAPIs for BUS1023 do not support all one is only for outsourcing, the other for only B/D tables. In the code the E table update is commented out since they forgot to actually add the corresponding structure for the load as importing parameter...
    SXDA - A rather unknown Data Transfer Workbench which also uses projects to organize the data transfer hooks up onto LSMW as well, but again that is where the tools end. There are no example projects for any area that are of any use. SCM/SRM has some minor examples but nothing that could be called exhaustive.
    The Link: Thanks to SAP support this link has been discontinued so that no-one can ask questions any more on this issue. You may still find the docu on the service marketplace in 50076489.ZIP for HRBP_USA_V1500
    Edited by: Andreas Mau on Apr 23, 2009 10:36 PM
    Edited by: Andreas Mau on Apr 24, 2009 1:39 AM

  • Data transfer limits -- Confusing info

    When I click on the page on dot mac that tells me how much data transfer I have used, I see two numbers.
    For instance, one line says:
    You are currently using 260.1 MB of 5 GB bimonthly data transfer limit.
    Then below, it lists the date I opened my dot mac account (two days ago), and says, "Amount used: 1.94 GB"
    I can't figure out which number I should be keeping an eye on? I don't want to exceed my 5 GB bi-weekly limit.
    Can anyone explain? Thanks, in advance.

    The "data transfer" is how much bandwidth (from your limits) has been used for the bimonthly period.
    The 260.1MB number is the one to watch. To increase your monthly bandwidth limit from the standard 10GB's you need to purchase additional storage space.
    Doubling the default (1GB to 2GB) for about $50 a year moves your data transfer amount from 10 to 25GB's.

  • Data S Default data transfer Options

    Hi
    What do the below mentioned terms mean in infopackage-Schedular--Data S Default data transfer
    1Max size of datapakcet in kbytes
    2Max no of dailogues process for sending data
    3No of data packet per info idoc
    Please help me with an example and what importance it has when we increase/ decrease the lengh of the above 3 mentioned do we have any interconnection in all three or they are all independent.
    Thanks
    Puneet

    Hello Puneet,
    These are some standard BW Settings done in transaction SPRO.
    SPRO ->SAP Customizing Implementation Guide->SAP NetWeaver->SAP Business Information Warehouse->Links to Other Systems->Maintain Control Parameters for the data transfer
    Maximum size of a data packet in kilo bytes.
    The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package.
    SAP recommends a data package size between 10 and 50 MB.
    Frequency with which status Idocs are sent
    With this frequency you establish how many data IDocs should be sent in an Info IDoc.
    Maintain Control Parameters for the data transfer
    Standard settings
    For SAP source systems, you change the control parameter settings in the transaction SBIW (Customizing for Extractors), under Business Information Warehouse -> General Settings -> Control Parameters -> Maintain Control Parameters for Data Transfer .
    Activities
    1. Maximum size of data packages
    For data transfer into BW, the individual data records are sent in packages of variable size. You use these parameters to control how large such a data package typically is. If no entry is maintained, the data is transferred with a standard setting of 10,000 kbytes per data package. The memory requirement depends not only on the setting for data package size, but also on the width of the transfer structure, and the memory requirement of the  relevant extractor.
    2. Frequency
    With the specified frequency, you detemine after how many data IDocs an Info IDoc is sent, or how many data IDocs are described by an Info IDoc.
    The frequency is set to 1 by default. This means that an Info IDoc follows every data IDoc. Generally, choose a frequency of between 5 and 10, but not greater than 20.
    The larger the package size of a data IDoc, the lower you must set the frequency. In this way you ensure that, when loading data, you receive information on the current data load status at relatively short intervals.
    In the BW Monitor you can use each Info IDoc to see whether the loading process is running without errors. If this is the case for all the data IDocs in an Info IDoc, then the traffic light in the Monitor is green. One of the things the Info IDocs contain information on, is whether the current data IDocs have been loaded correctly.
    3. Size of a PSA partition
    Here, you can set the number of records at which a new partition is generated. This value is set to 1.000.000 records as standard.
    When you are integrating with Other SAP Components then
    SPRO ->SAP Customizing Implementation Guide->Integration with Other SAP Components->Data Transfer to the SAP Business Information Warehouse->General Settings->Maintain Control Parameters for the data transfer
    Maintain Control Parameters for Data Transfer
    Activities
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2'Max. Rows'1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    Thanks
    Chandran

Maybe you are looking for

  • Issue with payment run

    hi guru's We have one vendor in two company code one is India and one is USA. The bank account number  is different in vendor master data but both accounts in the same country. When we make the payment run it has to split to different account based o

  • Front panel indicators used as controllers and viceversa

    It is possible to use indicators also as controllers? I would like to load data from a file and then to be able to edit that data. Kind regards

  • Creating the first planning app in 9.2.1

    I've created PLANAPP1 in planning desktop, and regenerated the properties file. This is the first planning application I've created using 9.2.1 Using the planning web page I can not login. I get an error message. Can someone tell me how shared servic

  • Broken iphone screen

    Apple with all the knowledge and technology they have has done a very poor job on their crystal like screens. My family and I have 4 iphone4s and mine fell in an apple bumper from my hand and the screen got shattered. Apple store is asking $199 for r

  • Macbook only shuts down when battery is flat and date and time revert to 2008

    Hey Guys My Macbook only shuts down when the battery goes flat. When I switch it back on everything seems to be fine though the date and time are incorrect. Seems to go to default date which is 2008. Other things seem to affect my browser, Facebook w