SAP r3 Data flow

Any In depth documentation of the extracting from R3 as in how RFC is used and general architecture design of how R3 connection is established( involvement of  RFC)
And a question that i had
We use Shared Directory Access option on SAP data store
                         SAP Working Directory - Work\Dir
                         Application Shared Directory -
servername\foldername
So the architecture we are currently using we have a middleware which handles all the transport
So for ex and R3 data flow writes .dat file on the SAP working Directory once its done we have an ABAP code that we run /ose38 to move the file from the working directory to the out directory and there is a shuttle that moves the file to the application shared directory so it is not a one step process
One step does the extract and after the shuttle moves it to the shared directory  the other job reads the flat file from the shared directory
Now they want to change the architecture and use as a one step process.
I want to know what kind of access is needed on the Working directory to move the files to the application shared directory? Any special access
Does the working directory needs a write access to the shared directory?
Does DI handle the movement from Working directory to the shared directory or is there any function that needs to be run on the SAP to make this work?

Sounds like a custom_transfer method in case you find an exe that you can call which itself transports the file.
https://wiki.sdn.sap.com/wiki/display/BOBJ/ChosingtheTransport+Method

Similar Messages

  • Need to know how to check the outbound queue in SAP for data flow into CRM

    Hi Specialist's,
    I have changed an entry in Customer Master Data in Sap r/3 and this change has come fine to CRM system.I checked the inbound BDOC in CRM ( Trn:SMW01) and it has come fine into CRM system.I am also able to see the affected field in CRM with new value.
    I need to know how to check this from SAP side i.e. I have checked the outbound IDOC list (WE02) but could not find the record.
    Is there any other way in SAP that i can check?
    Also please let me know if there is any other way except IDOC in SAP to send data from R/3 to CRM?
    Usefull answers will definitely be rewarded with points .
    Thanks,
    Abhinav.

    Hi Abhinav,
    There are no IDOCs generated in R/3 for replication. R/3 system calls BAPI and Function modules remotely on CRM system which are part of the Adaptors provided for middleware.
    Try to deregister the outbound queues in R/3 (Transaction SMQS) and see if you get an entry in the Outbound Queue (Transaction SMQ1). I have not tried this.
    If you don't get any entry in the outbound queue then definitely there would be some FM at R/3 end which would be calling CRM system remotely.
    <b>Reward points if it helps.</b>
    Regards,
    Amit Mishra

  • How to derive month/year from date in SAP BW 3.5 data flow

    Hi
    How we can derive cal year/month and fiscal month/year from date in SAP BW 3.5 data flow (we're using transfer and update rule)..
    Thanks,
    PK

    Hi,
    if you have any date filed in source side you can just map to any time char system will automatically convert to target objects.
    please look at the screen shot for understanding. (not 3.x it is 7.x)
    Thanks,
    Phani.

  • SAP R/3 data flow

    Hi all,
    I am working on the Accounts Payable Rapid mart . Can i have a job that first creates all the .dat files on the SAP working directory and another job that executes the .dat file from the application shared directory without having to again run the R3 data flow
    If you didnt get it ..
    1st job get the data from the sap r3 table and puts in the data transport ( ie..It writes the .dat file on the working directory of the SAP server).
    2nd job gets the .dat file from the application shared directory without having to do the first job again
    Is the above method possible if there is a way.
    I would really appreciate any comments or explanations on it.
    Thanks
    OJ

    Imagine the following case:
    You execute your regular job.
    It starts a first dataflow
    A first ABAP is started...runs for a while...then is finished.
    Now the system knows there is a datafile on the SAP server and wants to get it
    Because we configured the datastore to use a custom transfer program as download, the tool expects our bat file to download the file from the SAP server to the DI server
    Our custom transfer program shall do nothing else than wait for 15 minutes because we know the file will be copied without our intervention automatically. So we wait and after 15 minutes we return with "success"
    DI then assumes the file is copied and starts reading it from the local directory...
    The entire trick is do use the custom transfer batch script as a way to wait for the file to be transported automatically. In the real implementation the batch script will not wait but check if the file is finally available....something along those lines.
    So one job execution only, no manual intervention.
    Got it? Will it work?

  • Help Required Regarding - SAP Job names using R3 data flows

    We are calling a set of SAP Jobs using R3 data flows in data services. When ever a job fails we first kill the active SAP jobs by logging into SAP and then restarting the Jobs.
    There are about 100 odd SAP jobs that we call using these Data services Jobs so we wanted to kill the jobs using a reusable code on the SAP side by passing the Job name just before every R3 flows just incase its still in active status.
    So wanted to know if there are any short cuts to retrive the set of associated SAP job names because it will be a tedious process to hardcode the SAP job names and pass them as parameters for all the 100 + SAP job names in the custom defined resuable code.
    Any help or advice on this please !!

    The program is not meeting the expectations
    and the problem is due to reflection.Do we know this for certain?
    ... my application gets the class name, field name
    etc. from an XML file so i don't know their method names
    beforehand .
    Now since every class instance corresponds to a row
    in the database and i have to call get and set
    methods of each class instance so the performance
    keeps on degrading as the number of columns and rows increase .
    Can somebody suggest some improvement regarding this
    and regarding creating multiple instances of the same object Class.forName() will be using a hash already, so there is probably not much room for improvement.
    Class.newInstance() probably does not take significantly more processing than a simple "new Fubar();".
    Umpteen reflective method invokations (one per column) for each row/instance - Are you saying these are the problem?
    You can test this easy enough.
    If you comment out the reflective method invocations and leave the rest of your code untouched,
    does your application processing speed up significantly?

  • Oracle to SAP BI with BCS u2013 Best Data flow Design.

    Hi,
      We are a SAP implementation team. We are @ Blue Print stage. My client is a RETAIL Business Gaint. Client has 50 % of Transaction data and Master data in Oracle data base. Now we are moving to BI 7.0 and also has plans to use SAP-BCS.
      We would like to map all the existing Oracle tables to BI. Provide any clue regarding the best Data flow (From Oracle 10G to BI 7.0).
    Your quick and valuable suggestion/ links are highly appreciated.
    Warm Regards,
    Bab

    Hi Ashok,
    You have mentioned that you have a Oracle 10g  system as a data inflow which perfectly sets the platform to extract the data from the Oracle  system to SAP BW system.
    This Can be done through the DB connect ,through where you could select the necessary tables and form them as a datasource in BI system,further creating the usual BI objects on top of the DS.
    Once the Data is in BI ,we could pull the BI cube or form a replica of the cube in BCS application format to use it in the BCS environment.
    Hope this helps,
    Regards,
    Rajesh.

  • Data flow in SAP XI

    Any one there , will provide me the Data flow in SAP XI ,
    thanking you
    sridhar

    Hi sridhar
    1) Sender Adapter & Sender agreement & communication channel
    2) receiver determination
    3) interface determination
    4) message branch
    5) receiver agreement & communication channel
    6) call inbound adapter
    Regards Mario

  • Regarding data flow in SAP XI

    Hello Friends,
    In Landscape where XI connected to system(s), is there any way to find how the data flows through or how it is designed to connect other systems in SAP XI other than SLD
    thanks

    If you are looking for any specific landscape and trying to make out how the message flow, then i think you need
    to get the Technical and functional specification documents and based on that you can get a clear picture of
    how different interface are designed and which systems they are connecting.
    Regards
    Srinivas

  • Data Flow from SAP Source (ECC) system to SAP BI system

    Hi All,
    I wanted to know how data will be flown from SAP Source system to SAP BI system.Data flow should include
    1) Data will be flown by using the IDOCs?
    2) What all are the interfaces involved while data is transferring?
    3) What will happen exactly, if you execute the PSA?.
    If you have any info on this, could you please post here....I
    Regards,
    K.Krishna Chaitanya.

    Hi Krishna,
    Please go through  this article :
    "http://www.trinay.com/C6747810-561C-4ED6-B85C-8F32CF901602/FinalDownload/DownloadId-C2EB7035A229BFC0BB16C09174241DC8/C6747810-561C-4ED6-B85C-8F32CF901602/SAP%20BW%20Extraction.pdf".
    Hope this answers all the mentioned questions.
    Regards,
    Sarika

  • Data Flow in SAP BI - Please Help

    Hello All,
    I have to prepare a Data flow in SAP BI for one of the Insurance system application.
    There are 4 flat files coming from the source. Each flat file has records on insurances details like policies and so on.
    Each flat file represents a line of business. Now the data model is newly being made. There will also be historic data for last three years for reporting and current year data is loaded in monthly basis.
    Also the error records will go in different target for reporting for each line of business.
    Please suggest a data flow which which would be made in best.
    Edited by: Syed786 on Oct 23, 2009 5:09 PM
    Edited by: Syed786 on Oct 26, 2009 11:31 AM

    Hi sridhar
    1) Sender Adapter & Sender agreement & communication channel
    2) receiver determination
    3) interface determination
    4) message branch
    5) receiver agreement & communication channel
    6) call inbound adapter
    Regards Mario

  • Display Data Flow - Short Dump

    Hi all,
    When i select display data flow of any cube...it is going for a short dump.
    I have searched for the answer in previous Forum questions. I could find only for previous BW versions but not for for BI7.
    Could you please let me know the solution for this issue.
    Thanks & Regards,
    Eswari

    Hi All,
    Thank you very much for all of your responces.....
    I am working on Support Package 10.
    Here is the detailed description of the short dump.
    Short text
        The current application triggered a termination with a short dump.
    What happened?
        The current application program detected a situation which really
        should not occur. Therefore, a termination with a short dump was
        triggered on purpose by the key word MESSAGE (type X).
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.
    Error analysis
        Short text of error message:
        GP: Control Framework returned an error; contact system administrator
        Long text of error message:
         Diagnosis
             The Graphical Framework is based on the basis technology known as
             the Control Framework. A method in the Control Framework returned
             an error.
         Procedure
             It probably involves a programming error. You should contact your
             system administrator.
         Procedure for System Administration
             Check the programming of the graphics proxy especially for the
             parameters that were sent and, if necessary, correct your program.
        Technical information about the message:
        Message class....... "APPLG"
        Number.............. 229
        Variable 1.......... " "
        Variable 2.......... " "
        Variable 3.......... " "
        Variable 4.......... " "
    How to correct the error
        Probably the only way to eliminate the error is to correct the program.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "MESSAGE_TYPE_X" " "
        "CL_AWB_OBJECT_NET_SAPGUI======CP" or "CL_AWB_OBJECT_NET_SAPGUI======CM005"
        "PBO"
        If you cannot solve the problem yourself and want to send an error
        notification to SAP, include the following information:
        1. The description of the current problem (short dump)
           To save the description, choose "System->List->Save->Local File
        (Unconverted)".
        2. Corresponding system log
           Display the system log by calling transaction SM21.
           Restrict the time interval to 10 minutes before and five minutes
        after the short dump. Then choose "System->List->Save->Local File
        (Unconverted)".
        3. If the problem occurs in a problem of your own or a modified SAP
        program: The source code of the program
           In the editor, choose "Utilities->More
        Utilities->Upload/Download->Download".
       4. Details about the conditions under which the error occurred or which
       actions and input led to the error.
    Thanks,
    Eswari.

  • R/3 data flow is timing out in Data Services

    I have created an R/3 data flow to pull some AP data in from SAP into Data Services.  This data flow outputs to a query object to select columns and then outputs to a table in the repository.  However the connection to SAP is not working correctly.  When I try to process the data flow it just idles for an hour until the SAP timeout throws an error.  Here is the error:
    R/3 CallReceive error <Function Z_AW_RFC_ABAP_INSTALL_AND_RUN: connection closed without message (CM_NO_DATA_RECEIVED)
    I have tested authorizations by adding SAP_ALL to the service account I'm using and the problem persists.
    Also, the transports have all been loaded correctly.
    My thought is that it is related to the setting that controls the method of generating and executing the ABAP code for the data flow, but I can't find any good documentation that describes this, and my trial and error method so far has not produced results.
    Any help is greatly appreciated.
    Thanks,
    Matt

    You can't find any good documentation??? I am working my butt off just.......just kiddin'
    I'd suggest we divide the question into two parts:
    My dataflow takes a very long time, how can I prevent the timeout after an hour? Answer:
    Edit the datastore, there is a flag called "execute in background" to be enabled. With that the abap is submitted as a background spool job, hence does not have the dialog-mode timeout. Another advantage is, you can watch it running by brwosing the spool jobs from the SAP GUI.
    The other question seems to be, why does it take that long even? Answer:
    Either the ABAP takes that long because of the data volume.
    Or the ABAP is not performing well, e.g. join via ABAP loops with the wrong table as inner.
    Another typical reason is to use direct_download as transfer method. This is fine for testing but it takes a very long time to download data via the GUI_DOWNLOAD ABAP function. And the download time would be part of the ABAP execution.
    So my first set of questions would be
    a) How complex is the dataflow, is it just source - query - data_transfer or are there joins, lookups etc?
    b) What is the volume of the table(s)?
    c) What is your transfer method?
    d) Have you had a look at the generated abap? (in the R/3 dataflow open the menu Validation -> Generate ABAP)
    btw, some docs: https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ConnectingtoSAP

  • Data Flow terminated due to error 120307

    Hi.
    I get this error when executed  project.
    Source system: SyBase IQ.
    Target system: SAP HANA.
    Part of tables copied successfully, but job terminated anyway.
    I attached screenshots with Progress screen and Monitoring.
    Error log is empty.
    In trace log I see errors from SUBJ.
    Also I have another one strange message in trace log:
    Cache statistics determined that data flow <SYBASE_IQ_2_HOD_DBA_FACT_FINAL> uses 0 caches with a total size of 0 bytes, which is less than (or equal to) 3757047808 bytes available for caches in virtual memory. Data flow will use IN MEMORY cache type.

    I executed this job from data services designer and get another error.
    main Bufman: An error was detected on a database page. You may have a damaged index. For additional information, please check your IQ message file or run sp_iqcheckdb
    Trying to find issue in goggle

  • Changing Character set in SAP BODS Data Transport

    Hi Experts,
    I am facing issue in extracting data from SAP.
    Job details: I am using an ABAP data Flow which fetches the data from SAP and loads into Oracle table using Data Transport.
    Its giving me below error while executing my job:
    (12.2) 05-06-11 11:54:30 (W) (3884:2944) FIL-080102: |Data flow DF_SAP_EXTRACT_QMMA|Transform R3_QMMA_EXTRACT__AL_ReadFileMT_Process
                                                         End of file was found without reading a complete row for file <D:/DataService/SAP/Local/Z_R3_QMMA>. The expected number of
                                                         columns was <30> while the number of columns actually read was <10>. Please check the input file for errors or verify the
                                                         schema specification for the file format. The number of rows processed was <8870>.
    reason: When analyzed I found the reason for this is presence of special characters in data. So while generating the data file in SAP working directory which is available on SAP Application server the SAP code page is 1100 due to which the delimeter of the file and the special characters are represented with #. So once the ABAP is executed and data is read from the file it is treating the # as delimiter and throwing the above error.
    I tried to replace the special characters in ABAP data Flow but the ABAP data Flow doesnot support replace_substr function. I also tried changing the Code Page value to UTF-8 in SAP datastore properties but this didnt work as well.
    Please let  me know what needs to be done to resolve this issue. Is there any way we change the character set while reading from the generated data file in BODS to convert code page 1100 to UTF-8.
    Thanks in advance.
    Regards,
    Sudheer.

    Unfortunately, I am no longer working on this particular project/problem. What I did discover though, is that /127 actually refers to character <control>+<backspace>. (http://en.wikipedia.org/wiki/Delete_character)
    In SAP this and any other unknown characters get converted to #.
    The conclusion I came to at the time, was that these characters made their way into the actual data and was causing the issue. In fact I think it is still causing the issue, since no one takes responsibility for changing the records, even after being told exactly which records need to be updated ;-)
    I think I did try to make the changes on the above mentioned file, but without success.

  • Automatic creation of BW data flow documentation

    Dear Gurus,
    I need to write documentation of the data flow of a huge project which I haven't implemented by myself.
    The documentation should contain a mapping of the objects in the dataprovider, towards objects in the source system(s).
    Eventually with the info in which dataproviders the objects are included, e.g. between the multiprovider and the source system.
    Details of transformations can be ignored; eventually mentioning there's a routine involved, but that's the maximum.
    With the data repository, I can have the content of cubes in a graphical overview, but it doesn't really provide me useful information.
    You can imagine I prefer an automatic way to create this documentation.
    Anybody who knows a solution, even if it only provides part of the purpose?
    Any solution via query, standard SAP or customized program, ...
    Recommendations would be very highly appreciated!
    Thx & Rgds, sam

    Worldwide documentation is made on SAP BW projects, but no reply on automatic documentation.
    A lot of time must be lost by manually creating documentation on mapping objects to source system fields.
    ==> SAP, please, work out a solution.
    I didn't find a satisfying solution, but I've done it the following way:
    List all objects for a multiprovider via the meta data repository, and paste in excel document.
    Then listing all objects for the underlying dataproviders, and paste in separate sheets of this excel.
    Compare the objects of the MP with the objects on the other sheets using excel functions, and sign when a dataprovider contains a certain object.
    For the datasources, I checked if an object is present, and if yes, give the original source field.
    This in summary as a not optimal and not complete solution, but it prevents making mistakes.
    Rgds. sam

Maybe you are looking for

  • 4.08 update wont install

    My phone had the gingerbread update last night and phone was working normal today (with the exception of low space icon which had previously been popping up despite space being available) and then tonight I get a OTA for 4.08, press install and my ph

  • Value in exbs field of j_1iexcdtl is takeing base price not the discounted

    hi In exbs field of j_1iexcdtl taking base price of a material but exice(bed) is calculated on discounted value of base price. can any body clear me whether value is ok or not. if i need to maintain assesable value at j1iid what value should be? kaus

  • Quick look problem when trying to quick look a movie

    Hi can anyone help, when I click on a movie file and press space it comes up with a clip from the movie with a diagonal line through it, the top bit above the line is a transparent white color. Is the file corrupt?, cause it happens to all movies fil

  • Poor webcam image quality (HP Touchsmart Notebook

     I recieved a HP touchsmart Notebook as a Christmas gift and the image quality is  horrible! Even in a well lit environment, the image is still not clear. I have adjusted the settings to try to improve the image quality, but there is still not much o

  • Call of Duty: MW3 Play it Early!

    Get EARLY ACCESS to special ops gameplay only at Best Buy®! Monday, November 7th 8PM to 11:30PM At select stores, we are allowing the first 200 people in line a chance to play Call of Duty®: Modern Warfare 3 before it launches at midnight.* To find a