Run an ABAP program using Data Services/Integrator

Hi,
Its it possible to run an ABAP program from Data serives
There is an aBAP program already created , what it does is to move file from working dir to another directrory
Can i execute the ABAP prgram from Data Integrator. If it;s possible
Any suggestions on how to do it
Thanks

Yes it might be possible.
First of all create a batch file based on the following link ...
http://webcache.googleusercontent.com/search?q=cache:V-Qc73pxxVcJ:www.sap-basis-abap.com/abap/how-can-we-run-abap-program-from-command-line.htmtoexecuteanabapfromwindows&cd=4&hl=en&ct=clnk&gl=uk
After doing so call the batch file from the following DS command exec() ... exec(command_file, parameter_list, flag)
Thanks

Similar Messages

  • Error while calling ABAP program from Data Services

    Hi All,
    We have a ABAP program which accepts two parameters 1] a date 2] a string of comma separated ARTICLE numbers .
    We have used a ABAB transform in ABAP dataflow which refers this ABAP program.
    If I pass a string of 6 articles as second parameter the job executes successfully
    But if i pass 9 articles as follows
    $GV_ITEM_VALUES='3564785,1234789,1234509,1987654,1234567,2345678,3456789,4567890,5456759';
    i get the following error
    ABAP program syntax error: <Literals that take up more than one line are not permitted>.
    The error occurs immediately after ABAP dataflow starts, ie even before the ABAP job gets submitted to ECC
    I am using BODS 4.2 . The datatype of $GV_ITEM_VALUES is varchar(1000).
    The ABAP program that gets generated by the DS job has the following datatype for this parameter
    PARAMETER $PARAM2(1000) TYPE C
    Is there a different way to pass string characters to ABAP transform in data services?
    I have attached the screen shot of trace log and error
    Regards,
    Sharayu

    Hi Sharayu,
    The error your getting is because the  literals exceeds more than 72 characters.
    It seems that the length of the string is exceeding more than 72 character.
    Can you check the following in ECC GUI
    Go to Transaction SE38=>Utilities=>Settings=>ABAP Editor=>Editor=> Downwards -Comp.Line  Length(72).
    The checkbox which defines length 72 must be clicked so the error is coming. Can you uncheck the checkbox and then try passing the parameter $GV_ITEM_VALUES using the BODS job
    Regards
    Arun Sasi

  • Running abap programs using the macros

    In our DP implementation we have faced several issues in the extraction of the data from the demand planning to one of the legacy system. We have written the ABAP program and would be using the read planning book bapi. I wanted to run the ABAP program using the macro could some one please let me know what is the structure of the macro builder that is used to fire the ABAP program. this is being fired from the macros as some comparison is to be done to run the macro.

    Hi,
    Some more information on macro function
    REPORT_SUBMIT()
    REPORT_SUBMIT( 'program_name' ;  <'job_name'> ; <'job_number'>; <'newmode'>) causes the specified program to be executed. Use the optional arguments, job name and job number, if you wish the program to run in the background. If you set the argument 'newmode', the results are displayed in a new window.
    Hope this will helps you.
    Regards,
    Sunitha.

  • How do we call smartforms in abap program or web services

    how do we call smartforms in abap program or web services
    How many types of smartforms are there?
    points will be rewarded

    Hi
    See this sample program
    Using the fun module smartform is called from the program
    Calling SMARTFORMS from your ABAP program
    REPORT ZSMARTFORM.
    Calling SMARTFORMS from your ABAP program.
    Collecting all the table data in your program, and pass once to SMARTFORMS
    SMARTFORMS
    Declare your table type in :-
    Global Settings -> Form Interface
    Global Definintions -> Global Data
    Main Window -> Table -> DATA
    http://sapr3.tripod.com
    TABLES: MKPF.
    DATA: FM_NAME TYPE RS38L_FNAM.
    DATA: BEGIN OF INT_MKPF OCCURS 0.
    INCLUDE STRUCTURE MKPF.
    DATA: END OF INT_MKPF.
    SELECT-OPTIONS S_MBLNR FOR MKPF-MBLNR MEMORY ID 001.
    SELECT * FROM MKPF WHERE MBLNR IN S_MBLNR.
    MOVE-CORRESPONDING MKPF TO INT_MKPF.
    APPEND INT_MKPF.
    ENDSELECT.
    At the end of your program.
    Passing data to SMARTFORMS
    <b>call function 'SSF_FUNCTION_MODULE_NAME'
    exporting
    formname = 'ZSMARTFORM'</b>
    VARIANT = ' '
    DIRECT_CALL = ' '
    IMPORTING
    FM_NAME = FM_NAME
    EXCEPTIONS
    NO_FORM = 1
    NO_FUNCTION_MODULE = 2
    OTHERS = 3.
    if sy-subrc <> 0.
    WRITE: / 'ERROR 1'.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    call function <b>FM_NAME</b>
    EXPORTING
    ARCHIVE_INDEX =
    ARCHIVE_INDEX_TAB =
    ARCHIVE_PARAMETERS =
    CONTROL_PARAMETERS =
    MAIL_APPL_OBJ =
    MAIL_RECIPIENT =
    MAIL_SENDER =
    OUTPUT_OPTIONS =
    USER_SETTINGS = 'X'
    IMPORTING
    DOCUMENT_OUTPUT_INFO =
    JOB_OUTPUT_INFO =
    JOB_OUTPUT_OPTIONS =
    TABLES
    GS_MKPF = INT_MKPF
    EXCEPTIONS
    FORMATTING_ERROR = 1
    INTERNAL_ERROR = 2
    SEND_ERROR = 3
    USER_CANCELED = 4
    OTHERS = 5.
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Can We Run a ABAP Program in Background

    Hi,
    How to run an ABAP Program in Background.
    Points willbe awarded.
    Regards,
    Jayasimha

    Hi,
    <b>Please see this document also
    http://help.sap.com/saphelp_nw04/helpdata/en/fa/096ccb543b11d1898e0000e8322d00/content.htm
    Easy Job Scheduling Using BP_JOBVARIANT_SCHEDULE</b>
    To schedule a job from within a program using the express method, you need only call the BP_JOBVARIANT_SCHEDULE function module.
    The express method has the following characteristics:
    Simplified job structure: The function module schedules a job that includes only a single job step.
    The function module uses default values for most job-processing options. You cannot, for example, specify a target printer as part of the call to the function module. Instead, the job step uses the print defaults of the scheduling user.
    Only ABAP reports can be scheduled. You must use the "full-control" method to start external programs.
    The range of start-time options is restricted. Event-based scheduling is not supported.
    The function module works as follows:
    You name the report that is to be scheduled in your call to the function module.
    The function module displays a list of variants to the user. The user must select a variant for the report.
    You must ensure that the variants required by your users have already been defined.
    The user picks either "immediate start" or enters a start date and start time. Optionally, the user can also make the job restart periodically. The job is then scheduled.
    Example
    You could use the following code to let users schedule report RSTWGZS2 for checking on the status of online documentation:
    call function 'BP_JOBVARIANT_SCHEDULE'
    exporting
    title_name = 'Documentation Check' " Displayed as title of
    " of scheduling screens
    job_name = 'DocuCheck' " Name of background
    " processing job
    prog_name = 'RSTWGZS2' " Name of ABAP
    " report that is to be
    " run -- used also to
    " select variants
    exceptions
    no_such_report = 01. " PROG_NAME program
    " not found.
    call function 'BP_JOBVARIANT_OVERVIEW' " List the jobs that
    exporting " have been scheduled
    title_name = 'Documentation Check' " Displayed as title
    " of overview screen
    job_name = 'DokuCheck' " Jobs with this name
    " are listed
    prog_name = 'RSTWGZS2'
    exceptions
    no_such_job = 01.
    Regards, ABY

  • Schema Mapping in Data Service Integrator

    Hi,
    I'm just examining some schema mapping programs like BizTalk, Altova and IBM Rational Data Architect, and now found Oracle Data Service Integrator which might be a similiar tool. Actually I found out about Data Service Integrator via BeQ AquaLogic, which probably had been some schema mapping tool before Oracle acquired it and now offers it as Data Service Integrator.
    My question is, whether Oracle Data Service Integrator is really applicable for schema mapping/matching, such as creating mappings between xml, csv/flat files or database schemas. I already downloaded it and tried it out, but had troubles creating a map. According to a tutorial you create a physical data service if you wanna do something like mapping, but after I did this there was only a "map" with a source schema. There was no way to add a target schema and map it with the source schema.
    So can I create mappings in Data Service Integrator or are there other products which would be more convenient (Oracle Warehourse Builder for instance)? If so, does anyone know whether there is a good tutorial how to map simple schemas such as xml files in Oracle Data Service Integrator?
    Thank you in advance.

    After you create your physical data services, create a logical data service using your target schema as the 'return type'. Then add functions and use the xquery mapper to map your physical data services (csv, database, xml, web service etc) to your target schema. You can also use logical data services as the input to a logical data service.

  • Extract data from ECC to Oracle using Data Services 4.0

    How to extract data from ecc6.0 Business content extractors  to oracle using sap bo data services 4.0

    Are you trying to use the SAP BW Business Content to extract data out of ECC and load into Oracle tables with Data Services? If that's the case, then you cannot do that. The SAP BW Business Content was developed to only be used in conjunction with SAP BW. When using Data Services to access the extractors in ECC, it has to have an SAP BW InfoPackage associated with it to execute. In this architecture, Data Services is only a pass through from ECC to BW and allows the ability to do some transformations of data prior to loading into the EDW layer (staging tables basically) on SAP BW.
    To connect ECC to Oracle, you're going to have to have all of the SAP BusinessObjects supplied Function Modules loaded onto ECC, along with a non-dialog logon account that has the ability to pass dynamic ABAP programs, generate the programs and schedule them. Depending on how you want to process the output, you may also have to have the ability to write to files on the ECC application servers and have an FTP account created on the application servers that can GET flat files and potentially DELETE them (you're going to need to delete periodically, otherwise your jobs will crash when the file space allocation has been consumed).

  • Err with scheduling an abap program using open dataset

    issue: have an abap program which uses "open dataset ... for input ..." to read the file. 
    - with manual ly running it, receive the following message "dataset_not_open".  
    - with scheduling it, receive same message
    attempting to run an abap program as part of a process chain (ie scheduling a background job) in BI.
    the abap performs the following fxns:
    1) read a file on the server
    2) removes delimiter, renames it
    3) rewrites the file onto the server
    initially used ws_upload for reading and ws_download for writing the file. 
    - both fxns worked fine if it is run manually --> but failed as a  background (part of process chain)
    - note 7925 states can't use ws_upload, download for background jobs
    -so switched to "open dataset"
    Any suggestions as to why the "open dataset" does not work is greatly appreciated it.
    B.A.

    Thank you for all responses. here is more info about the err message:
    sy-subrc = 8
    'invalid argument'
    I looked up the invalid argument in note 99155 --> due to "The destination file is no longer available during repeated file access. "   So, the following steps were taken:
    - file was regenerated and
    - file was placed on the server to be read
    have the following code:
    OPEN DATASET FILENAME FOR OUTPUT IN TEXT MODE encoding default
                          MESSAGE D_MSG_TEXT.
    also have tried the following:
       OPEN DATASET d1 for input in text mode encoding default.
       open dataset d1 for output in text mode encoding NON-UNICODE..
       open dataset d1 for output in text mode  encoding utf-8.
    none had worked.  system --> status shows no unicode.
    THanks again for any suggestions.

  • Urgent: Calling ABAP Program using JMS

    Hi,
    I have a scenario where legacy system pass some messages to ABAP Program and this program can handle one message at a time (written in that way).
    Now the receiver communication channel is configured to access J2SE adapter. This J2SE adapter stores the message on R/3 system and triggers the ABAP program located in R/3 system.
    Now when multiple messages are coming at a time in SAP-XI and processed successfully and handover to J2SE adapter. But J2SE adapter triggers the ABAP program for all messages. And here is the problem. ABAP program is not supporting multi - threading.
    my idea is to use JMS adapter...can you guys suggest me how to achieve result and how to configure JMS or any other adapter to call ABAP Program so that only one message will pass to ABAP program at a time.
    Regards,
    Gourav Khare

    Hi,
    First find out where your ABAP program the data written.You need to write it into and spool then only you can see it.
    Transaction SP01, you can use the FM 'GET_PRINT_PARAMETERS' in your abap program to write to spool.
    Thanks,
    Ravi

  • How to run a ABAP Program in Batch JOB

    How to run a ABAP Program in Batch JOB ?

    Hello Manish,
    Using transaction SM36 you can define the batch job along with the start conditions for that job.
    1. Transaction SM36.
    2. Give the Z name of the job in the 'Job Name' input field.
    3. Click on 'Steps' button from the application toolbar.
    4. On the 'Create Step 1' dialog box, give the name of the ABAP program in the 'ABAP Program' section' along with the variant.
    5. Click on 'Check Input' button from the dialog box.
    6. Click on 'Save' button from the dialog box once the check is successful.
    7. One list will be shown. Click on Back button from the standard toolbar.
    8. Click on 'Start Conditions' button from the application toolbar. Specify the start condition e.g. immediate. Click on save.
    9. The job staus is now scheduled.
    10.Click on Save button from the standard toolbar of SM36. The job status will be released.
    Using SM37 you can monitor the status of the job.
    This will sort out your problem.
    PS If the answer solves your query, plz reward points.
    Regards

  • Integrating Analytics model with Data Services/Integrator

    We are trying to integrate a regression model (analytics result) in PMML format into the data solution. We are using Data Services and our clients also have Data Services/Data Integrator.
    What is the method to apply the analytics model over the complete data solution. We could be flagging or scoring data.
    Appreciate all help in this regard.

    No offense indended. Neither do I know you in person nor your background, so I thought I better cover all possibilites, even the trivial ones. I am sure you had a similar conversation like the one below with a customer of yours
    "My computer is not working!" and after hours you figured the solution: "Turn on your monitor"
    The problem I was having was the inconsistant information or how I interpreted it:
    Fact1: you see swapping and huge memory consumption
    Fact2: The sum of virtual memory of all processes is less than the physical memory, at least does not account for swapping.
    Fact3: Hence DI has to have a memory leak.
    That didn't make sense to me. And I am still not sure I got your point about the core reason of it and where my understanding was wrong. Was the al_engine still living as a zombie process within Windows? Or did the SQL Server not release the session memory? I have no idea how to tell either one anyway....
    In regards to row level locking in the SQL Server database I can possibly give you some background that might help: SQL Server does row level locking until it figures that to be inefficient because of the number of changes and does automatically switch to page level locking. If somebody else or even your own process but different session is touching any row inside that block, you get an error. This has caused us some troubles with DI, e.g. you have a Table Comparison in row-by-row mode. A few rows pass through the TC transform, the SQL Server is switching to page level locking and suddenly the TC cannot read a row anymore. The concept of an insert/update prohibiting a read (!!), a read of a row never touched even, is something we as Oracle DBAs would never dream of initially.
    Are you intending to go to the Business Objects User Conference this year? [http://www.myboc.org/?extcmp=salesflash_global_2008_2107]
    If yes, let me know. I buy you a beer. From Oracle DBA to Oracle DBA.

  • Error while loading data into BW (BW as Target) using Data Services

    Hello,
    I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
    Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
    Please let me know what this means and  how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
    Thanks
    Pradeep

    Hi Pradeep,
    Regarding your query please refer below SCN thread for the same issue:
    SCN Thread:
    FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
    Error in loading data from BOFC to BW using FIM 10.0
    Thanks,
    Daya

  • Using Data Services (AMF) with Android

    Hello,
    I'm working on Android application which needs to invoke existing AMF service.
    I've read the following post:
    Creating an Android application that invokes Data Services
    ...which states that you need flex-messaging-client-android.jar as part of your build to be able to use Data Services.
    This took me to the following post:
    Create a Data Services application for the Experience Server that returns data
    From there I managed to set up Expesience Server and connect to the Packages, but nowhere can I find the "dataservices-sdk-pkg.zip" which is suppose to contain the flex-messaging-client-android.jar. All I can find is "dataservices-pkg.zip" which does not have "flex-messaging-client-android.jar", but has bunch of other JARs. I've also spend reasnoble amount of time seraching through Package Share, but no luck.
    I've also Googled and read through all other posts relaetd to this topic and none of them seem to lead to success
    So my question is:
    when connecting Android application to AMF backend, do we still need to use the "flex-messaging-client-android.jar" or is there a new JAR that has replased this?
    If "flex-messaging-client-android.jar" needs to be used, could you help me find it.
    Thanks

    Found it...
    In the Package Share found the following package: es-sdk-pkg
    Downleaded that, then saved to local drive, then unzip.
    In "es-sdk-pkg" package you can find "dataservices-sdk-pkg" in the following location:
    ...\jcr_root\etc\packages\adobe\aep\platform\sdk
    Unzip "dataservices-sdk-pkg" and the "flex-messaging-client-android.jar" is in the following location:
    ...\jcr_root\etc\aep\sdks\riaservices\dataservices\4.5.1\android
    Hope this helps someone else.

  • Push Data in SAP Tables using Data Services

    Hi
    Lets suppose that I have used Data Services 4.0 to pull data from KNA1 table , and loaded into the MDM where data cleansing , enrichment and deduplication has been done, and then MDM writes to a file.
    How can Data Services then push the cleansed data back into the SAP table ( KNA1) ? [or it cannot be done due to multiple reasons ]
    Rishi

    Hi
    To update an SAP table, import the metadata for the relevant BAPI or IDoc and call it via a query transform.
    M

  • Extract data from a BW 7.0 cube to a SQL DB using Data Services XI

    Hi Gurus,
    We are trying to extract data from a BW 7.0 cube to a SQL DB using Data Services XI, the issue is that we can not read text without making "joins" between SID in the fact table and the master data tables. Do you know if it is posible to read text in a natural way?
    Best Regards

    Thanks Wondewossen,
    As you know, the DataStores (Data Services) provide access to:
    1.-Tables
    2.-Functions
    3.-IDOCs
    4.-Open Hub Tables
    We are trying to extract data using the first one (Tables), not using Open Hub.
    Best Regardas

Maybe you are looking for

  • Preenchimento de tags do grupo II

    Boa tarde, Prezados, no processo de criação de NF-e de importação alguém sabe se a SAP liberou algo standard (não apenas preenchimento via Badi) para preencher as informações de despesas de aduana e IOF? Tags do XML: vDespAdu, vIOF. Sds, Ariel

  • My laptop says the hp c309a printer is offline, but it isn't.. i can print from my hubby's laptop.

    I have rebooted printer, pc and even wireless network.  Why won't my Windows 7 PC recognize the printer is online.  I verified the IP address was the same.

  • Apps are assigned to wrong Apple ID

    Just bought my very first Apple purchase after resisting years of bombardment of propoganda from my friends and colleagues. They all told me how "easy" they are and how they "just work". Not so, I find this new 27" iMac so restrictive in what I can a

  • Assign Reference Documents

    Hi, Can we assign more than 1 XML result to an XML output property of a transaction? Like, "XMLOutput = Query_1.Results & Query_2.Results". Regards, Chanti.

  • Does anyone else have a bill that is a different ammount every month

    My bill this month is $13 higher than last month. I was told by the condescending customer service last month that I could expect to pay withing a few cents the same amount every month. Surprise! $13 higher this month and customer service told me it