Background job output to file on desktop

Hi All,
my requirement is, query(SQ01)created with infoset and user group.
in the query we have selection option file store, when the user gave path(can be shared drive or application server or desktop)
and run in background file should save, which is not happening.
currently foreground we can save the file at required location, the problem is when i run in background not able to save the file.
logic: got the spool request, conver to itab using FM after that OPEN DATASET .......
if any one come accros the situation, please let me know the solution.
thank you in advance.
Regards,
Madhavi

Hi,
Please check below options "
To open a file for reading, use the FOR INPUT addition to the OPEN DATASET statement.
To open a file for writing, use the FOR OUTPUT addition to the OPEN DATASET statement.(If the file does not already exist, it is created automatically.)
Try using below logic .
lv_filename = <input file path>
open dataset lv_filename for input in text mode encoding default.
    if sy-subrc = 0.
      do.
        read dataset lv_filename into gs_input-wa_string.
        if sy-subrc eq 0.
          append gs_input to gt_input.
        else.
          exit.
        endif.
      enddo.
      close dataset lv_filename.
    endif.
Please revert for Further Qs.
Thanks and Regards,
P.Bharadwaj

Similar Messages

  • Why the background job for downloading file failed?

    I have a background job for downloading file with logical path.
    with
    'OPEN DATASET l_out_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT';
    Now, the program can be run in foreground;but when it running on background, it shows 'open file error'.
    I do not know what happen?
    Please help ~~

    Hi,
    The logical path you have provided might be incorrect or you might not have the necessary authorization to read/write on that folder.
    Please check the folder in transaction AL11 & check whether you have the authorization to read/write. BASIS consultant would be able to tell you about it.
    Best regards,
    Prashant

  • Regarding Background Job scheduling for file to file scenario

    Hi Guru's
    Good morining all of u
    I have one doubt on ...
    can we do background job scheduling for file to file scenario?
    Please give me response as early as possible.
    thanks and regards
    sai

    background jobscheduling for file adapter...
    you can schedule the file adapter according to your requirement as follows:
    In the  Communication Channel Monitoring Locate the link Availability Time Planning,  In Availability Time Planning, choose the Availability time as daily and say create, give the time details, select the communication channel in your case file adapter , goto the Communication Channels tab and filter and add the respective channel, save it
    /people/shabarish.vijayakumar/blog/2006/11/26/adapter-scheduling--hail-sp-19-

  • Background job output to Presentation server

    Hi,
    I am executing report as Background job.
    after job gets completed output should be in excel format in front end or presentation server.
    Regards
    Naga

    Hi,
    We cannot perform any action on the presentation server in a program that is being execcuted in background. What you can do is to write the output on to the app server, and after wards move the file on to the Pres. server using CG3Y transaction.
    Regards,
    Ravi
    Note : Please mark the helpful answers

  • Background job output

    I schedule a job in backgroung proces(smartform).
    Whwn i click on the spool  to see the output ,  there is  error  "NO FRONTEND AVAILABALE" .
    how can i see my output of the back ground  job (smartforms).
    i am executing the process  like 
    1. tcode run
    2. F9  ( FOR BACKGROUND PROCESS)
    3. se37  to see the job proceesing
    4. click on spool
    5. but no  ouput i can see
    i don't know why??

    Hi Ajay,
    Please search form with '
    Mulitple Purchase order Print in SAP Script'
    the thread shows below.
    If your requirement is to print multiple PO's then i think you need to develop a Program .
    In that in your final internal table you can loop it and print...
    And also these setting need for background job.
    If you have smartforms you can set following parameters
    CALL FUNCTION LV_FM_NAME
    EXPORTING
    ARCHIVE_INDEX =
    ARCHIVE_INDEX_TAB =
    ARCHIVE_PARAMETERS =
    CONTROL_PARAMETERS = CONTROL
    MAIL_APPL_OBJ =
    MAIL_RECIPIENT =
    MAIL_SENDER =
    OUTPUT_OPTIONS = OUTPUT_OPTIONS
    USER_SETTINGS = ' '
    VEHICLE = VEHICLE
    LV_BAR = LV_BAR
    IMPORTING
    DOCUMENT_OUTPUT_INFO =
    JOB_OUTPUT_INFO =
    JOB_OUTPUT_OPTIONS =
    TABLES
    IT_FINAL = IT_FINAL
    EXCEPTIONS
    FORMATTING_ERROR = 1
    INTERNAL_ERROR = 2
    SEND_ERROR = 3
    USER_CANCELED = 4
    OTHERS = 5
    IF SY-SUBRC 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    CONTROL-PREVIEW = 'X'. "Preview the output of Smartform
    CONTROL-NO_DIALOG = 'X'. "Don't show Dialog
    OUTPUT_OPTIONS-TDDEST =  'X' or space. "Spool: Output device(depend on requirement)
    OUTPUT_OPTIONS-TDNOPRINT = ' '. "No printing from print preview
    OUTPUT_OPTIONS-TDCOPIES = 1. " number of print copies
    OUTPUT_OPTIONS-TDIMMED = 'X'. " Print Immediately (Print Parameters)
    OUTPUT_OPTIONS-TDDELETE = 'X'. " delete the request after printing
    OUTPUT_OPTIONS-TDLIFETIME = '5'. " spool retention period
    OUTPUT_OPTIONS-TDCOVER = ' '. " no cover print
    If you want immediate output while setting background set printer property time of print "Immediate"
    Hope this will help to you.
    Thanks,
    Hari

  • Background Job : To download file to windows server

    Hi Gurus,
    Here we have to develop a program thats creates a csv file in a network shared folder( a windows folder), We have to schedule it as a background process. We cant use gui_upload function module, since it doesnt support background jobs.
    Please share your ideas.
    Any idea about ftp?
    Thanks in advance,
    Anuroop

    hi,
    I think you can have a network shared folder on application server.
    you said that it is a comma separated file.
    1) Using TCODE CG3Z, upload the flat file to application server , remember to copy the path (it contains file path aswellas name).
    2) Using DATASET read the filepath along with the name and read each line into workarea and split at ',' into fields.
    3) Then append those fields to internal table.
    using the FMs job_open, job_submit and job_close you can go for background scheduling.
    Please go through the following link for sample code on FTP.
    http://abap4.tripod.com/FTP_Using_SAP_Functions.html
    Reward points if helpful.
    Thanks and regards

  • Background job - failing when file not on app server

    Hi Experts,
    I am facing an issue, while I am trying to run a program in background.
    Let me explain u the situation.
    My program runs in background and pick one file from predifined folder from application server and process that file. if file is not in the folder batch job goes into cancel status. but what i want if proposed file is not available then batch job will run and finished; it should not go into cancle status.
    waiting for valuable inputs,
    Prarthan.
    Edited by: Julius Bussche on Feb 20, 2009 1:06 PM
    Please use meaningfull subject titles

    hi
    below is my piece of code for application server.
    thanks.
    FORM application_server .
      TYPE-POOLS: kcde.
      DATA : lt_intern TYPE  kcde_cells OCCURS 0 WITH HEADER LINE.
      DATA wa_src01 TYPE string.
      DATA: BEGIN OF wa_src0,
           material         LIKE bapi2017_gm_item_create-material,
           consumption_qty(13) TYPE c,"  like bapi2017_gm_item_create-entry_qnt,
           base_uom         LIKE bapi2017_gm_item_create-entry_uom_iso,
           movement_type    LIKE bapi2017_gm_item_create-move_type,
           cost_center      LIKE bapi2017_gm_item_create-costcenter,
           doc_date         LIKE bapi2017_gm_head_01-doc_date,
           post_date        LIKE bapi2017_gm_head_01-pstng_date,
           plant            LIKE bapi2017_gm_item_create-plant,
           storage_location LIKE bapi2017_gm_item_create-stge_loc,
           END OF wa_src0.
      DATA : lt_src0 LIKE TABLE OF wa_src0.
      DATA : file1 TYPE string.
      DATA : tmp0(20). " type string.
      DATA : intern1 TYPE  kcde_intern.
      DATA : tmp_date LIKE sy-datum.
      path1 = path.
      replace '.txt' in PATH1 with ''.
      IF path IS INITIAL.
        path = '/usr/local/interface/globalone/SFA/SFA_WORK/SFA_'.
        tmp_date0(2) = sy-datum6(2). "date dd
        tmp_date2(2) = sy-datum4(2). "month mm
        tmp_date4(4) = sy-datum0(4). "year yyyy
        CONCATENATE path plant tmp_date '.txt' INTO path.
       concatenate path  tmp_date  into path1.
      ENDIF.
      file1 = path.
      file_nm = path.
    FILE_NM2 = PATH1.
      OPEN DATASET file_nm FOR INPUT IN TEXT MODE ENCODING NON-UNICODE.
      IF sy-subrc NE 0.
       MESSAGE e000(zmm002).
       EXIT.
      ELSE.
        DO.
          READ DATASET file_nm INTO wa_src01.
          IF sy-subrc = 0.
             CLEAR wa_src0.
            REPLACE cl_abap_char_utilities=>cr_lf(1) IN wa_src01 WITH ''.
            SPLIT wa_src01 AT cl_abap_char_utilities=>horizontal_tab
                                INTO  wa_src0-material
                                      tmp0
                                      wa_src0-base_uom
                                      wa_src0-movement_type
                                      wa_src0-cost_center
                                      wa_src0-doc_date
                                      wa_src0-post_date
                                      wa_src0-plant
                                      wa_src0-storage_location.
            SHIFT tmp0 LEFT DELETING LEADING '0'.
            wa_src0-consumption_qty = tmp0.
            APPEND  wa_src0 TO lt_src0.
          ELSE.
            EXIT.
          ENDIF.
        ENDDO.
        CLEAR wa_src0.
        LOOP AT lt_src0 INTO wa_src0.
          REPLACE ',' IN wa_src0-consumption_qty WITH '.'.
          MOVE:  wa_src0-material         TO wa_srcdata-material,
                 wa_src0-consumption_qty  TO wa_srcdata-consumption_qty,
                 wa_src0-base_uom         TO wa_srcdata-base_uom,
                 wa_src0-movement_type    TO wa_srcdata-movement_type,
                 wa_src0-cost_center      TO wa_srcdata-cost_center,
                 wa_src0-doc_date         TO wa_srcdata-doc_date,
                 wa_src0-post_date        TO wa_srcdata-post_date,
                 wa_src0-plant            TO wa_srcdata-plant,
                 wa_src0-storage_location TO wa_srcdata-storage_location.
          APPEND wa_srcdata TO gt_srcdata.
          CLEAR wa_srcdata.
          CLEAR wa_src0.
        ENDLOOP.
      ENDIF.
      DELETE DATASET file_nm.
    ENDFORM.                    " application_server

  • FinancialUtilService, downloadESSJobExecutionDetails method is not uploading ESS job output/log files to UCM.

    Hi,
    We are invoking financialUtilService web service using HTTP Proxy Client to upload data files and submit ESS and to get ESS job log files. We are able to successfully upload and submit ESS job. But downloadESSJobExecutionDetails is not uploading logs/out files to UCM.
              List<DocumentDetails> docDetails = financialUtilService.downloadESSJobExecutionDetails(requestId.toString(), "log");
            //  List<DocumentDetails> docDetails1 = financialUtilService.downloadExportOutput(requestId.toString());
              System.out.println("Ess Job output:" + docDetails);
              for(DocumentDetails documentDetails : docDetails){
                System.out.println("Account: "+documentDetails.getDocumentAccount().getValue());
                System.out.println("File Name: "+documentDetails.getFileName().getValue());
                System.out.println("Document Title: "+documentDetails.getDocumentTitle().getValue());
                System.out.println("DocumentName " + documentDetails.getDocumentName().getValue());
                System.out.println("ContentType " + documentDetails.getContentType().getValue());
    Below output is returned:
    Ess Job status:SUCCEEDED
    Ess Job output:[com.oracle.xmlns.apps.financials.commonmodules.shared.financialutilservice.DocumentDetails@5354a]
    Account: fin$/payables$/import$
    File Name: null
    Document Title: Uma Test Import
    DocumentName:  84037.zip
    ContentType zip

    Hey
    We have the same problem. On calling `downloadESSJobExecutionDetails`, a zipfile is returned, but it only contains my original upload file, not any logs. Although, if I call `downloadESSJobExecutionDetails` on a dependent child subprocess, the log is included.
    I also reported this to oracle support (SR 3-10267411981) - they referred me to known bug, to be fixed in v11: https://bug.oraclecorp.com/pls/bug/webbug_edit.edit_info_top?rptno=20356187 (not public accessible though )
    Is there any work around available?

  • Download Background job output to Excel automatically

    Dear Experts,
    Hi there... Please note the following scenarion and give me your expert solutions on the same...
    I have created a SALES QUANTITY REPORT for exports... In this there are many conditions applied to achieve data in many of the columns of this report. If we run it for 2-3 months it is not a problem... The output comes at a marginal wait time... But the moment it is run for a whole year then it just goes on and on.... I cannot fine tune also.... because of the conditions... For better performance and fast retrieval i have used views in this report to achieve quick output...
    The thing is when i put it to execute in Background it comes within an hour or so... But the output is not readable.... so here is what i want:
    1. First of all why report executes so quickly when put in background, whereas in foreground it takes more than 4 hours and also no guarantee of output.. This is to satisfy my curiosity...
    2. I want the output to get automatically converted into excel and get stored in a specific folder the moment the job is finished in background without any format getting disturbed. The report is prepared in ALV...
    Please gurus help me out.
    Thanks...
    Jitesh

    It is giving me the following error: -
    17.02.2009 11:24:41 Job started                                                                             00           516          S
    17.02.2009 11:24:41 Step 001 started (program Z_SD_SQR_EXPORT, variant &0000000000007, user ID JBABAP)      00           550          S
    17.02.2009 11:24:45 Incorrect Print Parameter                                                               0K           091          I
    17.02.2009 11:24:45 Spool request (number 0000014520) created without immediate output                      SY           355          S
    17.02.2009 11:24:45 Excel file c:\sqr\sqrexport.xls cannot be processed                                     UX           893          E
    17.02.2009 11:24:45 Job cancelled after system exception ERROR_MESSAGE                                      00           564          A
    form DOWNLOAD .
    CALL FUNCTION 'SAP_CONVERT_TO_XLS_FORMAT'
      EXPORTING
    *   I_FIELD_SEPERATOR          =
    *   I_LINE_HEADER              =
        i_filename                 = 'c:\sqr\sqrexport.xls'
    *   I_APPL_KEEP                = ' '
      tables
        i_tab_sap_data             = it_final
    * CHANGING
    *   I_TAB_CONVERTED_DATA       =
    * EXCEPTIONS
    *   CONVERSION_FAILED          = 1
    *   OTHERS                     = 2
    IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    endform.                    " DOWNLOAD
    Please reply..
    Thanks,
    Jitesh

  • How to view background job output

    Hi experts,
      I excecuted one of my Z report in background mode, its finished successfully, i want to view the output. How to view this output.
      Otherwise to view this ourput only in Basis people or what? if its so i will contact the basis person. Otherwise give some idea how to view this output.
    Thnx,
    Mohana

    Mohana,
    Go to sm37
    give the job name or *
    give the username
    give the program name
    execute it
    now tick the program that you had executed in the background
    click on the spool button on the top
    You will get a ABAP list under the field type
    click it.
    You wil be able to see the output of the program that you  had executed in the background.When a program is executed in the background mode you will be able to see the output in the List Display mode and moreover you will be able to see at the max only 19 fields when a report is executed in the background.
    You can check the jobs that are running in the background and their status in the table TBTCO.
    Thanks,
    K.Kiran.

  • Log job output to file - information missing? dbcc checkdb

    Hello
    Not sure where to put this question.. feel free to move it if necessary.
    I have a job which runs DBCC CHECKDB WITH PHYSICAL_ONLY on every database on the instance which is read_write. Problem is that I want to get the output of the result to make sure that every database is actually performing this command. When viewing normal
    history by right clicking the job and select "view history" I get cut off information due to the lack of space allowed (1000 chars default I think).
    So I tried to log it to table and view it by msdb.dbo.sp_help_jobsteplog but the information here does not cover all databases as well.. So then I tried to log it to a file. But I get the same information there, and not all databases are logged.
    So I start to wonder if the dbcc checkdb job does not get executed on the other databases?? If I look in the Current SQL Server Logs I only see that DBCC CHECKDB WITH PHYSICAL_ONLY executed on the same databases that is listed in my output file.
    What can I do? the instance contains over 400 databases but only approximately 70 is logged as doing a dbcc checkdb.
    This is my command:
    SET NOCOUNT ON
    EXEC sp_MSforeachdb @command1='
    IF NOT(SELECT DATABASEPROPERTYEX(''?'',''Updateability''))=''READ_ONLY''
    BEGIN
    DBCC CHECKDB (?) WITH PHYSICAL_ONLY
    END'

    There is a known issue with sp_MSforeachdb where under heavy load the procedure can actually miss databases with no errors. That can be the case in your environment. Aaron Bertrand wrote about the issue and solutions for the problems in the article:
    Making a more reliable and flexible sp_MSforeachdb.
    Ana Mihalj

  • Background job downloading missing some rows in sm37

    Hi,
    I am downloading background job output through spool request in excel format. Out of 2000 records some 50 records in the middle are not displayed.But those 50 records are there in the spool output.
    When downloaded in text format means all 2000 records are downloaded correctly..
    Give me a solution please...
    Thanks and Regards,
    Jenifer

    Jenifer,
    When you download a spool in "Spreadsheet", SAP really downloads the data in tab delimited text format, even thought the file extension is .XLS.
    So, just open the Excel file in notepad (or any other text editor) and see if the missing rows are present. If there are, then you have to figure out how to import the file properly into Excel file without losing the rows.

  • Call transaction in Background job

    Hi,
    I am executing a program in background. In that program I am having a CALL TRANSACTION as below:
    CALL TRANSACTION 'ME22'
        USING g_t_bdctab
        MODE 'N'
        UPDATE 'S'
        MESSAGES into g_t_bdcmsg.
    But it is not working. The same is working when the program is executed in foreground.
    Is this means, we cannot have CALL TRANSACTION in background?
    If Yes, then what could be the solution for it?
    Thanks,
    Pankaj.

    Hello Pankaj,
    Call transaction works in background.
    Provided you are not picking file from presentation server.
    If you are picking file from presentation server, place the same file in application server and during background job pick the file from application server then your program works perfectly.
    Regards,
    Tarun

  • SPOOL data gets truncated when bckground job output sent to spool recepient

    Hi All,
    I am facing a strange problem.
    I have a background job scheduled. Have specified a distribution list as the spool recipients.
    Background jobs output is sent to the recipients as attachment but data is not complete.
    When I check the spool request for that background job it contains 80 pages but the attachment in the recipients inbox contains just 17 pages.
    SP01/SP02 will display settings to display the range of pages but the emaill attachment doest show.
    Can anyone please help me with this?
    We are on ECC 6.0 (Windows2003/MSSQL2005 SP3)
    Regards,
    Yatin

    Hi,
    When the spool content is sent immediately after the background job its size is 14KB.
    I tested by sending spool content via mail using transaction SP01. I mean after the jobs is completed, I go to SP01 and send the SPOOL content manually. This time recipients receive the correct content of 80 pages and size of the document this time is 73KB.
    Regards,
    Yatin

  • Background Job spool / output file in different app server

    We are working with 2 ECC app servers, A and B.
    I've defined in app server A a background job to run an abap report and this report creates and submit 4 other background jobs.
    Each of these 4 jobs is regarding an abap report which outputs a text file in the server.
    The question is, why are the jobs generating the files in different app servers randomically?
    The same file, in the 1st execution was generated in app server A, and without any change was generated in app server B, in the job 2nd execution?
    Thank you!

    Hi,
    You can specify server name whille creating a batch job , so that it gets executed to that server only.
    there is one field (in SM36 during creation or SM37 while modifying batch job) called "Exec Target " , where you can specify server name.
    Hence you can plan to run your some batch jobs on server A or B.
    Regards,
    Rupali

Maybe you are looking for