Problem in delta job

Hi guru's,
Delta job not running.
Yestarday delta was failed becoz of the incorrect dataformat. I corrected the data in PSA and loaded to dso. But still the request in PSA  is showing red. But data is loaded successfully to dso.
Today delta job not started. what may be the reason and guide me the corrective actions to take.
regards,
Siddes

Hi Sid,
Is the QM status of the IP and the status of request in DSO green? As you have corrected the data in PSA and updated it to target both the QM status of the IP and the status of request in DSO should be green.
If both are green and still the load is failing please post the error message.
Moreover its better to correct the data in source system and reload delta if possible.

Similar Messages

  • Problem in Background Job. Job Completed Successfully but data not Posted

    Hi All,
    There is a problem in Background Job in Production server (600).
    The job completed successfully but data not posted.
    The same Job was working perfectly till the starting of the May month.
    but now it is not working and rebate has not been posted.
    Where to check and what to check
    Pls suggest.
    Regards,
    P Kamal

    Hi ,
    Please first check the status of the Job , ifs successfully completed then debug it (tcode JDBG) and 
    check out,  whether the values passed (might be using Import-export ,set-get parameter Ids) to the Job are getting passed properly or not. These values might not be getting passed properly.
    If it has some error in Job Log then let me know the error.
    Regards,
    Uma

  • Problem while scheduling job in apex

    Hi All ,
    I have problem while creating job in apex .Let me explain my problem clearly.I have created a page in
    url: http://apex.oracle.com/pls/otn/
    Workspace : PREETHI_WORKSPACE
    User ID : [email protected]
    Password : bowova1
    Application :Parse and upload csv file -21007
    Page No:1
    Page Name :Parse csv file.
    Main functionality of this page is to upload csv file into oracle table .First user will browse a file from local machine and upload it in upload region.After that,user enters a table name and clicks create table button which creates a table in DB and also shows the report for the same .Creating table is happening through "create table " process when user clicks "create table" button.Code is working fine .One more things here is, in csv file the second row should have data type mentioned .
    My requirement is ,i have to schedule "htmldb_tools.parse_file" in create table process as a job .In this "create table process" we are calling htmldb_tools.parse_file(:P1_FILENAME,'P1_COLLECTION','P1_HEADINGS','P1_COLUMNS','P1_DDL',:P1_TABLENAME);
    Problem Iam facing here is that the scope of item :p1_filename is getting lost when we pass it to procedure .I think problem is due to different apex and database session .
    Tried a lot through various ways ..Not finding the solution.
    Please help...

    Hi Marco,
    I have tried with the setting what u have mentioned in htmldb_tools.parse_file procedure: wwv_flow_api.set_security_group_id(<workspace_id>);
    But Job Getting fail and giving error ""ORA-01400: cannot insert NULL into (ORA-01400: cannot insert NULL into ("FLOWS_020200"."WWV_FLOW_COLLECTIONS$"."SESSION_ID")
    ORA-06512: at "FLOWS_020200.WWV_FLOW_COLLECTION", line 319
    ORA-06512: at "DIMPLE3_HD_DEV.HTMLDB_SID_50", line 196
    ORA-06512: at line 2
    Please help me out in this issue.
    Regards
    Dhan

  • Problem with delta load urgent!!

    Hi,
    I have a problem with delta load
    We have an IP, which loads data from R/3 system daily, its a delta load to the ODS and it updates to the cube with the selection on Company Codes and 0FISCPER
    we are in 3.5 system
    For a couple of company codes A & B, the init was done for the period 07.2010 to 12.2099 and after tht the deltas are loaded from 07.2010 till 12.2099 and there's no pblm with tht
    Now, there was some updation in R/3 system and the data was maintained for the company codes in A& B for the FISCPER 001.2010 to 006.2010, for which init wasnt maintained...
    now how shall we need to load the init in this case? dnt ask me how the postings was done in R/3, but my pblm is purely related to the loading data in BW from R/3, as its very imp for the customer to see the data from 01.2010 to 12.2010 in reports, but the data was only available in reports from 07.2010 onwards
    do i need to create another IP and maintain the init for 01.2010 to 06.2010? so tht this selection will automatically appear in the ODS delta loading infopackage
    pls throw ur inputs ASAP
    thank you

    Hi Prince,
    No need to maintain any Init for this data, It will be enough if you can load the data from jan 2010 to jun 2010 into your Info Cube.
    Follow the below steps:
    1)create full load IP for this data source.
    2) give the selection as A and B company codes and 0FISCPER as 001.2010 to 006.2010
    3) in menu bar, click on scheduler --> select full repair request
    4) In the next screen check the check box and save
    5) Now execute the IP, it will the data with out disturbing your daily delta.
    follow the same procedure to load data till to your Info Cube.
    Please revert if you have any questions
    Regards,
    Venkatesh

  • How to delete logistic delta jobs on R3 side?

    Hello
    I would like to delete delta jobs (stop system to release these jobs) on R3 side.
    How I can do it?

    Hi
    if you are deleting the delta , the Init in the BW side will also be deleted.
    Follow the steps in case if you need to Delete the Setup tables and Fill it back
    1. Delete data in DSO/Cube
    2. Stop scheduled jobs for transfering data in R/3 side and BI check in SM37 and LBWE -> Job control
    3. Delete setup tables  using LBWG (based on the application component)
    4. Delete delta initialization for your InfoPackage in BI by using RSA1 -> Scheduler -> "Delete Initialization Options for Source System"
    5. Delete RFC queue with LBWQ
    6. Check queues that there is no data with RSA7 and LBWQ
    7. Run OLI*BW in background in your case  restricitons if required
    8. Start delta init InfoPackage in BI when job of initialization in R/3 is finished
    Santosh

  • WHAT IS DELTA JOB IN BODS ?? WHERE WE USE DELTA JOBS ??

    WHAT IS DELTA JOB IN BODS ?? WHERE WE USE DELTA JOBS ??

    There is no 'Delta Job' object in Dataservices, In any ETL tool if you are loading data in incremental way based on certain conditions (Ex: Extract only yesterday's changed data and load), then its a Delta load, the Job which does this, you may call Delta Job.
    Please don't use ALL CAPS, its like yelling in forum

  • Delta jobs

    Hi,
    please provide me more info about types of delta jobs like V3,V2,V1.
    and also tell me how does these delta jobs get affected during upgrade from 3.5 to 7.0.?
    Thanks,
    Vijaya

    Hi ,
    V1 - Synchronous update
    V2 - Asynchronous update
    V3 - Batch asynchronous update
    These are different work processes on the application server that takes the update LUW (which may have various DB manipulation SQLs) from the running program and execute it. These are separated to optimize transaction processing capabilities.
    Taking an example -
    If you create/change a purchase order (me21n/me22n), when you press 'SAVE' and see a success message (PO.... changed..), the update to underlying tables EKKO/EKPO has happened (before you saw the message). This update was executed in the V1 work process.
    There are some statistics collecting tables in the system which can capture data for reporting. For example, LIS table S012 stores purchasing data (it is the same data as EKKO/EKPO stored redundantly, but in a different structure to optimize reporting). Now, these tables are updated with the txn you just posted, in a V2 process. Depending on system load, this may happen a few seconds later (after you saw the success message). You can see V1/V2/V3 queues in SM12 or SM13.
    In a local update, the update program is run by the same work process that processed the request. The dialog user has to wait for the update to finish before entering further data. This kind of update is useful when you want to reduce the amount of access to the database. The disadvantage of local updates is their parallel nature. The updates can be processed by many different work processes, unlike asynchronous or synchronous update, where the update is serialized due to the fact that there are fewer update work processes (and maybe only one).
    rewards points if it solve your problems

  • Delta  reocrds are not getting extracted . Problem with delta Quaue

    Hello friends .
    Could you please help me in this scenario ?
    It’s related to delta load from application 12 ,  - 2LIS_12_VCITM.
    Some how data is not getting transferred from LBWQ to RSA7 . There are more than 200000 entries in LBWQ , but the job is not able to transfer any entries to RSA7.
    For rest other application , data is going correctly .
    The job status showing is correct , like XXXXX LUW are transferred … there is no error in Job log , it technically correct .
    I have found one notes , in SAP service market place , but still it doesn’t match with my job status . My job status is NOTESID  ..where as per the notes it should WKEDPI*** ( something like this ) .
    I have taken below steps to solve this problem but still it doesn’t work .
    1.     Removed job from schedule ( the job for LBWQ to RSA7 ) and Executed the job manually .
    2.     Executed Program RMSBW12.
    3.     Deleted the Delta Q from RSA for application 12 and regenerated the same once again by running the infopackage , early delta initialization with out data transfer , . Once Delta Q is generated , run the job once again .
    I hope , I tried with all correct way ..but still I could not able to transfer the data .
    But with all above steps , still data is not transferred . Daily infopackage is running as per the schedule , and bringing only 0 record . If it still doesn’t work then I have to do Re-initilization .  and this is big cost for us . as we have to lock the sys for users and so on .
    Could you please give me some tips , that how can I transferr the data from LBWQ to delta Q ( RSA7) . If it works then definitely , I will save my lot of time .
    Please suggest me , how can I proceed . Please save me from this situation.
    Many many thanks in advance.
    Regards,

    Hi Akshay,
    Go to SMQ1 and check MCEX12 Queue. If your delta records are piling up in SMQ1 and not being transfered to RSA7, you need to check your back gound job control option once again.
    The back ground collector job will collect the recrords from SMQ1 and push to RSA7.
    Try to execute RMBWV312 program in SE38 and manully push all the recrods to RSA7. Once all the records are available in RSA7 you will find the MCEX12 queue will be cleared in SMQ1.
    Cheers
    Praveen

  • Problems creating background job for program (job open, submit and close)

    Hi gurus,
    im trying to start a background job using the FM BP_START_DATE_EDITOR to show the start date to the job or if it's imediate. this FM it's working fine, after call it im opening a job, submiting it and call the job close FM and the job close FM creates me the job.
    The problem it's when i go to the sm37 to see the job status the job has been canceled, and the job log says that i have to give a start date to the job.
    What i dont understand it's either the job is imediate or i choose a date to start the job always gives me this error...
    Below goes my code,
    any ideas will be rewarded
      CLEAR: stdt_modify_type, stdt_output.
      CALL FUNCTION 'BP_START_DATE_EDITOR'
           EXPORTING
                stdt_dialog                    = 'Y'
                stdt_input                     = stdt_input
                stdt_opcode                    = 14
           IMPORTING
                stdt_modify_type               = stdt_modify_type
                stdt_output                    = stdt_output
           EXCEPTIONS
                fcal_id_not_defined            = 1
                incomplete_last_startdate      = 2
                incomplete_startdate           = 3
                invalid_dialog_type            = 4
                invalid_eventid                = 5
                invalid_opcode                 = 6
                invalid_opmode_name            = 7
                invalid_periodbehaviour        = 8
                invalid_predecessor_jobname    = 9
                last_startdate_in_the_past     = 10
                no_period_data_given           = 11
                no_startdate_given             = 12
                period_and_predjob_no_way      = 13
                period_too_small_for_limit     = 14
                predecessor_jobname_not_unique = 15
                startdate_interval_too_large   = 16
                startdate_in_the_past          = 17
                startdate_is_a_holiday         = 18
                startdate_out_of_fcal_range    = 19
                stdt_before_holiday_in_past    = 20
                unknown_fcal_error_occured     = 21
                no_workday_nr_given            = 22
                invalid_workday_countdir       = 23
                invalid_workday_nr             = 24
                notbefore_stdt_missing         = 25
                workday_starttime_missing      = 26
                no_eventid_given               = 27
                OTHERS                         = 28.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        MOVE 'X' TO gv_flag.
      ENDIF.
      DATA jobname LIKE tbtcjob-jobname.
      DATA jobcount LIKE tbtcjob-jobcount.
      DATA job_release LIKE  btch0000-char1.
      DATA job_imediate TYPE c.
      CLEAR: jobname, jobcount, job_release.
      CONCATENATE 'MAPAEXEC' sy-uname sy-datum
                   INTO jobname SEPARATED BY space.
      CALL FUNCTION 'JOB_OPEN'
           EXPORTING
                jobname          = jobname
           IMPORTING
                jobcount         = jobcount
           EXCEPTIONS
                cant_create_job  = 1
                invalid_job_data = 2
                jobname_missing  = 3
                OTHERS           = 4.
      IF sy-subrc <> 0.
        MESSAGE i003(zmapas).
        EXIT.
      ENDIF.
      SUBMIT z_mapa_execucao_orcamental
             VIA JOB jobname NUMBER jobcount
             WITH ano EQ ano
             WITH so_perio IN so_perio
             WITH so_date IN so_date
             WITH so_org EQ so_org
             WITH so_num IN so_num
             AND RETURN.
      IF stdt_output-startdttyp EQ 'I'.
        CLEAR job_imediate.
        job_imediate = 'X'.
      ENDIF.
      CALL FUNCTION 'JOB_CLOSE'
           EXPORTING
                calendar_id                 = stdt_output-calendarid
                event_id                    = stdt_output-eventid
                event_param                 = stdt_output-eventparm
                event_periodic              = stdt_output-periodic  "?
                jobcount                    = jobcount
                jobname                     = jobname
                laststrtdt                  = stdt_output-laststrtdt
                laststrttm                  = stdt_output-laststrttm
                prddays                     = stdt_output-prddays  "??
                prdhours                    = stdt_output-prdhours  "?
                prdmins                     = stdt_output-prdmins  "??
                prdmonths                   = stdt_output-prdmonths
                prdweeks                    = stdt_output-prdweeks  "?
                predjob_checkstat           = stdt_output-checkstat
                pred_jobcount               = stdt_output-predjobcnt
                pred_jobname                = stdt_output-predjob
                sdlstrtdt                   = stdt_output-sdlstrtdt
                sdlstrttm                   = stdt_output-sdlstrttm
                strtimmed                   = job_imediate
                targetsystem                = stdt_output-instname
                start_on_workday_not_before = stdt_output-notbefore
                start_on_workday_nr         = stdt_output-wdayno
                workday_count_direction     = stdt_output-wdaycdir
           IMPORTING
                job_was_released            = job_release
           EXCEPTIONS
                cant_start_immediate        = 1
                invalid_startdate           = 2
                jobname_missing             = 3
                job_close_failed            = 4
                job_nosteps                 = 5
                job_notex                   = 6
                lock_failed                 = 7
                OTHERS                      = 8.
      IF sy-subrc <> 0.
        MESSAGE i003(zmapas).
        EXIT.
      ELSE.
        MESSAGE i004(zmapas) WITH jobname.
      ENDIF.
    Thanks in advance,
    Best Regards
    João Martins

    Hello João.
    In debug mode, check the value of variables you passed to parameters sdlstrtdt and sdlstrttm.
    As aditional info, I usually achieve your goal without FM BP_START_DATE_EDITOR.
    Check this code:
    CALL FUNCTION 'JOB_OPEN'
          EXPORTING
               jobname          = w_jobname
          IMPORTING
               jobcount         = w_jobcount
          EXCEPTIONS
               cant_create_job  = 1
               invalid_job_data = 2
               jobname_missing  = 3
               OTHERS           = 4.
    CHECK sy-subrc = 0.
    CLEAR seltab_wa.
    MOVE: t_jobs-param TO seltab_wa-selname,
    t_processar-line+34 TO seltab_wa-low.
    APPEND seltab_wa TO seltab.
    seltab_wa-selname = 'P_LOJA'.
    seltab_wa-low = t_processar-ficheiro+7(4).
    APPEND seltab_wa TO seltab.
    *** Submete o programa para o JOB
    SUBMIT (t_jobs-repid)
      WITH  SELECTION-TABLE seltab
      USER sy-uname
       VIA JOB w_jobname NUMBER w_jobcount
       AND RETURN.
    *** Encerra o JOB
      l_hora = sy-uzeit.
      ADD 60 TO l_hora.
    CALL FUNCTION 'JOB_CLOSE'
       EXPORTING
          jobcount           = w_jobcount
          jobname           = w_jobname
          sdlstrtdt            = sy-datum
          sdlstrttm           = l_hora
          targetserver       = w_servidor
       IMPORTING
          job_was_released     = l_liberado
       EXCEPTIONS
          cant_start_immediate = 1
          invalid_startdate    = 2
          jobname_missing      = 3
          job_close_failed     = 4
          job_nosteps          = 5
          job_notex            = 6
          lock_failed          = 7
          OTHERS               = 8.
    Regards.
    Valter Oliveira.

  • Problem with Backgrounf Jobs in RAR 5.3

    Hi All,
    I Scheduled Background Jobs in RAR 5.3 ,But the jobs status showing from past 7 days as " Running " Please help me regarding this issue.its very urgent ,because of this problem My SOD report delayed for this month.
    Every month it is taking 2 to 3 days to complete the jobs, But this time it is taking 7 day ,so i terminate the jobs and asked Basis team to  restart the bg job service to stop the jobs of RAR
    Previous months we are using same verient ,But at that time RAR 5.3 with SP13 and 2 threads,Now they changed as per below detals.
    I am using RAR 5.3 SP 15 with 4 threads.
    Below is the Job Log Dispay(unable to past complete Log)
    =======================
    Nov 8, 2011 12:14:46 AM com.virsa.cc.common.util.ExceptionUtil logError
    SEVERE: null
    Nov 8, 2011 12:14:58 AM com.virsa.cc.xsys.bg.BatchRiskAnalysis getBAPIRoleData
    INFO: -- Last Run Date is 2011-11-07
    Nov 8, 2011 12:14:58 AM com.virsa.cc.xsys.bg.BatchRiskAnalysis getBAPIRoleData
    INFO: -- Current Date is 2011-11-08
    Nov 8, 2011 12:14:58 AM com.virsa.cc.common.util.ExceptionUtil logError
    SEVERE: null
    java.lang.NullPointerException
         Risk Analysis Time: Started @:Tue Nov 08 04:30:36 UTC 2011
    performActPermAnalysis
    INFO: Detailed Analysis Time:
    Risk Analysis Time: Started @:Tue Nov 08 04:30:36 UTC 2011
    Rule Load Time: Started @:Tue Nov 08 04:30:36 UTC 2011
    Rule Load Time:14millisec
    Org Rule Loop Time: Started @:Tue Nov 08 04:30:36 UTC 2011
    Rule Loop Time: Started @:Tue Nov 08 04:30:36 UTC 2011
    Rule Loop Time:5303millisec
    Org Rule Loop Time:5303millisec
    Risk Analysis Time:5481millisec
    Nov 8, 2011 4:30:41 AM com.virsa.cc.xsys.riskanalysis.AnalysisEngine riskAnalysis
    INFO: End Analysis Engine->Risk Analysis .....  memory usage: free=2024M, total=6144M
    Nov 8, 2011 4:30:42 AM com.virsa.cc.common.RiskAnalysisReport render
    INFO: RiskAnalysisReport render: memory changed=0M,free=2020M, total=6144M
    Please help me regarding this issue.
    thanks in advance,
    suresh kumar
    Edited by: K S KUMAR on Nov 8, 2011 6:36 AM

    Hi  Ashish,
    My Background Job  is  Adhoc , This is not the first time  doing ,every month i am doing this activity for SOD report (13 BG Jobs)and other reports,last month they updated SP13 to SP15.Now i  am facing this issue.once BG Job Service restarted all the jobs running properly ,Why like that.i need root cause for this.
    please help me regarding this.
    thanks,
    suresh kumar

  • Problems with background job

    Hi,
    I have problems when creating a job that is supposed to be run once in background. I use the common steps as described below. My problem is that the report is executed twice.
    - First it is executed synchronously when the job is created
    - Then it is executed in the normal job step as I want it to do
    I don't want it to be executed the first time because it creates data!!!
    Code:
    CALL FUNCTION 'JOB_OPEN'
         EXPORTING
              jobname  = w_jobid
         IMPORTING
              jobcount = w_jobnr
              sdlstrtdt = sy-datum
              sdlstrttm = sy-uzeit.
    SUBMIT (p_prog)
    WITH p_idocnr = p_idocno
               USER            p_user
               VIA   JOB       w_jobid
                     NUMBER  w_jobnr
                     AND       RETURN.
    CALL FUNCTION 'JOB_CLOSE'
         EXPORTING
              jobcount         = w_jobnr
              jobname          = w_jobid
              strtimmed        = 'X'
         IMPORTING
              job_was_released = w_jobrel.
    Does anyone have a clue of what I should do to prevent this?
    //  Regards  Hans

    Hi again,
    1. U are right.
    2. It will happen if we use SUBMIT.
    3. As per the help documentation,
        it will run the program in separate session,
       ( as soon as submit statement comes)
    4. and run the INITIALIZATION event.
    The VIA JOB addition also loads the program accessed in a separate internal mode when the SUBMIT statement is executed and the system performs all the steps specified before START-OF-SELECTION. This means the events LOAD-OF-PROGRAM and INITIALIZATION are triggered and selection screen processing is performed. If the selection screen is not processed in the background when VIA SELECTION-SCREEN is specified, the user of the calling program can eidit it and schedule the program accessed in the background request using the function Place in Job. If the user cancels selection screen processing, the program is not scheduled in the background job. In both cases, execution of the program executed is completed after selection screen processing and the system returns to the calling program due to the AND RETURN statement.
    regards,
    amit m.

  • Problem in creating job via submit

    sap 4.7
    i am opening/scheduling job from other program/job
    And from some reason i get two jobs with the same name.
    One is working good, and the other one is stay in schedule.
    My problem is  , why i active two jobs and not only one  ?
    And why one (that I donu2019t want and need it at all ) is stay as "schedule" ?
    this is my code
      CONCATENATE  EVT_ID JOBCOUNT  INTO JOBNAME
                                               SEPARATED BY SPACE.
      LCL_TIME = SY-UZEIT + 1000 .  " + 15 minute
      CALL FUNCTION 'JOB_OPEN'
        EXPORTING
          JOBNAME   = JOBNAME
        IMPORTING
          JOBCOUNT  = JOBCOUNT.  " AND RETURN
         CALL FUNCTION 'JOB_OPEN'
           EXPORTING
              JOBNAME          = JOBNAME
             DELANFREP        = SPACE
           IMPORTING
              JOBCOUNT         = JOBCOUNT
           EXCEPTIONS
              CANT_CREATE_JOB  = 1
              INVALID_JOB_DATA = 2
              JOBNAME_MISSING  = 3
              OTHERS           = 4.
       if sy-subrc <> 0 .
         MESSAGE  I099  WITH 'cant craete job' .
       endif.
            SUBMIT YITF_MOVE_TO_BCKP
             VIA JOB JOBNAME NUMBER JOBCOUNT
                TO SAP-SPOOL   IMMEDIATELY ' '
                DESTINATION    'LOCL'
                KEEP IN SPOOL  'X'
                WITHOUT SPOOL  DYNPRO
                WITH EVT_PARM =  EVT_PARM
                AND RETURN.
      CALL FUNCTION 'JOB_CLOSE'
        EXPORTING
          JOBCOUNT  = JOBCOUNT
          JOBNAME   = JOBNAME
          SDLSTRTDT              = SY-DATUM     "i_strtdt
          SDLSTRTTM              =  LCL_TIME
         STRTIMMED = 'X'
    this is the status in sm37
    YITF_MFHBHY_LOAD 03304501           LEGACY          Scheduled
    YITF_MFHBHY_LOAD 03304501           LEGACY          Complete        04.03.2009 03:49:05

    Hi use this Code to fix the issue...
    DATA : v_jobhead LIKE tbtcjob.
    DATA : v_jobcount LIKE tbtcjob-jobcount.
    DATA : v_eventparm LIKE tbtcjob-eventparm.
    DATA : v_flg_released TYPE c.
    DATA: e_error.
    DATA: running LIKE tbtcv-run.
    TYPES: esp1_boolean LIKE boole-boole.
    CONSTANTS: esp1_false TYPE esp1_boolean VALUE ' ',
               esp1_true  TYPE esp1_boolean VALUE 'X'.
    CONSTANTS: true  TYPE boolean VALUE esp1_true,
                              false TYPE boolean VALUE esp1_false.
    PARAMETERS: v_jobnam LIKE tbtcjob-jobname,
                v_report LIKE sy-repid,
                v_varian LIKE  raldb-variant,
                v_uname  LIKE sy-uname.
    START-OF-SELECTION.
    * add the new job
      CALL FUNCTION 'JOB_OPEN'
           EXPORTING
    *            delanfrep        = 'X'
                jobname          = v_jobnam
           IMPORTING
                jobcount         = v_jobcount
           EXCEPTIONS
                cant_create_job  = 1
                invalid_job_data = 2
                jobname_missing  = 3
                OTHERS           = 4.
      IF sy-subrc  0.
        e_error = true.
      ELSE.
        CALL FUNCTION 'JOB_SUBMIT'  " or you can use SUBMIT statement as well.
             EXPORTING
                  authcknam               = v_uname
                  jobcount                = v_jobcount
                  jobname                 = v_jobnam
                  report                  = v_report
                  variant                 = v_varian
             EXCEPTIONS
                  bad_priparams           = 1
                  bad_xpgflags            = 2
                  invalid_jobdata         = 3
                  jobname_missing         = 4
                  job_notex               = 5
                  job_submit_failed       = 6
                  lock_failed             = 7
                  program_missing         = 8
                  prog_abap_and_extpg_set = 9
                  OTHERS                  = 10.
        IF sy-subrc  0.
          e_error = true.
        ELSE.
          CALL FUNCTION 'JOB_CLOSE'
               EXPORTING
    *               EVENT_ID                    = IC_WWI_WORKPROCESS_EVENT
    *               EVENT_PARAM                 = V_EVENTPARM
    *               EVENT_PERIODIC              = 'X'
                    jobcount                    = v_jobcount
                    jobname                     = v_jobnam
                    strtimmed                   = 'X'
               IMPORTING
                    job_was_released            = v_flg_released
               EXCEPTIONS
                    cant_start_immediate        = 1
                    invalid_startdate           = 2
                    jobname_missing             = 3
                    job_close_failed            = 4
                    job_nosteps                 = 5
                    job_notex                   = 6
                    lock_failed                 = 7
                    OTHERS                      = 8.
          IF sy-subrc  0.
            e_error = true.
          ELSE.
            DO.
              CALL FUNCTION 'SHOW_JOBSTATE'
                EXPORTING
                  jobcount               = v_jobcount
                  jobname                = v_jobnam
    *            IMPORTING
    *         ABORTED                =
    *         FINISHED               =
    *         PRELIMINARY            =
    *         READY                  =
    *              running                =
    *         SCHEDULED              =
               EXCEPTIONS
                 jobcount_missing       = 1
                 jobname_missing        = 2
                 job_notex              = 3
                 OTHERS                 = 4.
              IF sy-subrc  0.
                e_error = true.
                MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                        WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
              ENDIF.
              IF running = space.
                EXIT.
              ENDIF.
            ENDDO.
          ENDIF.
        ENDIF.
      ENDIF.

  • Problem with background job using SUBMIT under different user

    Hi All,
    I am submitting a background job under different user name. I am submitting this job by using JOB_OPEN , SUBMIT report and JOB_CLOSE. But submit statement is returning sy-subrc 8, becuase of this the job_close is failed. Can you please help me to solve this problem.
    Thanks in advance.
    Tjgupta

    Hi,
    The user is having all authorizations. is there any difference with user types?
    This user is having  user type " as communication data".
    Thanks,
    Tjgupta

  • Problem in delta load with Z field

    Hello Experts,
    We have CRM 5.0 system and as per the requirement we have
    added ZFIELD to BUT000 table. This field is added to BUT000 via EEWB – Easy
    Enhancement Workbench. This field is also BW enabled.
    BW enabled means – this field is used / available to BW
    extractor for further analysis.
    We update this ZFIELD with custom program – FM BUPA_CENTRAL_CI_CHANGE.
    When full load is done – data flows to BW correctly, ZFIELD
    values are reflecting correctly. But when we are doing delta load it is not
    working.
    If we use BP t-code delta works perfectly. So issue from BW
    side is ruled out. The FM BUPA_CENTRAL_CI_CHANGE is not triggering the change
    record for this BP. But then I don’t see any other FM / BAPI to use so that
    change record is also created.
    Could you please provide any pointers or any checks to
    overcome this problem?
    Thanks in advance.

    Hi Ashtankar,
    When you are changing the attributes(say ZFIELD) and save the transaction in BP, Is this updating the CHDAT(Change date) in BUT000?
    If the change date is being applied, then Delta shouldnt be a problem - if you are using datasource 0BPARTNER_ATTR.
    In order to check if the delta is being captured or not, you could use TCODE RSA7. Also, this brings the Doubt when you say Full load works while delta doesnt, try Re-initializing the datasource from BW after Datasource replication - because there have been some changes effective in Source system which the BW system/Delta queue might take into consideration.
    Regards,
    Thejas K

  • Data upload problem in delta update from 1st ODS to 2nd ODS

    Dear Friends,
    I am loading data from one ODS to another. The update mode was full upload. Sometime back an error occurred in activation of the first ODS. The error was: Full updates already available in ODS ,Cannot update init./delta. So currently daily records are pulled but not added i.e. transferred recs = 4000 but added recs = 0.
    When I looked for a solution in SDN I found that using program RSSM_SET_REPAIR_FULL_FLAG for 2nd ODS will reset all full uploads to Repair Full Request which I have already done for 2nd ODS. Then initialize once and pull delta.
    But problem is that I cannot set update mode to delta as I am pulling some 80,000 records in 2nd ODS from 1st ODS with around 14 lacs records daily based on some data-selection filters in infopkg. But do not see any parameters for data-selction in delta mode.
    Please suggest.
    Regards,
    Amit Srivastava

    Dear Sirs,
    Due to this error in activation in 2nd ODS daily data upload is failing in 1st ODS.
    To correct this I converted all full upload requests in 2nd ODS to Repair full requests.
    But now when I scheduled the infopkg today with full upload again data was transferred but not added.
    I know I cannot have init./ delta so what possibly can now be done in this scenario. Please help.
    Regards,
    Amit Srivastava

Maybe you are looking for