Export SCHEDULE problem

hello
please help me
i want schedule export and i have created a job with this specification
BEGIN
DBMS_SCHEDULER.CREATE_SCHEDULE ( schedule_name => 'EXPORT_SCHEDULE1',
start_date => SYSTIMESTAMP,
end_date => SYSTIMESTAMP + INTERVAL '30' day,
repeat_interval => 'FREQ=MINUTELY; INTERVAL=1', comments => 'Every 1 HOUR');
END;
BEGIN
DBMS_SCHEDULER.CREATE_PROGRAM ( program_name => 'EXPORT_PROGRAM1',
program_action => ' exp system/ali120 FILE=/home/oracle/alifiles/exp1.dmp TABLES=alit1',
program_type => 'EXECUTABLE',
comments => 'My comments here'
,enabled => TRUE);
END;
BEGIN DBMS_SCHEDULER.CREATE_JOB (
job_name => 'EXP_JOB1',
program_name => 'EXPORT_PROGRAM1',
schedule_name => 'EXPORT_SCHEDULE1',
enabled=>TRUE);
END;
my job executed but export file(/home/oracle/alifiles/exp1.dmp) dont created
it dont difference that i use exp or expdp
when i query from DBA_SCHEDULER_JOBS the STATE of EXP_JOB1 is RUNNING for
always
and when i use dbms_scheduler.run_job for running the job it don't return and
export file don't created

Hi,
A couple things.
- 'FREQ=MINUTELY; INTERVAL=1' is actually once every minute, but I think you figured that out.
-'exp system/ali120 FILE=/home/oracle/alifiles/exp1.dmp TABLES=alit1'
There are several problems here. You need to give the full path to exp. exp probably requires several environment variables to be set. You are much better off putting the entire call together with setting of variables in a shell script and calling that.
Only the binary or script should be in the program_action, other elements should be defined as program_arguments ("system/ali120", "FILE=/home/oracle/alifiles/exp1.dmp" "TABLES=alit1")
I am not sure why the job hangs (you can use stop_job to force it to stop). Try calling extjob from the commandline and making sure it returns without any errors.
Hope this helps,
Ravi.

Similar Messages

  • Production Orders: Scheduling Problem

    Hello
    I have a scheduling problem.  I want the system to calculate the production order dates.
    I have maintained the following
    1.      Work Center:  A formula to calculate the duration.  I have marked the u201Crelevant to finite indicatoru201D in the Capacity Header Tab.  I have also maintained the available capacity as 24 Hrs.
    2.     Routing:  I have maintained the values for the standard value keys
    3.     SPRO:  In scheduling parameters for the order type, I have marked u201CGenerate cap requirementsu201D and u201CSchedulingu201D;  In the production scheduling profile, I have marked u201CSchedule orderu201D (on release) & u201CFinite Schedulingu201D (Availability check).
    But the production order is not getting scheduled & the start & end times are the same.  I get the error u201CNo req/capacity exists or not relevant for finite schedulingu201D.
    Can anyone guide me as to what I am missing?
    Best regards
    Tom

    Dear Tom,
    Please check following;
    1. Your operation contol key; whether it allows scheduling.
    2. scheduling parameters for production order on OPU3: Please select proper combination of plant and order type and tick scheduling and generate capacity requts indicators in detailed scheduling tab.
    Please revert back if problem persist.
    Best Regards
    Uday

  • Exporting scheduled financial report batches with LCM

    Hi folks. Running 11.1.1.3 EPM here.
    I'm trying to use LCM to export scheduled Financial Reporting batches. When I am in Shared Services under Reporting and Analysis, under the Scheduled Objects drilldown, I don't see any scheduled objects yet I have about 200 FR batch reports scheduled on my dev server.
    Am I looking in the wrong place to export these or is there another way?
    Thanks,
    Tom

    Sorry I wan't aware the folder structure was different in 11.1.2. I found this on the LCM admin guide:
    Default value: This parameter is commented out and the Lifecycle Management engine uses the default file system location on the Shared Services computer; for example:
    MIDDLEWARE_HOME/user_projects/epmsystem1/import_export/username@ProviderName.
    Do you see it now?

  • Scheduling problem in LSMW for Sales Order

    I am doing LSMW for Sales Order using the BAPI method.
    I am able to upload the item details along with their quantities also but the problem that I am now facing is the scheduling problem.I am unable to schedule the item quantities, only the last quantity passed is taken as the final quantity and the consolidated quantity is also getting ignored.
    I am using E1BPSCHDL structure, in this structure I am mapping Itm_number , req_date and req_qty(scheduled qty).....
    in the structure E1BPSCHDLX, the same fields are mapped as 'X'.
    in the structure E1BPSDITM, I am mapping the item info along with the consolidated qty(TARGET_QTY), but this is getting ignored and the last scheduled qty is taken as the final consolidated qty and the rest scheduled qty data is getting rejected.
    Please do help... I dont know why this problem is coming... tried n number of alternatives but of no hope... do try to help me
    with regards,
    Daya.

    Check the format of your date and check the format expected.
    sometimes you can have diff like YYYYMMDD, MM.DD.YYYY,...
    Look at the converted data.
    Hope this helps,
    Erwan

  • JOB Scheduling problem

    Hi All,
    My problem is the next:
    I would like to call two functions, in one function module, but I cannot schedule the second job after the first. I wrote the code, use the parameter PREDJOB_CHECKSTAT of JOB_CLOSE, but when the mother function called two jobs started immediately and paralell. What is worng in the following code?
    Thanks /and points :)/ for your help!
    Tamas
          L_JOBNAME = 'REQUEST_COPY'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME
            IMPORTING
              JOBCOUNT = L_JOBCOUNT.
          SUBMIT RSSEM_REQUEST_COPY
                       WITH RNR = I_RNR
                       WITH S_CUBE = LS_CUBES-SOURCE_CUBE
                       WITH T_CUBE = LS_CUBES-TARGET_CUBE
                       USER SY-UNAME VIA JOB L_JOBNAME NUMBER L_JOBCOUNT
                       AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT
              JOBNAME   = L_JOBNAME
              STRTIMMED = 'X'.
          L_JOBNAME2 = 'REQUEST_CLOSE'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME2
            IMPORTING
              JOBCOUNT = L_JOBCOUNT2.
         SUBMIT Z_REQUEST_CLOSE_ZSD_P25
                   USER SY-UNAME VIA JOB L_JOBNAME2 NUMBER L_JOBCOUNT2
                   AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT2
              JOBNAME   = L_JOBNAME2
              STRTIMMED = 'X'
              PREDJOB_CHECKSTAT = 'X'
              PRED_JOBCOUNT = L_JOBCOUNT
              PRED_JOBNAME = L_JOBNAME.

    Hi Thomas,
    Thanks for all helps, I found the solution!
    Inserted a select from the Job Status Table into th second job definition.
    Thanks for the ideas!
    Tamás
    The finally code is the the following:
          L_JOBNAME = 'REQUEST_COPY'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME
            IMPORTING
              JOBCOUNT = L_JOBCOUNT.
          SUBMIT RSSEM_REQUEST_COPY
                       WITH RNR = I_RNR
                       WITH S_CUBE = LS_CUBES-SOURCE_CUBE
                       WITH T_CUBE = LS_CUBES-TARGET_CUBE
                       USER SY-UNAME VIA JOB L_JOBNAME NUMBER L_JOBCOUNT
                       AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT
              JOBNAME   = L_JOBNAME
              STRTIMMED = 'X'.
          L_JOBNAME2 = 'REQUEST_CLOSE'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME2
            IMPORTING
              JOBCOUNT = L_JOBCOUNT2.
          DO.
            SELECT SINGLE STATUS FROM TBTCO INTO L_STATUS
              WHERE JOBNAME = L_JOBNAME
              AND JOBCOUNT = L_JOBCOUNT.
            IF L_STATUS = 'F'.
              EXIT.
            ELSE.
              WAIT UP TO 1 SECONDS.
            ENDIF.
          ENDDO.
          SUBMIT Z_REQUEST_CLOSE_ZSD_P25
                    USER SY-UNAME VIA JOB L_JOBNAME2 NUMBER L_JOBCOUNT2
                    AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT          = L_JOBCOUNT2
              JOBNAME           = L_JOBNAME2
              STRTIMMED         = 'X'
              PREDJOB_CHECKSTAT = 'X'
              PRED_JOBCOUNT     = L_JOBCOUNT
              PRED_JOBNAME      = L_JOBNAME.

  • Scheduling Problem for uploading Data from Flat file to SAP

    Hi guys,
    I am facing a weared problem in uploading some leave records in z table. The code is working fine if we run it through se38 after selecting the file from a shared location from production server which has all the access rights.
    This folder lies in the \usr folder of SAP Production.
    I have kept all the Flat files in the shared path "
    Tis-mum-iz-s1\migration\SAP-INT\leave\" ...
    To give u exact directory structure..
    Tis-mum-iz-s1 is the Server Name
    usr is the SAP System folder used for uploads and downloads
    usr |
    ...-> Migration |
                      -> SAP-INT |
                                 -> leave -> (Flat Files)
    Migration folder is shared with all rights.
    Obviously, we cannot give shared drive as the variant in the scheduler.
    So i use the system path i.e. \usr\sap\tmp\migration\sap-int\leave\ as the variant.
    All my other download programs are working fine with this path as a variant...
    But my this particular upload program does not work with this path....
    I am giving u my code...
    TATA INTERACTIVE SYSTEMS (A Division of TATA INDUSTRIES LIMITED)
    REPORT      :  ZMIGRATE_ZLEAVE
    DESCRIPTION :  To Upload the Leave data. (ZLEAVE)
    CREATED BY  :  Abhishek Bachhawat
    CREATED ON  :  01.09.2005
    CONSULTANT  :  ANAND
    REPORT  ZMIGRATE_ZLEAVE.
    TABLES: ZLEAVE.
    data: begin of wtab,
              MANDT(3),
              ZLVID(8),
              PERNR(8),
              ZSTDT(8),
              ZENDT(8),
              ZDAYS(4),
              AEDAT(8),
              ERDAT(8),
          end of wtab,
          itab like WTAB occurs 0 WITH HEADER LINE.
    data: temp like zleave occurs 0 WITH HEADER LINE.
    SELECTION-SCREEN BEGIN OF BLOCK file
                   WITH FRAME TITLE text-005.
    parameters: file like rlgrap-filename Obligatory.
    Concatenate File SY-DATUM '_Leave.txt' into File.
    SELECTION-SCREEN END OF BLOCK file.
    at SELECTION-SCREEN ON VALUE-REQUEST FOR file .
      CALL FUNCTION 'WS_FILENAME_GET'
        IMPORTING
          FILENAME = file.
      IF SY-SUBRC <> 0.
      ENDIF.
    start-of-selection.
      if file ne space.
        CALL FUNCTION 'WS_UPLOAD'
          EXPORTING
            FILENAME = FILE
            FILETYPE = 'DAT'
          TABLES
            DATA_TAB = ITAB.
      else.
        message e000(zps) with 'Specify a file'.
      endif.
      SORT ITAB BY ZLVID.
      LOOP AT ITAB.
        REFRESH TEMP.
        CLEAR TEMP.
        TEMP-MANDT = sy-mandt.
        TEMP-ERDAT = SY-DATUM.
        TEMP-ZLVID = ITAB-ZLVID.
        TEMP-PERNR = ITAB-PERNR.
        TEMP-ZSTDT = ITAB-ZSTDT.
        TEMP-ZENDT = ITAB-ZENDT.
        TEMP-ZDAYS = ITAB-ZDAYS.
        TEMP-AEDAT = ITAB-AEDAT.
        TEMP-ERDAT = ITAB-ERDAT.
        APPEND TEMP.
        SELECT SINGLE *
               FROM   ZLEAVE
               WHERE  ZLVID = TEMP-ZLVID
               AND    PERNR = TEMP-PERNR.
        IF SY-SUBRC = 0.
          UPDATE ZLEAVE SET ZSTDT = TEMP-ZSTDT
                            ZENDT = TEMP-ZENDT
                            ZDAYS = TEMP-ZDAYS
                            AEDAT = TEMP-AEDAT
                            ERDAT = TEMP-ERDAT
                 WHERE ZLVID = TEMP-ZLVID
                 AND   PERNR = TEMP-PERNR.
        ELSE.
          INSERT ZLEAVE FROM TABLE TEMP.
          COMMIT WORK.
        ENDIF.
      ENDLOOP.

    Hi,
    open dataset file for input in text mode.
    check sy-subrc = 0.
    while sy-subrc = 0.
      read dataset file into wa.
      if sy-subrc = 0.
      append wa to itab.
      else.
        exit.
      endif.
    endwhile.
    close dataset file.
    regards
    Siggi
    PS: check also the F1-help for open, read and close statements!

  • SES Crawler Export schedule?

    Hi
    I have setup a SES Crawler Export on my UCM, and have a SES crawl this export, and it works fine. However sometimes SES does not crawl the newly created documents in UCM, and the only way I can get this done, is by manually trigger a SES Crawler Export from UCM (take a new snapshot) and then let SES crawl the export again.
    When does the UCM trigger the SES Crawler Export? Does it run by a schedule? I have found no configuration/documentation of this, can I change the schedule?
    Thank you
    Søren

    I don't understand the question, so I imagine it should be posted in the UCM forum rather than the Secure Enterprise Search forum.
    If you think it is SES related, please try to explain your problem more clearly. For example I don't know what "IBR" is.

  • OIM11gR2 - Deployment Manager - Export Scheduled Task?

    Hello,
    I've developed two scheduled tasks and they work as expected.
    Now I'm trying to export them via deployment manager to migrate to another environment.
    Problem: when I select "Scheduled Task" in the deployment manager wizard my two scheduled tasks are not listed, therefore I can't export them.
    Any clue why two scheduled tasks would exist, work, but not be listed in the deployment manager for export?
    Thanks,
    Adr.

    Does it delete the c:\GALexport.csv file? If not, then its not even executing the ps1 script...
    - Open cmd prompt and run below command to confirm that there isn't any typo or any other small error...
    C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe -version
    2.0 -NonInteractive -WindowStyle Hidden -command ". 'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1'; Connect-ExchangeServer -auto; d:\Scripts\GalExportReport.ps1"
    - If above works then something wrong with task scheduler configuration...
    Blog |
    Get Your Exchange Powershell Tip of the Day from here

  • Adobe Export pdf Problems

    I have subscribed to Adobe pdf export on my Ipad air; have recieved a confirmation e-mail but am asked to subcribe again when I try and export? Please help

    Thank you for the answer, I got it!
    Dionyzio
    De: Mylenium [email protected]
    Enviada em: quarta-feira, 12 de setembro de 2012 11:45
    Para: [email protected]
    Assunto: Problems to download adobe export PDF
    Re: Problems to download adobe export PDF
    created by Mylenium <http://forums.adobe.com/people/Mylenium>  in Downloading, Installing, Setting Up - View the full discussion <http://forums.adobe.com/message/4693037#4693037

  • CRM 7.0 EHP3 Installation Export CD problem

    Dear all,
    we are installing SAP CRM 7.0 EHP3 on MSSQL 2012.
    We do encounter a problem with the installation Export CD.
    SWPM is SWPM10SP05_2-20009707.SAR - SP5 PL 2, Kernel is 741 (Material Number: 51048107_8).
    Installation Export Material Number are respectively 51047869_2 and 51047869_3.
    When asked for Installation Export CD, we input the right path, but SWPM tells us that he is expecting another CD:
    You entered: E:\CD SAP\EXPORT\DATA_UNITS\EXP1 Found the label SAP:CRM:SR170EHP3:EXPORT(1/4):SAP CRM 700 EHP 3 SR1 Installation Export CD 1/4:CDxxxxxxx_1 but need the label SAP:CRM:70EHP3:EXPORT(1/4):*:*
    It seems that he asks for an older version of the Export CD.
    Has anyone already had this issue and can advice how to solve it?
    Thanks in advance.
    Kind regards,
    Andrea Visentin

    Dear all,
    ok, I'll answer myself:
    as per note 1998693 - Sapinst Error: Found the label SAP:UT:SR1740:SP05:*:* but need the label SAP:UT:740 for a similar "issue" we had installing Process Integration today, we have just realized that there is a specific section in SWPM for SR1:
    sorry for having bothered you with such a silly question :-)
    Kind regards,
    PS: changing LABEL.ASC CD-Label to non-SR1 package, works.
    Andrea Visentin

  • Background job scheduling problem

    Hai..Can anyone check the below program and correct it.
    I am unable to see the output in SP01 (SPOOL REQUEST)
    MY PROGRAM:-
    REPORT  zh_test4.
    TABLES : mara, TBTCO.
    DATA : BEGIN OF itab OCCURS 0,
          matnr LIKE mara-matnr,
          END OF itab.
    ****background data declarations
    data : job_name like TBTCO-JOBNAME.
    data : job_num like TBTCO-JOBCOUNT,
           rep like sy-repid.
    ***selection screen
    PARAMETERS : p_matnr LIKE mara-matnr default '1500-610'.
    SELECT matnr FROM mara INTO TABLE itab WHERE matnr EQ p_matnr.
    job_name = 'HARI'.
    CALL FUNCTION 'JOB_OPEN'
      EXPORTING
      DELANFREP              = ' '
      JOBGROUP               = ' '
        jobname                = job_name
      SDLSTRTDT              = NO_DATE
      SDLSTRTTM              = NO_TIME
    IMPORTING
       jobcount               = job_num
    EXCEPTIONS
       CANT_CREATE_JOB        = 1
       INVALID_JOB_DATA       = 2
       JOBNAME_MISSING        = 3
       OTHERS                 = 4
    IF sy-subrc <> 0.
    write :/ ' Job opening problem'.
    else.
    write :/ 'Job succesfully opened', sy-subrc.
    ENDIF.
    MOVE SY-UNAME TO TBTCO-AUTHCKNAM.
    rep = sy-repid.
    job_name = 'HARI'.
    CALL FUNCTION 'JOB_SUBMIT'
      EXPORTING
      ARCPARAMS                         =
        authcknam                         = SY-UNAME
        jobcount                          = job_num
        jobname                           = job_name
      LANGUAGE                          = SY-LANGU
      PRIPARAMS                         = ' '
       REPORT                            = 'ZH_TEST4'
      VARIANT                           = 'VAR'
    IMPORTING
      STEP_NUMBER                       =
    EXCEPTIONS
       BAD_PRIPARAMS                     = 1
       BAD_XPGFLAGS                      = 2
       INVALID_JOBDATA                   = 3
       JOBNAME_MISSING                   = 4
       JOB_NOTEX                         = 5
       JOB_SUBMIT_FAILED                 = 6
       LOCK_FAILED                       = 7
       PROGRAM_MISSING                   = 8
       PROG_ABAP_AND_EXTPG_SET           = 9
       OTHERS                            = 10
    IF sy-subrc <> 0.
    WRITE :/ 'JOB SUBMIT PROBLEM',
              job_name,
              job_num,
              rep,
              sy-subrc.
    else.
    write :/ 'Job succesfully submitted in background', sy-subrc.
    ENDIF.
    CALL FUNCTION 'JOB_CLOSE'
      EXPORTING
        jobcount                          = job_num
        jobname                           = job_name
      LASTSTRTDT                        = NO_DATE
      LASTSTRTTM                        = NO_TIME
      PRDDAYS                           = 0
      PRDHOURS                          = 0
      PRDMINS                           = 0
      PRDMONTHS                         = 0
      PRDWEEKS                          = 0
      PREDJOB_CHECKSTAT                 = ' '
      PRED_JOBCOUNT                     = ' '
      PRED_JOBNAME                      = ' '
      SDLSTRTDT                         = datum
      SDLSTRTTM                         = uzeit
      STARTDATE_RESTRICTION             = BTC_PROCESS_ALWAYS
       STRTIMMED                         = 'X'
      TARGETSYSTEM                      = ' '
      START_ON_WORKDAY_NOT_BEFORE       = SY-DATUM
      START_ON_WORKDAY_NR               = 0
      WORKDAY_COUNT_DIRECTION           = 0
      RECIPIENT_OBJ                     =
      TARGETSERVER                      = ' '
      DONT_RELEASE                      = ' '
      TARGETGROUP                       = ' '
    IMPORTING
      JOB_WAS_RELEASED                  = 'X'.
    EXCEPTIONS
       CANT_START_IMMEDIATE              = 1
       INVALID_STARTDATE                 = 2
       JOBNAME_MISSING                   = 3
       JOB_CLOSE_FAILED                  = 4
       JOB_NOSTEPS                       = 5
       JOB_NOTEX                         = 6
       LOCK_FAILED                       = 7
       INVALID_TARGET                    = 8
       OTHERS                            = 9
    IF sy-subrc <> 0.
    write :/ 'Unable to close the Job', rep, sy-subrc.
    else.
    write :/ 'Succesfully closed the job', sy-subrc.
    ENDIF.

    Here is an example, slightly different from your version.
    REPORT ztest.
    PARAMETERS: p_vbeln LIKE vbak-vbeln,
                p_bkrun NO-DISPLAY.
    DATA: ls_vbak LIKE vbak.
    DATA: v_answer,
          v_jobcount LIKE tbtcjob-jobcount.
      IF p_bkrun IS INITIAL.
    *-- not background processing
        CALL FUNCTION 'POPUP_TO_CONFIRM_STEP'
             EXPORTING
                  textline1      = 'This may time out.'
                  textline2      = 'Do you want to run in background?'
                  titel          = 'Warning!!!'
                  cancel_display = space
             IMPORTING
                  answer         = v_answer.
        IF v_answer = 'J'.
    *-- run in the background
          CALL FUNCTION 'JOB_OPEN'
               EXPORTING
                    jobname          = 'ZTEST'
               IMPORTING
                    jobcount         = v_jobcount
               EXCEPTIONS
                    cant_create_job  = 1
                    invalid_job_data = 2
                    jobname_missing  = 3
                    OTHERS           = 4.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE 'E' NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
            EXIT.
          ENDIF.
    *-- submit the program in the background
          SUBMIT ztest
            WITH p_bkrun = 'X'
            WITH p_vbeln = p_vbeln
            USER sy-uname
            VIA JOB 'ZTEST' NUMBER v_jobcount AND RETURN.
    *-- close the job
          CALL FUNCTION 'JOB_CLOSE'
               EXPORTING
                    jobcount             = v_jobcount
                    jobname              = 'ZTEST'
                    strtimmed            = 'X'
               EXCEPTIONS
                    cant_start_immediate = 1
                    invalid_startdate    = 2
                    jobname_missing      = 3
                    job_close_failed     = 4
                    job_nosteps          = 5
                    job_notex            = 6
                    lock_failed          = 7
                    OTHERS               = 8.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE 'W' NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
          ENDIF.
          EXIT.
        ELSE.
          CLEAR v_answer.
        ENDIF.
      ENDIF.
      CHECK v_answer IS INITIAL.
      SELECT SINGLE * FROM vbak
                      INTO ls_vbak
                     WHERE vbeln = p_vbeln.
      IF sy-subrc <> 0.
        WRITE:/ 'Invalid Order Id.'.
      ENDIF.
    END-OF-SELECTION.
      CHECK v_answer IS INITIAL.
      IF p_bkrun IS INITIAL.
        WRITE:/ 'Here is the result running the program in foreground.'.
      ELSE.
        WRITE:/ 'Here is the result running the program in background.'.
      ENDIF.
      WRITE:/ ls_vbak-vbeln,
              ls_vbak-vkorg.

  • Background job scheduling problem in APO

    Hi fellow SDNers,
    i am going through this peculiar problem of background job scheduling:
    scenario is like , i have an CSV (excel file) in aplication server which would load data into the infosource, which i have scheduled to run in backhround (in infopackage) ,after event is triggered (option in scheduling tab of infopackage...THE SCHEDULING OPTIONS)
    now everything seems to be fine.. but the data is not getting loaded...?? could u lease help me out..how to load data from excel file (in background), after an event gets triggered.
    thanks in advance,
    Rohan

    hi Alexander,
    i am triggering the event from BP_EVENT_RAISE FM in APO  by passing the  event id... this would automaitically raise the event...just like sm64
    Thanks
    Rohan

  • Scheduling problem with delivery date in purch.req and requirement date

    Dear all,
    I have the problem, that the calculated delivery date in the purchase requisition does not correspond with the requiement date of the production or planned order.
    I would like to use planned delivery times in the material master.
    Example:
    Planned delivery time: 20 days
    Todays date 01.10
    Case 1)
    Requiement date of planned order 10.10
    Case 2)
    Requiement date of planned order 20.09
    This situation can happen in our envirnoment, as the engineering process still continues after start of production. In this case I would like to have the following result:
    Case 1)
    Delivery date is 10.10 minus the goods receipt time
    Case 2)
    Delivery date is day of MRP or one or two days more
    I made the setting that dates in the past are not allowed. This solves my first smal problem. I tried to set the scheduling margin key but these settings do not help me, as they are relevant for all materials. I need of course individual planned delivery dates per material.
    So, I want to change the standart MRP procedure that in the case of the "wished requirement date" not the calculated date is beeing taken, but the "wished date". Usually you receive an exeption for this case and you would need t solve this manually. But this we dont want to do.
    Can s.o. please help me? Thank you all very much, I really dont know what to do in this case.
    With best regards,
    Bjoern

    The question cannot be answerd by SAP standart as I see. We will go through a workaround.

  • Background Scheduling Problem

    Hi everyone,
    I am getting problem in the BEx reporting.After loading the new data in the respective data targets, it is not reflecting in the BEx Analyzer. I have created the process chain, and loading is done in the background scheduling. I can see the updated request time and date in Monitor. It is showing that data is updated successfuly in the data targets.
    But when I execute the report after 2hrs, It will reflect in the query and it will show the latest updated records.
    If anybody knows the reason why it takes time to reflect in query, please give me some outputs.
    Thanks,
    Pandey

    Hi Manoranjan,
    If your data target is ODS, you can find the updated records for reporting only after activation. If it is InfoCube there could be some processs defined after dataload like create index,Stats and rollup if aggregate are present. may be this could be the reason...
    Hope this helps..
    Best Regards,
    DMK
    *Assign points if it serves your purpose...

  • Premiere CS4 to Encore CS4 Export  Quality Problems

    I recently purchased a non-linear editing machine with an RTX2 Matrox Card and CS4 Professional. After six or seven projects, I finally noticed that all my imported .avi files exported through Adobe Media Encoder and burned through Encore CS4 are not as crisp, sharp, or look like the original high quality SD footage after the DVD is burned like it always used to look in CS2. ( jumped from CS2 to CS4)
    These DVDs were of plays, football, and basketball games. It seems that when the camera shoots a close up, the video looks fine but after you shoot from a distance, there is a considerable difference in the footage. ( alomost blurry) I just caught this and nobody has complained but I watched one of my older DVDs I produced with CS2 and it had no quality loss whatsoever. There is a night and day difference
    For the sequences, I am using Matrox SD, NTSC, Standard, 720 x 486  and capturing with firewire through the Matrox card. The raw footage is beautiful. The Video Preview Option is NTSC Standard and the Codec is NTSC Matrox DV/DV Cam.
    I can playback a timeline from the Matrox card and burn directly to a DVD burner and the footage is perfect, but when I export, I have chosen Microsoft .avi file, Matrox DV/DV Cam/ MPEG2, and MPEG2DVD.
    I have even changed the sequences codec to Matrox MPEG-2 I-Frame and tried exporting using that out of Adobe Media Encoder and am getting the same results.
    None of my customers have complained yet, but I definitely do not want to continue producing almost blurry DVDs.
    I am using Adobe Premiere CS4 Version 4.1.0 and Encore CS4 Version 4.0.1.048 and Matrox Version 4.1.0.23
    Can this problem be related to and eliminated by updating my CS4 Production Suite and my Matrox Software.
    I was wondering if anyone had experienced this same issue and what solution they came up with or if anyone knew if the updates would correct this important issue. I just filmed a lengthy play that I have to author through Encore CS4 and I cannot release any more of my productions until this problem is solved. This is a big dollar production and I need some assistance on this matter as soon as possible. Thanks  Jimi White

    Thanks for taking the time to help me out. I just purchased my non-linear with CS4 Production Suite and the Matrox RTX2. I have several projects that I am working on right now and I was advised by Adobe, Matrox, and my non-linear vendor, 1 Beyond, to finish those projects before performing updates.
    I also found out that If I export using the MPEG2  8Mbps, it cleared up the video dramatically. We had to go through a number of export presets in order to find out what worked best in Encore. I am going to try to finish the project now and will advise on the results.
    This is a basketball highlight DVD and that's when I noticed the problem. My other productions were photos to music, and non sporting events and I didn't notice any problems with those DVDs ( they actually looked decent. Worst part about it was, I finished the project once and burned 18 DVDs, and then reviewed one of them and found the problem. I have to start over and just eat the DVDs printed and all.
    Thanks again, it was nice of you to get back to me.
    Jimi

Maybe you are looking for