Import - Validation

Hello All,
I have a query regarding import workflow validation.
I want to validate record while importing from import manager. I have created a Import workflow but i am making some mistake so workflow is not properly functioning.
Rqmnt - i have 3 fileds in main table in which 1 is mandatory. If i import records in bulk and configure validation thru import workflow what will be steps. and how it will function or will show results if i left this field mandatory in source file.. ( step by step workflow)
Also one more query Validations can not be applied to tuple.. correct me if i am wrong.
Rgds
Himanshu

Hi,
First go to the matching mode in data manager, there select the tuple name from dropdown available on left side pane in Validation tab. After selecting tuple, you can create validation at tuple level. It is for SP02. In SP01, u can find validation through menu path "Records-Validations-Tuple Validations".
Regards,
Prashant
Edited by: Prashant Malik on Oct 1, 2009 10:06 AM

Similar Messages

  • Import Validated invoices in Payables

    Hi All,
    We are using Payables Open Import to import invoices and then running Import Validation to validate these invoices. But the amount of validation being done is minimal functionally. We do not intend to match these invoices and there is just 1 invoice line. Once validated, we intend to pay them.
    For us the bottleneck is the Invoice Validation program performance. So we were wondering if there is any way we could skip this validation i.e. could we import invoice in a validated state? Any other suggestions are also welcome.
    Regards,
    Ravi

    I dindt hearabout importing ivoices in validated state, but i am importingmy invoices via pl/sql program and validate them as soon as imported. this makes the process of looking for an imported invoice and the the validating it by hand not necessary.

  • Order Import - Validation failed for the field - End Customer Location

    I am trying to import an order using standard import program and it is failing with below error:
    - Validation failed for the field - End Customer Location
    I have checked metalink to debug this and found nothing, the values i have used to populate the interface table or valid.
    Can someone please post your views on this, on where the problem might be.
    Thanks!

    Hello,
    Try the following
    1) Check if any processing constraints are applied to customer/location
    2) Try to create a new customer and associate internal location.
    Create a new order to reproduce the issue
    Thanks
    -Arif

  • Doc Import Param to suppress XML import validation message

    Hi,
    I'm running a script where the first part is to import an xml document, then to make some changes to the imported xml.
    Problem is, the xml may not always be valid. If it isn't I would like to still make the changes regardless.
    Unfortunately, my code is not running as I end up with "Validation of XML file failed. Continue? " prompt.
    Can I somehow suppress this ?  or is there a way to send the OK key hit - so my code can just continue without requiring any user interaction ?
    var importParams = GetImportDefaultParams();
                             i=GetPropIndex(importParams,Constants.FS_ImportAsType);
                            importParams[i].propVal.ival=Constants.FV_TYPE_XML;
                             i=GetPropIndex(importParams,Constants.FS_FileIsXmlDoc);
                            importParams[i].propVal.ival=Constants.FV_DoOK;
                            i=GetPropIndex(importParams,Constants.FS_AlertUserAboutFailure);
                            importParams[i].propVal.ival=0;
                             i=GetPropIndex(importParams,Constants.FS_DontNotifyAPIClients);
                            importParams[i].propVal.ival=1;
                           // var importParams = GetImportDefaultParams();
                            i=GetPropIndex(importParams,Constants.FS_HowToImport);
                            importParams[i].propVal.ival=Constants.FV_DoByCopy;
                            var returnParams = new PropVals() ;
                    alert("importing");
                            doc.Import (doc.TextSelection.end , filename, importParams, returnParams);
                 app.ActiveDoc = doc;
    do_something_else();
    thanks!
    Tracey

    Hi,
    Have you already tried to suppress the alert?
    There is an example of this in the FrameMaker installation.
    FM_Suppress_Alerts.jsx
    ScriptsAndUtilities\Suppress Alerts
    Regrads,
    Apollo

  • How to Query/Lookup Data Target HFM Application in FDMEE Import / Validation Step

    Hi Experts,
    We are using FDMEE 11.1.2.3.500 to load 2 sets of Flat Files - ICP Trial Balances and Non-ICP Trial Balances. The ICP Trial Balances will be loaded to Target HFM Application first using FDMEE and then the remaining Non-ICP Trial balances Load would follow
    We have a complex business requirement wherin we want to check if the Non-ICP Trial Balances for a given ENTITY, ACCOUNT, [ICP None] combination in the Trial Balance File matches the already loaded ICP Trial Balances for ENTITY, ACCOUNT, [ICP Total]. If it matches then this record has to be skipped from our data load else it has to be loaded. To elaborate with an example:
                                                                  Entity    Account     ICP              Amount
    File 1 (ICS_TB.csv) - Record 1  ->          E1          A1           P1                  100
    File 1 (ICB_TB.csv) - Record 2  ->          E1          A1           P2                  300
    Above File Gets Loaded First
    File 2 (Non_ICP_TB) - Record 1->          E1          A1          [ICP None]      400    (Record to be skipped - It matches the above already loaded File)
    Is it possible to validate this during Import or Validate Step in FDMEE? if yes, can you please throw some light on how this can be done?
    Thanks!

    As far as I am aware there is no seperate Java API reference manual for FDMEE, you are limited to the information available in the Admin Guide for now. If you feel that it is not comprehensive enough perhaps you could raise a ticket or enhancement request via Oracle Support.
    The sysntax for the statement should be dmAPI.executeDML(String query,Object[] parameters) according to the AG, I wouldn't put 100% trust in that as it can often be incorrect, but give it a try. Ensure the 2nd parameter is a Jython list.
    I don't think the method supports passing a Stored Procedure and there doesn't appear to be another native API method listed that would do that. You could initiate the connection using standard Python or Java libraries and then use a cursor and prepared statements to execute the sql or the callproc method of the Python DB API

  • Important Validations for MTL Material Transaction

    Hi All,
    I am creating an OAF page for material transaction -- for material issue and receipt.
    For this I am inserting a record into MTL transaction interface table and the running the transaction manager API in oracle, which populate mtl_material_transactions table.
    But after general insertion i found that the record is getting error out due to oracle validation. The item may be lot or serial controlled.
    Please let me know the validations performed by this program, so that I can handle these before inserting into interface tables.
    Any help/document will be appreciated.
    You can mail me to [email protected] as well.
    Regards
    Riyas

    Riyas,
    You have to insert the serial number if it is a serial controlled item or it will fail always. Before inserting into the interface table you can validate whether it is serial no is exists or not from the wsh_serial_numbers or mtl_serial_numbers.
    You can compare the values from the inventory_itme_id to get the serial numbers.
    You can use the below query to validate the records. It might be helpful.
    1stà select * from oe_order_headers_all where header_id=4838351
    2ndà select * from wsh_delivery_details where source_header_id=4839902
    3rdà select * from wsh_serial_numbers where delivery_detail_id=5088694
    INSERT INTO mtl_serial_numbers_interface
    (source_code, source_line_id,
    transaction_interface_id,
    last_update_date, last_updated_by,
    creation_date, created_by,
    last_update_login, fm_serial_number,
    to_serial_number, product_code
    --product_transaction_id
    VALUES ('Miscellaneous issue', 7730351,
    71737725,
    --mtl_material_transactions_s.NEXTVAL, --transaction_interface_id
    SYSDATE, --LAST_UPDATE_DATE
    fnd_global.user_id, --LAST_UPDATED_BY
    SYSDATE, --CREATION_DATE
    fnd_global.user_id, --CREATED_BY
    fnd_global.login_id, --LAST_UPDATE_LOGIN
    '168-154-701',
    --FM_SERIAL_NUMBER
    '168-154-701', --TO_SERIAL_NUMBER
    'RCV'
    --PRODUCT_CODE
    --l_rcv_transactions_interface_s
    --v_txn_interface_id --product_transaction_id

  • Cannot import valid MP3 because system says file is empty

    Hello all,
    I am trying to import an MP3 recording I did during one of my classes to Garageband. I need to split the file (roughly 300 and some Mb, with a duration of 3 hours and 49 minutes +/-) in two separate files.
    However, after adding the file, Garageband says it is improting the file and then, BAM, error message saying that the operation cannot be completed as the file is empty.
    Does anyone knows what could be the issue here?
    Thank you in advance for your help!
    Tiago

    but it is the first time I get that error ("filename.aiff" is empty)
    The only difference here is that this specific file is much larger (and longer) than what I usually work with.
    This looks like your file is not an MP3 but an aiff file. Is the filename extension set to aiff or mp3?
    If it says "mp3" try to change it to "aiff".

  • Import Manager Usage : Approaches for developing Import file structure and text validations

    Hi Experts,
    We are having 50+ import maps. We have provided option to users to drop the files for data import. Currently Import Manager(7.1 Sp08) does not have capability of pre-import validation of file data such as
    a. file structure - number of columns specific to import map
    b. file text validations - special characters, empty lines, empty cells
    c. Uniqueness of the records in the file
    For this, we are planning to build temporary folder(port specific) in which user drops in the file. We use the custom development to do above mentioned validations and then move the files to actual import ports.
    Please let us know if you have worked on similar requirements and how you have fulfilled such requirements.
    Regards,
    Ganga

    Hi Ganga,
    Assuming you have a well defined xsd and are getting valid xmls from source in the Inbound Port of MDM. Also,you have a Primary key in form of External ID (say).
    So just by making and defining a XSD you get most of what you want in your questions a and b.
    Now if you wish to use PI to drop files in the inbound port then you can build all the validations in PI itself and you would not need Staging table.
    Otherwise,you can have another table (preferably Main table) in the same repository or other dummy repository where records are created on import based on External ID.
    Here you can launch an MDM workflow on import of these records and run assignments to replace unwanted characters and Validations to give error for rejecting some records based on the data quality level desired.Once unwanted characters are removed and data is validated it can be syndicated using a syndication step in the Workflow.So records which fail are not sent and which pass are sent to a outbound port.
    From the outbound port PI or some other job can pick the file from this outbound folder and drop to Inbound folder of the same repository which imports to the required Primary Main table.Here again you have the option to leverage validations in PI and further check if data is fine.
    Once this activity is done you can delete the records from the staging table.
    Thanks,
    Ravi

  • Problem with IMPORT CATALOG

    I have a question concerning the move of one recovery catalog from one server to another using the IMPORT CATALOG command.
    We have three Windows 2008 severs on which we are running Oracle 11.1.0.7.0. On server A we have nine Oracle instances that we backup with RMAN using a recovery catalog (rcat1) at server B. Now we have installed a new server, C to which we would like to move rcat1. We created a new instance, rcat2, on server C and did the following:
    rman catalog rman2/xxx@rman2
    RMAN> create catalog;
    RMAN> import catalog rman1/xxx@rman1;
    We get the following error message (sorry but some of the text is in Swedish):
    Starting import catalog at 2010-10-15
    connected to source recovery catalog database
    import validation complete
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of import catalog command at 10/15/2010 10:13:01
    RMAN-06004: ORACLE error from recovery catalog database: ORA-02085: databaslänken DBLINK_ENO3LA.RMAN är ansluten till RCAT1.WORLD
    It complains about some database links (can be different links, always concerning rcat1.world), but there are no database links in rcat1, rcat2 or any of the source databases on server A.
    All the databases are in the tnsnames.ora file on all servers, and we have tried to perform the command from all three servers, but we get the same error and are now running out on ideas… Has anyone had the same problem?
    Would it work to create a new database on server C with the same structure and name as the one on server B and just copy all the files from rcat1 on server B to the new rcat1 on server C? Or, is RMAN somehow aware of the servername/ip-address of the server of the recovery catalog?
    Thanks in advance!
    Helen

    Hi Helen,
    Your ORA-:
    ORA-02085: databaslänken DBLINK_ENO3LA.RMAN är ansluten till RCAT1.WORLDhas the following explanation.
    ORA-02085:
    database link string connects to string
    Cause:      a database link connected to a database with a different name. The connection is rejected.
    Action: create a database link with the same name as the database it connects to, or set global_names=false.
    but there are no database links in rcat1, rcat2 or any of the source databases on server A.Probaly the link is dynamicly created due to the < import catalog > command.
    Set global_names to false on rcat1 & rcat2 and retry.
    Regards,
    Tycho

  • Order Import Failure - Ship To

    Hi All,
    I am trying copy the Order from 11i to R12, but the order is failing in Order Import API saying "Validation failed for the field - Ship To".
    Could you please help me out to resolve this issue.
    Thanks,
    Venu

    user3674413 wrote:
    Hi All,
    I am trying copy the Order from 11i to R12, but the order is failing in Order Import API saying "Validation failed for the field - Ship To".
    Could you please help me out to resolve this issue.
    Thanks,
    VenuPlease see the following docs.
    Internal Orders Get Error Validation Failed For The Field Ship To [ID 470660.1]
    INTERNAL REQUISITIONS FAIL ORDER IMPORT - VALIDATION FAILED FOR THE FIELD - SHIP TO [ID 273441.1]
    EDI 850 POI Order Import Errors: ‘VALIDATION FAILED FOR THE FIELD- SHIP TO’ [ID 396536.1]
    OEXOEORD "Validation failed for the field - Ship To" while using Account Relationship [ID 737581.1]
    Internal Orders Are Stuck In Interface Table With Error 'Validation failed for the field - Ship To' [ID 782048.1]
    Unable To Create A New Line To An Existing Order: Validation Failed for the field - Ship to [ID 1391307.1]
    Thanks,
    Hussein

  • Validations and workflows

    hi.....can any one tell me wat is meant by validations and workflows  and the difference between them. And plz explain me the functions and there syntax .....

    Hi,
    Validations are used in MDM data manager for the data compliance i.e to check the structure of data if it is according to the business standards or not.Validations are written in data manager and applied and the result(True /False) are used for further actions.
    Ex like a record Emailid is there.A validation is written to check if there is @,.com,.in etc are there or not.If not then email-id is not proper so we will make that email-id value blank by calling a assignments on the basis of the result of validations .
    Have a look at the following links.They will help you to understand them better.
    Validations:
    MDM Validations
    New Webinar About Validations with MDM 5.5
    Workflow as the word suggest ,is to make the business solution to be in working format by including all the process of data add/update/import/ Validating/De-duplication/merging etc(as per business req.)
    Workflow is used to automate the whole process.
    Workflow links
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b70eb612-0b01-0010-70b4-a01122d3fd17
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/60559952-ff62-2910-49a5-b4fb8e94f167
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/60f28084-b90e-2b10-3eb6-d6565367048a
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90990743-91c0-2a10-fd8f-fad371c7ee40
    Rgds
    Ankit

  • Artifact harvest failed due to: Unexpected error import: Invalid jar file

    Hi,
    i am trying to submit the .biz and .proxy file to OER using Eclipse,
    but i am getting the following error,
    can anybody help me on it,
    what i am missing ?
    8332 [Main Thread] INFO com.bea.alsb.harvester.plugin.reader.OSBReader - OSB Config Jar Import / Validation starting.
    <Mar 13, 2012 10:28:16 AM IST> <Warning> <ConfigFwk> <BEA-000000> <Setting transaction '6' as rollback only. Rollback reason:
    java.io.IOException: Invalid jar file
    at com.bea.wli.config.importexport.ConfigJar$LogicalJarForm.<init>(ConfigJar.java:1341)
    at com.bea.wli.config.task.impl.UploadJarTask._execute(UploadJarTask.java:46)
    at com.bea.wli.config.task.impl.SessionedTask$1.execute(SessionedTask.java:233)
    at com.bea.wli.config.transaction.TransactionalTask._doExecute(TransactionalTask.java:217)
    at com.bea.wli.config.transaction.TransactionalTask._doExecuteWithRetry(TransactionalTask.java:162)
    at com.bea.wli.config.transaction.TransactionalTask.doExecute(TransactionalTask.java:142)
    at com.bea.wli.config.task.impl.SessionedTask.doExecute(SessionedTask.java:236)
    at com.bea.wli.config.task.impl.SessionedTask.doExecute(SessionedTask.java:191)
    at com.bea.wli.config.task.impl.UploadJarTask.uploadJar(UploadJarTask.java:36)
    at com.bea.wli.config.mbeans.Config.uploadJarFile(Config.java:442)
    at com.bea.alsb.harvester.utils.ConfigJarUtils.importConfigJar(ConfigJarUtils.java:79)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.readQuery(OSBReader.java:196)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.readQueries(OSBReader.java:112)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.read(OSBReader.java:87)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.flashline.util.classloader.ContextClassLoaderHandler.invoke(ContextClassLoaderHandler.java:39)
    at $Proxy0.read(Unknown Source)
    at com.oracle.oer.sync.framework.MetadataManager.start(MetadataManager.java:630)
    at com.oracle.oer.sync.framework.Introspector.<init>(Introspector.java:204)
    at com.oracle.oer.sync.framework.Introspector.main(Introspector.java:430)
    8426 [Main Thread] ERROR com.oracle.oer.sync.framework.MetadataManager - Artifact harvest failed due to: Unexpected error import: Invalid jar file
    com.oracle.oer.sync.framework.MetadataIntrospectionException: Unexpected error import: Invalid jar file
    at com.bea.alsb.harvester.utils.ConfigJarUtils.importConfigJar(ConfigJarUtils.java:97)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.readQuery(OSBReader.java:194)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.readQueries(OSBReader.java:112)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.read(OSBReader.java:87)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.flashline.util.classloader.ContextClassLoaderHandler.invoke(ContextClassLoaderHandler.java:39)
    at $Proxy0.read(Unknown Source)
    at com.oracle.oer.sync.framework.MetadataManager.start(MetadataManager.java:630)
    at com.oracle.oer.sync.framework.Introspector.<init>(Introspector.java:204)
    at com.oracle.oer.sync.framework.Introspector.main(Introspector.java:430)
    Caused by: java.io.IOException: Invalid jar file
    at com.bea.wli.config.importexport.ConfigJar$LogicalJarForm.<init>(ConfigJar.java:1341)
    at com.bea.wli.config.task.impl.UploadJarTask._execute(UploadJarTask.java:46)
    at com.bea.wli.config.task.impl.SessionedTask$1.execute(SessionedTask.java:233)
    at com.bea.wli.config.transaction.TransactionalTask._doExecute(TransactionalTask.java:217)
    at com.bea.wli.config.transaction.TransactionalTask._doExecuteWithRetry(TransactionalTask.java:162)
    at com.bea.wli.config.transaction.TransactionalTask.doExecute(TransactionalTask.java:142)
    at com.bea.wli.config.task.impl.SessionedTask.doExecute(SessionedTask.java:236)
    at com.bea.wli.config.task.impl.SessionedTask.doExecute(SessionedTask.java:191)
    at com.bea.wli.config.task.impl.UploadJarTask.uploadJar(UploadJarTask.java:36)
    at com.bea.wli.config.mbeans.Config.uploadJarFile(Config.java:442)
    at com.bea.alsb.harvester.utils.ConfigJarUtils.importConfigJar(ConfigJarUtils.java:79)
    at com.bea.alsb.harvester.plugin.reader.OSBReader.readQuery(OSBReader.java:196)
    ... 11 more
    Please help...
    Regards,
    yogesh

    Hi Yogesh - when you say "i don't think the file is corrupt, because it works fine in eclipse", do you mean, you are actually able to publish them to a server and test those out?
    Dont know your exact scenario, but when you say that you took the sbconfig from another machine and used it on your local system, cant we just go ahead with that? I mean, clean up your code base, and then import this latest "working" sbconfig and start afresh?
    -Swagat

  • JOB_OPEN JOB_SUBMIT JOB_CLOSE

    Hello Experts,
    I have a requirement where I want to execute a portion of my code in the background inorder to reduce the performance issue.
    This can be done through JOB_OPEN JOB_SUBMIT JOB_CLOSE command which generates a new JOB in the  background mode.
    JOB_OPEN
    JOB_SUBMIT
    JOB_CLOSE
    Question : How can I pass the internal table with the values through JOB_SUBMIT??
    Thanking you all in advance,
    Warm Regards,
    gayatri.

    Hi,
    * Get Print Parameters
      CALL FUNCTION 'GET_PRINT_PARAMETERS'
        EXPORTING
          no_dialog      = 'X'
        IMPORTING
          valid          = g_valid
          out_parameters = gs_params.
    * Open Job
    *  g_stim = '185800'.
      CONDENSE g_var6.
      CONCATENATE g_var8+6(2)
                  g_var8+3(2)
                  g_var8(2)
              INTO g_stim.
      g_sdat = sy-datum .
      CALL FUNCTION 'JOB_OPEN'
        EXPORTING
          jobname          = g_job
          sdlstrtdt        = sy-datum
          sdlstrttm        = g_stim
        IMPORTING
          jobcount         = g_jobcount
        EXCEPTIONS
          cant_create_job  = 1
          invalid_job_data = 2
          jobname_missing  = 3
          OTHERS           = 4.
      IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      CONDENSE : g_var1,
                 g_var2,
                 g_var3,
                 g_var4,
                 g_var5,
                 g_var6,
                 g_var7,
                 g_var8.
      gs_rspar-selname = 'P_USER'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var1.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_PWD'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var2.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_HOST'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var3.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_FILE'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var4+2.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_FNAME'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var5.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_FILE1'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var6.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_FILE2'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = g_var7.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_DEST'.
      gs_rspar-kind = 'P'.
      gs_rspar-low  = 'SAPFTPA'.
      APPEND gs_rspar TO gt_rspar.
      gs_rspar-selname = 'P_COMP'.    <----- This are Parameter value to be passed in background Program
      gs_rspar-kind = 'P'.
      gs_rspar-low  = 'N'.
      APPEND gs_rspar TO gt_rspar.
      SUBMIT ztest123 VIA JOB  g_job
           NUMBER  g_jobcount
    *                  USING SELECTION-SCREEN
                      WITH SELECTION-TABLE gt_rspar
                      TO SAP-SPOOL WITHOUT SPOOL DYNPRO
           SPOOL PARAMETERS gs_params
                      AND RETURN.
      CALL FUNCTION 'JOB_CLOSE'
          EXPORTING
            jobcount                          = g_jobcount
            jobname                           = g_job
            sdlstrtdt                         = g_sdat
            sdlstrttm                         = g_stim
    *     strtimmed                         = 'X'
          EXCEPTIONS
            CANT_START_IMMEDIATE              = 1
            INVALID_STARTDATE                 = 2
            JOBNAME_MISSING                   = 3
            JOB_CLOSE_FAILED                  = 4
            JOB_NOSTEPS                       = 5
            JOB_NOTEX                         = 6
            LOCK_FAILED                       = 7
            INVALID_TARGET                    = 8
            OTHERS                            = 9

  • Background Job is not creating the List ID

    Hello Experts,
    I am scheduling a program as a Background Job. When I go and check the tables TBTCP & TBTCO, I see the List ID field (TBTCP-LISTIDENT or TBTCO-LISTIDENT) value as 0 .
    Hence I am not able to get the Spool ID for this List.
    Can anybody please let me know what is the problem?
    Thanks a lot.

    Hi,
    See, if this piece of code can help you.
    data:   sdate type sy-datum,
            stime type sy-uzeit,
            l_valid,
            ls_params like pri_params,
            l_jobcount like tbtcjob-jobcount,
            l_jobname  like tbtcjob-jobname.
    start-of-selection.
    Get Print Parameters
      call function 'GET_PRINT_PARAMETERS'
           exporting
                no_dialog      = 'X'
           importing
                valid          = l_valid
                out_parameters = ls_params.
    Open Job
      l_jobname = 'THIS_JOB'.
      call function 'JOB_OPEN'
           exporting
                jobname  = l_jobname
           importing
                jobcount = l_jobcount.
    Submit report to job
      submit <your_program_name
           via job     l_jobname
               number  l_jobcount
           to sap-spool without spool dynpro
               spool parameters ls_params
                  and return.
    Kick job off 10 seconds from now.
      sdate = sy-datum.
      stime = sy-uzeit + 10.
    Schedule and close job.
      call function 'JOB_CLOSE'
           exporting
                jobcount  = l_jobcount
                jobname   = l_jobname
                sdlstrtdt = sdate
                sdlstrttm = stime.
    <REMOVED BY MODERATOR>
    Edited by: Alvaro Tejada Galindo on Mar 3, 2008 11:09 AM

  • Extra job getting created in dynamic handling of jobs

    I have the below code and I notice an extra job that is being created in SM35. Any reasons/clues please
    bdcjob will have A/P_ACCOUNTS_BDC
    adrjob will have A/P_ACCOUNTS_ADDRESS
    Name of the batch input session is A/P
    The 3rd extra job that is coming up is 'A/P' and I did not open any job by that name.
    Thanks for your help.
    Kiran
    DATA: bdcjob TYPE tbtcjob-jobname,
          bdcnum TYPE tbtcjob-jobcount,
          adrjob TYPE tbtcjob-jobname,
          adrnum TYPE tbtcjob-jobcount,
          params LIKE pri_params,
          l_valid   TYPE c.
    CHECK fileonly IS INITIAL.
    MOVE: jobname TO bdcjob.
    adrjob = 'A/P_ACCOUNTS_ADDRESS'.
    IF NOT logtable[] IS INITIAL.
      IF NOT testrun IS INITIAL.
    *   If its a test run, Non Batch-Input-Session
        SUBMIT rfbikr00 AND RETURN
                        USER sy-uname
                        WITH ds_name EQ file_o
                        WITH fl_check EQ 'X'.         "X=No Batch-Input
      ELSE.
    *   Create a session which will be processed by a job
        SUBMIT rfbikr00 AND RETURN
                        USER sy-uname
                        WITH ds_name EQ file_o
                        WITH fl_check EQ ' '.
    *   Open BDC Job
        CALL FUNCTION 'JOB_OPEN'
          EXPORTING
            jobname          = bdcjob
          IMPORTING
            jobcount         = bdcnum
          EXCEPTIONS
            cant_create_job  = 1
            invalid_job_data = 2
            jobname_missing  = 3
            OTHERS           = 4.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
          WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ELSE.
    *     Submit RSBDCSUB to trigger the session in background mode
          SUBMIT rsbdcsub
            VIA  JOB    bdcjob
                 NUMBER bdcnum
            WITH von     = sy-datum
            WITH bis     = sy-datum
            WITH z_verab = 'X'
            WITH logall  = 'X'
            AND RETURN.
          IF sy-subrc EQ 0.
    *       Export data to a memory id. This data will be used by the program
    *       that updates the address & email id
            EXPORT t_zzupdate TO SHARED BUFFER indx(st) ID 'MEM1'.
    *       Get Print Parameters
            CALL FUNCTION 'GET_PRINT_PARAMETERS'
              EXPORTING
                no_dialog      = 'X'
              IMPORTING
                valid          = l_valid
                out_parameters = params.
    *       Open a second job to trigger a program which updates addresses & email ids
            CALL FUNCTION 'JOB_OPEN'
              EXPORTING
                jobname          = adrjob
              IMPORTING
                jobcount         = adrnum
              EXCEPTIONS
                cant_create_job  = 1
                invalid_job_data = 2
                jobname_missing  = 3
                OTHERS           = 4.
            IF sy-subrc <> 0.
              MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
            ELSE.
    *         submit the program to update email id & long addresses
              SUBMIT zfpa_praa_address_update
                     VIA JOB adrjob
                     NUMBER  adrnum
                     TO SAP-SPOOL WITHOUT SPOOL DYNPRO
                         SPOOL PARAMETERS params
                            AND RETURN.
              IF sy-subrc EQ 0.
    *           First close the dependent job(address update job). Dependency
    *           is shown by using pred_jobcount & pred_jobname parameters
                CALL FUNCTION 'JOB_CLOSE'
                  EXPORTING
                    jobcount             = adrnum
                    jobname              = adrjob
                    pred_jobcount        = bdcnum
                    pred_jobname         = bdcjob
                  EXCEPTIONS
                    cant_start_immediate = 1
                    invalid_startdate    = 2
                    jobname_missing      = 3
                    job_close_failed     = 4
                    job_nosteps          = 5
                    job_notex            = 6
                    lock_failed          = 7
                    invalid_target       = 8
                    OTHERS               = 9.
                IF sy-subrc <> 0.
                  MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                          WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
                ENDIF.
              ENDIF.
            ENDIF.
          ENDIF.
    *     Close the main job(BDC Job)
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              jobcount             = bdcnum
              jobname              = bdcjob
              strtimmed            = 'X'
            EXCEPTIONS
              cant_start_immediate = 1
              invalid_startdate    = 2
              jobname_missing      = 3
              job_close_failed     = 4
              job_nosteps          = 5
              job_notex            = 6
              lock_failed          = 7
              invalid_target       = 8
              OTHERS               = 9.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
          ENDIF.
        ENDIF.
      ENDIF.
    Edited by: kiran dasari on Jul 9, 2010 12:58 AM

    I tried changing the tags..that did not help.
    Since there are two open job statements, I expect to see ONLY two jobs in SM37 and am seeing 3 jobs as mentioned. That is the problem and I wish to know how and from where the 3rd job is getting created.
    Let me try again and paste the code in the tags:
    DATA: bdcjob TYPE tbtcjob-jobname,
          bdcnum TYPE tbtcjob-jobcount,
          adrjob TYPE tbtcjob-jobname,
          adrnum TYPE tbtcjob-jobcount,
          params LIKE pri_params,
          l_valid   TYPE c.
    CHECK fileonly IS INITIAL.
    MOVE: jobname TO bdcjob.
    adrjob = 'A/P_ACCOUNTS_ADDRESS'.
    IF NOT logtable[] IS INITIAL.
      IF NOT testrun IS INITIAL.
    *   If its a test run, Non Batch-Input-Session
        SUBMIT rfbikr00 AND RETURN
                        USER sy-uname
                        WITH ds_name EQ file_o
                        WITH fl_check EQ 'X'.         "X=No Batch-Input
      ELSE.
    *   Create a session which will be processed by a job
        SUBMIT rfbikr00 AND RETURN
                        USER sy-uname
                        WITH ds_name EQ file_o
                        WITH fl_check EQ ' '.
    *   Open BDC Job
        CALL FUNCTION 'JOB_OPEN'
          EXPORTING
            jobname          = bdcjob
          IMPORTING
            jobcount         = bdcnum
          EXCEPTIONS
            cant_create_job  = 1
            invalid_job_data = 2
            jobname_missing  = 3
            OTHERS           = 4.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
          WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ELSE.
    *     Submit RSBDCSUB to trigger the session in background mode
          SUBMIT rsbdcsub
            VIA  JOB    bdcjob
                 NUMBER bdcnum
            WITH von     = sy-datum
            WITH bis     = sy-datum
            WITH z_verab = 'X'
            WITH logall  = 'X'
            AND RETURN.
          IF sy-subrc EQ 0.
    *       Export data to a memory id. This data will be used by the program
    *       that updates the address & email id
            EXPORT t_zzupdate TO SHARED BUFFER indx(st) ID 'MEM1'.
    *       Get Print Parameters
            CALL FUNCTION 'GET_PRINT_PARAMETERS'
              EXPORTING
                no_dialog      = 'X'
              IMPORTING
                valid          = l_valid
                out_parameters = params.
    *       Open a second job to trigger a program which updates addresses & email ids
            CALL FUNCTION 'JOB_OPEN'
              EXPORTING
                jobname          = adrjob
              IMPORTING
                jobcount         = adrnum
              EXCEPTIONS
                cant_create_job  = 1
                invalid_job_data = 2
                jobname_missing  = 3
                OTHERS           = 4.
            IF sy-subrc <> 0.
              MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
            ELSE.
    *         submit the program to update email id & long addresses
              SUBMIT zfpa_praa_address_update
                     VIA JOB adrjob
                     NUMBER  adrnum
                     TO SAP-SPOOL WITHOUT SPOOL DYNPRO
                         SPOOL PARAMETERS params
                            AND RETURN.
              IF sy-subrc EQ 0.
    *           First close the dependent job(address update job). Dependency
    *           is shown by using pred_jobcount & pred_jobname parameters
                CALL FUNCTION 'JOB_CLOSE'
                  EXPORTING
                    jobcount             = adrnum
                    jobname              = adrjob
                    pred_jobcount        = bdcnum
                    pred_jobname         = bdcjob
                  EXCEPTIONS
                    cant_start_immediate = 1
                    invalid_startdate    = 2
                    jobname_missing      = 3
                    job_close_failed     = 4
                    job_nosteps          = 5
                    job_notex            = 6
                    lock_failed          = 7
                    invalid_target       = 8
                    OTHERS               = 9.
                IF sy-subrc <> 0.
                  MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                          WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
                ENDIF.
              ENDIF.
            ENDIF.
          ENDIF.
    *     Close the main job(BDC Job)
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              jobcount             = bdcnum
              jobname              = bdcjob
              strtimmed            = 'X'
            EXCEPTIONS
              cant_start_immediate = 1
              invalid_startdate    = 2
              jobname_missing      = 3
              job_close_failed     = 4
              job_nosteps          = 5
              job_notex            = 6
              lock_failed          = 7
              invalid_target       = 8
              OTHERS               = 9.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
          ENDIF.
        ENDIF.
      ENDIF.
    ELSEIF NOT t_zzupdate[] IS INITIAL.
    * some other process
    endif.
    Thanks,
    Kiran

Maybe you are looking for