Input to data service job

Hi Experts,
Is it possible to provide input to data service job from BW while runtime.

Hi,
you can follow the steps below to execute a BODS Job from BW.
Goto BW, create infopackage for respective datasource and fill the "3rd party selection" details as below.
        Repository    : BODS repository name
        JobServer     : BODS running jobserver name
        JobName     : BODS job name
Save and Execute infopackage.It will trigger BODS job which will load the data into BW datasource.
Add this infopackage into process chain and schedule it.
Now for passing the values into the job you can try as follows:-
You need to add Global Variables to your Job.
Then, if you refresh the 3rd party selections, you'll see your variables after Advanced_Parameters.

Similar Messages

  • Data services job failes while insert data into SQL server from Linux

    SAP data services (data quality) server is running on LInux server and Windows. Data services jobs which uses the ODBC driver to connect to SQL server is failing after selecting few thousand records with following reason as per data services log on Linux server. We can run the same data services job from Windows server, the only difference here is it is using SQL server drivers provided by microsoft. So the possible errors provided below, out of which #1 and #4 may not be the reason of job failure. DBA checked on other errors and confirmed that transaction log size is unlimited and system has space.
    Why the same job runs from Windows server and fails from Linux ? It is because the ODBC drivers from windows and Linux works in different way? OR there is conflict in the data services job with ODBC driver.
    ===== Error Log ===================
    8/25/2009 11:51:51 AM Execution of <Regular Load Operations> for target <DQ_PARSE_INFO> failed. Possible causes: (1) Error in the SQL syntax;
    (2)6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM Database connection is broken; (3) Database related errors such as transaction log is full, etc.; (4) The user defined in the
    6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM datastore has insufficient privileges to execute the SQL. If the error is for preload or postload operation, or if it is for
    ===== Error Log ===================

    this is another method
    http://www.mssqltips.com/sqlservertip/2484/import-data-from-microsoft-access-to-sql-server/
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Issue while using views in Data services Jobs

    Hi,
    In Data Services Job, i am trying to pull data from a view to a table. The view is pointing to the table which is in other database.
    The problem is when i import the source view into data services and view the data, i found one row having wrong data. The values in that row are wrong/corrupted while the same row in the source table is having correct values
    I queried the view from TOAD for that record. The values are valid.
    The data is coming wrong only in Data services. Any row in that table gets corrupted, there is no specific row.
    Hence while running the jobs i am getting errors.
    Any idea what can be the reason of getting corrupted data in view while same view when queried from TOAD gives correct values?

    hi,
    There is a possibility of unsupported data type by data service, please share the data service version,  database type and data type of column which got corrupted.
    Regards,
    M Ramesh

  • Data Services job server crashed and won't start backup

    Hello,
    I was running some jobs on data services 4.2 sp3  windows server 2012R2 and they all failed and the job server went down. None of the jobs that failed had an trace file or error log in the management console. Now i am unable to open data services designer or data services server manager, when I try to open them nothing happens. Also the SAP Data services job service cannot be started. The job server was running fine for a few weeks before this. This has happened twice already today the first  time the only way i was able to fix it was run the repair on the dataservices install. Can someone please help me what know what is causing this and how it can be fixed.

    Hi Tyler,
    It was Windows specific issue please refer the below link & KBA
    How To Fix Windows Service Error 1053
    http://windows-exe-errors.com/how-to-fix-windows-service-error-1053/
    1986247 - Error "Windows could not start the BusinessObjects Data Services service on local computer" occurs in Data Services 4.1
    https://service.sap.com/sap/support/notes/1986247
    1992260 - Error: Windows could not start the SAP Data Services service on local computer, after upgrading SAP data services and deleting job servers SAP Data Services 4.2
    https://service.sap.com/sap/support/notes/1992260
    Hope this will help!!!!
    Thanks,
    Daya

  • I would like to set job trigger on Data Service job.

    Hi Expert,
    I would like to know that how to set job trigger on Data Service job because now our system has been set job separate for get data from FI Module , HR Module and other. I want to execute job follow below.
    Start --> FI job run --> FI job finished --> trigger --> HR job run --> HR job finished --> trigger --> Other etc.
    Or if you have any idea. Please advise me.
    Thank you for your advise.

    Hi,
    You can do this from Data Service management console, follow the below steps to create and execute
    batch (.BAT) file
    Select Batch > repository.
    Click the Batch Job Configuration tab.
    For the batch job to configure, click the Export Execution Command link.
      4. On the Export Execution Command page, enter the desired options for the batch job command file that you want the Administrator to create(Extension will be added automatically:  .sh for UNIX , .bat for Windows)
      5. Click Export.
          Batch file for the job will be created under <DS_COMMON_DIR>\log directory.
    6. Create a New DS Jobs in Data service designer and write the script to execute your  batch file as below
    Hope this will be helpful for your requirement .
    Regards
    M Ramesh

  • Data Services job rolling back Inserts but not Deletes or Updates

    I have a fairly simple CDC job that I'm trying to put together. My source table has a record type code of "I" for Inserts, "D" for deletes, "UB" for Update Before and "UP" for Update After. I use a Map_CDC_Operation transform to update the destination table based on those codes.
    I am not using the Transaction Control feature (because it just throws an error when I use it)
    My issue is as follows.
    Let's say I have a set of 10,000 Insert records in my source table. Record number 4000 happens to be a duplicate of record number 1. The job will process the records in order starting with record 1 and begin happily inserting records into the destination table. Once it gets to record 4000 however it runs into a duplicate key issue and then my try/catch block catches the error and the dataflow will exit. All records that were inserted prior to the error will be rolled back in the destination.
    But the same is not true for updates or deletes. If I have 10000 deletes and 1 insert in the middle that happens to be an insert of a duplicate key, any deletes processed before the insert will not be rolled back. This is also the case for updates.
    And again, I am not using Transaction Control, so I'm not sure why the Inserts are being rolled back, but more curiously Updates and Deletes are not being rolled back. I'm not sure why there isn't a consistent result regardless of type of operation. Does anyone know what's going on here or  what I'm doing wrong/what my misconception may be?
    Environment information: both source and destination are SQL Server 2008 databases and the Data Services version we use is 14.1.1.460.
    If you require more information, please let me know.

    Hi Michael,
    Thanks for your reply. Here are all the options on my source table:
    My Rows per commit on the table is 10,000.
    Delete data table before loading is not checked.
    Column comparison - Compare by name
    Number of loaders - 1
    Use overflow file - No
    Use input keys - Yes
    Update key columns - No
    Auto correct load - No
    Include in transaction - No
    The rest were set to Not Applicable.
    How can I see the size of the commits for each opcode? If they are in fact different from my Rows per commit (10,000) that may solve my issue.
    I'm new to Data Services so I'm not sure how I would implement my own transaction control logic using a control column and script. Is there a guide somewhere I can follow?
    I can also try using the Auto correct load feature.  I'm guessing "upsert" was a typo for insert? Where is that option?
    Thank you very much!
    Riley

  • Data Services 4.0 error when running job - not updating repository table

    Hi All,
    I am hoping someone might have come across this error before.
    When running a data services job, from within Data Services Designer, when trying to re-run the same job, I get the following error messages:
    9332     2572     DBS-070300     17/10/2011 10:38:39 AM     |Data flow DF_XXXX_Data_DW1Date
    9332     2572     DBS-070300     17/10/2011 10:38:39 AM     SQL submitted to Oracle Server <BO3> resulted in error <ORA-00001: unique constraint (DS3_ADMIN.SYS_C005632) violated
    9332     2572     DBS-070300     17/10/2011 10:38:39 AM     >. The SQL submitted is <INSERT INTO AL_BW_REQUEST (REQUESTID,TYPE,VALUE,DF_NAME,CREATION_TIME) VALUES (
    9332     2572     DBS-070300     17/10/2011 10:38:39 AM     'REQU_4NIVIZ1G9S5B7LUY0HHP65ZJU', 3 , '29', 'DF_XXXX_Data_DW1Date',  to_date('2011.10.17 10:38:39', 'yyyy.mm.dd
    9332     2572     DBS-070300     17/10/2011 10:38:39 AM     hh24:mi:ss') )>.
    9332     2572     REP-100109     17/10/2011 10:38:39 AM     |Data flow DF_XXXX_Data_DW1Date
    9332     2572     REP-100109     17/10/2011 10:38:39 AM     Cannot save <RequestID info> into the repository. Additional database information: <SQL submitted to Oracle Server <BO3>
    9332     2572     REP-100109     17/10/2011 10:38:39 AM     resulted in error <ORA-00001: unique constraint (DS3_ADMIN.SYS_C005632) violated
    When I go and look into the Data Services repository AL_BW_REQUEST  table, the first time the job is run, all five fields in this table populate with data successfully. When re-running the job, on 3 out of the five fields within this table are populated but the VALUE and DF_NAME field are not populated, hence causing the primary key violation.
    Here is the data below from the AL_BW_REQUEST  table:
    REQUESTID                                      TYPE  VALUE     DF_NAME                               CREATION_TIME
    REQU_4NIVIZ1G9S5B7LUY0HHP65ZJU     3                      10/17/2011 10:38:38 AM
    REQU_4NJ503RVIFD3VLBJJEKBJ3OE2     3                       10/17/2011 10:33:37 AM
    REQU_4NJ4ZHSP9F93L3OAUDV6RHCMY     3                       10/17/2011 10:32:31 AM
    REQU_4NIV1NKP9645E394E7H12Q796     3                      10/17/2011 9:53:53 AM
    REQU_4NEYFNI2U68MBJOEWFJWVWWU2     3  24     DF_XXXX_Data_DW1Date      10/17/2011 9:47:58 AM
    If I truncate this table and re-run the DS job, it will not populate the VALUE and DF_NAME fields. 
    Does anyone know what could be causing this? The target source is BW.
    Seems like a bug to me.
    Thanks,
    Ainsley

    Hi Ramesh,
    Am a HANA apps consultant in India, would like to get in touch with you. but don't know how
    regards,
    Tilak

  • SAP BO Data Services XI 3.2 - Cannot Handle Multithreaded RFC Connection?

    Hi Guys,
    Just want to ask for your inputs if Data Services cannot handle multiple RFC connection request to BW system?
    The scenario is:
    There is one BODI job using RFC connection and trigger the 2nd job at the same time and it happen that the 2nd job failed.
    Current version of SAP BO Data Services XI 3.2 that we are using is 12.2.2.1
    Thanks in advance,
    Randell

    Arpan,
    One way to get to the multiprovider data is to use Open Hub with a DTP that gets the data from the multiprovider and exposes it as an open hub destination to Data Services. With Data Services XI 3.2 we now fully support Open Hub where Data Services will (1) start the process chain to load the data (2) read the data when process chain ended and (3) notify Open Hub when done so that the data can be purged again.
    More info on Open Hub here : http://help.sap.com/saphelp_nw04/helpdata/en/1e/c4463c6796e61ce10000000a114084/content.htm
    But I will also look into the why we show the multiproviders when browsing the metadata, but get an error when trying to extract using the ABAP method (not via Open Hub). You could be right in your assumptions below and we might just need to hide the multiproviders when browsing metadata.
    Thanks,
    Ben.
    Edited by: Ben Hofmans on Jan 5, 2010 6:06 PM - added link to Open Hub documentation which references multiproviders as possible source.

  • Would a BW functional developer be able to support data services?

    Hi gurus,
    Our company cannot justify having a full time data services functional developer to support our existing master data address cleanse / address validation process in data services.   There is light work for customizing data services jobs and occasional support work for address validation issues.   We have a full staff of SAP developers in most areas and are looking to train an existing employee to do two roles.  Not being well versed in what is involved with data services support, can you help with what skill set would be a good complement to also learn data services?  Would it be a BW functional developer? ABAP developer?  Basis?  Master Data functional developer?
    Warm Regards,
    CM

    Hello
    In general, the most effective Data Services developers have a 'proper' development background and understand set theory, relational databases, and SQL.
    You'll also need to find someone with an understanding of data quality theory.
    Michael

  • Help with data services webservice in Xcelsius

    Hi,
    I have the a problem, this is the problem:
    1. I add a web service connection, to run a data services job
    2. I put thw WSDL URL and after that i click the button import
    3. When i click the button IMPORT, that generate a WEB SERVICE URL
    The problem is, if a copy this WEB SERCVICE URL in my internet explorer it shows me the xml data but if i open this url in the server when i will publish the swf file the url doesnt open...
    Why in my machine can open and in other machine no....?
    Please i need help thanks

    Hi,
    Need some more information.
    1) Do you have a problem in viewing data when you import it to swf on infoview.
    2) Or you have any problems in importing that wsdl link at data connection level.
    If you have issue with the first 1 then you have to place a cross-domain.xml file in your ROOT path of BO Tomcat server.
    Apart from that you have to allow the local disk ro access external source in Flash Global settings manager.
    Is the link generated with QaaWS tool or java or .net ?
    Let me know if you need more information.
    Regards,
    Anjani Kumar C.A.

  • Can Data Services take .gz (zipped) file

    Hello,  I am trying to figure to how to setup a Data Services job to process an .xml file that is stored in a .gz (zipped) file.   Can Data Services process .gz files?    I don't think it can.   I need to figure out how to unzip it in the job and process the file..
    Thanks.

    From reading this thread, i'm guessing you re fairly new to unix ?
    Before getting too involved in the data services part of this, make sure you aren't getting problems because of other issues.
    first, write your script to call gzip taking a filename as a parameter.
    I suggest you create a standard directory you keep all the scripts you will call from Data Services in. 
    When you are login in to the account Data services runs under
    type:
    cd ~
    (This should change to your home directory if your not already there) then press return.
    next type:
    pwd
    then press return
    This should display the home directory for data services.  Lets assume it is '/home/dataservices'
    Next create a directory for your script:
    mkdir /home/dataservices/scripts
    change into that directory:
    cd  /home/dataservices/scripts
    write your script:
    vi gunzip.ksh
    (Replace vi with the editor or your choice)
    Your script should at its most basic look something like this:
    #!/bin/ksh
    gzip - d $1
    NB:  If you don't understand shell scripting, google is your friend.  A link I found quickly:
    http://www.dartmouth.edu/~rc/classes/ksh/
    next set the script as executable:
    chmod u+x gunzip.ksh
    finally verify it works from the command line:
    /home/dataservices/scripts/gunzip.ksh myfile.gz
    If it uncompresses the file, you should be able to run this from data services with no trouble.
    If you don't understand these unix commands, you really need to forget about data services for a while and go and learn unix There is plenty of good stuff online and millions of books for beginners like dummies guides, etc.
    Good luck !

  • Exception in get_Monitor_Log/get_Error_Log after Data Services Batch error

    I have a Data Services Job that has an error (intentional to test getting logs). I submit and monitor the job through the Web Service SDK. When the job completes with error and I call either get_Monitor_Log or get_Error_Log, these methods raise an exception as follows:
    org.xml.sax.SAXParseException: The reference to entity "ERROR_STEP" must end with the ';' delimiter.
    I can see both logs on the DS server under Administrator/Batch.
    Why can't I retrieve the error log?

    YES!!! I see the problem...SAX parser must not like the &. but how do I solve this, since it's the server
    that sends this in a SOAP message? Probably needs to be packaged as CDATA, not text.
    (12.1) 02-23-09 08:44:11 (E) (3920:0176) VAL-030159: |SESSION WF_CustomerAddress
                                                         Found erroneous expression < &ERROR_STEP('  sesleep(60000)
                                                         >. Please check its syntax and fix this expression.
    (12.1) 02-23-09 08:44:11 (E) (3920:0176) VAL-030159: |SESSION WF_CustomerAddress
                                                         Found erroneous expression < &ERROR_STEP('  sesleep(60000)
                                                         >. Please check its syntax and fix this expression.
    Running version 3.1 of Data Services (12.1.0)

  • Jobs gets hanged when a call is made to PLSQL function in Data Services XI

    Hi,
    I am facing the below issue after migration of BODI 11.7 to BODS XI 3.1.
    The job is not proceeding after the below mentioned statements.
    print('before call');
    $is_job_enable=DS_TEST.TEST.MY_PKG.IS_JOB_ENABLED(job_name());
    print($is_job_enable);
    MY_PKG.IS_JOB_ENABLED plsql function will return Number.
    $is_job_enable is a global variable declared as decimal (10, 0).
    This Job works fine in Data Integrator 11.7.3 version and gets handed in Data Services XI 3.1.
    I tried changing the global variable $is_job_enable to int and created new data sources before doesn't solve the problem. Can anyone tell me what is the issue?
    Thanks & Regards
    Maran MK
    The trace file says
    5260     3284     JOB     5/5/2009 4:43:17 AM     Job <TEST_JOB> is started.
    5260     3284     PRINTFN     5/5/2009 4:43:17 AM     before call
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <MY_PKG.IS_JOB_ENABLED> is started.
    5260     3284     SP     5/5/2009 4:43:18 AM     SQL query submitted for stored procedure call <MY_PKG.IS_JOB_ENABLED> is: <BEGIN :AL_SP_RETURN :=
    5260     3284     SP     5/5/2009 4:43:18 AM     "TEST"."MY_PKG"."IS_JOB_ENABLED"("P_JOB_NAME" => :P_JOB_NAME); END;
    5260     3284     SP     5/5/2009 4:43:18 AM     >.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <E> input parameter <P> has value of <TEST_JOB>.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <E> return value is <1.0000000>.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <MY_PKG.IS_JOB_ENABLED> is done.
    The below error occurs only in Windows and not in Linux environment.
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     |Session TEST_JOB
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <E:\Program Files\Business Objects\Data
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Services\log\BODI_MINI20090505044318_5260.DMP> and <E:\Program Files\Business Objects\Data
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Services\log\BODI_FULL20090505044318_5260.DMP>
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Process dump is written to <E:\Program Files\Business Objects\Data Services\log\BODI_MINI20090505044318_5260.DMP> and
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     <E:\Program Files\Business Objects\Data Services\log\BODI_FULL20090505044318_5260.DMP>
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Call stack:
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00CA9EAB, ActaDecimalImpl<RWFixedDecimal<RWMultiPrecisionInt<3> >,RWMultiPrecisionInt<3>,ActaDecimal28,char
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     [29]>::operator=()0315 byte(s), x:\src\rww\actadecimalimpl.cpp, line 13140004 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00D8A267, Convert()+0999 byte(s), x:\src\eval\calc.cpp, line 0303
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBF9E0, XVal_cast::compute()+0272 byte(s), x:\src\core\compute.cpp, line 1664
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBC239, XStep_assn::execute()+0057 byte(s), x:\src\core\step.cpp, line 0069
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBB30D, XStep_sblock::execute()+0029 byte(s), x:\src\core\step.cpp, line 0707
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBB30D, XStep_sblock::execute()+0029 byte(s), x:\src\core\step.cpp, line 0707
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBE0BC, XPlan_spec::execute()+0348 byte(s), x:\src\core\plan.cpp, line 0082
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DC5EA0, XPlan_desc::execute()+0336 byte(s), x:\src\core\xplan.cpp, line 0153
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBD68E, XPlan_spec::compute()0206 byte(s), x:\src\core\plan.cpp, line 01450011 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBD891, XPlan_spec::compute()+0225 byte(s), x:\src\core\plan.cpp, line 0244
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:0074533A, AE_Main_Process_Options()+31498 byte(s), x:\src\xterniface\actamainexp.cpp, line 3485
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00747EDA, AE_Main()1498 byte(s), x:\src\xterniface\actamainexp.cpp, line 07680030 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:004029F9
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Registers:
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     EAX=0000000E  EBX=03E392E0  ECX=04B455A0  EDX=012346D8  ESI=02B75D88
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     EDI=04B455A0  EBP=00212738  ESP=002124BC  EIP=00CA9EAB  FLG=00210206
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     CS=001B   DS=0023  SS=0023  ES=0023   FS=003B  GS=0000
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Exception code: C0000005 ACCESS_VIOLATION
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Fault address:  00CA9EAB 01:00585EAB E:\Program Files\Business Objects\Data Services\bin\acta.dll

    Hi Manoj & Tiji,
    Thanks for your comments. Please find the below outcome.
    print($is_job_enable); -- is not executed if PLSQL function is called.
    I changed $is_job_enable to VARCHAR, still the same issue.
    I created new project and executed the same in new job, still the same issue (all objects are new except Datastore).
    The dmp happens only when we the PLSQL function is called. I commented the Function call, the execution proceeds further but got hanged in other PLSQL function call (different than the 1st one)
    Is this bug in 12.1?
    Can you tell any Hot fix available? If possible please give me the SAP Notes Number.
    Is there any other way to execute the PLSQL functions/procedures in 12.1?
    Thanks
    Maran MK

  • Change source path in batch Job in global variable in data services

    Hi Experts,
    my organization has created job in data services 3.2 to cleanse the data reading from excel flat files. the folder path was store in the global variable(I think) and now they have changed the directories hence is it throwing me below error.
    Error, Input file  does not exist please confirm existence and restart job, 16 ) >
    failed, due to error <50316>: <>>> Error, Input file  does not exist please confirm existence and restart job>. I want to update the folder path. I am sure it would be easy but I am very new to BODS.
    (12.2) 07-15-14 16:10:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Sleeping for 35.000000 seconds...  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Waking up......  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : 'Starting the timer loop number 6...'
    (12.2) 07-15-14 16:10:43 (14232:12656) WORKFLOW: Work flow <WF_Metadata_Files> is started.
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> $G_FILENAME_IN : ALL_Metadata_SALES.xls...'
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> looking for input file name
                                                     \\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls'
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>>  Input file Name is '
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB ERROR' : '>>> Error, Input file  does not exist please confirm existence and restart job'
    I want to update the folder path\\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls to \\Home\BIData\finance\production\sales\Metadata\ALL_Metadata_SALES.xls
    when i investigated WF_Metadata_files i saw there is a global called INPUT_DIR i assume I have to change the path there. I tried to find old directory in the batch job but i cant find it and even When i give value to global variable it is still pointing to old path.
    Can anybody please help me.
    Thanks
    Tim

    Hi Tim,
    If having specified the value in the global variable it is still pointing to the old path there can be a couple of scenarios applicable
    1. There is a different global varaiable being used for the file path
    2. The filepath is hardcoded in the file-format or Excel file definition despite the declaration of the global variable.
    Are you getting this error when running a dataflow within this workflow or in a script? It will be better to run the workflow in debug mode and look through the stages to find out where exactly in the workflow it fails.
    kind regards
    Raghu

  • How to pass the data from a input table to RFC data service?

    Hi,
    I am doing a prototype with VC, I'm wondering how VC pass the data from a table view to a backend data service? For example, I have one RFC in the backend system with a tabel type importing parameter, now I want to pass all the data from an input table view to the RFC, I guess it's possible but I don't know how to do it.
    I try to create some events between the input table and data service, but seems there is no a system event can export the whole table to the backend data service.
    Thanks for your answer.

    Thanks for your answer, I tried the solution 2, I create "Submit" button, and ser the mapping scope to  be "All data rows", it only works when I select at least one row, otherwise the data would not be passed.
    Another question is I have serveral imported table parameter, for each table I have one "submit" event, I want these tables to be submitted at the same time, but if I click the submit button in one table toolbar, I can only submit the table data which has a submit button clicked, for other tables, the data is not passed, how can I achieve it?
    Thanks.

Maybe you are looking for

  • Itunes stops podcast subscription by itself

    I subscribe to Rev3's "Tekzilla" video podcast. there are one-minute daily versions and a once-per-week long-form version that are covered by the one subscription (in other words you can't get just one or the other). I watch only the long-form versio

  • Drawing images gives 2 different results with the same code.

    Im adding my code on here, and its sort of long but ill try to explain it. Most of it you can ignore. Anyway i'm making a tree menu with only 1 level of expansion. So you have a TreeMenu object, and this object has some Branch classes. These Branch c

  • JDeveloper 11g and Configure SOA

    Dear Sir/Madam I've downloaded jdevstudio1111.zip JDeveloper 11g Technology Preview 4. Follow intruction in: soa11g_installation_tp4.pdf, when step: Configure the SOA infrastructure \ i. Select Tools -> Configure SOA I could not find this option in T

  • Java IDE survey

    Hi I am trying to learn Java SWING and wonder which IDE is better when comes to SWING programming. I am looking at JBuilder and JDeveloper. Which is better?

  • HT1338 how install printer Hp 1505n in Mac Book Pro

    How install wiredless Hp 1505n printer in Mac Pro 13.3 laptop.