Scheduling an automatic Jobs in Data Services

Dear Experts,
could you please explain how to schedule an automatic job in Data Services?
I would like to be able to schedule a daily, weekly or monthly job.
Any Document which explain the steps?
Thanks
Pat

I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
You can now get what data your job is going to write to target without writing anything to target.
Test this in dev first before trying in Production.
REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
Thanks,

Similar Messages

  • .bat or .sh file for executing jobs in data services

    are there any command line utilities where i can import the .atl files and executes the jobs in data services
    i was able to import the .atl files using 'al_engine' but not able to find the arguments to execute the jobs.
    Here is my requiremen:
    There would be 2 master scripts :
    Master_Initial_Script.bat/.sh - This would have all the pre/post checks and would call the DS Initial Job
    Master_Delta_Script.bat/.sh - This would have all the pre/post checks and would call the DS Delta Job
    Pre-Checks for Delta Job -
    If the initial job is not loaded then
                    Do not move further to execute the Delta job
    else
                    Execute the Delta job
    Post Checks for Delta Job:
    Print the statement when the job starts successfully,
    Checks the error/return code and prints Job success message in a log file

    This looks more like scheduling the job with dependency.
    Unfortunately, BODS scheduler doesnt support setting depedencies for running the jobs.
    So, the best way to go forward is export the jobs to a batch files in a particular location.
    Use the batch files with a external scheduling tool and set the dependency.
    A scheduling tool such as Control M or Redwood chronacle scheduler should be able to execute the batch files.
    Set the dependency in the scheduling tool accordingly.
    Let me know if you need any more details.
    Regards,
    Kishore Reddy L.

  • Error while running a job in Data services

    We are using Data services BO XI R3 tool.
    We have created datastore for Oracle database  8i using Microsoft ODBC DSN. This is the source datastore.
    When job is to pull the data from source to 10g target, we get the below error in log file:
    25219           1        CON-120902     6/26/2009      Data services ODBC Driver Manager unable to find
                                                                              data source <...> in $LINK_DIR/bin/odbc.ini file.
    Is this a UNIX/Windows error?
    Please let us know how this can be resolved.
    Thanks in advance!

    I restart the jobserver and it work, for a moment :P, but now says more...
    1. SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    2. The SQL submitted is <select VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7>
    3. resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.>.
    SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    4. The SQL submitted is <select VERSION from AL_VERSION>.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    5. Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    I check if mysql DB has thas tables (they were created when I setup DS), but i don't know why  this erros says  No database selected  the odbc file looks correct...
    Thank you,
    Edited by: ToloPalmer on Jun 1, 2010 9:51 AM
    Edited by: ToloPalmer on Jun 1, 2010 9:58 AM

  • Unable to run a Real_time job in Data services.

    Hi guys,
    I have a scenario that i am reading the data from SAP using an idoc, into data services, and but i am getting an error while Configuring Real time sevices and in Service provider when we add job server over there, so the real time services are not starting. could you help me what are the possible ways to troubleshoot that.

    I'm not sure I understand you, on the CTI topic.  CTI is a fundemental protocol when it comes to CUCM/UCCX, which is the future of telphony.  No old school PBX's here, just packets.
    I was just looking at the UCCX 9.0(2) REST API and it looks like there are some stats to be farmed there, however, it's so new, I cannot really speaek to it from experience.
    Check it out:
    http://developer.cisco.com/documents/11218258/11219470/Cisco-uccx-developer-guide-902.pdf?redirect=http%3A%2F%2Fdeveloper.cisco.com%2Fweb%2Fuccxapi%2Fhome%3Fp_p_id%3Ddoc_library_summary_portlet_WAR_doclibrarysummaryportlet_INSTANCE_b8N60ctZVvHc%26p_p_lifecycle%3D0%26p_p_state%3Dnormal%26p_p_mode%3Dview%26p_p_col_id%3Dcolumn-5%26p_p_col_pos%3D1%26p_p_col_count%3D2
    Anthony Holloway
    Please use the star ratings to help drive great content to the top of searches.

  • Importing .atl DI jobs into data services.

    Hi Experts,
    I started learning data services one month back, I practiced and had done some assignments using data integrator tool. But, now we are using data services tool in the project.
    So, is it possible to import the .atl files of those data integrator jobs into the current data services version.
    Please suggest the necessary things in order to do this.
    Thanks in advance.

    if you have created your job in 11.7 version and now you are using Data Service XI R3 (12.x) then you can upgrade the 11.7 repository using Repository Manager, use the Repository Manager from the Data Services Installation
    open repositroy manager and enter 11.7 repo details and click on Upgrade, this will upgrade you 11.7 job, Df to 12.x
    or if you have already created your 12.x repository and have done some developement on that, then yes you can import the 11.7 ATL to 12.x some of the options may not be set to optimised values as done during upgrade process

  • Schedule an automatic job

    Hey friends
    plz give me idea of doing this using workflow or SM36. if possible provide code.
    as soon as user enters a good receipt in MB01 or MIGO, an automatic mail should be send to [email protected] stating:
    Material # XXXX quantity XXX has been received into Plant XXXX (field WERKS) Storage Location XXXX (field LGORT) and the date received in.
    Please tell me how to do this as automatic.
    i have found that data is stored in the MSEG table, but how to use it so that and automatic mail is send
    Thanks
    KVC

    below program to sene an email to email id lists.
    REPORT Z_NOTIFY_USER .
        Function          :  Send Workflow Message
        Author            :  Raj Ramamoorthy
        Date              :  07/09/2002
        Transaction code  :  SE38
        Function          :
                PROGRAM CHANGE HISTORY
      CHG DTE   PROJ     INT     DESCRIPTION OF PROGRAM CHANGE
      === ===   ====     ===     ============================
       Table Declarations
    TABLES: SOLI.
       Data Declarations
    SELECTION-SCREEN BEGIN OF BLOCK B1 WITH FRAME TITLE TEXT-001.
    PARAMETERS: SAPID     RADIOBUTTON GROUP ADDR,
                EMAIL_ID  RADIOBUTTON GROUP ADDR.
    SELECTION-SCREEN END OF BLOCK B1.
    SELECTION-SCREEN BEGIN OF BLOCK B2 WITH FRAME TITLE TEXT-002.
    SELECT-OPTIONS: ID  FOR SOLI-LINE NO INTERVALS.
    SELECT-OPTIONS: CC  FOR SOLI-LINE NO INTERVALS.
    SELECT-OPTIONS: BCC FOR SOLI-LINE NO INTERVALS.
    *PARAMETERS:     SENDER LIKE SOUD-USRNAM.
    SELECTION-SCREEN END OF BLOCK B2.
    SELECTION-SCREEN BEGIN OF BLOCK B3 WITH FRAME TITLE TEXT-009.
    PARAMETERS: SUB_LINE(60) TYPE C.
    SELECTION-SCREEN END OF BLOCK B3.
    SELECTION-SCREEN BEGIN OF BLOCK B4 WITH FRAME TITLE TEXT-008.
    SELECT-OPTIONS:  TEXT1 FOR SOLI-LINE NO INTERVALS.
    SELECTION-SCREEN END OF BLOCK B4.
    SELECTION-SCREEN BEGIN OF BLOCK B5 WITH FRAME TITLE TEXT-009.
    SELECTION-SCREEN BEGIN OF LINE.
    parameters: P_ATTACH as checkbox.
    selection-screen comment 3(30) text-010.
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN END OF BLOCK B5.
    DATA: MAIL_CONTENT LIKE SOLI OCCURS 0 WITH HEADER LINE,
          SUBJECT_LINE LIKE SOOD1-OBJDES.
       Start of program processing
    START-OF-SELECTION.
    Get the Body of the Message from the selection screen or from
    calling program
      LOOP AT TEXT1.
        MOVE   TEXT1-LOW TO MAIL_CONTENT-LINE.
        APPEND MAIL_CONTENT.
      ENDLOOP.
    Subject of the Message
      MOVE  SUB_LINE TO SUBJECT_LINE.
    call a routine to send the workflow message
      PERFORM SEND_EMAIL
              TABLES MAIL_CONTENT
              USING  SUBJECT_LINE.
    *&      Form  SEND_EMAIL
    Send Workflow message
    FORM SEND_EMAIL TABLES OBJCONT STRUCTURE MAIL_CONTENT
                    USING  TITLE   LIKE SOOD-OBJDES.
      DATA: RECEIVERS LIKE SOOS1 OCCURS 0 WITH HEADER LINE,
            TSOOD1 LIKE SOOD1,
            PACKING_LIST LIKE SOXPL OCCURS 0 WITH HEADER LINE,
            OBJCONT1 LIKE MAIL_CONTENT OCCURS 0 WITH HEADER LINE.
      DATA: BEGIN OF AT_HEADER OCCURS 1.
              INCLUDE STRUCTURE SOLI.
      DATA: END OF AT_HEADER.
      CLEAR: TSOOD1,
             RECEIVERS.
      REFRESH RECEIVERS.
      MOVE:  SY-LANGU       TO TSOOD1-OBJLA,
             'Email Notice' TO TSOOD1-OBJNAM,
             'C'            TO TSOOD1-OBJSNS,
             TITLE          TO TSOOD1-OBJDES.
           'SCHIAVONIR'   TO TSOOD1-OWNNAM.
    loop through each ID and move them to recipient table
      LOOP AT ID.
        TRANSLATE ID-LOW TO UPPER CASE.
        IF SAPID = 'X'.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                ' '          TO RECEIVERS-RECESC,
                 ID-LOW      TO RECEIVERS-RECNAM,
                'X'          TO RECEIVERS-SNDEX.
        ELSE.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                'U'          TO RECEIVERS-RECESC,
                'U-'         TO RECEIVERS-RECNAM,
                ID-LOW       TO RECEIVERS-RECEXTNAM.
        ENDIF.
        APPEND RECEIVERS.
        CLEAR RECEIVERS.
      ENDLOOP.
    loop through each CC and move them to recipient table
      LOOP AT CC.
        TRANSLATE CC-LOW TO UPPER CASE.
        IF SAPID = 'X'.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                ' '          TO RECEIVERS-RECESC,
                 CC-LOW      TO RECEIVERS-RECNAM,
                'X'          TO RECEIVERS-SNDEX,
                'X'          TO RECEIVERS-SNDCP.
        ELSE.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                'U'          TO RECEIVERS-RECESC,
                'U-'         TO RECEIVERS-RECNAM,
                CC-LOW       TO RECEIVERS-RECEXTNAM,
                'X'          TO RECEIVERS-SNDCP.
        ENDIF.
        APPEND RECEIVERS.
        CLEAR RECEIVERS.
      ENDLOOP.
    loop through each BCC and move them to recipient table
      LOOP AT BCC.
        TRANSLATE BCC-LOW TO UPPER CASE.
        IF SAPID = 'X'.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                ' '          TO RECEIVERS-RECESC,
                 BCC-LOW     TO RECEIVERS-RECNAM,
                'X'          TO RECEIVERS-SNDEX,
                'X'          TO RECEIVERS-SNDBC.
        ELSE.
          MOVE: SY-DATUM     TO RECEIVERS-RCDAT,
                SY-UZEIT     TO RECEIVERS-RCTIM,
                'U'          TO RECEIVERS-RECESC,
                'U-'         TO RECEIVERS-RECNAM,
                BCC-LOW      TO RECEIVERS-RECEXTNAM,
                'X'          TO RECEIVERS-SNDBC.
        ENDIF.
        APPEND RECEIVERS.
        CLEAR RECEIVERS.
      ENDLOOP.
      AT_HEADER = SY-DATUM.
      APPEND AT_HEADER.
      AT_HEADER = SY-UZEIT.
      APPEND AT_HEADER.
    IF SENDER EQ SPACE.
       SENDER = SY-UNAME.
    ENDIF.
      IF P_ATTACH EQ 'X'.
        PACKING_LIST-HEAD_START = 1.
        PACKING_LIST-HEAD_NUM   = 2.
        PACKING_LIST-BODY_START = 1.
        PACKING_LIST-BODY_NUM   = 9999.
        PACKING_LIST-FILE_EXT   = 'TXT'.
        APPEND PACKING_LIST.
        CLEAR  PACKING_LIST.
        APPEND LINES OF OBJCONT TO OBJCONT1.
        REFRESH OBJCONT.
      ENDIF.
      CALL FUNCTION 'SO_OBJECT_SEND'
           EXPORTING
                OBJECT_HD_CHANGE           = TSOOD1
                OBJECT_TYPE                = 'RAW'
           TABLES
                OBJCONT                    = OBJCONT
                RECEIVERS                  = RECEIVERS
                ATT_HEAD                   = AT_HEADER
                ATT_CONT                   = OBJCONT1
                PACKING_LIST               = PACKING_LIST
           EXCEPTIONS
                ACTIVE_USER_NOT_EXIST      = 1
                COMMUNICATION_FAILURE      = 2
                COMPONENT_NOT_AVAILABLE    = 3
                FOLDER_NOT_EXIST           = 4
                FOLDER_NO_AUTHORIZATION    = 5
                FORWARDER_NOT_EXIST        = 6
                NOTE_NOT_EXIST             = 7
                OBJECT_NOT_EXIST           = 8
                OBJECT_NOT_SENT            = 9
                OBJECT_NO_AUTHORIZATION    = 10
                OBJECT_TYPE_NOT_EXIST      = 11
                OPERATION_NO_AUTHORIZATION = 12
                OWNER_NOT_EXIST            = 13
                PARAMETER_ERROR            = 14
                SUBSTITUTE_NOT_ACTIVE      = 15
                SUBSTITUTE_NOT_DEFINED     = 16
                SYSTEM_FAILURE             = 17
                TOO_MUCH_RECEIVERS         = 18
                USER_NOT_EXIST             = 19
                X_ERROR                    = 20
                OTHERS                     = 21.
    ENDFORM.                               " SEND_EMAIL
    Scheduling Job is :
    First maintain variants for workflow program(Above Program)..
    now come to Sm36 transaction ->define back ground job
    Give the job name
    and class and define job and here you will find how to set the background job
    Reward Points if it is helpful
    Thanks
    Seshu

  • Error starting batch job through Data Services Web Service

    I have a customer running our web service interface to BODS. We have not been able to successfully start any jobs due to the following error:
    Services sent the request to start batch job 'Job_Test_Web_Services' on server 'null'. The Job Server refused the request with error: Job Server not found
    The same interface works against other environments (all XI 3.1), so I need some pointers as to where it gets the server name. Sounds to me like a configuration issue, but I am not a BO DS admin and don't know where to look.
    Please help.
    Thanks.

    I was able to reproduce this issue in our own lab by going to the BO DS Server Manager and removing the servers from the configuration. Not sure if this is what is happening at this customer site, but I will check it out and post back.
    Not sure why they would not have a Job Server defined if they want to run jobs!

  • Regarding rescheduling jobs in data services management console

    Hi friends
    There are some issues which are leading to job failure. Is it possible to take a chance of rerun the job once again, means can we rerun the job in production for brief investigation? If Active, what will be the consequences?
    1)     Will it be duplication of data?
    2)     Or there any possibilities of data losingu2026u2026
    Please give suggestion and  the consequences if we do ?
    Thanks in advance

    I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
    You can now get what data your job is going to write to target without writing anything to target.
    Test this in dev first before trying in Production.
    REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
    Thanks,

  • I would like to set job trigger on Data Service job.

    Hi Expert,
    I would like to know that how to set job trigger on Data Service job because now our system has been set job separate for get data from FI Module , HR Module and other. I want to execute job follow below.
    Start --> FI job run --> FI job finished --> trigger --> HR job run --> HR job finished --> trigger --> Other etc.
    Or if you have any idea. Please advise me.
    Thank you for your advise.

    Hi,
    You can do this from Data Service management console, follow the below steps to create and execute
    batch (.BAT) file
    Select Batch > repository.
    Click the Batch Job Configuration tab.
    For the batch job to configure, click the Export Execution Command link.
      4. On the Export Execution Command page, enter the desired options for the batch job command file that you want the Administrator to create(Extension will be added automatically:  .sh for UNIX , .bat for Windows)
      5. Click Export.
          Batch file for the job will be created under <DS_COMMON_DIR>\log directory.
    6. Create a New DS Jobs in Data service designer and write the script to execute your  batch file as below
    Hope this will be helpful for your requirement .
    Regards
    M Ramesh

  • Change source path in batch Job in global variable in data services

    Hi Experts,
    my organization has created job in data services 3.2 to cleanse the data reading from excel flat files. the folder path was store in the global variable(I think) and now they have changed the directories hence is it throwing me below error.
    Error, Input file  does not exist please confirm existence and restart job, 16 ) >
    failed, due to error <50316>: <>>> Error, Input file  does not exist please confirm existence and restart job>. I want to update the folder path. I am sure it would be easy but I am very new to BODS.
    (12.2) 07-15-14 16:10:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Sleeping for 35.000000 seconds...  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Waking up......  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : 'Starting the timer loop number 6...'
    (12.2) 07-15-14 16:10:43 (14232:12656) WORKFLOW: Work flow <WF_Metadata_Files> is started.
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> $G_FILENAME_IN : ALL_Metadata_SALES.xls...'
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> looking for input file name
                                                     \\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls'
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>>  Input file Name is '
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB ERROR' : '>>> Error, Input file  does not exist please confirm existence and restart job'
    I want to update the folder path\\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls to \\Home\BIData\finance\production\sales\Metadata\ALL_Metadata_SALES.xls
    when i investigated WF_Metadata_files i saw there is a global called INPUT_DIR i assume I have to change the path there. I tried to find old directory in the batch job but i cant find it and even When i give value to global variable it is still pointing to old path.
    Can anybody please help me.
    Thanks
    Tim

    Hi Tim,
    If having specified the value in the global variable it is still pointing to the old path there can be a couple of scenarios applicable
    1. There is a different global varaiable being used for the file path
    2. The filepath is hardcoded in the file-format or Excel file definition despite the declaration of the global variable.
    Are you getting this error when running a dataflow within this workflow or in a script? It will be better to run the workflow in debug mode and look through the stages to find out where exactly in the workflow it fails.
    kind regards
    Raghu

  • Data Services job server crashed and won't start backup

    Hello,
    I was running some jobs on data services 4.2 sp3  windows server 2012R2 and they all failed and the job server went down. None of the jobs that failed had an trace file or error log in the management console. Now i am unable to open data services designer or data services server manager, when I try to open them nothing happens. Also the SAP Data services job service cannot be started. The job server was running fine for a few weeks before this. This has happened twice already today the first  time the only way i was able to fix it was run the repair on the dataservices install. Can someone please help me what know what is causing this and how it can be fixed.

    Hi Tyler,
    It was Windows specific issue please refer the below link & KBA
    How To Fix Windows Service Error 1053
    http://windows-exe-errors.com/how-to-fix-windows-service-error-1053/
    1986247 - Error "Windows could not start the BusinessObjects Data Services service on local computer" occurs in Data Services 4.1
    https://service.sap.com/sap/support/notes/1986247
    1992260 - Error: Windows could not start the SAP Data Services service on local computer, after upgrading SAP data services and deleting job servers SAP Data Services 4.2
    https://service.sap.com/sap/support/notes/1992260
    Hope this will help!!!!
    Thanks,
    Daya

  • Roles of a Data services Administrator

    Hi Techies,
    I am a Data services Developer and had a chance to work as an BODS Administrator, so could you please let me know  what are the roles and jobs of Data Services Administrator, like Scheduling, monitoring and Checking Performance optimization and Repository manger and server manager and job server in Detail...
    Thanks in advance,,
    Best Regards,
    Amjad.

    Essentially,
    a. You can find the actual performer of a Task by using output/execData/systemAttributes/updatedBy/id
    b. You can use ids:isUserInRole to test user's membership in desired role
    Note that since we have a bug of not exposing identity XPath functions in BPMN, you have to edit the source of BPM. What I did was, create an expression in editor: ora:addQuotes(concat(role, users)) (any dummy expression to get everything setup), then found the expression in the .bpmn file, and replaced it with:
    <from language="http://www.w3.org/1999/XPath" xmlns:bpmn="http://www.omg.org/bpmn20" xmlns:ora="http://schemas.oracle.com/xpath/extension">ora:addQuotes(concat(bpmn:getDataObject('role'), bpmn:getDataObject('user')))</from>
    This will work with Groups but not Swimlane roles (app roles) It is supposed to work with Swimlane roles as well and we will fix it in PS.
    Please let me know if you would like the sample code.

  • SAP BW and Data Services

    I need some guidance from experts I am loading from data services to SAP BW at both ends jobs is successful but no data
    in sap bw i am loading in to sap bw infosource as a target.
    when i run a job from data services it creates infopakage automatically
    Please if someone did it previously can help me out.
    I'll appreciate that.

    Hi Binish,
             If you wanted to load to a characteristic in BW from dataservices, do the following,
    1. Create a datasource in BW containing the master data fields
    2. Create  the required data flow and work flow in DS, with target as our datasource
    3. In your infoprovider option in the tranaction: rsa1, right click your infoarea and select "Insert Infobject as infoprovider" & then select your infoobject.
    4. Once data is loaded after executing this data flow in DS, create a transformation in BW from this dats source with target as
    master data of this infoprovider
    Try all these steps and let me know if you face any problems.
    Regards,
    Balajee.

  • Data services with SQL Server 2008 and Invalid time format variable

    Hi all
    Recently we have switched from DI on SQL Server 2005, to DS(Date Services) on SQL Server 2008. However I have faced an odd error on the query that I was running successfully in DI.
    I validate my query output using a validation object to fill either Target table (if it passes), or the Target_Fail table (if it fails). Before sending data to the Target_Fail table, I map the columns using a query to the Target_Fail table. As I have a column called 'ETL_Load_Date' in that table, which I should fill it with a global variable called 'Load_Date'. I have set this global variable in the script at the very first beginning of the job. It is a data variable type:
    $Load_Date = to_char(sysdate(),'YYYY.MM.DD');
    When I assign this global variable to a datetime data type cloumn in my table and run the job using Data Services, I get this error:
    error message for operation <SQLExecute>: <[Microsoft][ODBC SQL Server Driver]Invalid time format>.
    However I didn't have this problem when I was running my job on the SQL Server 2005 using Data Integrator. The strange thing is that, when I debug this job, it runs completely successfully!!
    Could you please help me to fix this problem?
    Thanks for your help in advance.

    Thanks for your reply.
    The ETL_Date is a datetime column and the global variable is date data type. I have to use the to_char() function to be able to get just the date part of the current system datetime. Earlier I had tried date_part function but it returns int, which didn't work for me.
    I found what the issue was. I don't know why there were some little squares next to the name of the global variable which I had mapped to the ETL_Date in the query object!!! The format and everything was OK, as I had the same mapping in other tables that had worked successfully.
    When I deleted the column in the query object and added it again, my problem solved.

  • Data Services result as a .txt file

    I am trying to output the results of a JOB in data services to a .txt tab delimited file. Is there a way to wrie out at a .txt file. The job is currently outputting as a template table and would like for it to be txt instead
    Please help,
    Travis

    Hi Travis.
    Yes, it is possible. Create a flat file format and use it as a target.
    Designer Guide: http://help.sap.com/businessobject/product_guides/boexir4/en/sbo401_ds_designer_en.pdf (6.2 File format editor).
    Could you have any questions please let me know.
    Thank you,
    Viacheslav.

Maybe you are looking for

  • How do I create a partition using diskutil command

    I've shrunk my original partition and now filevault2 has converted it to corestorage, as follows: $ diskutil list /dev/disk0    #:                       TYPE NAME                    SIZE       IDENTIFIER    0:      GUID_partition_scheme              

  • Changing background Image of a DIV on RO

    I am trying to figure out how to change the background image of a div upn Roll-Over (RO) from the users mouse. My search thus far has led me to javascript, but if any other ideas/method are known, i have no preference for any language. I have a horiz

  • Is Method.invoke (Object,Class[]) efficient

    hi How efficient is the method invoke compared to explicit method calling? thanks

  • XCode Purchase Not In App Store?

    I purchased XCode on App Store, and i went to check if there had been any recent updates, but it said that i hadn't purchased it. When i clearly have it downloaded...

  • The photo library needs to be upgraded but it doesn't and won't open

    When I go into IPhoto a message pops up that the photo library needs to be upgraded to work with this version of iPhoto, but when I hit upgrade it search's for an upgrade than another box pops up that say "you can't open your current photo library us