Join In BODS job

Hi All,
I am new to BODS tool. I am trying to read data from two databases (Hana and MS SQL Server)and do the inner join in Query transformation (From tab). These tables are huge in size and I don't need all the fields from Tables. So I am using SQL block to write custom selects for both tables with some where conditions . These select queries are working fine and takes less time to execute when I checked in the their respective consoles.
But the BODS job is taking lot of time to execute this job. The log of query transformation shows that it is going for a Cartesian join and then doing filter of join condition. The row count keeps on growing as it is a Cartesian join and I have to kill the job.
How Can I make this join to run faster . Is there any other option ?
Regards,
Suman.

Hi Suman,
I would suggest you to create a database link since you are using two different database servers and make sure that your DF is full pushdown. Inner join shall not take much time if it is full push down.
You may refer this article on how to create a database link in sql server. URL
Another article on performance optimization. URL
Hope this would be helpful.
Regards,
MSA

Similar Messages

  • Pass BODS job parameters as HAVA View parameters

    Hi experts,
    I use HANA Calculation View as Data Source and need to pass BODS job's parameters into Calculation View parameters. I have tried to use SQL Transformation with code like this (see below) but it doesn't work.
    select RRCTY, RYEAR, FISCAL_PERIOD as POPER, FISCAL_YEAR_PERIOD, RBUKRS, RACCT, RCNTR, WBS, RZZPERNR, PLEVEL, ROW, MEASURE, REP_ID
         from SOURCE_1.DIM_EMPLOYEE as emp
         inner join "_SYS_BIC"."packeage/CalcView" (PLACEHOLDER."$$IP_SNAPSHOT_TIMESTAMP$$" => {$IP_SNAPSHOT_TIMESTAMP}, PLACEHOLDER."$$IP_REP_ID$$" => {$IP_REP_ID}) as mce1
         on emp.EMPLOYEE_CHAR = mce1.RZZPERNR
    How can I pass job parameters into HANA Calculation View from BODS?
    Thank you!
    With best regards,
    Sergey

    Hello
    There's nothing wrong with your theory, it will work as all Data Services is doing is constructing an SQL string as passing it to HANA.  Which version of HANA are you using?  The PLACEHOLDER syntax changed slightly, however I suspect you are missing some single quotes around your parameter values.
    Get the SQL working succesfully via the HANA Studio (or hdbsql) then reverse engineer the parameterised statement for the SQL transform.
    Michael

  • Error while executing a uinx script in BODS job - exit code 127

    I am executing a UNIX script using BODS job and it is giving an error message saying 'Program terminated because of exit code 127'. The same script i am able to execute in UNIX. I am not able to find the solution for this.

    Using a print command with return code 8 should give more insight to the error.
    print(exec('sh \bin','<your command>',8));

  • Get a connection error when running a BODS job

    Hi All,
    I have BODS system setup in my environment. There are quite some jobs that run everyday. The jobs have stopped running since the last days. The jobs arn't failing, but they do not run.
    When I try to Execute a job, I get the error "Connection Refused: connection". When I Test the connection from BODS to CMS, it is working fine. Also, from the BODS/BO server I am able to connect/ping to the DataStore servers. My Datastores are SQL servers and ECC also.
    I do not know what is the resolution for the issue.
    Thanks,
    Amrita

    Hi Manoj,
    We recognized the issue was at the OS level, in the "Data Services"-OS Service.
    The BODS jobs are running fine now.
    Thanks,
    Amrita

  • Error while runnin BODI job

    Hi,
    We were executing the BODI job from my machine. We are having job server seperately.When we execute job, we are getting an error called "TNS adoptive error". But i am able to see the records if i click the table in my dataflow.
    But the same job is running fine if we execute the job from another machine.But we are using the same job server for both the machines.
    Can anyone please help us to resolve this issue? Anything needs to be configured in my local machine?
    Thanks,
    Vino

    When you click the "view data" button for a table in a dataflow, you are using the local client to connect to the database and get the results. Since this works fine, your local client seem to be correctly configured.
    When you execute a job to a remote jobserver, it is the jobserver that will connect to the database to extract and load the data. So the error you get seems to indicate that your jobserver cannot connect to the source/target table.
    What's strange however, is that you say that from another client, you can successfully execute the job  to the same jobserver. Is it exactly the same job ? Maybe the TNS name connection parameter in the datastore is different in both repositories and could explain why the job fails for one client ?
    Also, could you clarify what error you exactly get ? I didn't get any match when I googled "TNS adoptive error"...

  • Batch file not executing in BODS Job

    Hi friends,
    I have created a batch file to create a text file in a directory. I ran it manually and it worked fine i.e a txt file was created, However when I used the batch file in script in BODS job, It's not being executed.
    The script being used is as follows ( I have tried both, Both worked manually but not in Job, the job ran successfully without desired output )
    print(exec('C:\xxx\testdir.bat ','',0));
    exec('C:\xxx\testdir.bat ','',8);
    testdir.bat has the below txt code
    cd C:\xxx\yyy dir *.xml /b > dir.txt
    Thanks and Regards
    Anil

    What have you changed between
    However when I used the batch file in script in BODS job, It's not being executed
    and
    The Job was successfully executed but  an empty dir.txt was created with no files.

  • How to create a validation rule in SAP BODS Job

    Hi Experts
    I have created a BODS Job and in that job i have to create a validation rule as if the cust_id is null the loading must stop.
    I dont have idea where i have to define this validation rule in the Job and how to stop the load job if the validation rule file.
    My Job was defined like below image
    Please guide me where i have to define the validation rule and how to stop the load job.
    Thanks in advance
    PrasannaKumar

    Hi samatha b
    Thanks for your response. I have done as you said and now i can rise the exception.
    I have another requirement as per the validation transformation the data will load into Pass table and Fail table after job execution. If any data entered into fail table i have delete the data loaded into Pass table.
    Here i am facing problem as my target tables are MySQL tables and when i am writing scripts i wrote the script as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error also
    Thanks in Advance
    PrasannaKumar

  • HOW TO RUN BODS JOB THROUGH UNIX SCRIPT

    Dear Experts
    Please provide me the way how to call a job by a script .
    I have used Export Execution Command as recommended by below links
    http://scn.sap.com/docs/DOC-34648
    http://scn.sap.com/community/data-services/blog/2012/08/22/sap-bods--running-scheduling-bods-jobs-from-linux-command-line-using-third-party-scheduler
    But I am not able to locate .sh  file in unix server .
    This is required to call a job after completion of parent job.
    Thanks
    Anupam

    You can check the status in "SAP Business Objects Data Services management Console", Below are the steps
    Login in SAP Business Objects Data Services management Console
    Click on Batch Job
    Select the local repository
    Then check your Job Execution status from there
    For your second query there are two ways one is do the same activity what you have done in DEV.
    below are the steps
    Login in SAP Business Objects Data Services management Console and export the Jobs using export execution command from batch job
    SHELL Scripts will be exported in Job Server Log
    Move that SHELL script as per your location
    Update the User environment variable same as DEV only difference is SID is changed
    Thanks,
    Daya

  • SAP BW structure/table name change issue with BODS Jobs promotion

    Dear All, One of my client has issue with promotion of BODS jobs from one environment to another. They move SAP BW projects/tables along with BODS jobs (separately) from DEV to QA to Prod.
    In SAP-BW the structures and tables get a different postfix when they are transported to the next environment.  The promotion in SAP-BW (transport) is an automated process and it is the BW deployment mechanism that causes the postfixes to change.  As a result with the transport from Dev to QA in SAP-BW we end up with table name changes (be they only suffixes), but this does mean that when we deploy our Data Services jobs we immediately have to change them for the target environments.
    Please let me know if someone has deployed some solution for this.
    Thanks

    This is an issue with SAP BW promotion process. SAP BASIS team should not turn on setting to suffix systemid with the table names during promotion.
    Thanks,

  • Load HANA Data to Netezza through BODS Job

    Hi Experts,
    I am new to BODS and Netezza. I have a requirement loading HANA table data into Netezza table through BODS job.
    We created data stores for HANA and Netezza. And we have created Netezza table same like HANA source table.
    I don't have any query transform logic in job .Its a direct 1:1 mapping.  But When i execute the job, i am getting an error, says" Cross Database Access not supported for this type of command".
    I don't have any idea about this error.
    Please share your thoughts and idea's if anyone faced similar issues.
    Thanks,
    Best Regards,
    Santhosh.

    Hi Manoj,
    Thanks for your reply.
    Finally after making some try to the Netezza table properties(Bulk loader Option tab), now the job is executed successfully.
    Here is the Netezza table property details:
    Thanks,
    Best Regards,
    Santhosh Kumar.

  • Scheduling BODI Jobs

    Hi,
    Does anyone have some good information on how to schedule a bodi job via a windows scheudler, using a script. I'm particularly interested in being able to pass the current_system_configuration paremeter.
    I appreciate your help.
    Regards
    Azeem

    set the schedule to active when you are creating it from Management console, there is a check box below the schedule name
    The script will be created in the %LINK_DIR%\log\<JobServerName>\<GUID of Job>.bat on the machine of the selected job server
    for example :- file will be something as below
    "%LINK_DIR%\log\<JobServerName>\27275a80_bbc4_41e4_89d6_d08cee02bf92_5.bat"
    Open the windows scheduled tasks from control panel, you will see a new entry something like At followed by a number like At1 with the schedule time that you selected, right click on that and select properties, you can see the script that is being called to run the job
    If you are using some other scheduler and need only the script for launching the job, then use export execution command this will create the script on the selected job server machine in the %LINK_DIR%\log folder with the job name
    example:- if your job name is JOB_TEST then you see following 2 files on the job Server machine
    %LINK_DIR%\log\JOB_TEST.bat and %LINK_DIR%\log\JOB_TEST.txt

  • Joins in BODS

    Hi All,
    I am new to BODS. My requirement was to read a big flat file (size more than 180 GB) and join with small Database table.So I cannot do push down operation.
    I tested the job with small test file and I can see that query transformer is doing the cross join even though I mentioned as inner join in query transformer. I have tested with many files and tables job always go for the cross join (according to the row count of the query transformer in the job log) and takes long time to finish
    I am not sure why BODS behaves like this. Is there some configuration needed or it works this way?
    Is there any better way of handling this requirement?
    Regards,
    Suman

    1TB and 180GB is not exactly the same, is it?
    Now, it all depends what your join is meant for.
    1/. The join is for adding additional columns to your file data (lookup functionality). In that case, you'll need the storage anyway. Load your file into a table that already contains the extra columns. Then, in a 2nd dataflow, join the big table with the small one and update.
    2/. The join is for filtering. Thyen check the settings for your input. The ones for the file should look like:
    For the table:
    The input with the highest join rank will be the driving one. Let us know how this works out.

  • Aborted al_engine error while creating BODI Job Server in Linux

    Hi,
    I have installed BODI in linux (in the directory /opt/app/DataIntegrator). 
    While configuring job server I got the following error.
    Updating the repository <bodi@tdw__Oracle>. Please wait...
    sh: line 1: 15850 Aborted al_engine -Ubodi -Pbodi -Stdw -NORACLE
    -ZMACHINEINFOI/opt/app/DataIntegrator/bin/machine.sql 2>/tmp/al_machine.out.156
    78
    UNABLE to Update the repository <bodi@tdw__Oracle>. Please check the /tmp/al_mac
    hine.out.15678 for more details!
    SNMP is disabled.
    The log file tmp/al_machine.out.15678 is of OKB and no info inside it.
    I could able to solve this issue by giving chmod 777 -Rf /opt permissions. Is BODI needs 777 permissions for Job server to be created?
    Oracle is installed in /opt/app/oracle.
    Can you help on this issue?
    Thanks & Regards
    Maran MK

    Please test the Oracle repository connectivity from the al_engine directory so that you can rule out any connectivity issues before checking for the chmod settings.
    Let me know what you find.
    Thanks & Regards
    Tiji

  • Regarding BODS job running slowly

    Hi,
    When i am executing the job its taking 2 mins to run the job. it also creating the dump files when it is taking the time. I uninstall the BODS and installed again then also i am getting the same problem.
    earlier i was faced same type of issue that time i was uninstalled BODS and installed again then it was working fine .
    but i did same thing it is not working .
    Thanks & regards,
    Ramana.

    HI Diego,
    Most of the time my jobs are taking more time ( 2 mins for start of excution). It is taking time when job processing time . it takes 2 mins time , it creates dump files in log folder . Some times  when i am trying to save the job also taking some time and creating the dump files.
    Please let me know how can i resolve this issue.
    Thanks & Regards,
    Ramana

  • BODI - Job Error " blank message,possibly due to insufficient memory

    Ending up with below error while executing a DI job. The job uses a file of size 37 MB as source. As the error looks something related to memory. I have tried splitting up the input file into and executed the job. The job completed successfully without any error.
    Could someone help me out to find any memory setting which needs to be investigated in the dataflow level to get a permanent solution for it.
    Expecting for help !!!
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050406: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF|Dataflow SM_DM_ACCESS_LOG_DTL_F_DF
    Data flow <SM_DM_ACCESS_LOG_DTL_F_DF> received a bad system message. Message text from the child process is <blank message, possibly due to insufficient memory>. The process executing data flow <SM_DM_ACCESS_LOG_DTL_F_DF> has died abnormally. For NT,
    please check errorlog.txt. For HPUX, please check stack_trace.txt. Please notify Customer Support.
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050409: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF
    The job process could not communicate with the data flow <SM_DM_ACCESS_LOG_DTL_F_DF> process. For details, see previously
    logged error <50406>.
    (11.7) 03-04-11 08:18:06 (E) (21097:0001) RUN-050409: |Session SM_DM_ACCESS_LOG_DTL_F_JOB|Workflow SM_DM_ACCESS_LOG_DTL_F_WF
    The job process could not communicate with the data flow <SM_DM_ACCESS_LOG_DTL_F_DF> process. For details, see previously
    logged error <50406>.

    Hi,
    loading a 37MB file shouldnt be a problem without splitting it. i´ve loaded GB size flatfiles without problems.
    Did you checked the error.txt as stated in the message? Whats in there.
    If you split the file and you can load it, you have enough space in your DB.
    Please check the memory utilization of your server during executing the job with one file. Maybe the Server is too busy...what would be strange with a 37MB file.
    Regards
    -Seb.

Maybe you are looking for

  • Restore a Macbook Pro without the disc...

    Hello!   I have had some issues with my Macbook Pro latley, so I have been trying to completly reset my Mac to the factory-preferences, without succes. The disc that I need to do this has somehow vanished and can't find it anywhere.    So my question

  • Unable modprobe ndiswrapper on 2.6.12.5 kernel

    Greetings everyone, As my username implies, long time slackware user. Got bored one day and installed Arch and am very impressed. One small problem.  I have using a 2.6.12.5 kernel with a custom .config file that has reiser4 support patched in.  I ha

  • How to Uninstall an installed package

    I was trying to access a site that required ActiveX and downloaded and installed an application which was supposed to enable it. However it doesn't work. I tried to uninstall via the Installer but there isn't an option to do that. How is the best way

  • Manual Syncing

    I have my iPod enabled for disk use to manually sync the videos that I want on my iPod but not on my computer. However, this is a great inconvenience as I have thousands of songs whose tags I update regularly, and several playlists that I am constant

  • Need a cd to install...don't seem to have a program to open the Internet

    We seem to have uninstalled all ways to get on the Internet...where or how do I get a cd to download Firefox?