Trigger sql agent upon a data load completion

hi all,
i insert/update data from server A to server B
once the above step is successful, i want to try to start the sql agent job on server B.
how can i set the sql agent job on server b to check and make sure that that data load has been completed for that day and then start executing the job steps.
pleasse guide
nik

Hello,
Yes, there are different ways of making connections to the remote SQL Server and running T-SQL Commands. some examples include:
OPENQUERY
Linked Servers
Powershell through agent
The best place to do it would be at the end of your import/export script/process.
Sean Gallardy | Blog |
Twitter

Similar Messages

  • Include Data Load completion time in OBIEE

    Hi all,
    We are using DAC(Database Administration Consolel) for our data load activity.. SO my client wants the Data Load Completion time in all the Dashboards,,
    May I know how to do this..

    Hi,
    You should add the DAC tables (out of the DAC Repository) to Oracle BI. There you will be able to report ETL-data.
    Good Luck,
    Daan Bakboord
    http://obibb.wordpress.com

  • Hyperion11.1.2.2 Sql Server and Essbase data load error

    Gentlemen,
    i have issue on loading data in Essbase via sql server , while loading data it was fine and all of sudden i see that i get network error 10054
    network error 10054 failed to recive data / send data
    unexpected essbase error 1042013
    even when i try to load manully it says Data Load Fails error Data load buffer [9999] does not exist Unexpected Essbase error 1270040
    and in logs
    Received client request: MaxL: Execute (from user [admin@Native Directory
    *Error writing to server*
    Also tried with maxll its the same issue
    is .esm is loacked?
    any issue with ODBC?
    or any orphan link which is connected to sql  and hyperion unable to process another ?
    let me know your thoughts
    thanks.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Can you try limiting the query to return some records? Try adding a where clause and see whether that works?
    Regards
    Celvin
    http://www.orahyplabs.com

  • Error while opening SQL source for a Data Load Rules File

    Hi ,I have created Data Laod rules file.When I try to open a SQL source for this rules file (File->Open SQL) , I get an error saying "Your server does not have a SQL connection Option, Please check with your system administrator"Further I get a message "There are no data sources defined. PLease create one to continue.".I have created DSN on my Essbase server.What is the problem.What needs to done to open SQL files.Thanks.

    I have Essbase 7.1 I guess for version 7.1 the SQL interface option is intalled with the Analytic server itself .Am I right?I have setup the DSN also.Please help to resolve this issue.Thanks .

  • Data Load Best Practice.

    Hi,
    I needed to know what is the best way to load the data from a source. Is the SQL load the best way or using data files better? What are the inherent advantages and disadvantages of the two processes?
    Thanks for any help.

    I have faced a scenario that explaining here
    I had an ASO cube and data is being load from txt file daily basis and data was huge. There is some problem in data file as well as Master file (file that is being used for dimension building).
    Data and master file has some special character like ‘ , : ~ ` # $ % blank spaces and tab spaces, even ETL process cannot remove these things because this is coming within a data.
    Sometimes any comment or database error were also present in data file.
    I faced problem with making rule file with different delimiter, most of the time I find same character within data that is used as a delimiter. So its increases no of data field and Essbase give error.
    So I have used sql table a for data load .a Launch table is created and data is populated in this table. All error are removed here before using data load into Essbase
    This was my scenario (this case I find SQL load the second one is better)
    Thanks
    Dhanjit G.

  • Data Load log #'s from 1003000 - 1003999

    Hi,Does anyone know as of what each and every # mean that is mentioned above. We are looking for a # that will tell us that data load has started and data load completed. We want to do a search within the log file based on these #'s.Thanks,Minash...

    I have written a parser for just this function. Do you do VB? Let me know if you want the code. <br><br>Jeff McAhren<br>Dallas, Texas

  • EIS data load problem

    Hi,
    I use user defined sql to do a data load using EIS 7x
    When i run the sql in toad just to see the data I see 70000 rows
    But when I use the same sql in the userdefined sql and try to load data...I dont know why EIS says that it loaded around 69000 rows and also I see no rejections...
    I even made some sql changes to find out what are the records that are not being loaded..and I see some rows when I run a sql in toad and if i use the same in EIS its not loading those rows which I can see in toad for the same sql ...
    This is very strange.. Can any one help me with this,..??

    Glenn-
    I dont think there is anything unique about the missing rows..
    Actually the a part of code has been added to the previously working view(which i use to load data) to bring in some additional data...
    I took that part and tried to load data...and I see no data being loaded or rejected..
    It just says that records loaded "0"..rejected 0...
    but the same part brings in the required additional data... when exectued in toad..
    and about the excel sheet lock and send..I did that a week and as you said to my surprise it loads everything....
    This was the test bu which I figured out that EIS is not even able to load a part of data...and I found the exact part of data by going through it closely..
    So I think this is something to do with sql interface..
    And I did not understand the last lines in your post...
    /////I know when I do regular Sql interface some data types don;t load and I have to convert them to varchar tto get them to work. If the Excel file and Flat file loads work, look at that//////
    what did convert to varchar???
    Excel load works fine..So I think its something similar to what you mentioned in your post...
    Can you please explain it ...

  • Data loading delay

    Hi Friends.,
               Shall i have an answer for one error,
    The Issue is: Every day i load to one info cube, whatever the cube it is, it takes 2 Hours for every load, but once it has taken 5 Hours, what might be the reason? just confusing with that, can anybody let me clarify !!!!
    Regards.,
    Balaji Reddy K.

    Reddy,
    1. Is the time taken for loading to PSA or to load from PSA to cube ? if it is to oad to PSA then  uaually the problem lies at the extractor
    2. If it is loading to the cube.. then check out if statistics are being maintained for the cube and they would give an accurate picture of where the dataload is taking up most time.
    Do an SQL trace during the data load and if you find a lot of aster Data Lookups .. make sure that master data is loaded and if there are a lot of looups to Table NRIV check if number range buffering is on so that dim IDs get generated faster
    Check if the data load happens fast if you drop any existing indexes...
    Are you loading any agregates after the data load ? check fi th aggregates are necessary or if they have been enabled for delta loads..
    If you have indexes active and there is a huge data loa , depending on the index , the data load can get delayed..
    If the cube is not compressd , some times the data load can get delayed..
    Also when the data load is going on check in SM50 and SM37 to see if the jobs are active - this means that the data load is active from both sides...
    Always update the statistics for the cube before the load and ater the load , this helps in deciphering the time it takes for the data load... after activating the statistics .. check table RSDDSTAT or the standard reports available as part of BW tecnical content..
    Hope it helps..
    Arun
    Assgn points if helpful

  • Essbase 7.1 - Incremental data load in ASO

    Hi,
    Is there incremental data loading feature in ASO version 7.1? Let's say, I've the following data in ASO cube
    P1 G1 A1 100
    Now, I get the following 2 rows as per the incremental data from relational source:
    P1 G1 A1 200
    P2 G1 A1 300
    So, once I load these rows using rule file with override existing values option, will I've the following dataset in ASO:
    P1 G1 A1 200
    P2 G1 A1 300
    I know there is data load buffer concept in ASO 7.1. And this is the inly way to improve data load performance. But just wanted to check if we can implement incremental loading in ASO or not.
    And one more thing, Can 2 load rules run in parallel to load data in ASO cubes? As per my understanding, when we start loading data, the cube is locked for any other insert/update. Pls correct me if I'm wrong!
    Thanks!

    Hi,
    I think the features such as incremental data loads were available from version 9.3.1
    In the whats new for Essbase 9.3.1 it contains
    Incrementally Loading Data into Aggregate Storage Databases
    The aggregate storage database model has been enhanced with the following features:
    l An aggregate storage database can contain multiple slices of data.
    l Incremental data loads complete in a length of time that is proportional to the size of the
    incremental data.
    l You can merge all incremental data slices into the main database slice or merge all
    incremental data slices into a single data slice while leaving the main database slice
    unchanged.
    l Multiple data load buffers can exist on a single aggregate storage database. To save time, you
    can load data into multiple data load buffers at the same time.
    l You can atomically replace the contents of a database or the contents of all incremental data
    slices.
    l You can control the share of resources that a load buffer is allowed to use and set properties
    that determine how missing and zero values, and duplicate values, in the data sources are
    processed.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • How to put a SQL Agent Job wait in the trigger while the job is running.

    Hello!
    I am not a geek in writing t-sql code so I am seeking forum help in completion of my task.
    I have a trigger which fires upon a action and with in that code, I am starting a job via t-sql like
    EXEC msdb.dbo.sp_start_job N'JobName';
    Now, I want to check the condition like that, If the trigger got invoked while the job is running, it should wait till the successfully completion of the job.
    Thanks for your time and help on this.

    You can use Context_info with Wait for delay 
    Below are some useful links..
    http://www.sqlservercentral.com/articles/T-SQL/2765/ 
    http://technet.microsoft.com/en-us/library/ms180125(v=sql.110).aspx 
    http://ask.sqlservercentral.com/questions/42786/how-to-avoid-a-stored-procedure-to-be-executed-par.html
    Thanks,
    Saurabh 
    http://www.linkedin.com/in/sbhadauria http://www.experts-exchange.com/M_6313078.html

  • Unable to bulk copy data. - Random failure, running as SQL Agent with Admin rights and timeouts=0

    Hello,
    My setup is SQL Server 2012 (11.0.5522) and SSIS, but still running the 2008 R2 created packages. The server is Windows Server 2008 R2 with 32GBs of memory.
    I am running a control package which calls 4 packages at once to run simultaneously for performance reasons. It runs every day with issues, but maybe once a month I get the failure:
    'Unable to bulk copy data. You may need to run this package as an administrator.'
    The SQL Agent is setup as admin, it has access to the create global objects the source and destination databases are on the same server and the timeout is set to zero. Each package has the standard batch size and takes about 3-4 minutes to complete.
    Its not easily re-creatable and always runs fine when I re-start the package.
    Any help would be appreciated.
    Kind regards, Graham.

    see
    https://popbi.wordpress.com/2012/09/03/ssis-unable-to-bulk-copy-data-you-may-need-to-run-this-package-as-administrator/
    https://support.microsoft.com/kb/2216489?wa=wsignin1.0
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • SQL Agent Job failing - not using credentials in the config file for Data source

    Hi
    We have an SSIS pkg, that is secheduled as SQL Agent job using proxy account. The pkg contanins data source for connecting different SQL servers and the proxy account do not have access to the external DBs. The data source credentials are stored in the Config
    file.
    Why the job is not using the credentials in the config file and try to use the proxy account and failing.
    Do the proxy account need access to all the external dbs in the pkg, and then what is the purpose of the config file.
    I am sorry, i am not SSIS person trying to understand. If any one can explain tha will be great!!
    Thank you!
    VR

    Please take a look at these URLs:
    Schedule a Package by using SQL Server Agent
    SSIS package does not run when called from a SQL Server Agent job step
    Cheers,
    Saeid Hasani
    Database Consultant
    Please feel free to contact me at [email protected] as well as on Twitter and Facebook.
    [My Writings on TechNet Wiki] [T-SQL Blog] [Curah!]
    [Twitter] [Facebook] [Email]

  • Clear data in sql report region on page load....

    I have a sql report region.when i run the page the data getting populate on the screen because of the sql query. On page load i dont want to populate the data in screen.
    i have created the page process to clear cahe the page. but the data is not clearing from the screen.
    How to resolve this?
    Thanks & regards,
    Skud.

    Skud,
    you can create condition in report region query e.g. where 1 = :PXX_ITEM, and populate that item on click and then refresh report.
    Bt,
    Marko

  • Sql loader error in offline data load

    Hi,
    I have done an offline schema creation using existing tablespace.
    I am trying to do an offline data load using sql loader.The CTL and DAT file are generated by the work bench.
    This is my CTL file code generated by workbench.
    load data
    infile 'Import.dat' "str '<EORD>'"
    into table IMPORT
    fields terminated by '<EOFD>'
    trailing nullcols
    When I am running the ctl file with DAT file in sql loader I am getting the following error
    SQL*Loader-350: Syntax error at line 4.
    Expecting single char, found "<EOFD>".
    fields terminated by '<EOFD>'
    ^
    My Sql Loader version is Release 8.0.6.3.0
    Please help if anyone has came across this issue.
    Thanks in advance.
    Regards
    Saravanan.B

    Saravanan,
    Its a long time since I have seen 8 sql loader. Check the doc. Is it resrticted to a single character delimter??
    Barry

Maybe you are looking for