How to check UNUSED ADMIN-SCRIPTS through SQL Query..??

Can someone help me with the SQL query, which would give result of total unused ADMIN SCRIPTs for past few months?? (not the routing scripts)
Thanks in advance for your help n concern..!!

You can youse the below queries
Step 1:
Query to pull the used scripts between 17/12/2011 to 18/01/2012
select MasterScriptID,EnterpriseName from Master_Script where MasterScriptID in (
select MasterScriptID from Script where ScriptID in(
select distinct(ScriptID) from Route_Call_Detail
where DateTime between '2011-12-17 00:00:00' and '2012-01-18 23:59:59'))
Step 2:
Query to pull all the current configured Scripts 
select MasterScriptID,EnterpriseName from Master_Script
Step 3:
Once we get both the data, we can use Excel’s VLOOKUP functionality to find the unused Scripts
Regards,
Senthil

Similar Messages

  • Issue in creation of group in oim database through sql query.

    hi guys,
    i am trying to create a group in oim database through sql query:
    insert into ugp(ugp_key,ugp_name,ugp_create,ugp_update,ugp_createby,ugp_updateby,)values(786,'dbrole','09-jul-12','09-jul-12',1,1);
    it is inserting the group in ugp table but it is not showing in admin console.
    After that i also tried with this query:
    insert into gpp(ugp_key,gpp_ugp_key,gpp_write,gpp_delete,gpp_create,gpp_createby,gpp_update,gpp_updateby)values(786,1,1,1,'09-jul-12',1,'09-jul-12',1);
    After that i tried with this query.but still no use.
    and i also tried to assign a user to the group through query:
    insert into usg(ugp_key,usr_key,usg_priority,usg_create,usg_update,usg_createby,usg_updateby)values(4,81,1,'09-jul-12','09-jul-12',1,1);
    But still the same problem.it is inserting in db.but not listing in admin console.
    thanks,
    hanuman.

    Hanuman Thota wrote:
    hi vladimir,
    i didn't find this 'ugp_seq'.is this a table or column?where is it?
    It is a sequence.
    See here for details on oracle sequences:
    http://www.techonthenet.com/oracle/sequences.php
    Most of the OIM database schema is created with the following script, located in the RCU distribution:
    $RCU_HOME/rcu/integration/oim/sql/xell.sql
    there you'll find plenty of sequence creation directives like:
    create sequence UGP_SEQ
    increment by 1
    start with 1
    cache 20
    to create a sequence, and
    INSERT INTO UGP (UGP_KEY, UGP_NAME, UGP_UPDATEBY, UGP_UPDATE, UGP_CREATEBY, UGP_CREATE,UGP_ROWVER, UGP_DATA_LEVEL, UGP_ROLE_CATEGORY_KEY, UGP_ROLE_OWNER_KEY, UGP_DISPLAY_NAME, UGP_ROLENAME, UGP_DESCRIPTION, UGP_NAMESPACE)
    VALUES (ugp_seq.nextval,'SYSTEM ADMINISTRATORS', sysadmUsrKey , SYSDATE,sysadmUsrKey , SYSDATE, hextoraw('0000000000000000'), 1, roleCategoryKey, sysadmUsrKey, 'SYSTEM ADMINISTRATORS', 'SYSTEM ADMINISTRATORS', 'System Administrator role for OIM', 'Default');
    as a sequence usage example.
    Regards,
    Vladimir

  • Help needed in Exporting tables data through SQL query

    Hi All,
    I need to write a shell script(ksh) to take some of the tables data backup.
    The tables list is not static, and those are selecting through dynamic sql
    query.
    Can any body tell help me how to write the export command to export tables
    which are selected dynamically through SQL query.
    I tried like this
    exp ------ tables = query \" select empno from emp where ename\= \'SSS\' \"
    but its throws the following error
    EXP-00035: QUERY parameter valid only for table mode exports
    Thanks in advance,

    Hi,
    You can dynamically generate parameter file for export utility using shell script. This export parameter file can contain any table list you want every time. Then simply run the command
    $ exp parfile=myfile.txt

  • Row value needs to be changes as column through SQL Query

    HI ALL
    I have a table like below
    ID Month VALUES
    1 01-jan 10
    1 01-feb 20
    2 01-jan 10
    2 01-feb 20
    I need the output like below
    ID 01-jan 01-feb
    1 10 20
    2 10 20
    How can i get it through SQL Query?. Please help me on that i have urgent work like this

    In effect because you are wanting to take X rows and squish them down to 1 row per id, you are gouping on the ID to typically a group by clause is the best way to do this.
    If you really wanted to do it without aggregate functions and a group by clause you would be looking for something like this...
    SQL> ed
    Wrote file afiedt.buf
      1  WITH t as (select 1 as id, '01-jan' as month, 10 as val from dual union all
      2             select 1, '01-feb', 20 from dual union all
      3             select 1, '01-mar', 30 from dual union all
      4             select 2, '01-jan', 10 from dual union all
      5             select 2, '01-feb', 30 from dual union all
      6             select 2, '01-mar', 60 from dual)
      7  --
      8  select id, jan, feb, mar
      9  from (
    10    select id
    11          ,row_number() over (partition by id order by to_date(month,'dd-mon')) as rn
    12          ,val as jan
    13          ,lead(val) over (partition by id order by to_date(month,'dd-mon')) as feb
    14          ,lead(val,2) over (partition by id order by to_date(month,'dd-mon')) as mar
    15    from t
    16    )
    17* where rn = 1
    SQL> /
            ID        JAN        FEB        MAR
             1         10         20         30
             2         10         30         60
    SQL>Although this will only work if you can guarantee that there is a '01-jan' value for each id. If there could be missing values then you'll have to use aggregate functionality.

  • Term is not recognized when executing PowerShell Script through SQL Agent using CMDEXEC

    I am trying to simply execute a PowerShell script that is stored in a file on a network drive through SQL Agent as a job. The script is a basic copy from one directory to another.  I have run and successfully executed this from a command prompt outside
    of SQL Agent.  When I execute this through SQL Agent as a Operating System (CmdExec) command I am getting an error that the term is not recognized as the name of a cmdleet, function, script file or operable program. I have executed many different ways
    put found an article to use double quotes in the network path which I have done with no success. 
    I am executing the following command as a job in SQL Agent:
    PowerShell H:\"\PowerShell""\PS_Scripts\"\batchcopyFFLWithProgress.ps1 through SQL Agent job
    I get the following error message:
     04/21/2015 10:01:09,Copy FFL Files,Error,1,NY11266-LTW7E\JPLAPTOPSQL,Copy FFL Files,Copy FFL files,,Executed as user: NT Service\SQLAgent$JPLAPTOPSQL. H:\PowerShell\PS_Scripts\batchcopyFFLWithProgress.ps1 : The term   'H:\PowerShell\PS_Scripts\batchcopyFFLWithProgress.ps1'
    is not recognized as   the name of a cmdlet<c/> function<c/> script file<c/> or operable program. Check the   spelling of the name<c/> or if a path was included<c/> verify that the path is   correct and try again.
     At line:1 char:1  + H:"\PowerShell\PS_Scripts"\batchcopyFFLWithProgress.ps1  + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~      + CategoryInfo          : ObjectNotFound:
    (H:\PowerShell\P...ithProgress.p      s1:String) []<c/> CommandNotFoundException      + FullyQualifiedErrorId : CommandNotFoundException.  Process Exit Code 1.  The step failed.,00:00:03,0,0,,,,0
    Content of batchcopyFFLWithProgress.ps1 which has the PowerShell script:
    $source=ls H:\SQLTest\Script\TestData\*.*
    $i=1
    $source| %{
        [int]$percent = $i / $source.count * 100
        Write-Progress -Activity "Copying ... ($percent %)" -status $_ -PercentComplete $percent -verbose
        copy $_.fullName -Destination H:\test -Recurse
        $i++
    I have searched the internet and have not found any resolution to my error.  If someone has experienced this error and found the resolution I would greatly appreciate your help.

    I have change the service account for SQL Agent to be my domain account as I have local admin rights to my laptop.  I stopped and started the services for SQL Agent and than started the job to run which is copying locally to minimize any network drive
    issues.  I am still getting the same error message as it is showing that I am executing the job under my domain account?  Any thoughts what it could be?
    ErrorMsg
    04/23/2015 11:21:06,Copy FFL Files,Error,1,ServerName\InstanceName,Copy FFL Files,Copy FFL files,,Executed as user: Domain\DomainAccount. \\ServerName\Test\PS_Script\batchcopyFFLWithProgress.ps1 : The term '\\ServerName\Test\PS_Script\batchcopyFFLWithProgress.ps1'
    is   not recognized as the name of a cmdlet<c/> function<c/> script file<c/> or operable program. Check the spelling of the name<c/> or if a path was   included<c/> verify that the path is correct and try again.  At
    line:1 char:1  + \\ServerName\Test\PS_Script\batchcopyFFLWithProgress.ps1  + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~      + CategoryInfo          : ObjectNotFound: (\\ServerName...ithProgress.ps1:String)
    []<c/> CommandNotFoundException      + FullyQualifiedErrorId : CommandNotFoundException.  Process Exit Code 1.  The step failed.,00:00:02,0,0,,,,0
    Script
    $source=ls "\\ServerName\Test\TestData\*.*"
    $i=1
    $source| %{
        [int]$percent = $i / $source.count * 100
        Write-Progress -Activity "Copying ... ($percent %)" -status $_ -PercentComplete $percent -verbose
        copy $_.fullName -Destination "\\ServerName\Test\test" -Recurse
        $i++

  • How to check when admin or some other account has been logging in to the server

    Hi
    How can I check when admin or some other particular account has logged in to the server? So that I can check login times for users.

    Hi,
    You can try to edit the XML file to filter events by logon types.
    In addition, explicit credentials mean that we input our user name and password in front of a computer through keyboard (or other more advanced means). After that, Windows system will pack our credentials as tickets for further
    purposes like access network resources, tickets are implicit credentials.
    More information for you:
    Advanced XML filtering in the Windows Event Viewer
    http://blogs.technet.com/b/askds/archive/2011/09/26/advanced-xml-filtering-in-the-windows-event-viewer.aspx
    Best Regards,
    Amy

  • How to check unusable index

    Hi all
    I am getiing error
    Index ORVETL.NU_1_761 or some [sub]partitions of the index have been marked unusable
    How to check index which is unusable (Partition , non partition ALL)
    Pl help me

    I dont know the query it is user running when i got i will update
    alert log plz
    ORACLE Instance IDEARADB - Can not allocate log, archival required
    Sun Jun 20 13:54:03 2010
    Thread 1 cannot allocate new log, sequence 44150
    All online logs needed archiving
    Current log# 2 seq# 44149 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 13:54:08 2010
    Thread 1 advanced to log sequence 44150 (LGWR switch)
    Current log# 1 seq# 44150 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 13:56:47 2010
    Thread 1 advanced to log sequence 44151 (LGWR switch)
    Current log# 3 seq# 44151 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:00:34 2010
    Thread 1 advanced to log sequence 44152 (LGWR switch)
    Current log# 2 seq# 44152 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:06:55 2010
    Thread 1 advanced to log sequence 44153 (LGWR switch)
    Current log# 1 seq# 44153 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:09:31 2010
    Thread 1 advanced to log sequence 44154 (LGWR switch)
    Current log# 3 seq# 44154 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:12:07 2010
    Thread 1 advanced to log sequence 44155 (LGWR switch)
    Current log# 2 seq# 44155 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:14:30 2010
    Thread 1 advanced to log sequence 44156 (LGWR switch)
    Current log# 1 seq# 44156 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:17:09 2010
    Thread 1 advanced to log sequence 44157 (LGWR switch)
    Current log# 3 seq# 44157 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:19:42 2010
    Thread 1 advanced to log sequence 44158 (LGWR switch)
    Current log# 2 seq# 44158 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:22:19 2010
    Thread 1 advanced to log sequence 44159 (LGWR switch)
    Current log# 1 seq# 44159 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:24:45 2010
    Thread 1 advanced to log sequence 44160 (LGWR switch)
    Current log# 3 seq# 44160 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:27:15 2010
    Thread 1 advanced to log sequence 44161 (LGWR switch)
    Current log# 2 seq# 44161 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:29:45 2010
    Thread 1 advanced to log sequence 44162 (LGWR switch)
    Current log# 1 seq# 44162 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:32:21 2010
    Thread 1 advanced to log sequence 44163 (LGWR switch)
    Current log# 3 seq# 44163 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:34:58 2010
    Thread 1 advanced to log sequence 44164 (LGWR switch)
    Current log# 2 seq# 44164 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:37:35 2010
    Thread 1 advanced to log sequence 44165 (LGWR switch)
    Current log# 1 seq# 44165 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:40:08 2010
    Thread 1 advanced to log sequence 44166 (LGWR switch)
    Current log# 3 seq# 44166 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:43:52 2010
    Thread 1 advanced to log sequence 44167 (LGWR switch)
    Current log# 2 seq# 44167 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:50:46 2010
    Thread 1 advanced to log sequence 44168 (LGWR switch)
    Current log# 1 seq# 44168 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 14:51:47 2010
    Thread 1 advanced to log sequence 44169 (LGWR switch)
    Current log# 3 seq# 44169 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 14:53:05 2010
    Thread 1 advanced to log sequence 44170 (LGWR switch)
    Current log# 2 seq# 44170 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 14:56:59 2010
    Thread 1 advanced to log sequence 44171 (LGWR switch)
    Current log# 1 seq# 44171 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 15:07:42 2010
    Thread 1 advanced to log sequence 44172 (LGWR switch)
    Current log# 3 seq# 44172 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 15:17:31 2010
    Thread 1 advanced to log sequence 44173 (LGWR switch)
    Current log# 2 seq# 44173 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 15:24:32 2010
    Thread 1 advanced to log sequence 44174 (LGWR switch)
    Current log# 1 seq# 44174 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 15:32:49 2010
    Thread 1 advanced to log sequence 44175 (LGWR switch)
    Current log# 3 seq# 44175 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 15:41:28 2010
    Thread 1 advanced to log sequence 44176 (LGWR switch)
    Current log# 2 seq# 44176 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 15:45:28 2010
    Index ORVETL.NU_1_761 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 15:45:49 2010
    Index ORVETL.NU_2_761 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 15:48:24 2010
    Index ORVETL.NU_1_762 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 15:49:03 2010
    Index ORVETL.NU_2_762 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 15:51:11 2010
    Thread 1 advanced to log sequence 44177 (LGWR switch)
    Current log# 1 seq# 44177 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 16:01:10 2010
    Thread 1 advanced to log sequence 44178 (LGWR switch)
    Current log# 3 seq# 44178 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 16:06:20 2010
    Index ORVETL.NU_1_751 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 16:11:04 2010
    Thread 1 advanced to log sequence 44179 (LGWR switch)
    Current log# 2 seq# 44179 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 16:16:40 2010
    Index ORVETL.NU_1_753 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 16:17:54 2010
    Index ORVETL.NU_2_753 or some [sub]partitions of the index have been marked unusable
    Sun Jun 20 16:20:28 2010
    Thread 1 advanced to log sequence 44180 (LGWR switch)
    Current log# 1 seq# 44180 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 16:30:11 2010
    Thread 1 advanced to log sequence 44181 (LGWR switch)
    Current log# 3 seq# 44181 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 16:38:45 2010
    Thread 1 advanced to log sequence 44182 (LGWR switch)
    Current log# 2 seq# 44182 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 16:40:40 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54067K exceeds notification threshold (51200K)
    Sun Jun 20 16:41:08 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54016K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1081756.trc
    Sun Jun 20 16:41:17 2010
    Thread 1 advanced to log sequence 44183 (LGWR switch)
    Current log# 1 seq# 44183 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 16:41:32 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54066K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1081756.trc
    Sun Jun 20 16:41:56 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54015K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1081756.trc
    Sun Jun 20 16:49:45 2010
    Thread 1 advanced to log sequence 44184 (LGWR switch)
    Current log# 3 seq# 44184 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 16:58:01 2010
    Thread 1 advanced to log sequence 44185 (LGWR switch)
    Current log# 2 seq# 44185 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 17:00:15 2010
    Thread 1 advanced to log sequence 44186 (LGWR switch)
    Current log# 1 seq# 44186 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 17:02:37 2010
    Thread 1 advanced to log sequence 44187 (LGWR switch)
    Current log# 3 seq# 44187 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 17:05:13 2010
    Thread 1 advanced to log sequence 44188 (LGWR switch)
    Current log# 2 seq# 44188 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 17:07:37 2010
    Thread 1 advanced to log sequence 44189 (LGWR switch)
    Current log# 1 seq# 44189 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 17:13:36 2010
    Thread 1 advanced to log sequence 44190 (LGWR switch)
    Current log# 3 seq# 44190 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 17:19:16 2010
    Thread 1 advanced to log sequence 44191 (LGWR switch)
    Current log# 2 seq# 44191 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 17:25:15 2010
    Thread 1 advanced to log sequence 44192 (LGWR switch)
    Current log# 1 seq# 44192 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 17:32:18 2010
    Thread 1 advanced to log sequence 44193 (LGWR switch)
    Current log# 3 seq# 44193 mem# 0: +REDO_LOG/redo03.log
    Sun Jun 20 17:37:41 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 53686K exceeds notification threshold (51200K)
    Sun Jun 20 17:38:02 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 53898K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1241490.trc
    Sun Jun 20 17:38:21 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54025K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1241490.trc
    Sun Jun 20 17:38:40 2010
    Memory Notification: Library Cache Object loaded into SGA
    Heap size 54012K exceeds notification threshold (51200K)
    Details in trace file /oracle/oracle/Oracle_10gr2_DB/admin/IDEARADB/udump/idearadb_ora_1241490.trc
    Sun Jun 20 17:39:21 2010
    Thread 1 advanced to log sequence 44194 (LGWR switch)
    Current log# 2 seq# 44194 mem# 0: +REDO_LOG/redo02.log
    Sun Jun 20 17:45:28 2010
    Thread 1 advanced to log sequence 44195 (LGWR switch)
    Current log# 1 seq# 44195 mem# 0: +REDO_LOG/redo01.log
    Sun Jun 20 17:51:21 2010

  • Check data before loading through SQL *Loader

    Hi all,
    I have a temp table which is loaded through SQL*Loader.This table is used by a procedure for inserting data into another table.
    I get error of 0RA-01722 frequently during procdures execution.
    I have decided to check for the error data through the control file itself.
    I have few doubts about SQL Loader.
    Will a record containing character data for column declared as INTEGER EXTERNAL in ctrl file get discarded?
    Does declaring column as INTERGER EXTERNAL take care of NULL values?
    Does a whole record gets discarded if one of the column data is misplaced in the record in input file?
    Control File is of following format:
    LOAD DATA
    APPEND INTO TABLE Temp
    FIELDS TERMINATED BY "|" optionally enclosed by "'"
    trailing nullcols
    FILEDATE DATE 'DD/MM/YYYY',
    ACC_NUM INTEGER EXTERNAL,
    REC_TYPE ,
    LOGO , (Data:Numeric Declared:VARCHAR)
    CARD_NUM INTEGER EXTERNAL,
    ACTION_DATE DATE 'DD/MM/YYYY',
    EFFECTIVE_DATE DATE 'DD/MM/YYYY',
    ACTION_AMOUNT , (Data:Numeric Declared:NUMBER)
    ACTION_STORE , (Data:Numeric Declared:VARCHAR)
    ACTION_AUTH_NUM ,
    ACTION_SKU_NUM ,
    ACTION_CASE_NUM )
    What changes do I need to make in this file regarding above questions?

    Is there any online document for this?<br>
    Here it is

  • How to pass a variable for a SQL query in OLEDB source?

    Hi All,
    I am new to SSIS and working on it past few days. Can anyone please help me getting through a scenario where I need to pass a variable in the SQL statement in OLEDB source connection. Please find below for the details.
    eg:
    1) I have a SQL table with the columns SerialNumber, Name, IsValid, FileName with multiple rows.
    2) I have the file Name in a variable called Variable1.
    3) I want to read the data from my SQL table filtering based on the FileName (Variable1) within a data flow task and pull that data to the destination table.
    Question: In the data flow task, added source and destination DB connection with a script component in between to perform my validations. When trying to retrieve the data from source using the variable (i.e. SQL Query with variable), I am not able to add
    the query as the SQL statement box is disabled. How to filter the data based on the variable in the source DB ?
    Any help/suggestions would be of great help.
    Thanks,
    Sri

    Just to add with Vaibhav comment .
    SQL Command  : SQL query either with SQL variable or any condition  or simple Sql statement
    Like ;
    Select * from dimcustomer
    SQL Command using Varible :
    Sometimes we design our dynamic query in variable and directly use that variable name in oledb source.
    If you Sql query needs a condition based on SSIS variable .
    you can find a Example here :
    http://www.toadworld.com/platforms/sql-server/b/weblog/archive/2013/01/17/ssis-replace-dynamic-sql-with-variables.aspx
    http://www.select-sql.com/mssql/how-to-use-a-variable-inside-sql-in-ssis-data-flow-tasks.html
    Thanks
    Please Mark This As Answer or vote for Helpful Post if this helps you to solve your question/problem. http://techequation.com

  • Checking correct data format using sql query

    1) I got column date of joining which accepts date in below format
    DD-MON-YYYY
    DD-MON-YY
    MON-DD-YYYY
    MON-DD-YY
    Month DD,YYYY
    Question:- how do i check whether all dates in Date of joining column are in above format or not using sql query?
    2) I got one more date column which accepts date in below format
    MMDDYYYY
    YYYYMMDD
    MM/DD/YYYY
    MM/DD/YY
    YYYY/DD/MM
    Question:- how do i check whether all dates in date column are in above format or not using sql query?
    sorry if it is a very simple question since I am new to sql and trying to learn ......Thanks for the answers from the group............

    In short, NO, it's not possible.  If you store dates correctly in the database as DATE datatype then you don't have this problem.  If you store them as VARCHAR2 you have a problem.
    So, you get a date of 20092012
    Is that 20th September 2012?  or is it 20th December 2009?
    What about...
    11-Jan-12
    Is that 11th January 2012 or 12th January 2011?
    Dates should never be stored on the database as strings.  That is why Oracle gives you a DATE datatype so you can store them properly.
    Also, when dates are passed from an application to the database, the application should be passing them as DATE datatype, and the application interface should be designed to accept dates from the user in a format specific to their country/locality and it would then know what that format is and automatically convert it to a DATE datatype before it gets anywhere near the database or any SQL.

  • Determining the parameters passed in a Discoverer Report through SQL query

    Hi,
    I want to know the parameters passed in a Discoverer Report through a SQL query.
    i.e if we pass the Report name (Workbook Name) then we get the paramaters used .
    Is there any way we can do this.
    Any help will be really appreciated.
    Thanx in advance
    Ankur

    Hi
    You can indeed get the parameters from the EUL5_QPP_STATS table, although they are extremely difficult to get at.
    Look at this script:
    SELECT
    QS.QS_DOC_OWNER    USER_NAME,
    QS.QS_DOC_NAME     WORKBOOK,
    QS.QS_DOC_DETAILS  WORKSHEET,
    TRUNC(QS.QS_CREATED_DATE) DOC_DATE,
    *(LENGTH(TO_CHAR(EUL5_GET_ITEM_NAME(QS.QS_ID)))+1)/9 ITEMS,*
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),1,  6)) ITEM1,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),10, 6)) ITEM2,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),19, 6)) ITEM3,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),28, 6)) ITEM4,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),37, 6)) ITEM5,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),46, 6)) ITEM6,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),55, 6)) ITEM7,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),64, 6)) ITEM8,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),73, 6)) ITEM9,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),82, 6)) ITEM10,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),91, 6)) ITEM11,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),100,6)) ITEM12,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),109,6)) ITEM13,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),118,6)) ITEM14,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),127,6)) ITEM15,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),136,6)) ITEM16,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),145,6)) ITEM17,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),154,6)) ITEM18,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),163,6)) ITEM19,
    EUL5_GET_ITEM(SUBSTR(EUL5_GET_ITEM_NAME(QS.QS_ID),172,6)) ITEM20
    FROM
    EUL5_QPP_STATS QS--,
    --   APPS.FND_USER          USR
    WHERE
    --   QS.QS_DOC_OWNER = '#' || USR.USER_ID AND
    *(LENGTH(TO_CHAR(EUL5_GET_ITEM_NAME(QS.QS_ID)))+1)/9 < 21*
    AND QS.QS_CREATED_DATE > '01-JAN-2007'
    What this does is return the first 20 items used in a worksheet. It does this by passing 6 characters at a time out of a cusror made up by concatenating QS_DBMP0 to QS_DBMP7 to get the Dimensions and then again by concatenating QS_MBMP0 to QS_MBMP7 to get the Measures. Having got that cursor it then takes each 6 characters and passes them to a nibble algorithm to decode the actual item. The code is extremely difficult to follow.
    I mention this because other fields in the same table are QS_JBMP0 to QS_JBMP7 which I believe are Joins, and QS_FBMP0 to QS_FBMP7 which look like Filters (aka parameters) being used. I think the QS stands for Query Statistics and BMP for bitmap. Somewhere in the EUL5.SQL script is the key to unlocking this.
    Good luck. The reason I say this will become apparent when you look inside EUL5.SQL.
    Best wsihes
    Mcihael

  • Possibility to show opening balance of a GL account through SQL query?

    Hi experts,
    Is it possible to run an SQL query in SBO that shows the opening balance
    of a GL account based on a date entered by the user?
    This is similar to how the general ledger shows the opening balance of a GL account based on the posting dates entered.
    Thanks for your ideas.

    Hi Gordon,
    Thanks for that idea.
    I am thinking of the following:
    1. Create a virtual table (#TEST) from a SELECT statement that gets all journal entries for the specified GL account BEFORE the specified posting date.
    2. Perform a SELECT statement that sums the debit and credit from (#TEST).
    This should show the opening balance of the GL account right?
    Regards,
    Eric
    Edited by: eceguerra on May 18, 2011 7:02 AM
    Edited by: eceguerra on May 18, 2011 7:02 AM

  • SAP Crystal Report (2008) through SQL Query parameters

    Hi,
    I Created a report in Crystal report (2008) based on Sql Query. I created parameters in SQL query prompt only (all single valued).
    Based on this I need to show the data in detail section.
    For Graph, I used another SQL query, and added one subreport to the Main report header section. Since this subreport is also using the SQL query prompts for taking parameters.
    Since both the reports are using same set of parameters. I mapped Main report parameters to sub-report parameters.
    So Now structure is like,
    In a report there is one subreport.
    Main report ....SQL query generates parameter prompt..same parameters is passed to subreport.
    In subreport there are sql generated prompts. and we mapped that prompts to main report parameter prompts.
    Now it is running fine from crystal report, but not from CMC. It is not showing the graph.
    Made Two Experiment:
    1. To check subreport is running or not , Mannualy put my name in sub report--working fine.(CMC)
    2. To check the parameters are correctly mapped--printed the value in subreport--Showing all the parameters value from CMC
    but it is not running the SQL query of that sub report.
    Any suggestion ?? Or is there anyoption to check?? Or a CMC setting.
    Please suggest and respond..
    Thank you in advance.!!

    Hi Rajeev,
    This is Crystal reports development community, you said the report is working fine in Crystal. it is problem in CMC. So the issue is at BI Server.
    So i think you need to create the thread in the below community.
    BI Platform
    --Naga

  • How to Suppress The Output of a SQL Query In Oracle 11gR2

    Hi Friends,
    I am using oracle version 11.2.0.1, I have set a cronjob which will run on every 15 minutes and give us a log file mentioning the execution time taken for that SQL query:-
    For example:
    SQL> set timing on;
    SQL> SELECT objProp FROM aradmin.arschema WHERE (schemaId = 175);
    OBJPROP+
    --------------------------------------------------------------------------------+
    *6\60006\4\0\\60008\40\0\60009\4\0\\60010\4\0\\60018\4\0\\600*
    *22\4\68\1\63\AR:jRL#*
    Elapsed: 00:00:00.00
    The above query will return the output as well as the time taken for execution of the query. I want to suppress the output of the query and only want the time taken to be printed. Is it possible by set commands. I have marked the output as bold and made it Italic.
    Please help me at the earliest.
    Regards,
    Arijit

    >
    I am using oracle version 11.2.0.1, I have set a cronjob which will run on every 15 minutes and give us a log file mentioning the execution time taken for that SQL query:-
    The above query will return the output as well as the time taken for execution of the query. I want to suppress the output of the query and only want the time taken to be printed. Is it possible by set commands. I have marked the output as bold and made it Italic.
    >
    How would that even be useful?
    A query from a tool such as sql*plus is STILL going to send the output to the client and the client. You can keep sql*plus from actually displaying the data by setting autotrace to trace only.
    But that TIME TAKEN is still going to include the network time it takes to send ALL rows that the query returns across the network.
    That time is NOT the same as the actual execution time of the query. So unless you are trying to determine how long it takes to send the data over the network your 'timing' method is rather flawed.
    Why don't you tell us WHAT PROBLEM you are trying to solve so we can help you solve it?

  • How can I use the Rownum/Customized SQL query in a Mapping?

    Hi,
    * I need to use a Rownum for populating one of the target field? How to create a mapping with Rownum?
    * How can I use an Dual table in OWB mapping?
    * Can I write Customized SQL query in OWB? How can I achieve this in a Mapping?
    Thanks in Advance
    Kishan

    Hi Niels,
    As I'm sure you know, the conundrum is that Reports doesn't know how many total pages there will be in the report until it is all done formatting, which is too late for your needs. So, one classical solution to this problem is to run the report twice, storing the total number of pages in the database using a format trigger, and throwing away the output from the first run when you don't know the total number of pages.
    Alternatively, you could define a report layout so that the number of pages in the output is completely predictable based upon, say, the number of rows in the main query. E.g., set a limit of one, two, ... rows per page, and then you'll know how many pages there will be simply because you can count the rows in a separate query.
    Hope this helps...
    regards,
    Stewart

Maybe you are looking for

  • Cannot mount multiple CIFS shares in /home

    I have 2 users, Joe and Fred, who each have their own directory in /home.  I also have a Netapp filer with a CIFS share called CorpData, which has subdirectories for Joe and Fred. On the Arch Linux box, I have this for my fstab: # /etc/fstab: static

  • Loading versions 4.3 & 4.2SP2, error message: "not a valid Win32 app"

    Hello, I've been trying to install two versions of BDS, and both download fine, but upon clicking on the icon to install, I get the error message: "(name of app) is not a valid Win32 application".....does anyone know what the heck this means and how

  • Issues with subject in idm 8.1

    Below are details about issue we are facing. I am using IDM 8.1 version We have used Checkout View in the provisoning workflows. The arguments of the Checkout view are - <Argument name='op' value='checkoutView'/> <Argument name='type' value='User'/>

  • HT4557 ipad 2 syncing music form pc to ipad

    i have an ipad two i would like to get music from itunes on my laptop onto itunes in my ipad . i have a usb lead for the ipad and ipod please help

  • Table view control

    hi, i am using a table control to hav the customer information.i need to edit some fields in my table control to change the values and the chaged values should be updated to the database table.how to edit the fields in TC and how the updation can be