Need to test if a column have unique values or not

Hi all,
in ETL process I need to check if some cols have unique values or not using sql or plsql.
Suppose we need to load a big file data with external table initially and then test if values
on one or more columns have unique values  in order to proceed with ETL process.
What is the faster test I can execute to verify that a column have unique values or not?
It's better for the ETL performance, use:
a. techniques regard constraints like described on Ask tom forum
"ENABLE NOVALIDATE validating existing data"
(Ask Tom "ENABLE NOVALIDATE validating existing da...")
b. "simply" query on the data?
like this:
select count(count(*)) distinct_count,
         sum(count(*)) total_count,
         sum(case when count(*) = 1 then 1 else null end) non_distinct_groups,
         sum(case when count(*) > 1 then 1 else null end) distinct_groups
from hr.employees a
group by A.JOB_ID
c. use analytics function?
d. use some feature directly on external table?
Bye in advance

Here is the example to handling the errrs using LOG_ERRORS into concept. You will check this and let me know if any doubt you have
DATAFILE:-
1000,ANN,ZZ105
1001,KARTHI,ZZ106
1002,PRAVEEN,ZZ109
1002,PARTHA,ZZ107
1003,SATHYA,ZZ108
1000,ANN,ZZ105
----- Original Table With unique constraints
SQL> CREATE TABLE tab_uniqtest(student_id     NUMBER(10) UNIQUE,
                          student_name   VARCHAR2(15),
                                                  course_name    VARCHAR2(15)
  2    3    4
Table created.
----- External table
SQL> CREATE TABLE tab_extuniqtest(student_id     NUMBER(10),
  2                               student_name   VARCHAR2(15),
  3                                                  course_name    VARCHAR2(15)
  4                              )
  5  ORGANIZATION EXTERNAL
  6  (
  7  DEFAULT DIRECTORY ann_dir
  8  ACCESS PARAMETERS
  9  (
10    RECORDS DELIMITED BY NEWLINE
11    BADFILE 'tabextuniqtest_badfile.txt'
12    LOGFILE 'tabextuniqtest_logfile.txt'
13    FIELDS TERMINATED BY ','
14    MISSING FIELD VALUES ARE NULL
15    REJECT ROWS WITH ALL NULL FIELDS
16    (student_id,student_name,course_name)
17  )
18  LOCATION ('unique_check.csv')
19  )
20  REJECT LIMIT UNLIMITED;
Table created.
---- Error logging table to log the errors
SQL> CREATE TABLE dmlerrlog_uniqtest(ORA_ERR_NUMBER$     NUMBER ,
  2                                 ORA_ERR_MESG$       VARCHAR2(2000),
  3                                 ORA_ERR_ROWID$      ROWID,
  4                                 ORA_ERR_OPTYP$      VARCHAR2(2),
  5                                 ORA_ERR_TAG$        VARCHAR2(4000),
  6                                 inserted_dt         VARCHAR2(50) DEFAULT TO_CHAR(SYSDATE,'YYYY-MM-DD'),
  7                                 student_id              VARCHAR2(10)
  8                                  );
Table created.
---- Procedure to insert from external table
SQL> CREATE OR REPLACE PROCEDURE proc_uniqtest
  2  AS
  3  v_errcnt NUMBER;
  4  BEGIN
  5      INSERT INTO tab_uniqtest
  6      SELECT * FROM tab_extuniqtest
  7      LOG ERRORS INTO dmlerrlog_uniqtest('PROC_UNIQTEST@TAB_UNIQTEST') REJECT LIMIT UNLIMITED;
  8      SELECT COUNT(1) into v_errcnt
  9      FROM dmlerrlog_uniqtest
10      WHERE ORA_ERR_TAG$ = 'PROC_UNIQTEST@TAB_UNIQTEST';
11       IF(v_errcnt > 0) THEN
12       ROLLBACK;
13      ELSE
14        COMMIT;
15       END IF;
16      DBMS_OUTPUT.PUT_LINE ( 'Procedure PROC_UNIQTEST is completed with ' || v_errcnt || ' errors') ;
17  EXCEPTION
18   WHEN OTHERS THEN
19    RAISE;
20  END proc_uniqtest;
21  /
Procedure created.
SQL> SET SERVEROUTPUT ON
SQL> EXEC proc_uniqtest;
Procedure PROC_UNIQTEST is completed with 2 errors
PL/SQL procedure successfully completed.
SQL> SELECT STUDENT_ID,ORA_ERR_MESG$ FROM dmlerrlog_uniqtest;
STUDENT_ID                     ORA_ERR_MESG$
1002                           ORA-00001: unique constraint (
                               SCOTT.SYS_C0037530) violated
1000                           ORA-00001: unique constraint (
                               SCOTT.SYS_C0037530) violated

Similar Messages

  • Error 1000: AFx Library library exception: Sql encountered an error: These columns don't currently have unique values.

    Hi everyone,
    Using the Writer block, configured with a correct SQLAzure instance, I experience this error:
    - Error 1000: AFx Library library exception: Sql encountered an error: These columns don't currently have unique values.
    I have read that this error can be related to inheritance problems, but my table does not have inheritances of any kind:
    CREATE TABLE [dbo].[my_table](
    [my_float] [float] NULL
    ) ON [PRIMARY]
    GO
    Writer block parameters:
    Comma separated list of columns to be saved:
    temp      (from Bike Rental UCI Dataset)
    Data Table name: [dbo].[my_table]
    Comma separated list of datatable columns:
    [my_float]
    The experiment from which I have extracted the writing part was able to correctly write to DB until few days ago. Is it possible that some changes in the platform are causing this issue or am I doing something wrong?
    Thank you and best regards,
    FV

    Hi Maureen,
    Yes, the scenario is correct. On SQL profiler I can't see any SQL operation that can lead to that kind of error, but as you were saying it was worth trying anyway having that error message.
    New test scenario:
    CREATE TABLE [dbo].[my_table_identity](
        [ID] [int] IDENTITY(1,1) NOT NULL,
        [my_float] [float] NULL,    
     CONSTRAINT [PK_my_table_identity] PRIMARY KEY CLUSTERED 
        [ID] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON)
    ) ON [PRIMARY]
    -- These lines work fine on SQL Management Studio
    INSERT INTO [dbo].[my_table_identity] (my_float) VALUES (1.23)
    INSERT INTO [dbo].[my_table_identity] (my_float) VALUES (1.23)
    INSERT INTO [dbo].[my_table_identity] (my_float) VALUES (1.23)
    -- I can see all new rows with each one a different ID
    SELECT * FROM [dbo].[my_table_identity]
    dbo is the default schema for my Azure login.
    New writer block parameters:
    Comma separated list of columns to be saved: temp  
       (from Bike Rental UCI Dataset)
    Data Table name: my_table_identity
    Comma separated list of datatable columns: my_float
    I still get the same error: Error 1000: AFx Library library exception: Sql encountered an error: These columns don't currently have unique values.
    I have also tried to pass the "instant" column instead of "temp", as it hasn't any duplicated values, but still no luck.
    I attach the log file (purged of sensitive information) because I see some error / warnings there.
    https://dl.dropboxusercontent.com/u/40411467/azure_log.txt
    Thank you again and best regards,
    FV

  • How to determine how many times result set columns have same value

    Hi -
    I'm doing a report which will be used for payment trend analyses.
    My initial result set looks like this:
    HOUSEHOLD_ID     JAN_PMT     FEB_PMT     MAR_PMT     APR_PMT     MAY_PMT     JUN_PMT     JUL_PMT     AUG_PMT     SEP_PMT     OCT_PMT     NOV_PMT     DEC_PMT
    90026845409     1     1     1     1     2     1     1     1     1     0     1     0(many rows, of course; result set pivoted)
    I need to determine the households that have a > 0 value in three or more consecutive months.
    I'm hoping someone will have some suggestions because the only solutions I'm coming up with right now would be a coding nightmare (lots of "OR's"), and I'm assuming (hoping) there's a better solution out there.
    Thanks!
    Christine

    Hi Frank,
    I'm not sure I'm understanding how I would use those analytic functions. Here is my select statement:
    SELECT HOUSEHOLD_ID,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  1 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  1 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS JAN_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  2 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  2 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS FEB_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  3 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  3 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS MAR_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  4 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  4 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS APR_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  5 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  5 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS MAY_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  6 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  6 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS JUN_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  7 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  7 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS JUL_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  8 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  8 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS AUG_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     =  9 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 =  9 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS SEP_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     = 10 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 = 10 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS OCT_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     = 11 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 = 11 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS NOV_PMT,
           SUM(CASE WHEN ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM'))     = 12 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 ) OR
                         ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'MM')) - 1 = 12 AND TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6  )
                    THEN 1 ELSE 0 END) AS DEC_PMT
      FROM MONETARY_TRANS
    WHERE MONETARY_TRANS_TYPE_ID    = 1                             --payment
    --TESTING
    AND HOUSEHOLD_ID = 90026845409
       AND RESPONSIBLE_PARTY_TYPE_ID = 1                             --household
       AND RECEIVED_DATE > '01-JAN-2008'
       AND ( TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) < 6
        OR   TO_NUMBER(TO_CHAR(RECEIVED_DATE, 'DD')) > 19 )
       AND PREMIUM_AMOUNT            > 0
       AND BILLING_TRANS_TYPE_ID     = 6
       AND MONETARY_TRANS_TYPE_ID    = 1
       AND RESPONSIBLE_PARTY_TYPE_ID = 1
    GROUP BY HOUSEHOLD_IDAnd from this I get the results originally posted. From there I need to figure out the households that have values greater than 0 for three or more consecutive months.
    Thanks for your help........
    -Christine

  • How to validate if a column have NULL value, dont show a row with MDX

    Hello,
    I have this situation, I have a Result from MDX that return rows with values NULL on columns, I tried with NON EMPTY and NONEMPTY but the result is the same. That I want to do is validate if a column have a Null value discard the row, but I dont know how
    to implement it, could somebody help me?, please.
    Thanks a lot.
    Sukey Nakasima
    Sukey Nakasima

    Hello,
    I found the answer in this link https://social.technet.microsoft.com/Forums/sqlserver/en-US/f9c02ce3-96b2-4cd6-921f-3679eb22d790/dont-want-to-cross-join-with-null-values-in-mdx?forum=sqlanalysisservices
    Thanks a lot.
    Sukey Nakasima
    Sukey Nakasima

  • Possible Bug with SqlDataAdapter.Fillschema DataColumn.Unique values are not populated.

    Hi all,
    I have done a lot of searching for answer to this problem this evening and so far have not found a solution but have found multiple people with the same issue.
    When using SqlDataAdapter.FillSchema() from a table that contains a Primary Key and columns with Unique constraints the only DataColumn.Unique value to be populated is the Primary Key.
    From reading the MSDN entry for dbDataAdapter this is not the intended functionality
    Can anyone suggest avenues to try or where to raise this issue if the method truly is not functioning as intended?

    Hello HarborneD,
    >>When using SqlDataAdapter.FillSchema() from a table that contains a Primary Key and columns with Unique constraints the only DataColumn.Unique value to be populated is the Primary Key.
    From your description, it is not very clear how your table is defined, I tried to reproduce this issue with below table(not sure if it is similar with yours, if not, please share it with us):
    CREATE TABLE [dbo].[T20141230] (
    [ID] INT NOT NULL,
    [Name] NCHAR (10) NULL,
    [Birthday] DATETIME NULL,
    PRIMARY KEY CLUSTERED ([ID] ASC),
    CONSTRAINT uc_PersonID UNIQUE ([Name])
    And the code used to full the schema:
    SqlConnection con = new SqlConnection(@"Server=(localdb)\Projects;Database=ADO.NET;Trusted_Connection=True;");
    try
    con.Open();
    SqlCommand cmd = new SqlCommand("select * from [T20141230]", con);
    SqlDataAdapter da = new SqlDataAdapter(cmd);
    DataTable dt = new DataTable();
    da.FillSchema(dt, SchemaType.Mapped);
    da.Fill(dt);
    catch (Exception)
    finally
    con.Close();
    However, I could all columns populated with values from database. I used VS2013, .NET 4.5 and windows 8.1, you could have a try with my demo.
    >>Can anyone suggest avenues to try or where to raise this issue if the method truly is not functioning as intended
    If it is an exact issue which is not reported yet, you could post it to this site below:
    https://connect.microsoft.com/VisualStudio
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • REST API - Can't expand Title of lookup columns "System.ArgumentException : Value does not fall within the expected range"

    I have a list that has more than 12 columns of type lookup with more than 5000 items and I am using a date field as an indexed column so I can restrict the results to meet the list view threshold.  I am also using $select to restrict the columns to
    meet the lookup column threshold.
    If I run the query below it all works as expected:
    _api/web/Lists/MyList/items/?$Filter=Created gt '2015-03-10T05:00:00.000Z' and Created lt '2015-03-17T18:25:00.712Z'&$select=Lookup1/Id&$expand=Lookup1
    If I run this query it does not work:
    _api/web/Lists/MyList/items/?$Filter=Created gt '2015-03-10T05:00:00.000Z' and Created lt '2015-03-17T18:25:00.712Z'&$select=Lookup1/Title&$expand=Lookup1
    Error:
    <m:error xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
    <m:code>-2147024809, System.ArgumentException</m:code>
    <m:message xml:lang="en-US">Value does not fall within the expected range.</m:message>
    </m:error>
    The lookup column "Lookup1" is using Title as the column's information and this should be working.  What is interesting is that changing the lookup threshold has no effect on this but raising the list item view limit does make it work.  The
    date range is selecting less than 5000 items.  So what am I missing here? I just want the Title for the lookup column and my query has about 10 items in it.

    Looking through the logs the error is thrown due to the following error messages:
    03/19/2015 08:26:12.12 w3wp.exe (0x6D34)
    0x7F38 SharePoint Foundation
    Fields 84h8
    High Field with internal name 'LookupFieldName_x005f_Title' already exists in the field cache
    3446f49c-f2d2-d0a2-89f4-59dee19b2f45
    03/19/2015 08:26:12.12 w3wp.exe (0x6D34)
    0x7F38 SharePoint Foundation
    Fields ki9p
    High Unable to add join related fields to the Query.[Error 0x80070057]
    3446f49c-f2d2-d0a2-89f4-59dee19b2f45
    03/19/2015 08:26:12.12 w3wp.exe (0x6D34)
    0x7F38 SharePoint Foundation
    General xxpm
    High Unable to execute query: Error 0x80070057
    3446f49c-f2d2-d0a2-89f4-59dee19b2f45
    03/19/2015 08:26:12.12 w3wp.exe (0x6D34)
    0x7F38 SharePoint Foundation
    General 8e2s
    Medium Unknown SPRequest error occurred. More information: 0x80070057
    3446f49c-f2d2-d0a2-89f4-59dee19b2f45
    03/19/2015 08:26:12.12 w3wp.exe (0x6D34)
    0x7F38 SharePoint Foundation
    General aix9j
    High SPRequest.GetListItemDataWithCallback2: UserPrincipalName=<removed>, AppPrincipalName= ,pSqlClient=<null> ,bstrUrl=siteUrl ,bstrListName=<removed> ,bstrViewName=<null> ,bstrViewXml=<View
    Scope="RecursiveAll"><Query><Where><Eq><FieldRef Name="ID" /><Value Type="Counter">5481</Value></Eq></Where></Query><ViewFields><FieldRef Name="LookupFieldName"
    LookupId="TRUE" /><FieldRef Name="LookupFieldName_x005f_Title" /></ViewFields><ProjectedFields><Field Name="LookupFieldName_x005f_Title" Type="Looku ,fSafeArrayFlags=SAFEARRAYFLAG_DATES_IN_UTC
    3446f49c-f2d2-d0a2-89f4-59dee19b2f45
    The very first message states that it is unable to add join related fields, what could be the possible cause of this?  As you can see rather than specifying a date range which I know contains less than 5000 items, even filtering based on ID gives the
    same result.

  • ( ?split-column-width-unit:value? not supported

    Hi,
    I tried to use the following:
    (<?split-column-width-unit:value?> in my template but when I ran the preview, I am getting an exception saying "org.xml.sax.SAXException: element xdofo:split-column-width-unit is not supported yet."
    If it is not supported, then why is it in the XML Publisher manual?

    Its supported ... you may have some other problem in your template causing the parser to complain. Log a TAR, get it routed to XMLP and we can help. Be sure to load template and sample data.
    Tim

  • How to generate a dynamic column with unique value in AMDP

    Hi Collegue,
    For AMDP I have a table with material plant,i have to assign a unique number to each unique combination of material,plants into a dynamic column say sequence.
    Please suggest me how to proceed.
    Regards,
    Saurabh

    hi
    Firstly, have a look at the following code to see how this can be implemented -
    REPORT ZTEST.
    perform test.
    class test definition.
      public section.
        methods: create_screen.
    endclass.
    class test implementation.
      method create_screen.
        data:  report_line(72),
               report_source like table of report_line.
        data: err_message(240),
              err_line type i,
              err_word(100).
        report_line = 'REPORT TEST.'.
        append report_line to report_source.
        report_line = 'PARAMETERS: P_TEST TYPE I.'.
        append report_line to report_source.
        report_line = 'START-OF-SELECTION.'.
        append report_line to report_source.
        report_line = 'WRITE : P_TEST.'.
        append report_line to report_source.
        syntax-check for report_source message err_message
                                       line    err_line
                                       word    err_word.
        if err_message is initial.
          INSERT REPORT 'ZZZTESTZZZ' FROM REPORT_SOURCE.
          SUBMIT ZZZTESTZZZ VIA SELECTION-SCREEN AND RETURN.
        endif.
      endmethod.
    endclass.
    form test.
      data test type ref to test.
      CREATE OBJECT TEST.
      call method test->create_screen.
    endform.
    As you can see, the report is being written dynamically. Once the INSERT REPORT statement is executed, the program is available. you can you external subroutine calls to pass the data between the programs now.
    Regards,
    ravish
    <b>plz dont forget to reward points if helpful</b>

  • Different tables' column have same value

    Hi,
    How to find out which coulmns in database are stoing a value. ex: 'Tier'
    I need to know different table's column in database have value 'Tier'.
    table1.col1 ='Tier'
    table2.col16='Tier'
    table3.col21='Tier'Thanks
    Sandy

    One possible solution is below. Take care with the performance, as the procedure is checking every schema.table.column VARCHAR ou CHAR. The number of columns can easily go to thousands and affect the overall performance of your DBMS.
    The example is explicitlly avoiding tables in schema SYS and SYSDBA.
    set serveroutput on size 999999
    declare
    cnt number;
    sql_comm varchar2(1000);
    begin
    for TRec in (select owner, table_name, column_name, data_length
                   from all_tab_columns
                  where data_type IN ('VARCHAR2', 'CHAR')
                    and owner not in ('SYS', 'SYSTEM')
                    and data_length >= 4
                  order by 1, 2, 3) loop
        sql_comm := 'select count(*) from '||TRec.owner||'.'||TRec.table_name||' where '||TRec.column_name||'= ''Tier''';
        execute immediate sql_comm into cnt;
        if cnt>0 then
            dbms_output.put_line(TRec.owner||'.'||TRec.table_name||'.'||TRec.column_name||' has '||cnt||' rows.');
        end if;   
    end loop;         
    end;
    Miguel

  • Trying to connect apple tv 3 to wifi at school says I need to create profile but i have a pc and not a mac

    When I select other wifi options... the home sharing still does not work!!!! VERY ANNOYING. I want to smash this to pieces right now.

    it's very unclear what you are saying
    create profile what?
    if you mean create an appleId then that does not require a mac
    if something else which seem to require you having a mac tell us
    I never had any problems when I had a pc only

  • Need help - can't run a report cause value set not found

    APP-FND-00738: error while loading value SET
    Posted: Apr 12, 2007 11:16 PM                     Reply
    Hi, I'm trying to run a report (in the parameters form) - i'm getting this error:
    APP-FND-00738: error loading value set SETUP_DESCR_VSETS.
    APP-FND-01564: error ORACLE 1403 in FDFAVS.
    Reason: Fail in program FDFAVS with reason ORA-01403: no data found.
    Executed SQL instruction from file &ERRFILE.
    Can you advice - which tool I can use to determine which value set didn't load.
    Maybe logs or log tables???

    I didnt get your problem exactly.
    Plz mention the steps what you are doing and where the error is coming.
    So that we can see..
    --Basava.S                                                                                                                                                                                                                                                                                           

  • Need PlSQL code to find out the column/combination of columns from a given table which will be unique values

    Given a table with some columns and data associated with that. Need to find out a column or a combination of some columns, so that the values or combination of values will be unique in the table.
    The table and number of columns and the columns will be dynamic.
    Can you please help me with the solution?

    f8d0dcea-cdf0-4935-8734-632fe021456c wrote:
    No key is defined in the table.
    Suppose a table contains 20 columns then I need the unique combinations of all columns.
    Example: A table 'Employee' has 4 columns: Emp_No, Emp_Name,Passport_No,Emp_Designation. No key is defined for Employee table. Need to find out which column(single column and combination if columns) have unique values. Like, First check if Emp_No is unique then check Emp_No+Emp_Name is unique, then check if Emp_No+Emp_Name+Passport_No is unique, then check Emp_No+Emp_Name+Passport_No+Emp_Designation is unique.
    Then again try with the combination of Emp_No+Passport_No and so on. In this way I need to find out all the combinations having unique values.
    As Paul says, that will be a waste of time, as it will taken a lot of processing for all the possibilities, checking all the data each time to determine which combinations of columns provide uniqueness.  What happens if someone inserts or deletes some data whilst your doing it?
    You'd be quicker to manually look at the tables, and make an educated guess and then test for uniqueness with a quick query on that guess.

  • I need an application for my iPad that is compatible with excel.  I appreciate there is a number of options, However i need to be able filter columns! is this possible

    I need an application for my iPad that is compatible with excel.  I appreciate there is a number of options, However i need to be able filter columns! is this possible.

    Note that I have not tested either of these and they will not be supported by Microsoft. 
    1. Copy your new version of Silverlight.exe to C:\Program Files\Microsoft Configuration Manager\Client\i386
    (you'll have to re-distribute the client package after this).
    OR
    2. You could edit the ccmsetup.xml file with an alternative location for silverlight.exe
    </Item>
     <Item FileName="i386/Silverlight.exe" FileHash="417B442E128D821119008ACEEEE6CDC2A41224377A829B6EC52BABA2724F0151">
      <Applicability Platform="ALL" OS="ALL">
       <Skip>Embedded</Skip>
      </Applicability>
    Gerry Hampson | Blog:
    www.gerryhampsoncm.blogspot.ie | LinkedIn:
    Gerry Hampson | Twitter:
    @gerryhampson

  • LOV's auto refresh and having unique values.

    Hi, I'm having LOVs connected with few attributes of a table, and those fields are used in view criteria of the table.
    This view criteria is used in the af:query component as search fields.
    To avoid the duplicates in the LOV, i created a separate view for each of the attribute, using the distinct key , and connected them with the attribute via view accessors. So that the af:query component , search field drop downs can give unique value of the column.
    There is an another requirement, to refresh the LOVs of the search field, when there is a change in the database. To accomplish this auto refresh property of the view's query, i added the PK of the table with each view. By doing this i can see the latest values of the database in the UI . ie, the LOVs of af:query getting the changes of database.
    But this one causing, the duplicate values to be in LOV, and the distinct key word in the view's query will not work because of the PK of table is added to query.
    I tried different ways to query , like groupby . But no success.
    I need both auto refresh as well as unique values in the LOVs of the af:query . Could some one point me a reference to solve this. Thanks .
    Edited by: user642477 on Oct 7, 2010 10:52 AM

    user642477 wrote:
    Hi, I'm having LOVs connected with few attributes of a table, and those fields are used in view criteria of the table.
    This view criteria is used in the af:query component as search fields.
    To avoid the duplicates in the LOV, i created a separate view for each of the attribute, using the distinct key , and connected them with the attribute via view accessors. So that the af:query component , search field drop downs can give unique value of the column.
    why you have done that what is your reason?!!! it seems so wierd to me? you could easily add distinct on query of the viewobject!

  • How to create surrogate key in dimension without unique value

    Hi, I have a dimension where there is no column with unique value. I want to add a surrogate key to replace the existing primary key which is derived from concatenating 3 columns(e.g. 'A'||'B'||'C'). I'm thinking of using sequence. But this won't allow me to link the dimension to fact table. How do I come up with surrogate key under this situation? Thanks. ~Tracy

    I'm actually trying to accomplish something similar myself.
    In my sources I've got two sorts of customers, ones that are directly reported, and ones whose information is provided with sales records (this is stored in module ODS).
    Of course identification is different, but in the datamart (module DWH) I'm sort of forced to use an equivalent way of loading (due to the way it first used to work). To accelerate lookups on dimensions, I copy the ODS surrogate key to DWH dimensions, but this does not work for the 'inbuilt' customers because they do not have a surrogate key in the ODS.
    They DO have means of unique identification, and at first I thought I could concatenate these (also 3) columns to use as identification code. Unfortunately this is VARCHAR2, where the surrogate key is (naturally) NUMBER.
    So now it looks like I'm forced to first build a table in ODS especially for these 'inbuilt' customers and assign a surrogate key (by sequence) to it, this way it conforms to how 'normal' customers are loaded into DWH.
    I guess you'll have to pull of the same trick, i.e. create a table with either only the 'translation' of D-code to a surrogate key or all information that is fed into the dimension, which then can be used as a lookup or as complete source when loading data into your datamart.
    Good luck, Patrick

Maybe you are looking for