Need help to pivot the data with hierarchy

Dear Team,
I have hierarchical data in the table as shown below
SQL> Create Table t As
  2  (
  3  Select 97250528 col1, 'TSD' col2,  97250528 col3,   'WAV' col4 From dual Union All
  4   Select 75020160     , 'ASD'     ,  97250528     ,   'TSD'      From dual Union All
  5   Select 69150832     , 'PA'      ,  75020160     ,   'ASD'      From dual Union All
  6   Select 49150538     , 'DB'      ,  69150832     ,   'PA'       From dual Union All
  7   Select 49150538     , 'WAV'     ,  49150538     ,   'DB'       From dual Union All
  8   ------------
  9   Select 97251520 col1, 'TSD' col2,  97251520 col3,   'WAV' col4 From dual Union All
10   Select 75020689     , 'ASD'     ,  97251520     ,   'TSD'      From dual Union All
11   Select 69151039     , 'PA'      ,  75020689     ,   'ASD'      From dual Union All
12   Select 49150672     , 'DB'      ,  69151039     ,   'PA'       From dual Union All
13   Select 49150672     , 'WAV'     ,  49150672     ,   'DB'       From dual
14  );
Table created
SQL> select *
  2  from t;
      COL1 COL2       COL3 COL4
  97250528 TSD    97250528 WAV
  75020160 ASD    97250528 TSD
  69150832 PA     75020160 ASD
  49150538 DB     69150832 PA
  49150538 WAV    49150538 DB
  97251520 TSD    97251520 WAV
  75020689 ASD    97251520 TSD
  69151039 PA     75020689 ASD
  49150672 DB     69151039 PA
  49150672 WAV    49150672 DB
10 rows selected
I am using connect by prior clause to get the data in correct order.
below is the SQL i am using..
SQL> Select Level lvl,
  2          col3,
  3          col4
  4   From t
  5   Start With col3 In ('97250528','97251520')
  6   And        col4 In ('WAV')
  7   connect by Nocycle Prior col1 = col3
  8              And     Prior col2 = col4;
       LVL       COL3 COL4
         1   97250528 WAV
         2   97250528 TSD
         3   75020160 ASD
         4   69150832 PA
         5   49150538 DB
         1   97251520 WAV
         2   97251520 TSD
         3   75020689 ASD
         4   69151039 PA
         5   49150672 DB
10 rows selected
The data i am getting is correct one. However i want output in the following format..
     WAV             TSD              ASD             PA             DB
1     97250528     97250528     75020160     69150832     49150538
2     97251520     97251520     75020689     69151039     49150672
When i use max + case statement i am getting in-correct results as there is no group by key
Kindly give me some hints or tips to pivot the data or should i go to PL/SQL to pivot the data.
Regards
nic

Hi Dbb,
Connect_by_root did the trick i guess
Select max(Case When col4 = 'WAV' Then col3 End) wav,
       max(Case When col4 = 'TSD' Then col3 End) tsd,
       max(Case When col4 = 'ASD' Then col3 End) ASD
From
Select  CONNECT_BY_ROOT col1 col,
        col3,
        col4
From t
Start With col3 In ('97250528','97251520')
And        col4 In ('WAV')
connect by Nocycle Prior col1 = col3
            And     Prior col2 = col4
Group By col;    
       WAV        TSD        ASD
  97250528   97250528   75020160
  97251520   97251520   75020689
Let me execute this over bulk-sets of data...
thanks for this pointer and hint..

Similar Messages

  • Need help in formatting the Date - Date does not

    Need help in formatting the Date - Date does not formats and give Not a valid month error in the below scenario.
    select oc.ST_PGM_MGR, r.ag_dnum, get_major_work_type(r.perf_eval_rtng_id) "v_work_code", r.ag_dnum_supp "supp", r.intfinal, to_char(r.formdate,'MM/DD/YYYY') "formdate", to_char(r.servfrom,'MM/DD/YYYY') "srv_from", to_char(r.servto,'MM/DD/YYYY') "srv_to", descript, add_months(to_char
    --- Bellow line of Code on trying to format it to mm/dd/yyyy gives the error
    (r.formdate, 'DD-MON-YYYY'),12) "formdate2"
    from  table REdited by: Lucy Discover on Jul 7, 2011 11:34 AM
    Edited by: Lucy Discover on Jul 7, 2011 1:05 PM

    Your syntax is wrong - look at the post above where this syntax is given:
    to_char (add_months(r.formdate,12), 'MM/DD/YYYY') "formdate2"Look at the formula from a logical perspective - "inside out" to read what is happening -
    take formdate, add 12 months
    add_months(r.formdate, 12)then apply the to_char format mask - basic syntax
    to_char(date, 'MM/DD/YYYY')Compare to your syntax:
    to_char(add_months(r.formdate, 'MM/DD/YYYY'),12) "formdate2"You will see your format string inside the call to add_months, and your 12 inside the call to to_char.
    Good luck!

  • Need help on resolving the issue with adobe output server - error MSG256 & MSG 210 not in .ini file

    Hi,
    I am using adobe output designer 5.5 for designing the label template and using the Adobe output server for printing process.
    In the Jfmerge.ini we given the condition "DiscardUnknownFields=Yes" for ignoring the unwanted fields in the .dat file.
    During the process, I faced some issue with the output server in printing the labels.
    When the .dat file is placed in the Data folder of adobe, the label is not getting printed in the printer.
    The file is move to the error folder and an error file is getting generated which contains the error message as given below:
    090826 02:59:02 D:\Program Files\Adobe\Central\Bin\jfmerge: [256]** Message Msg256 not in .ini file **
    090826 02:59:02 D:\Program Files\Adobe\Central\Bin\jfmerge: [210]** Message Msg210 not in .ini file **
    2009/08/26 02:59:02 D:\Program Files\Adobe\Central\Bin\jfserver.exe: [314]Agent exit message: [210]** Message Msg210 not in .ini file **
    The output server is a new installtion and I verified the Jfmerge.ini file. It contains the message details of Msg256 and Msg210.
    I also verified the license and it is a valid licence.
    Kindly help me out in solving this issue.
    Thanks
    Senthil

    I assume this is too late to help you, but other might need a hint.  I had the same problem, and found some possible causes that I posted on http://codeznips.blogspot.com/2010/02/adobe-output-server-message-msg210-not.html.
    It is quite likely that you are missing some double quotes around the path specifying the ini file (-aii), if its installed under "Program Files".
    Hope this helps anyone....
    Vegard

  • Calendar in iCloud deleted - need help to get the data back

    Hi,
    I just deleted one of my calendars in iCloud. Is there any chance to get this calendar (with all the data) back into iCloud?
    Thanks for your help.
    Pierre

    how? I should have a backup at my time machine from earlier today. Do I need to completely restore my MBP to also get my calender data back? Note that i only by accident deleted one calender (my others are ok)
    thank you

  • Need help in optimizing the query with joins and group by clause

    I am having problem in executing the query below.. it is taking lot of time. To simplify, I have added the two tables FILE_STATUS = stores the file load details and COMM table that is actual business commission table showing records successfully processed and which records were transmitted to other system. Records with status = T is trasnmitted to other system and traansactions with P is pending.
    CREATE TABLE FILE_STATUS
    (FILE_ID VARCHAR2(14),
    FILE_NAME VARCHAR2(20),
    CARR_CD VARCHAR2(5),
    TOT_REC NUMBER,
    TOT_SUCC NUMBER);
    CREATE TABLE COMM
    (SRC_FILE_ID VARCHAR2(14),
    REC_ID NUMBER,
    STATUS CHAR(1));
    INSERT INTO FILE_STATUS VALUES ('12345678', 'CM_LIBM.TXT', 'LIBM', 5, 4);
    INSERT INTO FILE_STATUS VALUES ('12345679', 'CM_HIPNT.TXT', 'HIPNT', 4, 0);
    INSERT INTO COMM VALUES ('12345678', 1, 'T');
    INSERT INTO COMM VALUES ('12345678', 3, 'T');
    INSERT INTO COMM VALUES ('12345678', 4, 'P');
    INSERT INTO COMM VALUES ('12345678', 5, 'P');
    COMMIT;Here is the query that I wrote to give me the details of the file that has been loaded into the system. It reads the file status and commission table to show file name, total records loaded, total records successfully loaded to the commission table and number of records that has been finally transmitted (status=T) to other systems.
    SELECT
        FS.CARR_CD
        ,FS.FILE_NAME
        ,FS.FILE_ID
        ,FS.TOT_REC
        ,FS.TOT_SUCC
        ,NVL(C.TOT_TRANS, 0) TOT_TRANS
    FROM FILE_STATUS FS
    LEFT JOIN
        SELECT SRC_FILE_ID, COUNT(*) TOT_TRANS
        FROM COMM
        WHERE STATUS = 'T'
        GROUP BY SRC_FILE_ID
    ) C ON C.SRC_FILE_ID = FS.FILE_ID
    WHERE FILE_ID = '12345678';In production this query has more joins and is taking lot of time to process.. the main culprit for me is the join on COMM table to get the count of number of transactions transmitted. Please can you give me tips to optimize this query to get results faster? Do I need to remove group and use partition or something else. Please help!

    I get 2 rows if I use my query with your new criteria. Did you commit the record if you are using a second connection to query? Did you remove the criteria for file_id?
    select carr_cd, file_name, file_id, tot_rec, tot_succ, tot_trans
      from (select fs.carr_cd,
                   fs.file_name,
                   fs.file_id,
                   fs.tot_rec,
                   fs.tot_succ,
                   count(case
                            when c.status = 'T' then
                             1
                            else
                             null
                          end) over(partition by c.src_file_id) tot_trans,
                   row_number() over(partition by c.src_file_id order by null) rn
              from file_status fs
              left join comm c
                on c.src_file_id = fs.file_id
             where carr_cd = 'LIBM')
    where rn = 1;
    CARR_CD FILE_NAME            FILE_ID           TOT_REC   TOT_SUCC  TOT_TRANS
    LIBM    CM_LIBM.TXT          12345678                5          4          2
    LIBM    CM_LIBM.TXT          12345677               10          0          0Using RANK can potentially produce multiple rows to be returned though your data may prevent this. ROW_NUMBER will always prevent duplicates. The ordering of the analytical function is irrelevant in your query if you use ROW_NUMBER. You can remove the outermost query and inspect the data returned by the inner query;
    select fs.carr_cd,
           fs.file_name,
           fs.file_id,
           fs.tot_rec,
           fs.tot_succ,
           count(case
                    when c.status = 'T' then
                     1
                    else
                     null
                  end) over(partition by c.src_file_id) tot_trans,
           row_number() over(partition by c.src_file_id order by null) rn
    from file_status fs
    left join comm c
    on c.src_file_id = fs.file_id
    where carr_cd = 'LIBM';
    CARR_CD FILE_NAME            FILE_ID           TOT_REC   TOT_SUCC  TOT_TRANS         RN
    LIBM    CM_LIBM.TXT          12345678                5          4          2          1
    LIBM    CM_LIBM.TXT          12345678                5          4          2          2
    LIBM    CM_LIBM.TXT          12345678                5          4          2          3
    LIBM    CM_LIBM.TXT          12345678                5          4          2          4
    LIBM    CM_LIBM.TXT          12345677               10          0          0          1

  • Need help in triggering the Data stream load  using process chain

    Hi Guru's
    is it possible to trigger a data stream load using process chain?
    Any help is highly appreciated.
    Thanks
    Indiran

    Hi Indiran and welcome aboard!
    Don't think this is possible. SAP BW & SAP SEM-BCS are rather independent systems. Though, BCS lives on top of BI-BW stack, it even may have master data different from those in BW.
    Process chains, AFAIK, is completely the BW's feature. Certainly, you may use PCc on BW side, loading ODS/DSO and cubes involved in BCS data model.
    The main con here is the lost transparency -- you don't control everything from the consolidation monitor.
    The pro side is also rather obvious for me. Since, very often there is a huge difference between data quality at the data source and in the BCS totals cube, I need to make a lot of data transformation. Not only some data calculations or cleaning, but also transformation of data model: key figure model -> account model. It's much more easier to do in BW, for me.
    I even call the ODS/cubes/routines involved in such transformation as intermediate layer, the layer between data source and SEM-BCS.
    And this layer lives rather independently from BCS.
    Hope this helps.

  • Need Help to Transfer Specific Data with dml_condition

    i have two databases on two different server with name db1 and db2. i want to transfer specific data with dml condition. bellow is my code
    for Server have db1.
    --------------Sys----------------------
    create user strmadmin identified by strmadmin;
    grant connect, resource, dba to strmadmin;
    begin dbms_streams_auth.grant_admin_privilege
    (grantee => 'strmadmin',
    grant_privileges => true);
    end;
    grant select_catalog_role, select any dictionary to strmadmin;
    alter system set global_names=true;
    alter system set streams_pool_size = 100 m;
    ----------------------end--------------------
    -----------------------StrmAdmin--------------------------
    create database link db2
    connect to strmadmin
    identified by strmadmin
    using 'DB2';
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    EXEC DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION(table_name => scott.emp');
    -- Configure capture process at the source database
    begin dbms_streams_adm.add_table_rules
    ( table_name => 'scott.emp',
    streams_type => 'capture',
    streams_name => 'capture_stream',
    queue_name=> 'strmadmin.streams_queue',
    include_dml => true,
    include_ddl => true,
    inclusion_rule => true);
    end;
    -- -- Configure Sub Set Rules capture process at the source database
    begin dbms_streams_adm.add_subset_rules
    ( table_name => 'scott.emp',
    dml_condition=>'deptno=50',
    streams_type => 'capture',
    streams_name => 'capture_stream',
    queue_name=> 'strmadmin.streams_queue',
    include_tagged_lcr => true);
    end;
    --     Configure the propagation process at Sources Database
    begin dbms_streams_adm.add_table_propagation_rules
    ( table_name => 'scott.emp',
    streams_name => 'DB1_TO_DB2',
    source_queue_name => 'strmadmin.streams_queue',
    destination_queue_name => 'strmadmin.streams_queue@DB2',
    include_dml => true,
    include_ddl => true,
    source_database => 'DB1',
    inclusion_rule => true);
    end;
    --     Configure the Subset propagation Rule process at Sources Database
    begin SYS.dbms_streams_adm.add_subset_propagation_rules
    ( table_name => 'scott.emp',
    dml_condition=>'deptno=50',
    streams_name => 'DB1_TO_DB2',
    source_queue_name => 'strmadmin.streams_queue',
    destination_queue_name => 'strmadmin.streams_queue@DB2',
    include_tagged_lcr => true);
    end;
    --      Set the instantiation system change number (SCN)
    declare
    source_scn number;
    begin
    source_scn := dbms_flashback.get_system_change_number();
    dbms_apply_adm.set_table_instantiation_scn@DB2
    ( source_object_name => 'scott.emp',
    source_database_name => 'DB1',
    instantiation_scn => source_scn);
    end;
    --      Start the capture processes
    begin dbms_capture_adm.start_capture
    ( capture_name => 'capture_stream');
    end;
    ---------------------------End----------------------------------------------------------
    for server 2 have db2.
    --------------------------Sys---------------------------------------------------------------
    CREATE USER strmadmin IDENTIFIED BY strmadmin;
    GRANT CONNECT, RESOURCE, DBA TO strmadmin;
    BEGIN
    DBMS_STREAMS_AUTH.grant_admin_privilege (grantee => 'strmadmin',
    grant_privileges => TRUE);
    END;
    GRANT SELECT_CATALOG_ROLE, SELECT ANY DICTIONARY TO strmadmin;
    ALTER SYSTEM SET global_names=TRUE;
    ALTER SYSTEM SET streams_pool_size = 100 M;
    -----------------------------------------------------------End-----------------------------
    ---------------------------------Stream user--------------------------------------------------------------
    CREATE DATABASE LINK db1
    CONNECT TO strmadmin
    IDENTIFIED BY strmadmin
    USING 'DB1';
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    EXEC DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION(table_name => scott.emp');
    -- add table Level rule on target Database.
    BEGIN
    DBMS_STREAMS_ADM.add_table_rules (
    table_name => 'scott.emp',
    streams_type => 'apply',
    streams_name => 'apply_stream',
    queue_name => 'strmadmin.streams_queue',
    include_dml => TRUE,
    include_ddl => TRUE,
    source_database => 'DB1',
    inclusion_rule => TRUE);
    END;
    -- add table Level Sub Set rule on target Database.
    BEGIN
    DBMS_STREAMS_ADM.add_subset_rules (
    table_name => 'scott.emp',
    dml_condition => 'deptno=50',
    streams_type => 'apply',
    streams_name => 'apply_stream',
    queue_name => 'strmadmin.streams_queue',
    include_tagged_lcr => TRUE);
    END;
    -- Start the apply processes
    BEGIN
    DBMS_APPLY_ADM.set_parameter (apply_name => 'apply_stream',
    parameter => 'disable_on_error',
    VALUE => 'n');
    END;
    BEGIN
    DBMS_APPLY_ADM.start_apply (apply_name => 'apply_stream');
    END;
    ---------------------------------End---------------------------------------------------------------------------------
    plz help me.

    below is the Result.
    RULE_NAME RULE_TYPE RULE_SET_TYPE RULE_SET_NAME STREAMS_TYPE STREAMS_NAME
    RULE$_7 POSITIVE RULESET$_8 DEQUEUE SCHEDULER_PICKUP
    RULE$_11 POSITIVE RULESET$_8 DEQUEUE SCHEDULER_PICKUP
    RULE$_3 POSITIVE RULESET$_4 DEQUEUE SCHEDULER_COORDINATOR
    EMP122 DDL POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP121 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP124 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP125 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP126 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP115 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP116 DDL POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP118 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP119 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP120 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    Edited by: Naeem Ullah Khattak on Apr 19, 2013 2:57 AM

  • Need help to resolve the Date time issue(Urgent)

    Hello All,
    I am running simple code as follows, to get the date and time. getting difference in 2 environment. Our both environment of AIX.
    import java.io.PrintStream;
    import java.util.Calendar;
    public class DateTest
    public DateTest()
    public static void main(String args[])
    java.util.Date now = Calendar.getInstance().getTime();
    System.out.println((new StringBuilder("Calendar.getInstance
    ().getTime():")).append(now).toString());
    Problematics environment gives following o/p
    bash-4.0$ java -cp . DateTest
    Calendar.getInstance().getTime():Wed Feb 03 00:59:40 EST 2010
    bash-4.0$ date
    Wed Feb  3 05:59:42 EST 2010
    bash-4.0$
    In another enviornment gives proepr o/p
    bash-4.0$ java -cp . DateTest
    Calendar.getInstance().getTime():Wed Feb 03 05:01:45 GMT-05:00 2010
    bash-4.0$ date
    Wed Feb 3 05:01:48 CST 2010
    can some one help me out in this...............

    What were the results of your investigations based on the pointers you got from [your earlier thread|http://forums.sun.com/thread.jspa?threadID=5423811] ?

  • Need help in deleting the data load request

    Hi All,
    In the system one erroneous data load load request is being generated( whoes status is  monitor without Request), because of this I am unable to activate other requests...
    I want to delete this request, I am deleting it but it is not, it is still remain in the system, and i am unable to activate it as it is an erroneous one, I have deleted that request from the backend table RSMONICDP as well, but still no successful results.
    Please give ur suggestions on this.
    Thanks,
    vinay

    Hi
    There is a possible solution for these kind of issues .
    1. Note that bad request number
    2. Go to SE16
    3. Open table RSODSACTREQ.: Filter the contents by that request number . There will be one entry for that request . Delete that entry
    4. Open table RSICCONT. There will be one entry for that request . Delete that entry
    5. Open Table RSMONICDP. Filter the contents by that request number . There will be 3-4 entries for that request depending on number of data packages in that request .Delete all those entries

  • I need help to remove the wrong email on my Iphone.  I can't down load app because my apple ID doese not match with the wrong email please help

    I need help to remove the wrong email on my Iphone.  the represnter who set up my phone put in the wrong email, and now I can't down loand apps on my phone because the apple ID does not match

    So sign out of the Apple ID under Settings > iTunes & App Store, then sign in with your own.

  • I have tried everything I know to retrieve the iPhoto Library app.  I detest this new Photo app, which obviously wasn't designed with photographers in mind.  I desperately need help in retrieving the old app and have not been able to do it so far.

    I have tried everything I know to retrieve the iPhoto Library app.  I detest this new Photo app, which obviously wasn't designed with photographers in mind.  I desperately need help in retrieving the old app and have not been able to do it so far.  I have gone to my app folder and tried to update my iPhoto Library.  I have gone to my trash and brought it over to my desktop and still cannot use it.  Please help!

    Try this fix discovered by User photosb189:
    From Finder, hold down the option key on your keyboard and click on the Go menu at the top of the screen
    Select Library from the drop down menu
    Then navigate to Application Support > iLifeAssetManagement > assets
    Right click on the 'sub' folder and choose Compress
    Drag the resulting zip file to your Desktop (as a backup)
    Go to your System Preferences and choose iCloud
    Turn OFF the iCloud Photos option
    Go back to Library > Application Support and DELETE the iLifeAssetManagement folder
    Now, in System Preferences, turn the iCloud Photos option ON
    iPhoto should now be able to launch. Give it enough time to re-download your Photo Stream content. if you are missing any of your My Photo Stream photos (more than 30 days old), unzip the sub folder on your desktop and drag that into iPhoto.

  • I need help opening up the pdf doc that i just saved. i need to open it up with excel?

    I need help opening up the pdf doc that i just saved. i need to open it up with excel?

    Yes, I need help configuring the settings.
       Re: I need help opening up the pdf doc that i just saved. i need to
    open it up with excel?  created by David Kastendick<http://forums.adobe.com/people/dave_m_k>in
    Adobe ExportPDF - View the full discussion<http://forums.adobe.com/message/4711293#4711293

  • Need help on displaying the callers name on ip phone with cme using external directory

    Hello Guys,
    Need help on displaying the callers name on ip phone with cme while using external directory
    Thank you,
    Khaja

    Thanks for your help,
    Does it 100% work with CME. We use SIP and 2ring for external directory?  Thanks you.

  • Needs help to know the different versions of forms with their compatibility

    Needs help to know the different versions of forms with their compatibility with different databases and operating systems.
    Kindly give the details or any suitable link for the same.
    please consider it as on urgent basis.
    regards,
    rajesh

    Rajesh,
    the certification matrix is available on metalink.oracle.com. You need a support contract to access this information.
    Frank

  • Need help in optimizing the process

    I need help in optimizing my process. Here is what am I doing:
    I have two tables table A and table B. table A has list of names and table B has name and ID. Right now I am loading name into cursor from table A and for each name from the cursor I try to match with name in table B to get the ID. This process takes very long to complete. I need help to replace PL/SQL with SQL if possible. I am using oracle utl_match to match the best possible record in table B. table B sometimes returns multiple matching records. I take the top nearest match name and ID for my use (there is not always one to one match)
    Any idea or help will be appreciated. Thanks

    always provide create table and sample data instert statements when asking a question. also, provide your database version.
    because you're using utl_match, you can't really use an index, there must be a cartesian join on the two tables so it may always be slow.
    however, here's a sample SQL that you may be able to make use of:
    with n as (select 'KATHERINE' person_name from dual
                 union all
                 select 'STEVE' from dual
                 union all
                 select 'BRUCE' from dual)
        ,nid as (select 'CATHERINE' person_name, 1 NID FROM DUAL
                 union all
                 select 'STEFAN', 2 from dual
                 UNION ALL
                 select 'STEVEN', 2 from dual
                 union all
                 select 'CATHY',3 from dual)
    select n_name, nid
      from (
            select n.person_name N_NAME
                  ,nid.person_name
                  ,nid.nid
                  ,ROW_NUMBER() OVER (partition by n.person_name ORDER BY utl_match.jaro_winkler_similarity(n.person_name,nid.person_name) DESC) jws_ORDER
              from n
              join nid on (utl_match.jaro_winkler_similarity(n.person_name,nid.person_name) > 70)
      where jws_order = 1;    

Maybe you are looking for