Analytical functions approach for this scenario?

Here is my data:
SQL*Plus: Release 11.2.0.2.0 Production on Tue Feb 26 17:03:17 2013
Copyright (c) 1982, 2010, Oracle.  All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select * from batch_parameters;
       LOW         HI MIN_ORDERS MAX_ORDERS
        51        100          6          8
       121        200          1          5
       201       1000          1          1
SQL> select * from orders;
ORDER_NUMBER LINE_COUNT   BATCH_NO
        4905        154
        4899        143
        4925        123
        4900        110
        4936        106
        4901        103
        4911        101
        4902         91
        4903         91
        4887         90
        4904         85
        4926         81
        4930         75
        4934         73
        4935         71
        4906         68
        4907         66
        4896         57
        4909         57
        4908         56
        4894         55
        4910         51
        4912         49
        4914         49
        4915         48
        4893         48
        4916         48
        4913         48
        2894         47
        4917         47
        4920         46
        4918         46
        4919         46
        4886         45
        2882         45
        2876         44
        2901         44
        4921         44
        4891         43
        4922         43
        4923         42
        4884         41
        4924         40
        4927         39
        4895         38
        2853         38
        4890         37
        2852         37
        4929         37
        2885         37
        4931         37
        4928         37
        2850         36
        4932         36
        4897         36
        2905         36
        4933         36
        2843         36
        2833         35
        4937         35
        2880         34
        4938         34
        2836         34
        2872         34
        2841         33
        4889         33
        2865         31
        2889         30
        2813         29
        2902         28
        2818         28
        2820         27
        2839         27
        2884         27
        4892         27
        2827         26
        2837         22
        2883         20
        2866         18
        2849         17
        2857         17
        2871         17
        4898         16
        2840         15
        4874         13
        2856          8
        2846          7
        2847          7
        2870          7
        4885          6
        1938          6
        2893          6
        4888          2
        4880          1
        4875          1
        4881          1
        4883          1
ORDER_NUMBER LINE_COUNT   BATCH_NO
        4879          1
        2899          1
        2898          1
        4882          1
        4877          1
        4876          1
        2891          1
        2890          1
        2892          1
        4878          1
107 rows selected.
SQL>The batch_parameters:
hi - high count of lines in the batch.
low - low count of lines in the batch.
min_orders - number of minimum orders in the batch
max_orders - number of maximum orders in the batch.
The issue is to create an optimal size of batches for us to pick the orders. Usually, you have stick within a given low - hi count but, there is a leeway of around let's say 5%percent on the batch size (for the number of lines in the batch).
But, for the number of orders in a batch, the leeway is zero.
So, I have to assign these 'orders' into the optimal mix of batches. Now, for every run, if I don't find the mix I am looking for, then, the last batch could be as small as a one line one order too. But, every Order HAS to be batched in that run. No exceptions.
I have a procedure that does 'sort of' this, but, it leaves non - optimal orders alone. There is a potential of the orders not getting batched, because they didn't fall in the optimal mix potentially missing our required dates. (I can write another procedure that can clean up after).
I was thinking (maybe just a general direction would be enough), with what analytical functions can do these days, if somebody can come up with the 'sql' that gets us the batch number (think of it as a sequence starting at 1).
Also, the batch_parameters limits are not hard and fast. Those numbers can change but, give you a general idea.
Any ideas?

Ok, sorry about that. That was just guesstimates. I ran the program and here are the results.
SQL> SELECT SUM(line_count) no_of_lines_in_batch,
  2         COUNT(*) no_of_orders_in_batch,
  3         batch_no
  4    FROM orders o
  5   GROUP BY o.batch_no;
NO_OF_LINES_IN_BATCH NO_OF_ORDERS_IN_BATCH   BATCH_NO
                 199                     4     241140
                  99                     6     241143
                 199                     5     241150
                 197                     6     241156
                 196                     5     241148
                 199                     6     241152
                 164                     6     241160
                 216                     2     241128
                 194                     6     241159
                 297                     2     241123
                 199                     3     241124
                 192                     2     241132
                 199                     6     241136
                 199                     5     241142
                  94                     7     241161
                 199                     6     241129
                 154                     2     241135
                 193                     6     241154
                 199                     5     241133
                 199                     4     241138
                 199                     6     241146
                 191                     6     241158
22 rows selected.
SQL> select * from orders;
ORDER_NUMBER LINE_COUNT   BATCH_NO
        4905        154     241123
        4899        143     241123
        4925        123     241124
        4900        110     241128
        4936        106     241128
        4901        103     241129
        4911        101     241132
        4903         91     241132
        4902         91     241129
        4887         90     241133
        4904         85     241133
        4926         81     241135
        4930         75     241124
        4934         73     241135
        4935         71     241136
        4906         68     241136
        4907         66     241138
        4896         57     241136
        4909         57     241138
        4908         56     241138
        4894         55     241140
        4910         51     241140
        4914         49     241142
        4912         49     241140
        4915         48     241142
        4916         48     241142
        4913         48     241142
        4893         48     241143
        2894         47     241143
        4917         47     241146
        4919         46     241146
        4918         46     241146
        4920         46     241146
        2882         45     241148
        4886         45     241148
        2901         44     241148
        2876         44     241148
        4921         44     241140
        4891         43     241150
        4922         43     241150
        4923         42     241150
        4884         41     241150
        4924         40     241152
        4927         39     241152
        2853         38     241152
        4895         38     241152
        4931         37     241154
        2885         37     241152
        4929         37     241154
        4890         37     241154
        4928         37     241154
        2852         37     241154
        2843         36     241156
        2850         36     241156
        4932         36     241156
        4897         36     241156
        4933         36     241158
        2905         36     241156
        2833         35     241158
        4937         35     241158
        4938         34     241158
        2880         34     241159
        2872         34     241159
        2836         34     241158
        2841         33     241159
        4889         33     241159
        2865         31     241159
        2889         30     241150
        2813         29     241159
        2902         28     241160
        2818         28     241160
        4892         27     241160
        2884         27     241160
        2820         27     241160
        2839         27     241160
        2827         26     241161
        2837         22     241133
        2883         20     241138
        2866         18     241148
        2849         17     241161
        2871         17     241156
        2857         17     241158
        4898         16     241161
        2840         15     241161
        4874         13     241146
        2856          8     241154
        2847          7     241161
        2846          7     241161
        2870          7     241152
        2893          6     241142
        1938          6     241161
        4888          2     241129
        2890          1     241133
        2899          1     241136
        4877          1     241143
        4875          1     241143
        2892          1     241136
ORDER_NUMBER LINE_COUNT   BATCH_NO
        4878          1     241146
        4876          1     241136
        2891          1     241133
        4880          1     241129
        4883          1     241143
        4879          1     241143
        2898          1     241129
        4882          1     241129
        4881          1     241124
106 rows selected.As you can see, my code is a little buggy in that it may not have followed the strict batch_parameters. But, this is acceptable to be in the general area.

Similar Messages

  • What would the  approach for this scenario

    If I choose the data source like  2LIS_02_HDR, 2LIS_02_ITM,2LIS_03_BF  i should go for Lo-cockpit Extratcion
    Activate each data source and replicate on bw side.
    Activate BI Content for corresponding DS and
    Use Mulitprovider  to include the cubes  and
    Generate reports on following Cubes
    Is my approach is correct?
    Else can i go for generic Extraction where I need three fields from each  cube,so i can create view on the Data source.
    Is this right way?

    Hi,
    Also check the below link,
    Generic DataSources allow you to extract data which cannot be supplied to the BW either with the DataSources delivered within Business Content or with customer-defined DataSources of the application.
    http://help.sap.com/saphelp_nw04/helpdata/en/3f/548c9ec754ee4d90188a4f108e0121/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/3f/548c9ec754ee4d90188a4f108e0121/frameset.htm
    Regards,
    Satya

  • Need suggestions or possible approach for this problem

    Hello,
    I have a scenario and I want to develop an apex application for this scenario.
    The thing is I have mutiple report regions on a page which are querying the same tbl 'loans'. I have a button named 'Assign loan' at the top of the page and at bottom of the page I have buttons 'Save' and 'Complete'. Initially the multiple report regions should not be displayed, only after user clicks on the buton 'Assign loan', then only they should be visible. Similarly for the buttons 'Save' and 'Complete'. After the loan is assigned to a user the button 'Assign loan' should be disabled.
    Now Consider a situation where in a user logins to the application and clicks on the button 'Assign a loan', at this point i will update the table and set a flag, so that the user will get that particular record for the reports region. I mean this record should be locked for that user and shouldn't be available for other users. The thing it should work in a multi-user environment, i mean each user should get a different loan when they click the button 'Assign loan'. So, here the user is assigned a loan and he/she makes changes to the multiple report region editable items and he/she can either save the changes by clicking on the button 'Save' or they can click on the button 'Complete' which means the loan was reviewed and is completed. The user shouldn't be assigned an another loan until he/she reviews and completes a paticular loan already assigned to him. And another case would be the user does some changes to the multiple report region editable items and clicks the button 'Save' (menaing pending) the report changes should be saved and he shouldn't be assigned any new loan until he completes the already assigned loan.
    Can anyone please help me with their possible suggestions or an approach to this kind of problem.
    thanks,
    Orton

    It looks to me that the trickiest part is preventing more than one user from getting assigned the same loan. I've seen DBMS_LOCK used for a situation like this. It's been years so I'm a bit fuzzy on the exact details but it goes something like this:
    When the user clicks 'Assign a loan', try to get an exclusive generic lock:
       dbms_lock.request(lockhandle => 'LOAN_LOCK',
                      lockmode   => dbms_lock.X_MODE,
                      timeout    => 10);If another user already has a lock with this name, try again after the timeout until the lock is obtained (and probably only try a maximum number of times).
    Once the lock is obtained, get the next available unassigned loan and set the flag in the table. Now release the lock so the next user can get a loan assigned.
       dbms_lock.release(lockhandle => 'LOAN_LOCK');As long as everyone uses the same process for getting a loan assigned, only the user with a lock can modify the table. The rest of it (the logic around what buttons to show, requiring an assigned loan to be completed before getting another one assigned, etc.) should be relatively straightforward.

  • Anyway to "UN" Ken Burns stills globally or as default bringing in jpeg's?  New to iMovie. I have 3 titles that build.  First is logo, add prod't name second line, add date 3rd line.  They shift when dissolving between them. Title function NG for this.

    Anyway to "UN" Ken Burns stills globally or as default bringing in jpeg's?  New to iMovie. I have 3 titles that build.  First is logo, add prod't name second line, add date 3rd line.  They shift when dissolving between them. Title function NG for this.
    The people at the Apple store gave me a long, tedious workaround which is open Pages and bring in logo.  Export it as a PDF.  Open the PDF in Review and then "Save As" a jpg to the desktop and drag and drop into the imovie.  Even with this process, the text will sometimes shift.  When it works, it looks good, the logo comes up, then the second line fades in under it as part of the build and then the third line completes the graphic.  It would something this easy and straight forward would be easy. 
    HarveyV

    I did the work around for the titles/jpg's in this short video. 
    I'll try your fix on the next one.  I have 22 short 3-5 minute videos to produce
    and I'm on number 5.  This should save me a bunch of time. 
    It would appear that you know more than the guy at the Apple Store. 
    Thank goodness for on-line forums.
    I already tried fixing them individually as you suggested,
      by going the crop/fit route.  However, even then,
      the second in the series had always shifted. 
    The first and the third images lined up, but not number 2.. 
    So  I brought in all the images, (1, 2 and 3).  
    and then the 2nd image again and put it in the third position
    in the sequence on the time line. I deleted the "original" 2nd image.
    So I had 1, 3, then 2.  Then I dragged and dropped the correct number two
    into position two and it worked.  So what happens is that the
    client logo appears, then that stays put and the next line dissolves in
      and then the third dissolves in.  They are three separate titles, but they
    are aligned perfectly so that they build as I dissolve between them.
    I think the problem with the edit copy route would be that while
    I could edit/copy the logo, I couldn't position text through the
    iMovie text function exactly as it only gives you middle or bottom third etc.
      Each title with the product name is positioned individually depending on
    length and number of lines. By going through Pages, I can position the logo
    and the text exactly.  Also, some logos are horizontal, while others
    are square etc., so text positioning is crucial.Anyhow,
    I'll try the global approach for the next one and see how that works.
      Thanks very much for your insight and especially for your prompt reply.
    I trust you're inside on this gorgeous July 4th because you're
    making money on your computer. ;-)HarveyVOn 7/4/2011 3:57 PM, Apple Support Communities Updates wrote:
    > Apple Support Communities <https://discussions.apple.com/index.jspa>>>>       Re: Anyway to "UN" Ken Burns stills globally or as default>       bringing in jpeg's? New to iMovie. I have 3 titles that build.>       First is logo, add prod't name second line, add date 3rd line.>       They shift when dissolving between them. Title function NG for...>> created by AppleMan1958 > <https://discussions.apple.com/people/AppleMan1958> in /iMovie/ - View > the full discussion > <https://discussions.apple.com/message/15552380#15552380>> ------------------------------------------------------------------------>> I am baffled by what you are asking...but if you are asking how to > turn off the Ken Burns effect, you can do it for future photos you add > by going to File/Project Preferences and setting the default initial > photo placement to FIT or CROP rather than Ken Burns.>> For a global change on photos that are already in your project, pick a > photo. Open the Rotate, Crop, Ken Burns Tool. Select FIT or CROP.  > Exit the tool. Now select this same photo and EDIT/COPY.>> Now, EDIT/SELECT ALL.>> Finally, EDIT/PASTE ADJUSTMENTS.../CROP ADJUSTMENTS.

  • Help required specifying Transation attributes for this scenario

    Hi ,
    I am trying to create/update rows in a database using BMP and CMP beans.
    A business method ( Method1 )in session bean calls a non-business method ( Method2) in the same session bean which inturn calls an EntityBean ( EB1-BMP) . EB1 can throw a certain business exception upon which , the Method1 in the session bean calls another EntityBean in a loop( EB2-CMP).
    The problem is that , when the EB1 throws the business exception, i am getting an exception ( part of the stack trace attached below ).
    Could any please explain what should be the transaction attributes to be specified for this scenario.
    Using RequiresNew for the EntityBeans would not work ( or would it ?? ) because the entity bean is being called in a loop and the commit or rollback should happen for all the methods.
    I feel the problem should be solved by specifying the transaction attribute for Method2 ( non business method in session bean ) as Required, but i guess this is not possible.
    How exactly will the transaction behave in this scenario, is the exception caused because EB1 has thrown an exception and i am trying to continue the transaction.
    Could someone please suggest a solution or workaround for this problem.
    Regards,
    Harsha
    ---- Begin backtrace for nested exception
    java.lang.IllegalStateException
    at com.ibm.ws.Transaction.JTA.TransactionImpl.enlistResource(TransactionImpl.java:1694)
    javax.ejb.EJBException: nested exception is: com.ibm.ws.ejbpersistence.utilpm.PersistenceManagerException: PMGR6022E: Error using adapter to create or execute an Interaction. com.ibm.ws.rsadapter.cci.WSInteractionImpl@28d16547
    .

    tryout business method ( Method1 )in session bean with transaction as RequiresNew.
    catch exception in Method2 originated from EB1-BMP
    make the method in EB2-CMP as Required/Mandatory
    I have made a guess here so just tryout and let me know if works.
    Its recommend that not to use both BMP's and CMPs in your application. Have any one either.

  • What’s the best practice for this scenario?

    Hi,
    My users want the ability to change the WHERE and/or ORDER BY clause at runtime. They may define user preferences on each screen ( which is bind to a view object). They want to see the same records based on WHERE/ORDER BY defined on the last visit. That is why I keep the users preferences and load the screen based on that, using :
    View.setWhereClause(...);
    View.setOrderByClause(...);
    View.executeQuery();
    This works good when only one user working with the application but faced low performance when more than one user working with the application.
    What are the points to increase the performance and what is the best practice for this scenario?
    Thanks for your help in advance.

    Sung,
    I am talking only about 2 users in my testing. I am sure i missed something but could not recognize that.
    This page is my custom query page including a tag to instantiate app module in stateful mode at the top <jbo:ApplicationModule..> and a tag to instantiate data source <jbo:Datasource...> and release tag at the bottom <jbo:ReleasePageResources..> and some java code in the middle(body). The java code constructed the query statement and then fires the query to set the view object based on the query statement using the above methods.
    So, I am facing very slow performance(speed) when two clients load this page at the same time. Looks like the entire application locks for others when one client load this page and fire the query. i realized the battle neck is where executeQuery() is executing.
    what do you think.
    Thanks in advance for your comments.

  • Design Patterns, best approach for this app

    Hi all,
    i am starting with design patterns, and i would like to hear your opinion on what would be the best approach for this app. 
    this is basically an app for data monitoring, analysis and logging (voltage, temperature & vibration)
    i am using 3 devices for N channels (NI 9211A, NI 9215A, NI PXI 4472) all running at different rates. asynchronous.
    and signals are being processed and monitored for logging at a rate specified by the user and in realtime also. 
    individual devices can be initialized or stopped at any time
    basically i'm using 5 loops.
    *1.- GUI: Stop App, Reload Plot Names  (Event handling)
    *2.- Chart & Log:  Monitors Data and Start/Stop log data at a specified time in the GUI (State Machine)
    *3.- Temperature DAQ monitoring @ 3 S/s  (State Machine)   NI 9211A
    *4.- Voltage DAQ monitoring and scaling @ 1K kS/s (State Machine) NI 9215A
    *5.- Vibration DAQ monitoring and Analysis @ 25.6 kS/s (State Machine) NI PXI 4472
    i have attached the files for review, thanks in advance for taking the time.
    Attachments:
    V-T-G Monitor_Logger.llb ‏355 KB

    mundo wrote:
    thanks Will for your response,
    so, basically i could apply a producer/consummer architecture for just the Vibration analysis loop? or all data being collected by the Monitor/Logger loop?
    is it ok having individual loops for every DAQ device as is shown?
    thanks.
    You could use the producer/consumer architecture to split the areas where you are doing both the data collection and teh analysis in the same state machine. If one of these processes is not time critical or the data rate is slow enough you could leave it in a single state machine. I admit that I didn't look through your code but based purely on the descriptions above I would imagine that you could change the three collection state machines to use a producer/consumer architecture. I would leave your UI processing in its own loop as well as the logging process. If this logging is time critical you may want to split that as well.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • Need inputs from u for this scenario?

    Hi all,
    I have one scenario I need all inputs from u all.
    My scenario is like this.
    I will get data from online transactions I need to collect all the day transaction into one folder and upload them into SAP system at one particular time.
    Which are the best adapters for this scenario.
    If I have standard IDOC I will go with IDOC adapter at receiver side if not proxy.
    But collecting all the data into one folder and schedule the process at particular time.
    How to do this and what are the adapter I can use.
    Thanks and Regards,
    Phani Kumar.

    Hi,
    I hope for online transactions and for tracking it and saving you can write a java script where you can prepare a XML file to get all transactions for particular order and always append it to the end of that xml file. At end of the day you will have a complete transaction list in the XML you are creating. You can use the file name as you desired but i will prefer datewise name.after the completion of that Transaction recording step, put it in some In folder from where your XI system willo take input.
    Now this is file to Idoc scenario or whatever you want to use to post data to SAP Systems. The sechdule this scenario to run in the night or your desired time. After processing the XML file put that file\ to some other complete folder , not to get it again.
    For collection of transactions you can also use databases. Then you scenario will be JDBC to IDOC or the thing you want to post data in SAP system.
    Hope this will help you.
    Regards
    Aashish Sinha
    PS : reward pints if helpful

  • Oracle features available for this scenario

    What is the best methos for replcating oracle database from a production database with every 15 min interval..
    what are the oracle features available for this scenario
    Thanks
    Shiju

    orashiju wrote:
    What is the best methos for replcating oracle database from a production database with every 15 min interval..
    what are the oracle features available for this scenario
    Thanks
    ShijuIs there any specific reason to open a new thread for the same discussion that you have started here,
    Please suggest a suitable method
    Aman....

  • Programming logic for this scenario

    hi all,
    kindly help me with this scenario:
    i have a internal table with fields like this (among others)
    OBJEK                        ATINN         CHAR                                  CHARG            CHAR1
    000000000000000031 0000000188  Batchnumber: WEEK NO. 9  0000000052
    000000000000000031 0000000189  Visualinspection: OK            0000000052
    now wht i need to do is for SAME batch number i need to concatenate the values of CHAR into CHAR1.
    that is to say that CHAR1 shud have the value "Batchnumber: WEEK NO. 9 Visualinspection: OK"
    ive right now done it using 2 different internal tables and concatenating the values. want to know if theres an easier and simpler way.
    any pointers guys??
    pk

    solved it myself
    thanks to sujatha reddy's post in the following thread:
    Re: at end of  statement
    pk
    Edited by: prashanth kishan on Jul 11, 2008 9:19 AM

  • Best approach for this problem

    Hi there Experts,
    I have an Async Sync Bridge BPM, to go from Delivery Idocs to Webservice Calls, after that i map the response to a an ALEAUD Idoc structure and i send it back to erp to update the Delivery Status.
    Currently its working as intended on production system, but the Deliveries quantity is growing exponentially, raising the response times for the whole process.
    I found a solution for this problem, instead of making 1 Webservice call for each delivery, i can group those deliveries and send multiple deliveries per each webservice call, the problem i have appears when i need to map this response to single ALEAUD Idocs.
    To Update the original delivery Idoc i need that idoc number onto the ALEAUD, but the webservice returns the response based on delivery number, therefore i need a mechanism, maybe a temporal table or something, that allows me to store the idoc number with the matching delivery number so i can use it as reference to map the huge response to single ALEAUD Idocs, also i'd like to know if this is possible to achieve using graphical mapping or wich would be the best way to do it.
    Thanks for all the input on the matter,
    Regards,
    Roberto.

    Maybe you can write 2 RFC function modules to maintain values in ECC and the other to read it. These RFC's can be called from within mapping.
    I dont know if it will work or not, but can be tried.

  • Does ODI fit for this scenario...?

    hi all,
    i have 200 equipments(systems) per site . The no. of files / messages can be 2 per minute per equipment.it implies 400 files / messages per minute.
    Now size of the file/message can be 3 MB to 20 MB per minute.
    I want to take all this data and want to perform some validations, aggregations n all. Then i'l put this aggregated data into database from their
    Reporting tools,Query builder 'n other Analytical tools will take data and will generate reports,dashboards etc..
    Can i solve this problem using ODI tool...? i know i can't provide the solution for this problem using ODI only, i may have to use some other tools also with ODI.
    I guess getting the data from equipments and performing validations, aggregations n all and loading aggregated data into database can be done using ODI....?
    And for analytics have to use other analytical tools..?
    So ODI + Analytical tools can be the solution...?
    /mishit

    Hi
    My Senario is like , am transforing data from one data base to another (some amount of data). i want detailed information about how much data shoud be transfered and how much pending .some period of time like weekly report or daily reoprt i need .So this type of senarios i get small information through opreater and i need a report of this .is there any way for that.some one is told like web client .
    web client in the sence what? i installed any for that
    can any help .
    Regards
    Raghav

  • How to write the pl/sql code for this scenario

    Hi,
    Here is the data in a Table with below colums
    id firstname lastname code
    1 scott higgins 001
    2 Mike Ferrari 001
    3 Tom scottsdale 002
    4 Nick pasquale 003
    1 scott higgins 004
    I want to trap the following exceptions into an error table using pl/sql
    1. same person having multiple code
    example
    id first last Code
    1 scott higgins 001
    1 scott higgins 004
    2. Multiple persons having same code
    id first last Code
    1 scott higgins 001
    2 Mike Ferrari 001
    Could you please help me how to capture the exceptions above using pl/sql
    and what will be the structure of error table. it should capture eff date & end date

    or using analytic functions like this:
    select id, fname, lname, code from (
    with t as(
        select 1 id, 'scott' fname, 'higgins' lname, 001 code from dual
        union all
        select 2 id,'Mike ' fname, 'Ferrari' lname,  001  code from dual
        union all
        select 3 id,'Tom' fname, 'scottsdale' lname, 002  code from dual
        union all
        select 4 id,'Nick' fname, 'pasquale' lname, 003  code from dual
        union all
        select 5 id,'scott' fname, 'higgins' lname, 004  code from dual
        select t.*, count(*) over (partition by fname) row_count from t)
    where row_count > 1;
    select id, fname, lname, code from (
    with t as(
        select 1 id, 'scott' fname, 'higgins' lname, 001 code from dual
        union all
        select 2 id,'Mike ' fname, 'Ferrari' lname,  001  code from dual
        union all
        select 3 id,'Tom' fname, 'scottsdale' lname, 002  code from dual
        union all
        select 4 id,'Nick' fname, 'pasquale' lname, 003  code from dual
        union all
        select 5 id,'scott' fname, 'higgins' lname, 004  code from dual
        select t.*, count(*) over (partition by code) row_count from t)
    where row_count > 1;

  • What is the best Skype setup for this scenario?

    I live in the United States and about to be deployed to Afghanistan for about one year.  I will have a laptop and an International capable Android phone, both with Skype software installed.  I want to be able to use Skype to Skype via the laptop, and also be able to call both landlines and cell phones in the United States while I am in Afghanistan.  What would be the best Skype subscription plan in this scenario to keep costs to a minimum.

    Hi,
    As you probably know, Skype to Skype calls are free, but if you wish to call phones then you would need a subscription that covers calling destination.
    You might find Unlimited US&Canada subscription from: http://www.skype.com/go/subscriptions
    Fair usage policy applies, but 6hours per day should be more than enough to call your family and friends back home in US.
    http://www.skype.com/go/terms.fairusage
    Skype also offers Online Numbers for US. For example if you purchase an Online Number in US then your friends who don't have Skype can call that number. They will be charged with local rates and call will be directed to your Skype account when you are online on your phone or laptop.
    Tip: If you do decide to purchase an Online Number for 12months then purchase calling subscription first and you will get 50% discount on Online Number.
    https://support.skype.com/en/faq/FA331/What-is-an-Online-Number
    Andre
    If answer was helpful please mark it with Kudos and if issue is resolved mark it with solution. This will help other users find this answer more easily. Thanks in advance!

  • What Permission Set Is Required For This Scenario?

    Hello,
    I currently have an environment where I have limited users to being able to create/delete databases that only they have created.  I did this by creating a Login with Public access and granting them the  CREATE ANY DATABASE permissions to their
    login. This works great as the users can create a database and delete their database but not anyone else's. The issue I have is the following scenario:
    Database A is backed up from Server 1 (a different server than the restore server) and the DB was created by User 1
    Database A is restored to Server 2 by User 2 who has CREATE ANY DATABASE permission (successfully).
    User 2 can see the restored database, but cannot access it: The database is not accessible error.
    When I view the database I see that the DB Owner is listed as User 1.
    I've been trying to figure out how to be able to allow the user to become the new owner so they can edit/delete the database, but still not affect other databases that they do not own.
    I've played with various permission sets but they end up being able to delete other user databases which I'm trying to avoid.  I also don't want to have to change the owner myself. 
    Is there a permission set that I can grant that will allow this scenario?

    That's a valid statement, let me explain my scenario and perhaps there is a better way to construct what I'm after.
    I have a QA server where different developers create databases.  All the databases backed up on the QA server are stored in a shared folder.  This shared folder is accessible on our DEV server.  We have a different set of developers
    who at times need to restore one of the QA backups on the DEV server for different testing reasons.  The permission set applied was simply to prevent someone from accidently deleting someone else's database.  Restoring someone else's database in
    our environment is not a concern.
    With this in mind, I was hoping there was an additional permission I could grant that would allow the user restoring the database to become the new owner. I don't want the burden of approving it as it will always be approved.
    I clicked on the connect link, but it returns as invalid -
    The system has encountered an
    unexpected error. We apologize for the inconvenience. The issue will be
    addressed as quickly as possible.

Maybe you are looking for

  • Is it possible to print a calendar in iPhoto'09? Or at least export it and save it in any other format?

    I constructed a calendar in iPhoto'09, but I can't buy it, as my country (Russia) is not on the list in the drop-down box of the countries.  And I can't export it and save it in any other format so that I could print it off myself.  Would upgrading t

  • What is the longest datatype for an integer

    i tried to initialise an integer with a value 4000000000 i tried int and long... but keep having an error: integer number too large: 4000000000what should i do?

  • Exceptional aggregation on Non *** KF - Aggregation issue in the Query

    Hi Gurus, Can anyone tell me a solution for the below scenario. I am using BW 3.5 front end. I have a non cumulative KF coming from my Stock cube and Pricing KF coming from my Pricing Cube.(Both the cubes are in Multiprovider and my Query is on top o

  • Run LV Service on SAP server got message

    Dear All, I installed LVS on one of our customer's server. They are on SAP2007A PL46 SQL2005. When I tried to start the LV Service, I got this message The SAP Business One LV Service service on Local Computer started and then stoped. Some services st

  • Convert HTML to Cisco XML

    Hi all. I've got a new challenge: display information of webpage on a Cisco IP Phone. So I need to convert plain HTML to a kind of plain (cisco xml) text and display it on the phones. Does anyone knows how to do this? Any help or sugestions are welco