How to relax the constraints of the dependent package

Updating Multiple Packages
Packages might have cross-dependencies and, thus, require updating at the same time. The packaging software will enforce a consistent image on the system: If it cannot satisfy the dependencies, the packages will not be modified. If such a dependency exists, you need to relax the constraints of the affected packages.
The message reported by the packaging software will be of the following form:
# pkg update [email protected]
Creating Plan \               
pkg update: No matching version of package_name can be installed:
   Reject:  pkg://solaris/[email protected],5.11-0.175.0.2.0.3.0:...
   Reason:  All versions matching 'require' dependency
   pkg:/[email protected],5.11-0.175.0.2.0.3.0 are rejected
how to relax the constraints of the dependent package

You might also want to review the Solaris 11 documentation:
Relaxing Version Constraints Specified by Incorporations - Adding and Updating Oracle Solaris 11.1 Software Packages
-- Alan

Similar Messages

  • I have a early 2008 Macbook Pro, need to up grade the hard drive so that I can add a windows environment what is the largest drive that I can put in this computer? Is the storage size the constraint or the physical size of the drive? Or a combination?

    I have a early 2008 Macbook Pro, need to up grade the hard drive so that I can add a windows environment what is the largest drive that I can put in this computer? Is the storage size the constraint or the physical size of the drive? Or a combination?

    The largest capacity HDD that you can install is the HGST 1.5 TB HDD:
    http://eshop.macsales.com/item/HGST/0J28001/
    Any 2.5" 9.5mm thick SATA HDD will fit and operate in your MBP, regardless of capacity.
    Ciao.
    Message was edited by: OGELTHORPE

  • Not able find the constraint in the Database

    When I am trying to udpating a table I am getting error like
    ORA-00001: unique constraint (IRP_DXX.SECURITY_UK) violated
    ORA-06512: at "IRP_DXX.AU_COMPANY_ABBREVIATION_TR", line 20
    ORA-04088: error during execution of trigger 'IRP_DXX.AU_COMPANY_ABBREVIATION_TR'
    When I try to query for the constraint name in the all_cons_columns table, I am not able to find the constraint
    select * from all_cons_columns
    where constraint_name = 'SECURITY_UK';
    no rows selected.
    Can anybody help in finding what may be the issue here?

    yes this is the name of a Unique constraint on a particular column. Thanks for the pick. Its strange for me.
    Hence, we can impose unique constraint on a column without actully creation a unique constraint, but indirectly by create a UNIQUE INDEX on the column.
    Is my understand right?

  • How  to solve ora-00054 error while drop the constraint

    i am trying to drop the constraint for the table but it will give the below error.
    ora-00054 resource busy and acquire.
    can you please tell me solve this problem. but in my pc i am not using that table at any where in the system.
    ALTER TABLE EIIS_JBWSTOCK
    DROP CONSTRAINT CHK_TRAN_JOB_TYPE;
    this is my code for alter table constraint.
    thanks

    You may find <sid, serial#> and kill the session.
    SELECT c.owner,
           c.object_name,
           c.object_type,
           b.SID,
           b.serial#,
           b.status,
           b.osuser,
           b.machine
      FROM v$locked_object a, v$session b, dba_objects c
    WHERE b.SID = a.session_id AND a.object_id = c.object_id; --You may add extra condition for your table.
    ALTER SYSTEM KILL SESSION '<sid, serial#>'

  • RI - What if Parent is refreshed - What happens to the constraints

    Hello
    I have a parent-child scenario. What happens if the parent table is completely refreshed? What happens to the child table (that has cascade delete)?
    If RI exists, will a complete refresh of parent be possible?
    What is the best way to handle this scenario?
    Thx!

    One of the quickest ways of identifying duplicate records in a table is to enable a unique (or primary key) constraint knowing ahead of time that it will choke on the duplicates, but using the EXCEPTIONS INTO EXCEPTIONS clause which will cause the rowids of the duplicates to be written into an EXCEPTIONS table (created ahead of time with the rdbms\admin\utlexcpt.sql script, for example).
    Therefore, imagine a million row table into which a fresh bulk load of another million rows from some external source is to be performed.
    You declare the constraint, you disable it, you perform the load, you enable it exceptions into exceptions. You now can clear out the duplicate data quite easily, and then enable the constraint normally. Next load, you disable the constraint all over again and repeat the process.
    That's one reason you might want to repeatedly disable/enable a constraint.
    Consider another: a unique or primary key constraint, by default, when created, will cause a unique index to be created on the table. Doing any sort of bulk load into a table, even of data you know to be 'clean' of duplicates, will be a lot slower as a result of each insert having to update the index. Therefore, one of the things you commonly do to speed up bulk loads is to disable constraints, therefore removing any associated unique indexes, and therefore saving the time that would normally be spent validating data and updating indexes.
    In the old days before they invented deferrable constraints, too, you might well be engaged in a complex data loading process that involves quite a few steps before finishing. If looked at during those steps, you'd say the data violated constraints... but if you just let me get to the end, you'll find that the data all comes good. So you had to be able to disable constraint checking at the start of the process, and re-enable it at the end. A good example of that is Scott's EMP table: it has a manager column and the data in that column refers to the EMPNO column. A manager has to be an employee, after all! So there's a self-referencing foreign key on that MGR column, which means you can't load a record containing a manager number that isn't already an employee. But if you were doing a bulk load of data into that table, starting with it completely empty, that would mean you'd never be able to load any data into it at all: no employees exist to begin with, so there are no managers, so the first employee record that mentions a manager gets rejected. But if only you'd let me do the initial load of data, you'd see that every manager mentioned DOES turn out to be an employee! At the end of the load, it all comes good, and everything is properly related to everything else.
    Short of being very careful about the order in which you load such data (presumably, you'd have to start with Mr King, who runs the company, then deal with the first lot of managers, and then you could load the clerks and salesmen), the simplest way of resolving that sort of Catch-22 is to disable the constraint, do the load, and then enable the constraint when all the data is in place and therefore all the relationships between records have come good.
    (As I say, since version 8.0 of Oracle, you can also defer the constraint so that it only gets checked at the point you commit all the data, but leaving that subtlety aside, it's still a classic example of where switching constraint checking on and off proves very useful).

  • Creating Constraint for a Dependency

    Hi all,
            I am basically an abaper, there is a requirement to create a constraint for a dependency. when i go to the transaction cu01 to create a dependency and try to add the code for constraint. The screen do not show me any means of adding the code. There are options for adding PRECONDITION, ACTION, SELECTION CONDITION and PROCEDURE. But when i go to the transaction CU03 i see additional option on the screen RULES & CONSTRAINTS. Could any help me out in creating the constraint for a dependency.
    Thanks in Advance
    Shiv

    Hi Gurav,
                 Thanks for your previous reply. The requirement is, i have to compare the weight assigned to the material while creating sales order with the Max and Min weight set for the material in variant configuration.
    Example:
                 Material     -     312.
                 Min Weight  -   100 Gm
                 Max Weight -   1000 Gm
    Variant Configuration details:
    Varient Tables - ZZDIS_FMT_MINWGT & ZZDIS_FMT_MAXWGT
    Dependency Net - ZZDIS_CON_FMT_MINWGT & ZZDIS_CON_FMT_MAXWGT
    Dependency (constraint) -   ZZDIS_CON_FMT_MIN01 &   ZZDIS_CON_FMT_MAX01.
    Characteristics : ZZDIS_FMT_MINWGT & ZZDIS_FMT_MAXWGT
    Class: ZZDIS_FMT_MINWGT & ZZDIS_FMT_MAXWGT.
            I need to write the code in the constraint with the above details. Could you please help me on that.
    My official mail Id - [email protected]
    Thanks in Advance
    Shiv

  • How to disable all Constraints for a Table

    Hi There,
    So I have a table that I need to delete a significant amount of records from. Using some advice I found it better to select the records that I wanted to keep into a temporary table and then truncate the original table. After that I insert the contents of the temp table into the original table.
    So now I am thinking I could speed this up even more if I disable all the constraints on the original table.
    Is there an easy way to do this or do I need to disable each constraint individually?
    thanks
    John

    http://forums.oracle.com/forums/search.jspa?threadID=&q=disable+all+constraints+&objID=c84&dateRange=all&userID=&numResults=15

  • How to apply the constraint ONLY to new rows

    Hi, Gurus:
       I have one question as follows:
       We need to migrate a legacy system to a new production server. I am required to add two columns to every table in order to record who updates the row most recently through triggers, and  I should apply not null constraint to the columns . However, since legacy system already has data for every table, and old data does not have value for the 2 new columns. If we apply the constraint, all of existing rows will raise exception. I wonder if there is possibility to apply the constraint ONLY to new rows to come in future.
    Thanks.
    Sam

       We need to migrate a legacy system to a new production server. I am required to add two columns to every table in order to record who updates the row most recently through triggers, and  I should apply not null constraint to the columns .
    The best suggestion I can give you is that you make sure management documents the name of the person that came up with that hair-brained requirement so they can be sufficiently punished in the future for the tremendous waste of human and database resources they caused for which they got virtually NOTHING in return.
    I have seen many systems over the past 25+years that have added columns such as those: CREATED_DATE, CREATED_BY, MODIFIED_DATE, MODIFIED_BY.
    I have yet to see even ONE system where that information is actually useful for any real purpose. Many systems have application/schema users and those users can modify the data. Also, any DBA can modify the data and many of them can connect as the schema owner to do that.
    Many tables also get updated by other applications or bulk load processes and those processes use generic connections that can NOT be tied back to any particular system.
    The net result is that those columns will be populated by user names that are utterly useless for any auditing purposes.
    If a user is allowed to modify a table they are allowed to modify a table. If you want to track that you should implement a proper security strategy using Oracle's AUDIT functionality.
    Cluttering up ALL, or even many, of your tables with such columns is a TERRIBLE idea. Worse is adding triggers that server no other purpose but capture useless infomation but, because they are PL/SQL cause performance impacts just aggravates the total impact.
    It is certainly appropriate to be concerned about the security and auditability of your important data. But adding columns and triggers such as those proposed is NOT the proper solution to achieve that security.
    Before your organization makes such an idiotic decision you should propose that the same steps be taken before adding that functionality that you should take before the addition of ANY MAJOR structural or application changes:
    1. document the actual requirement
    2. document and justify the business reasons for that requirement
    3. perform testing that shows the impact of that requirement on the production system
    4. determine the resource cost (people, storage, etc) of implementing that requirement
    5. demonstrate how that information will actually be used EFFECTIVELY for some business purpose
    As regards items #1 and #2 above the requirement should be stated in terms of the PROBLEM to be solved, not some preconceived notion of the solution that should be used.
    Your org should also talk to other orgs or other depts in your same org that have used your proposed solution and find out how useful it has been for them. If you do this research you will likely find that it hasn't met their needs at all.
    And in your own org there are likely some applications with tables that already have such columns. Has anyone there EVER used those columns and found them invaluable for identifying and resolving any actual problem?
    If you can't use them and their data for some important process why add them to begin with?
    IMHO it is a total waste of time and resources to add such columns to ALL of your tables. Any such approach to auditing or security should, at most, be limited to those tables with key data that needs to be protected and only then when you cannot implement the proper 'best practices' auditing.
    A migration is difficult enough without adding useless additional requirements like those. You have FAR more important things you can do with the resources you have available:
    1. Capture ALL DDL for the existing system into a version control system
    2. Train your developers on using the version control system
    3. Determining the proper configuration of the new server and system. It is almost a CERTAINTY that settings will get changed and performance will suffer even though you don't think you have changed anything at all.
    4. Validating that the data has been migrated successfully. That can involve extensive querying and comparison to make sure data has not been altered during the migration. The process of validating a SINGLE TABLE is more difficult if the table structures are not the same. And they won't be if you add two columns to every table; every single query you do will have to specify the columns by name in order to EXCLUDE your two new columns.
    5. Validating the performance of the app on the new system. There WILL BE problems where things don't work like they used to. You need to find those problems and fix them
    6. Capturing the proper statistics after the data has been migrated and all of the indexes have been rebuilt.
    7. Capturing the new execution plans to use a a baseline for when things go wrong in the future.
    If it is worth doing it is worth doing right.

  • How to disable the link to subreports depending upon value

    Hi,
    We have a table in ssrs reports where we are showing failed, success and cancelled status. for come condition.
    now for failed we are going to subreports as we are passing some parameter from parent report.
    but for success and cancelled we don't want to go that subreport.(problem is its going to the subreport)
    this status are coming in the same cell depending some condition.
    any body has idea how to do it
    simanta

    Hi fanu987,
    As per my understanding, there is a field includes three values: failed, success and cancelled. If the value of field is failed, you want to jump to the subreport when you click the field. Otherwise, it will not jump to the subreport. If that is the case,
    we can use Go to URL action to achieve your goal. For detail information, please refer to the following steps:
      1. In design mode, right-click the text box and open Text Box Properties dialog box.
      2. Click Action in left pane, select Go to URL.
      3. Click (fx) button, and type the expression like below:
    =iif(Fields!FieldName.Value="failed ","http://ServerName/reportserver?/folderName/SubreportName","")
      4. Right-click the subreport in the project, then click deploy to publish the report to report manager.
      5. Right-click the master report in the project, then deploy the report to report manager. When you preview the report and click failed value, it will jump to the subreport.
    If there is any question, please feel free to ask.
    Thanks,
    Wendy Fu

  • How to generate a pwm whose duty cycle has to be varied depending on the frequency of the input trigger??

    Sir/madam,
    I am quite new to the Labview FPGA module. 
    I am currently working on an application where i have to generate a pwm to control the ON time duration . The pwm ON time depends on the frequency of an input signal.I want to retrieve the ON time value from a lookup table which contain the duty cycle values for the pwm at different frequencies of the input.This input also has to work as a trigger. The input is pulsed and the trigger has to generate the pwm at every rising edge. I am using a PCI 7831R-  RIO series. I am urgently in need of some help and i have hardly a  couple of weeks before i meet the deadline.
    I am trying hard to learn the basics from the shipping examples but not able to quite make my vi work. 
    Kindly lend a helping hand and i would be very grateful to anybody who can help. Thanks a lot in advance.
    Do ask in case of any clarifications required. I  think i have explained my problem quite well. 

    Hello Manu,
    You can refer 
    Developing a PWM Interface using LabVIEW FPGA or PWM Output with LabVIEW FPGA
    How this helps.
    Best Regards, 
    Hardik Asawa
    AE
    National Instruments  
    Message Edited by Hardik Asawa on 05-05-2010 12:46 AM

  • How to switch off the constraints of database tables

    hi
    how can i switch off the constraints of database tables?

    But do remember that if once you disable to constraints, enter some data and then try to enable again. If there is some conflicting data, Oracle is going to cry and will not allow you to enable the constraints.
    Sidhu

  • HT4927 This may take a long time, depending on the size of the library. How long is a long time? Mine has been rebuilding now for over 24hrs?

    "This may take a long time, depending on the size of the library". How long is a long time? Mine has been rebuilding now for over 24hrs?

    "This may take a long time, depending on the size of the library"
    How big is your library?
    OT

  • How to search with multiple constraints in the new java API?

    I'm having a problem using the new MDM API to do searches with multiple constraints.  Here are the classes I'm trying to use, and the scenario I'm trying to implement:
    Classes:
    SearchItem: Interface
    SearchGroup: implements SearchItem, empty constructor,
                 addSearchItem (requires SearchDimension and SearchConstraint, or just a SearchItem),
                 setComparisonOperator
    SearchParameter: implements SearchItem, constructor requires SearchDimension and SearchConstraint objects
    Search: extends SearchGroup, constructor requires TableId object
    RetrieveLimitedRecordsCommand: setSearch method requires Search object
    FieldDimension: constructor requires FieldId object or FieldIds[] fieldPath
    TextSearchConstraint: constructor requires string value and int comparisonOperator(enum)
    BooleanSearchConstraint: constructor requires boolean value
    Scenario:
    Okay, so say we have a main table, Products.  We want to search the table for the following:
    field IsActive = true
    field ProductColor = red or blue or green
    So the question is how to build this search with the above classes?  Everything I've tried so far results in the following error:
    Exception in thread "main" java.lang.UnsupportedOperationException: Search group nesting is currently not supported.
         at com.sap.mdm.search.SearchGroup.addSearchItem(Unknown Source)
    I can do just the ProductColor search like this:
    Search mySearch = new Search(<Products TableId>);
    mySearch.setComparisonOperator(Search.OR_OPERATOR);
    FieldDimension myColorFieldDim = new FieldDimension(<ProductColor FieldId>);
    TextSearchConstraint myTextConRed = new TextSearchConstraint("red",TextSearchConstraint.EQUALS);
    TextSearchConstraint myTextConBlue = new TextSearchConstraint("blue",TextSearchConstraint.EQUALS);
    TextSearchConstraint myTextConGreen = new TextSearchConstraint("green",TextSearchConstraint.EQUALS);
    mySearch.addSearchItem(myColorFieldDim,myTextConRed);
    mySearch.addSearchItem(myColorFieldDim,myTextConBlue);
    mySearch.addSearchItem(myColorFieldDim,myTextConGreen);
    the question is how do I add the AND of the BooleanSearchConstraint?
    FieldDimension myActiveFieldDim = new FieldDimension(<IsActive FieldId>);
    BooleanSearchConstraint myBoolCon = new BooleanSearchConstraint(true);
    I can't just add it to mySearch because mySearch is using OR operator, so it would return ALL of the Products records that match IsActive = true.  I tried creating a higher level Search object like this:
    Search topSearch = new Search(<Products TableId>);
    topSearch.setComparisonOperator(Search.AND_OPERATOR);
    topSearch.addSearchItem(mySearch);
    topSearch.addSearchItem(myActiveFieldDim,myBoolCon);
    But when I do this I get the above "Search group nesting is currently not supported" error.  Does that mean this kind of search cannot be done with the new MDM API?

    I'm actually testing a pre-release of SP05 right now, and it still is not functional.  The best that can be done is to use a PickListSearchConstraint to act as an OR within a field.  But PickList is limited to lookup Id values, text attribute values, numeric attribute values and coupled attribute values.  It works for me in some cases where I have lookup Id values, but not in other cases where the users want to search on multiple text values within a single field.

  • How to call a function with the dependency editor

    Dear all,
    I am trying to develop a dependency for the charectristics with dependency editor (t-code:CU01).
    But it always occurs an error "E28152 Function myfunction is not declared".
    The following is the snatch of my code. Please provide me the solution, documents or reference link
    about how to write code with the dependency editor.
    000010 pfunction myfunction(
    000020   ......
    000030 )
    Thanks & Regards,
    Red

    I did not get any errors doing this..refer to:
    http://help.sap.com/saphelp_erp60_sp/helpdata/EN/92/58c3fc417011d189ec0000e81ddfac/frameset.htm
    http://help.sap.com/saphelp_erp60_sp/helpdata/EN/92/58c3fc417011d189ec0000e81ddfac/frameset.htm
    Check if you have the relevant access / authorizations.
    Regards,
    Srini

  • How to extract all keys (PK, FK) and constraints from the schema?

    hi,
    How to extract all keys (PK, FK) and constraints from the schema?Thanks!

    I have no idea about any tool which only extract the DDL of constraints.
    In oracle 9i or above you can use DBMS_METADATA to extract DDL of a table which also shows contraints defination.
    You can take the tables export without data and use databee tool to extract the DDL.
    www.databee.com

Maybe you are looking for

  • Spanish keyboard repair in US

    I reacently encountered a problem (water damadge) with my mac 13" keyboard, it was bought in spain, i was wondering if the Western replacement keyboards would fit this MBA since there seems to be not only a diference with the "ñ" key but also with th

  • EXIT_SAPLATPC_002 / AVAILABILITY_CHECK_CONTROLLER

    I want to use user exit "EXIT_SAPLATPC_002" (include ZXATPU02) to ensure that the number of sales units is a multiple of the delivery unit to avoid broken boxes. However, I find that once the code has run in "AVAILABILITY_CHECK_CONTROLLER" the result

  • Budget tolerence limit error

    Hi, I've maintained the tolerence limit for Budget at 70% for "all activity groups". As per my understanding if I set it for all activity groups then I should get an error or a warning message all levels of transaction i.e. attaching a material in pr

  • Iphoto sharing iphoto album text

    When I save my iphoto album to Photostream, external hard drive or a USB,  the faces, descriptions, and comments are not included.  How can I share the complete album with another Mac or PC?  Also, how do i get the text info included when I create a

  • My Windows 64 bit system won't recognize my ipod/itunes...

    I am using Windows 7... My brother and I also share the same computer, and our music libraries are quite different.  I do not know if that has anything to do with it or not.  I've tried removing the software and reinstalling it, but it doesn't always