Database level settings

Please advise database level settings for all our databases for following items:
 Virtual Log File
 Database file growth settings
- And suggest what are the best practices around these items that we need to follow for future new databases. What are different things to consider?
- And also what would we need to do before making these changes on current databases?
Thanks,

Can you refer the below link
https://www.simple-talk.com/sql/database-administration/sql-server-database-growth-and-autogrowth-settings/
http://www.sqlskills.com/blogs/kimberly/8-steps-to-better-transaction-log-throughput/
--Prashanth

Similar Messages

  • Database-level error reported by JDBC driver

    Hi everyone,
    I get an error in my Sender JDBC adapter. I am executing a stored procedure every minute. In total I have 34 JDBC senders for every different database. The error is like below. The problem is in Alert Inbox I cant see the rest of the message. This error occures just once every hour (average). I dont know why and I dont know what the rest of the error is saying.
    Any ideas how i can see the rest of the message? And where the error might be?
    Note: In communication channel monitoring everything is green.
    Database-level error reported by JDBC driver while executing statement 'EXEC [TB

    Hi,
    obviously there are concurring accesses to the table your stored procedure is accessing. This is nothing serious, because SAP XI will retry the execution of the stored procedure according to the settings of the communication channels (3times all 5 minutes is the default).  Additionally you can set the repeats in case of SQL errors in the extended tab of the communication channel configuration.
    Tuning the transaction isolation levels might help to avoid the error but can decrease the consistency of the data as well. Default isolation level is always the highest available in a database (in case of SQL Server it should be SERIALIZABLE), so locks might easily occur. I would suggest to have a look at other applications accessing the tables and determine their isolation level.
    All in all, I guess there shouldn't be real problems, because of the repeated execution of you stored procedure, but it's worth to have a look at the isolation level.
    Regards Sven

  • NLS_DATE_FORMAT on database level?

    Hi all,
    I would like to change the NLS_DATE_FORMAT parameter so that the change will be reflected in the NLS_DATABASE_PARAMETERS view. I always thought it would be alter system, but that only changes it for all newly connecting sessions. Appearantly this can also be set on a database level. What is this database level? What does it mean if the two are different? See details below.
    Thanx in advance,
    Lennert
    SQL> select * from nls_database_parameters where parameter like
    SQL> 'NLS_DA%';
    PARAMETER                      VALUE
    NLS_DATE_FORMAT                DD-MON-RR
    NLS_DATE_LANGUAGE              AMERICAN
    SQL> select * from nls_session_parameters where parameter like
    SQL> 'NLS_DA%';
    PARAMETER                      VALUE
    NLS_DATE_FORMAT                DD-MM-RR
    NLS_DATE_LANGUAGE              DUTCH
    SQL> alter system set nls_date_format='DD-MM-YYYY' scope = spfile;
    System altered.
    SQL> shutdown immediate;
    Database closed.
    Database dismounted.
    ORACLE instance shut down.
    SQL> startup open
    ORACLE instance started.
    Database mounted.
    Database opened.
    SQL>  select * from nls_database_parameters where parameter like
    SQL> 'NLS_DA%';
    PARAMETER                      VALUE
    NLS_DATE_FORMAT                DD-MON-RR      ---> still the same, why?
    NLS_DATE_LANGUAGE              AMERICAN     ---> still the same, why?
    SQL> select * from nls_session_parameters where parameter like
    SQL> 'NLS_DA%';
    PARAMETER                      VALUE
    NLS_DATE_FORMAT                DD-MM-YYYY   ---> changed
    NLS_DATE_LANGUAGE              DUTCH           ---> changed
    SQL>

    SQL> alter system set nls_date_format='DD-MM-YYYY' scope = spfile;
    SQL> select * from nls_database_parameters where parameter like
    SQL> 'NLS_DA%';
    NLS_DATE_FORMAT DD-MON-RR ---> still the same, why?It is because "alter system" has changed instance parameter and not database parameter.
    When you query nls_instance_parameters then you will see that nls_date_format parameter has changed.
    Note there are three views:
    1. nls_session_parameters - settings at session level, change can be made using "alter session".
    2. nls_instance_parameters - settings at instance level, taken from parameter file (pfile/spfile) change can be made using "alter system"
    3. nls_database_parameters - settings at database level, set during DB creation, you can't change them.
    What is this database level? What does it mean if the two are different? See details below.http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14220/intro.htm#sthref11
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14225/ch3globenv.htm#sthref217

  • How to remove Leading zeros from MATNR "at Database Level"

    Hi,
    When we create a material in MM01 , we can either create material as Internal Material No. or External Material No. (By Explicitly entering a material No.)
    Our requirement is that: When we create an Internal Material , there should NOT be any Leading Zeros in MATNR.
    and for External Material No.  , Leading Zeros Should be there.  (In Database)
    1) Now , we can create the No. Range for material in such a way that it will allow only AlphaNumeric entries for Internal Material , and Only Numeric Entries for External Material No.
    This should slove the Purpose.
    But According to the Client's requirement , Material No can be Numeric OR Alphanumeric for Internal As well as External Material No.
    2) We can Add or Remove leading zeros from MATNR at application level.  i.e.  we can find some BADI / Enhancement where we will Use the Conversion FM (CONVERSION_EXIT_MATN1_OUTPUT or  CONVERSION_EXIT_ALPHA_OUTPUT) and remove Leading Zeros from MATNR. But these changes will NOT reflect at Database Level , Because in MATNR Domain , there is a conversion Routine  , That will Add Leading Zeros in MATNR While saving in the Database.
    3) While Displaying the Material In SE11 , we can also set the "Display Format" as With leading Zeros OR Without Leading Zeros.
    But we dont want to Just "Display" material with OR Without Leading Zeros.  we want to actually create materials in that way.
    What can we do , so that at Database Level our changes are Reflected.
    Thanks and Regards,
    Anand Gore
    Edited by: anandgore on May 18, 2011 3:47 PM

    That will Add Leading Zeros in MATNR While saving in the Database.
    That is because you have it configured that way.  You can configure the storage as you desire.  The default is the behavior you are describing.  You, or your functional analyst, need to review the documentation on the MM config settings for material master number storage.
    Never use CONVERSION_EXIT_ALPHA_OUTPUT for material conversion.  Also keep in mind that the MATN1 functions have their own BADI for extended formatting...

  • Error while Assigning database level role (db_datareader) to SQL login (Domain Account)

    Team,
    I got an error while creating a User for Domain Account. Below is the screen shot of the error (error : 15401)
    Database instance is on SQL 2000 SP3. ( I know it is out of support, But the customer is relutanct to upgrade)
    On Google search, i found below article which is best matching for this error
    http://support.microsoft.com/kb/324321
    I have follows each step of troubleshooting. But still the issue persists.
    Step 1. The login does not exist == The login is very much exist in the domain as i am able to add the same domain id to other database instances
    Step 2. Duplicate security identifiers == i have used this query to find duplicate SID
    /*  SELECT name FROM syslogins WHERE sid = SUSER_SID ('YourDomain\YourLogin') */
    But there was only one row returned with create date of today's.
    Error while Assigning database level role (db_datareader) to SQL login (Domain Account) 
    Step 3. Authentication failure == Domain is available. User is able to login on other servers via RDP connection.
    Step 4. Case sensitivity == Database collation is set to Case insensitivity. (CI)
    Other two 5. Local Accounts & 6. Name resolution == is not applicable to me.
    I tried other ways also.
    A. Creating login and providing permission in one go only = User account is not created
    B. Instead of GUI, use query to create login and provide required permission = Same error.
    Does anybody has faced any such situation
    Chetan

    See the below output
    srvid
    sid
    xstatus
    xdate1
    xdate2
    name
    password
    dbid
    language
    isrpcinmap
    ishqoutmap
    selfoutmap
    NULL
    0x010500000000000515000000A1F66E1BFC1DC75D26E72530A2B80400
    14
    20:25.9
    57:33.4
    UKBAA\LHRAPPMuttavarapuS
    NULL
    1
    us_english
    0
    0
    0
    Chetan

  • Table does not refresh after database level action

    We have defined an Advanced table on a VO to represent invoice lines. The VO is referenced by a VL from the invoice header.
    At one point during our process, a database level package is executed on the current lines. We have a "Requery" button to refresh the lines, however, the lines do not refresh unless we exit the invoice entirely and come back to it via the VL.
    How can we refresh the lines on a current table after a database level transaction occurs?
    Thanks, Jerry.
    Our AM code is as follows...
    // handleRefreshInvoiceLines( )
    public void handleRefreshInvoiceLines()
    OAViewObject vo = (OAViewObject)getInvoiceLinesVO1( );
    if( !vo.isPreparedForExecution( ) )
    getTransaction( ).validate();
    vo.executeQuery( );
    clearVOCaches( "InvoiceLinesEO", true ); // troubleshooting
    } // handleRefreshInvoiceLines()

    Thank you. I've moved clearCache around a bit. apparently vo.isPreparedForExecution() is false. I wonder why?
    --Jerry.                                                                                                                                                                                                                                                   

  • How to use JNDI datasource instead of database connection settings JDev 10g

    Hi,
    In order to use the different database from other environments, we are not able to use the JNDI datasource configuration settings, all the time need to configure the database connection settings from JDeveloper by changing the database connectivity settings in the JDeveloper for each environment separately, need a solution on how to make the database connectivity unique using the JNDI datasource name for all the environments for database connectivity settings through the application server console rather than changing the database adapter configuration in JDeveloper.
    Please provide the update at the earliest. Your help is greatly appreciated. Thanks in advance..

    What are you not clear on?
    What you need to do is get your developers to conform to a database naming standard, as stated above, so if you have an oracle database that is for eBusiness Suite you get all developers to create a DB connection in JDev called, ora_esb as an example.
    When the developer creates a DB adapter this will create a JNDI name of eis/DB/ora_ebs. When the BPEL project is deployed it looks for the JNDI name in the oc4j-ra.xml file to see its connection details. If they don't exist then they use the developers connection details. The issue with this is that they generally always point to the development DB. It is best practice for the developers to remove the mcf settings in the DB adapter WSDL. This way if the JNDI name has not been configured it will fail.
    So when you migrate from dev-test-prod what you have is the JNDI name eis/DB/ora_ebs. The dev points to the dev instance of ebs, test points to the test instance and so on. This means that you don't need to adjust any code in the BPEL projects.
    cheers
    James

  • SYS user connects at database level, is it correct?

    My senior colleague has given me following information about the sys user. I want to know, is it correct?
    Since SYS user connects at the database level, therefore, on killing the active session of the SYS user,only the current statement is cancelled. The database session does not disconnect. Instead it continues to run the remaining statements in the script file in case we are running a script file containing a lot of SQL statements.
    Moazzam

    Moazzam wrote:
    My senior colleague has given me following information about the sys user. I want to know, is it correct?
    Since SYS user connects at the database level, therefore, on killing the active session of the SYS user,only the current statement is cancelled. The database session does not disconnect. Instead it continues to run the remaining statements in the script file in case we are running a script file containing a lot of SQL statements.Running a SQL script very likely means SQL*Plus is used. One of two types of Oracle sessions will be created via sqlplus. A dedicated session. Or a shared server session.
    A dedicated session can also be local (sqlplus connects "directly" to the dedicated server process), or remote (sqlplus connects via tcp/ip to the dedicated server process).
    A server session is usually "killed" using the alter system kill session command. Despite the differences between shared and dedicated server connections, the end result is the same. The session terminates abnormally (session UGA will be released, session will be cleaned up, rolled back, etc) - and the session ceases to exist.
    So irrespective of how that sqlplus session runs that script - the session, when killed, will cause a sqlplus failure. And no subsequent script commands would be executed by that Oracle session.
    What can happen is that sqlplus continues running, continues reading the script, and continues submitting commands to be executed. However, with the server session killed, there is no server process to service the commands submitted by the sqlplus client. In this case, sqlplus will throw the error "+SP2-0640: Not connected+" after each of the commands it tries to execute after the server session was killed.
    The only time when sqlplus will be able to continue is when the session is not killed, but interrupted. The Oracle Call Interface (OCI) supports a OCIBreak() call - allowing the client to interrupt-and-abort the request that its server session is currently executing.
    For example, sqlplus sends a OCIBreak() while it waits for a server response (e.g. waiting for the answer to a SQL select query), when the user presses Ctrl-Break to abort that request.
    In this case, the session still exists - and the client can issue a new request that the session will service. But an OCIBreak() cannot be triggered (to my knowledge) externally from another Oracle session. You need to send the client process a "break request" (like a Ctrl-Break keystroke) in order to trigger that client process to make an OCIBreak() call to Oracle and interrupt its server process.

  • How To Find Out Table Name At Database Level Trigger

    Table Name
    How do I will come to know the name of table on which currently DML statement is fired ?
    e. g. My current user is SCOTT. So Scott should realize the event of DML Statement fired. Hence my question is how the trigger on User Level is to be written & how to catch the event & table name on which the statement is being executed.
    Suppose my table is as follows
    Table Name : EMP
    EMPNO number
    ENAME varchar2(10)
    JOB varchar2(9)
    MGR number
    HIREDATE date
    SAL number
    COMM number
    DEPTNO number
    Bcd number
    Brcd number
    Rec_id number
    Scn_no number
    Sequences for above table
    1. EMP_REC_ID - minimum value is 1 increment by 1 & max value is unlimited
    2. EMP_SCN_NO - minimum value is 1 increment by 1 & max value is unlimited
    I have written a trigger as follows :
    CREATE OR REPLACE TRIGGER UPD_EMP_REC_SCN BEFORE INSERT OR UPDATE OR DELETE ON EMP REFERENCING NEW AS NEW OLD AS OLD FOR EACH ROW
    DECLARE
    BEGIN
    IF (INSERTING) THEN
    SELECT EMP_REC_ID.NEXTVAL INTO :NEW.REC_ID FROM DUAL;
    ELSIF (UPDATING) THEN
    SELECT EMP_SCN_NO.NEXTVAL INTO :NEW.SCN_NO FROM DUAL;
    ELSIF (DELETING) THEN
    INSERT INTO DELETED_ROWS (TAB_NAME, BCD, BRCD, REC_ID)
    VALUES (‘EMP’, :OLD.BCD, :OLD.BRCD, :OLD.REC_ID);
    END IF;
    END;
    Hence My problem is
    If my database user contains 800 tables then I have to write down 800 triggers (i.e. for each table).
    I want to write only one trigger at database level so that I can dynamically take actions, for this I need table name and the event on which the dml event has taken place.
    Any help in this matter will be greatly appreceiated.
    Regards
    Vikrant

    You cannot write a single trigger that applies to multiple tables. If you need a trigger for every table, you will need to generate 800 triggers.
    Rather than manually generating 800 triggers, you might write a code generator that generates triggers for each of the tables by querying the data dictionary (i.e. DBA_TABLES, DBA_TAB_COLS, etc).
    Justin

  • In ecatt - how to check at database level using ABAP

    Hi,
    How to check at database level using ABAP in Ecatt tool.
    say,for example I want to check a particular sales order is invoiced or not ,at the database level and if it is invoiced I have stop proceeding to invoicing of that sales order number.
    Could anybody suggest on this with an example?
    thanks.

    Hi,
    you can use the command GETTAB to access single db records.
    Full specified or partitial specified keys can be use at GETTAB. It will return always only one record, also if a couple could match your selection.
    For more advanced scenarios you can also use eCATTs Inline ABAP. In a block between the commands ABAP. ENDABAP. you can code ABAP statements, e.g. SELECT ... INTO TABLE ...
    eCATT script parameters of type 'V' defined in that script using ABAP/ENDABAP will be transfered into the ABAP block and back to script after ABAP perform.
    Best regards
    Jens

  • HOW TO GET INFORMATION ABOUT THE CLIENT MACHINE AT DATABASE LEVEL

    HOW TO GET INFORMATION ABOUT THE CLIENT MACHINE AT DATABASE LEVEL USING 10g Database and 10g Application Server
    we have developed an application using oracle forms 10g with
    oracle database 10g and Application server 10g
    Application uses a single Oracle User name to connect to database
    where as at Application level there are different users (these are not database users)
    Now how can we get the information about the user/his machine etc. at database level. earlier in 6i/8i we use to get by using
    USERENV('TERMINAL')
    we had written a triggers on tables on Insert/Update where we used to update a database field Last user terminal with USERENV('TERMINAL')
    but not this information is comming to be the machine name of application server where as we wish this to be the machine name of Client. Any Way outs
    thanks
    Chaand Kackria

    hi, you can use the sys_context function, like this:
    select sys_context('userenv','current_user'),
         sys_context('userenv','os_user'),
         sys_context('userenv','host'),
         sys_context('userenv','ip_address'),
         sys_context('userenv','instance'),
         sys_context('userenv','sessionid'),
         sys_context('userenv','terminal')
    from dual;
    Is this what you 're looking for?

  • Database Level Security not working ???

    The 10 g (10.1.2.1) documentation states the following:
    Chapter 7 Controlling access to information:
    "Regardless of the access permissions and task privileges that you set in Discoverer Administrator, a Discoverer end user only sees folders if that user has been granted the following database privileges (either directly or through a database role):
    ex: SELECT privilege on all the underlying tables used in the folder "
    So how come a folder (view in my case - not table) cannot be queried directly by a user, but the folder still shows up a choice when building a report using PLUS ? I am misreading the above ? For is sounds lilke to me if the user account does not have SELECT privilege then they will not see the folder in Discoverer ?
    Anyone run into the same issue or have an explanantion ?
    thanks
    OBX

    I think the user has access to see all the folders in the business area in Discoverer if he has permission to do so. This is a Discoverer level security to filter people who should not have access to the business area at all. You'll find that although they can see these Discoverer folders because the permission is set in Discoverer Administrator, that the database tables they are based on will not allow the users to see any of the data if they don't have those rights at the database level.

  • Web Database Level Security

    Is there any tools in an Oracle database or by Oracle to encrypt data in the database in 8.05 database or earlier?
    How to be secure my database from hacking people???
    If anybody have an ideas about Database Level Security, Please send message to my [email protected] id...

    Hi Soni,
    Yes, you could.
    Two options I can think of:
    - Program WS-Security for your specific web service
    - Use Oracle Web Services Manager
    With respect to the latter: I believe there is no OWSM policy step to handle your requirement out of the box. OWSM comes with policy steps for file-based authentication as well as for authenticating against Active Directory or a generic LDAP directory.
    In this case you could opt for the creation of a custom policy step.
    Hope this help.
    Sjoerd

  • Updates to the table from the database level.

    Hi Dear All,
    If we do some updates to the table at the Database Level, like i deleted some records from the table at the Oracle level. But I'm still able to see the same deleted records from the Data Dictionary(SE11) at the application level.
    Can you pl explain the mechanism, that how it is possible and why.
    best regards
    Mahesh

    transparent tables store data directly....if you delete some data from transparent tables, the same is reflected in the database (oracle) but the reverse is not true...if you modify the database table contents directly...the dictionary table remains intact...
    transparent tables have a one-to-one relationship with the database tables....
    hope that clarifies a bit....
    (somebody correct me if i am horribly wrong)

  • Batch level settings conversion from plant-to-plant to material-level

    If someone has worked on conversion of Batch level settings within SAP from plant-to-plant to material-level batch management, please let me know what are the challenges involved in doing this. Thanks.

    Check help !!!
    To change the batch level, you have to start a conversion program. This program first checks whether conversion is possible and outputs an error log containing all batch numbers that occur more than once at the new level. Now you need to manually transfer these batch records to batch numbers within Inventory Management using a transfer posting 'material to material'.
    If you change the level from plant level to a higher level, it is possible that batches with the same batch numbers in different plants are actually identical.  In this case, all you have to do is remove stocks (including previous period stocks) so that all batches with the same batch numbers, except one, can be reorganized.
    When you convert from plant level to material level, the material is then to be handled in batches in all plants in which it is defined.
    Standard settings
    In the standard delivery batches are uniquely defined at material level.
    Activities
    To change the batch level, proceed as follows:
    1. Choose the level at which you want your batches to be unique.
    2. Save the new settings and choose Back.
    3. Choose Batch level -> Conversion.
    4. If necessary, carry out the conversion in test mode first.
    Further notes
    Note that you cannot reset conversion from plant level to a higher level in the standard.
    If you create a client by copying an existing client, initially, there are no settings in the target client at batch level. The system makes the settings in the target client only when you have carried out an activity (maintaining master data or posting a goods movement, for example). The client level is not transported. If you copy a client with client level, you have to manually set the client level in the target system.

Maybe you are looking for

  • No Photo/Video switch visible after updating OS 3.0

    Hi All, I have iPhone 3G and updated software to OS 3.0 (recently released) but it didn't enable photo/video switch on my iphone. I can't record video right now. Can some one explain me if its compatible with old iPhone 3G or only with iPhone 3G(S).

  • ScrollPane not validating correctly when scaling a JPanel

    I am drawing on a JPanel which I would like to be able to zoom on, im using the 2D graphics for zooming. When I scale the scaling seems to work correctly, but when the JPanel gets bigger then the size it had when I added it to the ScrollPane, the gra

  • Can you download DVDs to itunes

    Is it possible to move movies and tv shows on DVD to itunes so you dont have to buy them?

  • XP laptop now has "limited connectivity"

    I have an airport extreme, a mac mini, a macbook, and a toshiba using XP all working just fine on my wireless network. However, my HP laptop also using XP cannot connect any more to my network, I get excellent signal strength but limited or no connec

  • Query Required For sort

    CREATE TABLE COUNTRIES(CID INT,CNAME VARCHAR(100)) INSERT INTO COUNTRIES VALUES(1,'AUSTRALIA') INSERT INTO COUNTRIES VALUES(2,'INDIA') INSERT INTO COUNTRIES VALUES(3,'FRANCE') INSERT INTO COUNTRIES VALUES(4,'BERLIN') SELECT * FROM COUNTRIES o/p CID C