Requirements for High-Load OLTP Database

Hi guys!
Need your Best Practise!
I will install&configure High-Load OLTP Database.
5 million users
500 transactions per second What requirements is need?
Do you have any papers or documents?

Denis :) wrote:
Hi guys!
Need your Best Practise!
I will install&configure High-Load OLTP Database.
5 million users
concurrent users?
500 transactions per second
  1* select 5000*60*60/5000000 from dual
SQL> /
5000*60*60/5000000
               3.6each user does about 4 transactions per hour
How big is a single transaction?
How much redo is generated every day?
>
What requirements is need? more hardware is more better!
Do you have any papers or documents?

Similar Messages

  • Struts not good for high-load requests?

    I need a small advice from somebody with experience writing high-load web applications. I worked in Java servlets and was doing a bit in Struts but this is my first bigger project without working in the team so now need to decide the technology.
    I have to do the J2EE project that has simple business logic but will take some time for database and I/O operations and the maximum number of users is expected to be big, a few hundreds of them in the same time. Struts + Hibernate combination is very comfortable for working but am now confused may I use it for this purpose because I heard that Struts' action is single-threaded. If action is single-threaded, does it mean that hundred requests will make a queue and wait for each other's I/O processing completely finishes before getting next request? If so, then I shouldn't use Struts but ordinary servlets, right?
    Any hint will be very useful, thanks in advance!

    Same problem here (I also do panoramas).
    Actually I can't even import my files into Aperture.
    Other, slighly smaller, files (PSD or TIFF) show similar problems than yours just that my Previews or Thumbnails decide to turn blue, displaying the "Unsupported format" message - after sometimes weeks of working fine.
    But anyway, most of my files are usually too large to save out from Photoshop as PSD or even Tiff. PSB is unfortunately not supported by Aperture (according to the tech specs) and Aperture cannot indeed see the .psb files …
    I shall hope that larger files in general and the large file format will be reliably supported by v3 (likely out right after Snow Leopard).
    My current workaround:
    For processing large size files I avoid Aperture altogether anyway, using Bridge/Photoshop (+ Photomatix and/or PTgui) instead.
    Frankly, for me this combination works pretty well.
    The final image will be saved out as HiQuality JPEG for Aperture, solely for the purpose of file management.
    That JPEG will be stored in the same folder as the .psb and the other files (braketing and/or pano images). So by using "Show in Finder" I can quickly find the Photoshop original.

  • Is license required for Sql Loader?

    I find that Sql Loader could save much time for me in bulk inserting.
    But I'm not sure whether license is required for using it in business software developing.
    Or is there any API that can provide similar functionality of Sql Loader?

    The sqlldr utility, expdp and impdp, and sqlplus come with the rdbms license. Earlier client software had the option of installing sqlldr, exp, and imp onto the client machine for use against a remote database. Being that expdp and impdp only run on the database server now these two utilities are probably not included but sqlldr may still be available as part of the client.
    HTH -- Mark D Powell --

  • What are the REAL system requirements for High Definition Editing?

    According to Apple, you can edit HD in iMovie with: a G4 1GHz or faster processor, 512MB RAM or higher, and OSX 10.3.6 or higher.
    I am running a G4 iMac 1.25GHz with 512MB DDR SDRAM and Tiger 10.4.3
    My machine can barely import HD from my new Sony HDV1080i camcorder. It imports at anywhere from 1/8th speed to 1/2 speed.
    But more importantly, the playback in iMovie of my HD clips is not smooth -- the playback looks jagged, as if the computer cannot quite keep up at full speed.
    Would upgrading from 512MB RAM to 1G RAM likely solve my problems with the slow import and the jagged playback?

    Would upgrading from 512MB RAM to 1G RAM likely solve my problems with the slow import and the jagged playback?
    No. A faster Mac with a faster video card will help.
    The playback in iMovie is never as good as it will be on TV, even on DV. It must be even more difference for HDV.

  • Is a DB license required for EM 10G repository database?

    We purchasedOEM10G Grid Control. We're planning to deploy OMS and Repository on a standalone Linux box (which has no other RDBMS running on it), while agents will be spread accross other target Unix boxes. OEM10G comes with a repository database. Do we need a separate license for this repository database on Linux? Can not locate the answer on Oracle Home site.

    My impression from what I have read in the manuals is you can run the OMS and repository on a seperate machine and not pay license (RDBMS or Grid Control) for it. You will be paying Grid Control license for all the monitored machines where you have your business applications running. Talk to you sales rep before you proceed because my interpretations will not protect you when Oracle comes after you for more money.

  • Information required for e-load report

    I would like to gather more information on the items displayed in e-load sessin report, under "Performance by Profile and Timer", like
    Avg?     
    Std Dev?
    90th %?
    I guess I know about following two items
    Min: Min time VU took to navigate a perticular page
    Max: Min time VU took to navigate a perticular page
    I have attached the sample report.
    Thanks

    The e-Load help says the following:
    Avg - the average performance for the virtual user profile or server response timer in seconds.
    Std Dev - the number of seconds that the performance of virtual user profiles or server response timers deviated from the mean (average value).
    90th % - shows the number of seconds that the performance of ninety percent of the virtual users or server response timers was at or below.
    For some reason your attachment had question marks in it, so I cannot see exactly what you were pointing out.
    -GateCity_QA

  • Coding Required for Data Load...

    Hi All,
    I am stuck up in an issue where I know the logic but unable to write a code. I have the requirement as follows :-
    I have the records in Excel as given below:-
    Fiscal Year          2004
    Key                     004
    Capitalization       A
    Norm Debts         B=0.7*A
    Rate of Interest    C
    Repayment          D=B/10
    Op. Balance         D=B/10
    Cl. Balance          E = B-D
    Total Interest        G=E+F/2*C
    The above records are the data which I am loading first time and only one record. The cube shud be populated with the data till Closing Balance becomes zero.
    So my logic will be
    First record by excel and second onwards is :-
    Fiscal Year         2004 +1 (It shud add 1 to every record till closing balance becomes zero)
    Key                    004 (Constant till Closing balance becomes Zero)
    Capitalization      A (Constant till Closing balance becomes Zero)
    Norm Debts        B (Constant till Closing balance becomes Zero)
    Rate of Interest    C (Constant till Closing balance becomes Zero)
    Repayment         D (Constant till Closing balance becomes Zero)
    Opening Balance   (Closing Balance will become Opening Balance here) = B-D
    Closing Balance     (Closing balance will now again will be debited by repayment that is B-2D)
    Total interest        = (Opening Balance + Closing Balance)/ 2 * Rate of Interest
    So next record will be opening balance = B-2D and Closing Balance will be B-3D and the record shud be automatically fetched till Closing Balance becomes zero.
    Any help will be assigned with points.
    Thanks and Regards,
    Sangini Mathur.

    Hi Sangini,
    I think in ur case u have to LOOP aginst source_package
    in the start routine only then you can get ur requirement.
    Data: w_temp1 like source_package-cl.balance.
    while w_temp1 eq 0.
    Put all your calculations here
    endwhile.
    Hope it helps
    Bhaskar
    Edited by: shanthi bhaskar on Jun 3, 2008 2:34 PM

  • Computer specs required for high def video editing???

    My current/new laptop won't edit high def video.  It's an Acer Aspire S732Z, pentium processor, 2.2 GHz, 3GB Ram, up to 1759 MB of Intel® Dynamic Video Memory Technology 5.0 (64 MB of dedicated video memory
    What specs to I need to edit high def video???  Any specific computer recommendations?  Price is a factor.
    Thank you

    Laptop Video Editing PC
    -http://forums.adobe.com/message/4717373
    -http://forums.adobe.com/message/4578948
    Buy a Desktop Video Editing PC
    http://www.adkvideoediting.com/
    -ADK Kudos http://forums.adobe.com/thread/877201
    Build a Desktop Video Editing PC
    -ideas inside http://www.pacifier.com/~jtsmith/ADOBE.HTM
    -http://forums.adobe.com/thread/947698
    -http://forums.adobe.com/thread/1104182
    -http://ppbm7.com/index.php/intro-part-1
    -http://forums.adobe.com/thread/1098759
    -http://forums.adobe.com/thread/878520

  • R3 Table Required for data load status

    Hi all,
    I am in 3.x version so rsstatmanpart(fast table only available in bi7) wont work.
    I want the no of records added and transferred for a specific cube on a specific date.
    Thanks in advance.

    Check Tables RSMONICTAB, RSMONFACT, RSMONICDP
    Hope this helps..
    /pradeep

  • List of Manual Setup required for iSetup to work

    Hi All,
    This is Mugunthan from iSetup development. Based on my interaction with customers and Oracle functional experts, I had documented list of manual setups that are required for smooth loading of selection sets. I am sharing the same. Please let me know if I anyone had to enter some manual setup while using iSetup.
    Understanding iSetup
    iSetup is a tool to migrate and report on your configuration data. Various engineering teams from Oracle develop the APIs/Programs, which migrates the data across EBS instances. Hence all your data is validated for all business cases and data consistency is guarantied. It requires good amount of setup functional knowledge and bit of technical knowledge to use this tool.
    Prerequisite setup for Instance Mapping to work
    ·     ATG patch set level should be same across all EBS instances.
    ·     Copy DBC files of each other EBS instances participating in migration under $FND_SECURE directory (refer note below for details).
    ·     Edit sqlnet.ora to allow connection between DB instacnes(tcp.invited_nodes=(<source>,<central>))
    ·     Make sure that same user name with iSetup responsibility exists in all EBS instances participating in migration.
    Note:- iSetup tool is capable of connecting to multiple EBS instances. To do so, it uses dbc file information available under $FND_SECURE directory. Let us consider three instances A, B & C, where A is central instance, B is source instance and C is target instances. After copying the dbc file on all nodes, $FND_SECURE directory would look like this on each machine.
    A => A.dbc, B.dbc, C.dbc
    B => A.dbc, B.dbc
    C => A.dbc, C.dbc
    Prerequisite for registering Interface and creating Custom Selection Set
    iSetup super role is mandatory to register and create custom selection set. It is not sufficient if you register API on central/source instance alone. You must register the API on all instances participating in migration/reporting.
    Understanding how to access/share extracts across instances
    Sharing iSetup artifacts
    ·     Only the exact same user can access extracts, transforms, or reports across different instances.
    ·     The “Download” capability offers a way to share extracts, transforms, and loads.
    Implications for Extract/Load Management
    ·     Option 1: Same owner across all instances
    ·     Option 2: Same owner in Dev, Test, UAT, etc – but not Production
    o     Extract/Load operations in non-Production instances
    o     Once thoroughly tested and ready to load into Production, download to desktop and upload into Production
    ·     Option 3: Download and upload into each instance
    Security Considerations
    ·     iSetup does not use SSH to connect between instances. It uses Concurrent Manager framework to lunch concurrent programs on source and target instances.
    ·     iSetup does not write password to any files or tables.
    ·     It uses JDBC connectivity obtained through standard AOL security layer
    Common Incorrect Setups
    ·     Failure to complete/verify all of the steps in “Mapping instances”
    ·     DBC file should be copied again if EBS instance has been refreshed or autoconfig is run.
    ·     Custom interfaces should be registered in all EBS instances. Registering it on Central/Source is not sufficient.
    ·     Standard Concurrent Manager should up for picking up iSetup concurrent requests.
    ·     iSetup financial and SCM modules are supported from 12.0.4 onwards.
    ·     iSetup is not certified on RAC. However, you may still work with iSetup if you could copy the DBC file on all nodes with the same name as it had been registered through Instance Mapping screen.
    Installed Languages
    iSetup has limitations where it cannot Load or Report if the number and type of installed languages and DB Charset are different between Central, Source and Target instances. If your case is so, there is a workaround. Download the extract zip file to desktop and unzip it. Edit AZ_Prevalidator_1.xml to match your target instance language and DB Charset. Zip it back and upload to iSetup repository. Now, you would be able to load to target instance. You must ensure that this would not corrupt data in DB. This is considered as customization and any data issue coming out this modification is not supported.
    Custom Applications
    Application data is the prerequisite for the most of the Application Object Library setups such as Menus, Responsibility, and Concurrent programs. iSetup does not migrate Custom Applications as of now. So, if you have created any custom application on source instance, please manually create them on the target instance before moving Application Object Library (AOL) data.
    General Foundation Selection Set
    Setup objects in General foundation selection set supports filtering i.e. ability to extract specific setups. Since most of the AOL setup data such as Menus, Responsibilities and Request Groups are shipped by Oracle itself, it does not make sense to migrate all of them to target instance since they would be available on target instance. Hence, it is strongly recommended to extract only those setup objects, which are edited/added, by you to target instance. This improves the performance. iSetup uses FNDLOAD (seed data loader) to migrate most of the AOL Setups. The default behavior of FNDLOAD is given below.
    Case 1 – Shipped by Oracle (Seed Data)
    FNDLOAD checks last_update_date and last_updated_by columns to update a record. If it is shipped by Oracle, the default owner of the record would be Oracle and it would skip these records, which are identical. So, it won’t change last_update_by or last_updated_date columns.
    Case 2 – Shipped by Oracle and customized by you
    If a record were customized in source instance, then it would update the record based on last_update_date column. If the last_update_date in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Case 3 – Created and maintained by customers
    If a record were newly added/edited in source instance by you, then it would update the record based on last_update_date column. If the last_update_date of the record in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Profiles
    HR: Business Group => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    HR: Security Profile => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    MO: Operating Unit => Set the Operating Unit name for which you would like to extract data from source instance. After loading Operating Unit onto the target instance, make sure that this profile option is set if required.
    Navigation path to do the above setup:
    System Administrator -> Profile -> System.
    Query for the above profiles and set the values accordingly.
    Descriptive & Key Flex Fields
    You must compile and freeze the flex field values before extracting using iSetup.
    Otherwise, it would result in partial migration of data. Please verify that all the data been extracted by reporting on your extract before loading to ensure data consistency.
    You can load the KFF/DFF data to target instance even the structures in both source as well as target instances are different only in the below cases.
    Case 1:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1, Loc2, Loc3 (Mandate), Loc4, Loc5 and Loc6
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3, Loc4, then locations will be loaded to target instance without any issue. If you do not provide value for Loc3, then API will fail, as Loc3 is a mandatory field.
    Case 2:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1 (Mandate), Loc2
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3 and Loc4 and load data to target instance, API will fail as Loc3 and Loc4 are not there in target instance.
    It is always recommended that KFF/DFF structure should be same for both source as well as target instances.
    Concurrent Programs and Request Groups
    Concurrent program API migrates the program definition(Definition + Parameters + Executable) only. It does not migrate physical executable files under APPL_TOP. Please use custom solution to migrate executable files. Load Concurrent Programs prior to loading Request Groups. Otherwise, associated concurrent program meta-data will not be moved even through the Request Group extract contains associated Concurrent Program definition.
    Locations - Geographies
    If you have any custom Geographies, iSetup does not have any API to migrate this setup. Enter them manually before loading Locations API.
    Currencies Types
    iSetup does not have API to migrate Currency types. Enter them manually on target instance after loading Currency API.
    GL Fiscal Super user--> setup--> Currencies --> rates -- > types
    Associating an Employee details to an User
    The extract process does not capture employee details associated with users. So, after loading the employee data successfully on the target instance, you have to configure them again on target instance.
    Accounting Setup
    Make sure that all Accounting Setups that you wish to migrate are in status “Complete”. In progress or not-completed Accounting Setups would not be migrated successfully.
    Note: Currently iSetup does not migrate Sub-Ledger Accounting methods (SLA). Oracle supports some default SLA methods such as Standard Accrual and Standard Cash. You may make use of these two. If you want to use your own SLA method then you need to manually create it on target instances because iSetup does not have API to migrate SLA. If a Primary Ledger associated with Secondary Ledgers using different Chart of Accounts, then mapping rules should be defined in the target instance manually. Mapping rule name should match with XML tag “SlCoaMappingName”. After that you would be able to load Accounting Setup to target instance.
    Organization API - Product Foundation Selection Set
    All Organizations which are defined in HR module will be extracted by this API. This API will not extract Inventory Organization, Business Group. To migrate Inventory Organization, you have to use Inventory Organization API under Discrete Mfg. and Distribution Selection Set. To extract Business Group, you should use Business Group API.
    Inventory Organization API - Discrete Mfg & Distribution Selection Set
    Inventory Organization API will extract Inventory Organization information only. You should use Inventory Parameters API to move parameters such as Accounting Information. Inventory Organization API Supports Update which means that you can update existing header level attributes of Inventory Organization on the target instance. Inventory Parameters API does not support update. To update Inventory Parameters, use Inventory Parameters Update API.
    We have a known issue where Inventory Organization API migrates non process enabled organization only. If your inventory organization is process enabled, then you can migrate them by a simple workaround. Download the extract zip file to desktop and unzip it. Navigate to Organization XML and edit the XML tag <ProcessEnabledFlag>Y</ProcessEnabledFlag> to <ProcessEnabledFlag>N</ProcessEnabledFlag>. Zip it back the extract and upload to target instance. You can load the extract now. After successful completion of load, you can manually enable the flag through Form UI. We are working on this issue and update you once patch is released to metalink.
    Freight Carriers API - Product Foundation Selection Set
    Freight Carriers API in Product Foundation selection set requires Inventory Organization and Organization Parameters as prerequisite setup. These two APIs are available under Discrete Mfg. and Distribution Selection Set. Also,Freight Carriers API is available under Discrete Mfg and Distribution Selection Set with name Carriers, Methods, Carrier-ModeServ,Carrier-Org. So, use Discrete Mfg selection set to load Freight Carriers. In next rollup release Freight Carriers API would be removed from Product Foundation Selection Set.
    Organization Structure Selection Set
    It is highly recommended to set filter and extract and load data related to one Business Group at a time. For example, setup objects such as Locations, Legal Entities,Operating Units,Organizations and Organization Structure Versions support filter by Business Group. So, set the filter for a specific Business Group and then extract and load the data to target instance.
    List of mandatory iSetup Fwk patches*
    8352532:R12.AZ.A - 1OFF:12.0.6: Ignore invalid Java identifier or Unicode identifier characters from the extracted data
    8424285:R12.AZ.A - 1OFF:12.0.6:Framework Support to validate records from details to master during load
    7608712:R12.AZ.A - 1OFF:12.0.4:ISETUP DOES NOT MIGRATE SYSTEM PROFILE VALUES
    List of mandatory API/functional patches*
    8441573:R12.FND.A - 1OFF:12.0.4: FNDLOAD DOWNLOAD COMMAND IS INSERTING EXTRA SPACE AFTER A NEWLINE CHARACTER
    7413966:R12.PER.A - MIGRATION ISSUES
    8445446:R12.GL.A - Consolidated Patch for iSetup Fixes
    7502698:R12.GL.A - Not able to Load Accounting Setup API Data to target instance.
    Appendix_
    How to read logs
    ·     Logs are very important to diagnose and troubleshoot iSetup issues. Logs contain both functional and technical errors.
    ·     To find the log, navigate to View Detail screens of Extracts/ Transforms/Loads/Standard/Comparison Reports and click on View Log button to view the log.
    ·     Generic Loader (FNDLOAD or Seed data loader) logs are not printed as a part of main log. To view actual log, you have to take the request_id specified in the concurrent log and search for the same in Forms Request Search Window in the instance where the request was launched.
    ·     Functional errors are mainly due to
    o     Missing prerequisite data – You did not load one more perquisite API before loading the current API. Example, trying to load “Accounting Setup” without loading “Chart of Accounts” would result in this kind of error.
    o     Business validation failure – Setup is incorrect as per business rule. Example, Start data cannot be greater than end date.
    o     API does not support Update Records – If the there is a matching record in the target instance and If the API does not support update, then you would get this kind of errors.
    o     You unselected Update Records while launching load - If the there is a matching record in the target instance and If you do not select Update Records, then you would get this kind of errors.
    Example – business validation failure
    o     VONAME = Branches PLSQL; KEY = BANKNAME = 'AIBC‘
    o     BRANCHNAME = 'AIBC'
    o     EXCEPTION = Please provide a unique combination of bank number, bank branch number, and country combination. The 020, 26042, KA combination already exists.
    Example – business validation failure
    o     Tokens: VONAME = Banks PLSQL
    o     BANKNAME = 'OLD_ROYAL BANK OF MY INDIA'
    o     EXCEPTION = End date cannot be earlier than the start date
    Example – missing prerequisite data.
    o     VONAME = Operating Unit; KEY = Name = 'CAN OU'
    o     Group Name = 'Setup Business Group'
    o     ; EXCEPTION = Message not found. Application: PER, Message Name: HR_ORG_SOB_NOT_FOUND (Set of books not found for ‘Setup Business Group’)
    Example – technical or fwk error
    o     OAException: System Error: Procedure at Step 40
    o     Cause: The procedure has created an error at Step 40.
    o     Action: Contact your system administrator quoting the procedure and Step 40.
    Example – technical or fwk error
    o     Number of installed languages on source and target does not match.
    Edited by: Mugunthan on Apr 24, 2009 2:45 PM
    Edited by: Mugunthan on Apr 29, 2009 10:31 AM
    Edited by: Mugunthan on Apr 30, 2009 10:15 AM
    Edited by: Mugunthan on Apr 30, 2009 1:22 PM
    Edited by: Mugunthan on Apr 30, 2009 1:28 PM
    Edited by: Mugunthan on May 13, 2009 1:01 PM

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

  • Replacing / Compiling Database Triggers in a HA/High Load system

    Hi there,
    My collegue has just asked me if downtime needs to be scheduled to replace an After IUD Database trigger where I've made a minor change.
    If this were a package or procedure, the answer would be an obvious yes, as I'd be wary of my users getting the dreaded: ORA-04068... State of Package has been disguarded.
    What about when working with database triggers?
    If the trigger is firing at the same time that I perform my Create or Replace Trigger DDL, will everything fall into a screaming heap?
    Notes:
    Oracle 10g R2 db.
    High DML rates on the table that the After IUD Trigger is 'attached' to.
    It's a near certainty that when the Create or Replace Trigger DDL runs, the trigger will be active at the time.
    The Trigger is 'simple' in that it's use: is to call a DB package with some of the :new values as params.
    +(I'm happy to rtfm - if someone can tell me where!!)+
    Cheers -
    Ron Marks

    If this were a package or procedure, the answer would be an obvious yes, as I'd be wary of my users getting the dreaded: ORA-04068... State of Package has been disguarded.Not true. This exception is thrown only if sessions have some state represented in package variables. If there is no associated session state, then no exception would be thrown after recompile.
    If the trigger is firing at the same time that I perform my Create or Replace Trigger DDL, will everything fall into a screaming heap?You'll wait - to lock an object in library cache which represents your trigger. Not sure if this would require a library cache lock on underlying table, but I would guess that it is required. BTW, locking a table would make no any sense: CREATE TRIGGER is a DDL, which means before it began it issues a COMMIT => all locks are released, and it is quite possible that a TM lock for CREATE TRIGGER won't be acquired since resource would be busy.
    Oracle 11gR2 claims to address problems of application upgrades on a high-load environments with a feature called "Editions" (not a good name - try to find something about it via Google...)

  • Hardware requirements for SQL database used with TestStand

    We are wanting to set up a SQL server to store the data from TestStand.
    How do we determine the hardware requirements for this server? It will be used with 10-30 machines running tests and logging data and another 5-10 machines running queries to pull the data back out for analysis. The result data size will range from 50-25,000 results per run (run times are 1 minute for 50 result tests and 5 hours for the 25,000 result tests).

    Hi,
    database design and hardware requirements are never easy. There are a lot of scientific papers on work load tests and requirement assumptions. I can not give a short answer which machine to use. Just some ideas and starting points.
    The most important parts of a database system with large data sets are network bandwidth, RAM and storage bandwidth. With smaller data sets and more complex transactions the CPU becomes more important.
    In this TS case the data sets are usually rather small. If the queries are not too complex, the requirements seems to be not too high.
    Database performance is usually measured in transactions per minute (tpm). Special database servers can perform several thousands tpm and have costs starting at about 15 $ per tpm. See MS' ad page for a g
    ood starting point: http://www.microsoft.com/sql/evaluation/compare/benchmarks.asp
    You may also visit the TPC.org homepage.
    To be more specific.
    I'd choose a modern Intel-based system like P4-3MHz (that have virtual multiprocessors) with at least 512 MB RAM and a RAID5 hard disk storage system (not nercessarily SCSII) with at least 3 single HDs. Use a 100 MB LAN connection at least, best with a switch. Don't forget backup!
    Check also the pages of your preferred database provider.
    I am on a starting point here too. We have chosen mySQL, which runs (at least now) on the very same machine where TS & LV are running. We plan to test this setup with increasing burden to get a practical assumption of the HW requirements. The planned final setup will have up to 5 test stations and 5 query stations. We'll run about 50 rather complex tests of about 4 hours each that operate in parallel on the test stations.
    HTH and
    Greetings from Germany
    Uwe

  • What are considerations for Highly Transactional Database

    Hi,
    Can anyone please tell me about considerations for highly transactional database? Is Oracle 10g RAC better or Oracle DB with dataguard?
    Thanks.
    Regards,
    RJiv.

    I'm still not understanding what your question is... Load characteristics are quite irrelevant when discussing the necessity of DataGuard, though the amount of redo generated obviously impacts how much bandwidth is required between the primary and standby site. Bare transaction numbers are somewhat irrelevant when discussing the necessity or advisability of RAC since the amount of work a "transaction" does depends wildly on the application, the number of "transactions" a server can handle depends wildly on the hardware, and the business's need for scalability/ load balancing/ surviving node failure are independent of the transaction load.
    Justin

  • SharePoint Online : Unable to load type Microsoft.SharePoint.Upgrade.SPUpgradeCompatibilityException required for deserialization.

    From today's morning , We are having Issue on Our Online SharePoint Site.
    Each WebPart is not loading and displaying this Line
    Unable to load type Microsoft.SharePoint.Upgrade.SPUpgradeCompatibilityException required for deserialization.
    Please write in quick response to fix it at
    [email protected]

    Hi,
    According to your post, my understanding is that SharePoint Online Site was unable to load web part and got the “Microsoft.SharePoint.Upgrade.SPUpgradeCompatibilityException” error.
    Per my knowledge, the SPUpgradeCompatibilityException occurs during upgrade when the front-end Web server attempts to connect to an incompatible database.
    Please check whether the database is compatible.
    Regarding SharePoint Online, for quick and accurate answers to your questions, it is recommended that you initial a new thread in Office 365 forum.
    Office 365 forum
    http://community.office365.com/en-us/forums/default.aspx
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • In Oracle RAC environment which is OLTP database? load balancing advantage.

    In Oracle RAC environment which is a OLTP database? List the options for load balancing along with their advantages.

    You can use a software load balancer.
    https://forums.oracle.com/forums/search.jspa?threadID=&q=Software+AND+Load+AND+Balancer&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Installing and Configuring Web Cache 10g and Oracle E-Business Suite 12 [ID 380486.1]
    Thanks,
    Hussein

Maybe you are looking for