Best Performance for apex over Linux+HW RAID

Dear Experts,
Given:
-- Server Specs:
---- HP ProLiant ML370 G6 E5540
---- Processor name: Intel® Xeon® E5540 (4 core, 2.53 GHz, 8MB L3, 80W)
---- Storage Controller: (1) Smart Array P410i/256MB
---- 8 SAS HardDisks with 320 GB for each
---- 12 GB RAM
Required:
1)) What are the best practice bring the best performance to Apex for the following solution stack
---- Oracle Linux 6.3 x86_64
---- Grid Infrastructure 11g R2
---- Database 11g R2
---- Oracle Apex 4.2.1
2)) What is the best hardware RAID configuration?
3)) What is the maximum concurrent users of applications on apex according to the above specs+software stack ?
Regards
Mahmoud

Dear Alvaro
Thank you for your response.
The current status
When I entered HP ACU from bootable Smart Start CD, I found under the configuration of Smart Array P410i in Embedded Slot, the following:
-Smart Array P410i in Embedded Slot
---Internal Drive Cage at Port 1I : Box 1
------300 GB 2-Part SAS Drive at Part 1I : Box 1 : Bay 1
------300 GB 2-Part SAS Drive at Part 1I : Box 1 : Bay 2
------300 GB 2-Part SAS Drive at Part 1I : Box 1 : Bay 3
------300 GB 2-Part SAS Drive at Part 1I : Box 1 : Bay 4
---Internal Drive Cage at Port 2I : Box 1
------300 GB 2-Part SAS Drive at Part 2I : Box 1 : Bay 5
------300 GB 2-Part SAS Drive at Part 2I : Box 1 : Bay 6
------300 GB 2-Part SAS Drive at Part 2I : Box 1 : Bay 7
------300 GB 2-Part SAS Drive at Part 2I : Box 1 : Bay 8
The questions now:
1) Do you recommend the following configuration for RAID's logical arrays:
Using Logical View:
SAS Array A - 4 Logical Drive(s)
--- Logical Drive 1 (50.0 GB, RAID 1+0) ---> for OS
--- Logical Drive 2 (24.0 GB, RAID 0) ---> for SWAP
--- Logical Drive 3 (200.0 GB, RAID 1+0) ---> for ASM DATA
--- Logical Drive 4 (296.7 GB, RAID 1+0) ---> for ASM FRA
SAS Array B - 1 Logical Drive(s)
--- Logical Drive 5 (1.1 TB, RAID 0) ---> for non-critical applications and sources
2) What are your recommendations for the following steps to reach oracle apex 4.2 installed?
Best Regards
Mahmoud

Similar Messages

  • What's iOS is the best performance for iP4 & 3Gs

    Hi all guys here.
    How are you all guys?
    I'm new here.
    I'm now running IOS 4.1 for my both iPhone(3Gs and 4)
    But i want the new features.
    Can anyone here recommend me what the best IOS for iP3Gs and iP4? ( d best performance )
    Without battery draining or make my iphone slower.
    I'm greatly appreciate that guys.
    Trillion Thanks!!!

    Doesn't really matter, since the only iOS available for either of your phones is iOS 4.3.5, if you chose to update. Users have complained about every single iOS update since the first update for the original iPhone was released. Fact is, the vast majority of users have zero issues & happily go about their ways.

  • Best practices for Voice over MetroRing

    Hi, We have installed a MetroRing Gigabit Ethernet using 3550 and 6500 Catalyst switches. Today, only data is running, but looking at tomorrow, when voice/video be requested, I am trying to find some best practices for QoS or traffic classification. If you can point me to some of them will be great.

    Hi,
    You might also find those useful: " LAN QoS"
    http://www.cisco.com/web/about/ac123/ac147/ac174/ac176/about_cisco_ipj_archive_article09186a00800c83cd.html
    and "Cisco AutoQoS White Paper"
    http://www.cisco.com/en/US/tech/tk543/tk759/technologies_white_paper09186a00801348bc.shtml
    as well as "Configuring QoS" for Catalyst 6500 switches
    http://www.cisco.com/en/US/products/hw/switches/ps708/products_configuration_guide_chapter09186a008007fb2b.html
    The most comprehensive starting point will be: "Quality of Service (QoS)"
    http://www.cisco.com/en/US/tech/tk543/tsd_technology_support_category_home.html
    Did this help?
    Martin

  • X4500 RAID Configuration for best performance for video storage

    Hello all:
    Our company is working with a local university to deploy IP video security cameras. The university has an X4500 Thumper that they would like to use for the storage of the video archives. The video management software (VMS) will run on an Intel based server with Windows 2003 server as the OS and connect to the Thumper via iSCSI. The VMS manages the permissions, schedules and other features of the cameras and records all video on the local drives until the scheduled archive time. When the archive time occurs, the VMS transfers the video to the Thumper for long term storage. It is our understanding that when using iSCSI and Windows OS there is a 2TB limit for the storage space - so we will divide the pool into several 2TB segments.
    The question is: Given this configuration, what RAID level (0, 1, Z or Z2) will provide the highest level of data protection without comprimising performance to a level that would be noticable? We are not writing the video directly to the Thumper, we are transferring it from the drives of the Windows server to the Thumper, and we need that transfer to be very fast - since the VMS stops recording during the archiving and restarts when complete, creating down time for the cameras.
    Any advice would be appreciated.

    I'd put as many disks as possible into a RAID 5 (striping) set. This will provide the highest level of performance, with the ability to sustain a single disk failure.
    With striping, some data is written to all the disks in the stripe set. So, if you have 10 disks in the set, then instead of writing data to a single disk, which is slow, 1/10th of the data is written to each disk simultaneously, which is very fast. In effect, the more disks you write to, the faster the operation completes.

  • Best Practice for Apex Implementation

    Hello,
    I'm looking for some guidance in best practices on implementing Apex across our enterprise. Do we install it on many databases based on whether an application gets most of its data from that database? And if so, could we use one 10gAS web server to serve up all of the instances?
    We currently have Apex installed on RAC databases in each environment (Dev, Int, QA, Prod), and then use dblinks to connect to the many remaining databases. Each RAC environment then uses an appropriate 10gAS web server (one web server per Apex installation). I'm wondering if this is a good approach or not? Any suggestions are appreciated.
    Julie

    Hi,
    Some of the standards will depend on your end-user capabilities/environments but I have a good standards document we use when developing UPK content for the eBus suite which might help.
    Email me at [email protected] and I'll send it over.
    Jon

  • Best NAS for full-fledged Linux?

    Now I know this thread's title might be a little ambiguous.
    I'm trying to find the best network storage which will allow me to host multiple types of data, like movies, tv shows, etc. and stream it through my LAN, so that it can be accessed with any of my TVs with appropriate hardware (with access to LAN).
    The thing is, I'd also like it to be able to download data from Bittorrent and Rapidshare, then unrar them.
    That's why I think I need a NAS which will be able to run Linux with a lightweight GUI session to use many programs available on Linux, like JDownloader (will it handle Java?)
    I don't even know if drives like these are available. I know about drives like WD My Book which could have such functionality (progams like pyLoad), but AFAIK, they only allow to login to them using ssh, and that's full CLI, as I understand it.
    It doesn't need to be NAS, it can also be a router with an external hard drive, I'd just like it to have the functionality as listed above.
    I have some basic Linux knowledge, and am not afraid of installing custom software, as long as it's not so hard.
    PS: It would be also great to be able to log into it from the Internet, not only inside the LAN.
    PPS: I don't know if hardware requirtements are that important, but they'd include:
    *) Gigabit Ethrnet preferred, 100Mbits otherwise
    *) At least 1 TB of storage
    Thanks in advance!
    Last edited by warnec (2010-07-08 13:32:06)

    .:B:. wrote:
    GUI sessions... Excuse me?
    A NAS should run SSH, maybe some services with web frontends. A GUI is a waste of RAM and CPU cycles.
    I agree with you about the GUI for a NAS.... but the OP also wants to run JDownloader, which requires Java, and I believe, a GUI.  I'm looking for the same sort of thing, and the Java and the GUI are where I get stuck.
    I do have an NSLU2, but haven't set it up... I don't think it has the horsepower or RAM to do what I want, which means hacking the hardware -- yuck!
    The OP may want to consider Marvell's PlugPC platform, which will do much of what he's looking for, minus the Java/JDownloader (I think).  It's a nice compact, low-power solution, and there are companies that have taken the platform and added software and hardware to make it a more finished product.  I believe the PogoPlug is an example of this, other ones can be found on Marvell's website, or by Googlizing "PlugPC"
    An old PC and FreeNAS is another alternative that I've considered.  That, or a lightweight version of Linux, would provide the GUI and Java features to run JDownloader for not much more money than a stand-alone NAS.  Less, actually, if you have an old PC with Gigabit or 100 Megabit Ethernet available.  The downside to this alternative is space and power.  An Atom-based one might solve those issues (Asus makes a nice one about the size of an external USB HD for $200), but even that would not have the low low power consumption of a PlugPC or an NSLU2.
    I've been playing around with my WD TV and Patriot Box Office Media Players... the WDTV can be hacked to host torrents (the Patriot does it natively) and they can sorta handle fileserving as well.  No Java or JDownloader solution, unfortunately... but they are a reasonable stopgap measure for me right now.
    The beauty of all these various solutions is that they've largely been enabled by Linux and the FOSS movement.  It's truly remarkable what Torvalds started and the community has embraced and extended...
    If I come across other solutions I'll report back...

  • Best approach for roll over in BPC?

    Hi All,
    We are currently looking for the best approach in SAP BPC for the roll
    forward of closing balances into opening balances from the previous
    year to the current period.
    Our current approach takes the closing balance account lines from the
    previous year , copies them into specific opening year members (f_init
    and ob_init) using business transformation rules then every month there
    are business transformation rules which takes these values in local and
    base currency to calculate the fx gain\loss and also copies over the
    closing balance at the historic rate into the opening balance of the
    current period. This approach takes both input data and journal data
    into account.
    We also need to take into account now the fact that we need to pull
    through any journals which were posted to adjustment companies and some
    (but not all) legal entities for traditional lines which do not have
    typical opening balance accounts (e.g. cash, stock, accruals etcu2026). The
    approach above can work but we need to add the relevant opening balance
    accounts.
    Please could you advise whether there is a better approach than this?
    Kind Regards,
    Fiona

    I normally prefer saving images in LocalFolder and save file name in database table. I prefer this because saving just file name will keep size of SQLite database small so will load faster.
    Gaurav Khanna | Microsoft .NET MVP | Microsoft Community Contributor

  • Best Setup for Working over a Network (What is it)

    Curious,
    I'm been using Dreamweaver for awhile now, both at work and
    at home as a hobbyist. My preferred setup when working on a set is
    to be connected via FTP to the set so that I can work on the
    documents directly without have local and remote versions of the
    documents.
    If I do choose a local directory and then also a remote
    directory with and Local/Network connection for the remote files
    that works failry well but I end up getting local copies of files
    and folders that I'm working on, which is good for performance but
    not so good when there is getting and putting of documents and a
    slow connection.
    My preferred method using a Local/Network connection for the
    remote files is to do that and not even have a local folder and no
    local files.
    Like I said, the only remote connection for me that seems to
    work well is the FTP connection.
    Wondering what others experience is and what they recommend.
    I guess I may have to change the way that I work but I would prefer
    not to. I have used other programs as well (e.g. BBEdit (which I
    like, I'm working on a Mac), TextMate). These are a little faster
    but there is still some performance issue when working over the
    LAN.

    As a professional, I would never work in that configuration.
    One silly
    mistake and you have just clobbered your only copy of the
    file. Of course,
    I never make such mistakes, but I've seen others do it! 8)
    I will always work from a local site, and upload to a remote
    site. I test
    from a local networked MySQL/PHP server. My sites are all on
    a shared
    network drive so that I can a) easily back them up, and b)
    access the same
    sites easily from several locations within my peer LAN.
    When you elect to not have a local site, then you lose the
    option to use
    some of DW's basic capabilities, like Templates and Library
    items. While I
    rarely use the latter, I use Templates all the time (along
    with a healthy
    dose of SSI).
    Does that help?
    Murray --- ICQ 71997575
    Adobe Community Expert
    (If you *MUST* email me, don't LAUGH when you do so!)
    ==================
    http://www.projectseven.com/go
    - DW FAQs, Tutorials & Resources
    http://www.dwfaq.com - DW FAQs,
    Tutorials & Resources
    ==================
    "An Ent" <[email protected]> wrote in
    message
    news:[email protected]...
    > Curious,
    >
    > I'm been using Dreamweaver for awhile now, both at work
    and at home as a
    > hobbyist. My preferred setup when working on a set is to
    be connected via
    > FTP
    > to the set so that I can work on the documents directly
    without have local
    > and
    > remote versions of the documents.
    >
    > If I do choose a local directory and then also a remote
    directory with and
    > Local/Network connection for the remote files that works
    failry well but I
    > end
    > up getting local copies of files and folders that I'm
    working on, which is
    > good
    > for performance but not so good when there is getting
    and putting of
    > documents
    > and a slow connection.
    >
    > My preferred method using a Local/Network connection for
    the remote files
    > is
    > to do that and not even have a local folder and no local
    files.
    >
    > Like I said, the only remote connection for me that
    seems to work well is
    > the
    > FTP connection.
    >
    > Wondering what others experience is and what they
    recommend. I guess I
    > may
    > have to change the way that I work but I would prefer
    not to. I have used
    > other programs as well (e.g. BBEdit (which I like, I'm
    working on a Mac),
    > TextMate). These are a little faster but there is still
    some performance
    > issue
    > when working over the LAN.
    >

  • Best practice for Video over IP using ISDN WAN

    I am looking for the best practice to ensure that the WAN has suffient active ISDN channels to support the video conference connection.
    Reliance on load threshold either -
    Takes to long for the ISDN calls to establish causing the problems for video setup
    - or is too fast to place additional ISDN calls when only data is using the line
    What I need is for the ISDN calls to be pre-established just prior to the video call. Have done this in the past with the "ppp multilink links minimum commmand but this manual intervention isn't the preferred option in this case
    thanks

    This method is as secure as the password: an attacker can see
    the hashed value, and you must assume that they know what has been
    hashed, with what algorithm. Therefore, the challenge in attacking
    this system is simply to hash lots of passwords until you get one
    that gives the same value. Rainbow tables may make this easier than
    you assume.
    Why not use SSL to send the login request? That encrypts the
    entire conversation, making snooping pointless.
    You should still MD5 the password so you don't have to store
    it unencrypted on the server, but that's a side issue.

  • Best practice for credentials over the net?

    I'm developing an AIR app which requests information about a
    user's account from a webserver. Currently in the air app, I get
    the user to input the same username and password they use on the
    website. I md5 the password and send it to my server where I check
    against an md5 password stored in a DB.
    Is this secure enough? Is there a better way?
    I was thinking about doing away with the password, and using
    an application key, but that's an extra step for the user to hurdle
    before using the app.
    What are you guys and girls doing? How do all the twitter
    apps authenticate on the webserver?

    This method is as secure as the password: an attacker can see
    the hashed value, and you must assume that they know what has been
    hashed, with what algorithm. Therefore, the challenge in attacking
    this system is simply to hash lots of passwords until you get one
    that gives the same value. Rainbow tables may make this easier than
    you assume.
    Why not use SSL to send the login request? That encrypts the
    entire conversation, making snooping pointless.
    You should still MD5 the password so you don't have to store
    it unencrypted on the server, but that's a side issue.

  • Best program for voice over in slideshow?

    I want to make a 30 minute slide show with voiceover.  Should I make it in iDVD, iPhoto, iMovie, or something else?

    A Garageband Podcast would work pretty good, too.
    http://www.apple.com/education/reachall/learn/docs/GarageBand_Podcast_Step_Card. pdf

  • How to connect multiple Xserve Raid for Best Performance

    I like to get an idea how to connect multiple Xserve Raid to get the best performance for FCP to do multiple stream HD.

    Again, for storage (and retrieval), FireWire 400 should be fast enough. If you are encoding video directly to the external drive, then FireWire 800 would probably be beneficial. But as long as the processing of the video is taking place on the fast internal SATA drive, and then you are storing files on the external drive, FireWire 400 should be fine.
    Instead of speculating about whether it will work well or not, you need to set it up and try your typical work flow. That is the only way you will know for sure if performance is acceptable or not.
    For Time Machine, you should use a single 1.5TB drive. It is likely that by the time your backup needs comes close to exceeding that space, you will be able to buy a 3TB (or larger) single drive for the same cost. Also, I would not trust a RAID where the interaction between the two drives is through two USB cables and a hub. If your primary storage drive fails, you need your backup to be something that is simple and reliable.
    Oh, and there should be no problem with the adapter, if you already have it and it works.
    Edit: If those two external drives came formatted for Windows, make sure you have use Disk Utility Partition tab to repartition and reformat the drive. When you select the drive in the Disk Utility sidebar, at the bottom of the screen +Partition Map Scheme+ should say *GUID Partition Table*. When you select the volume under the drive in the sidebar, Format should say *Mac OS Extended (Journaled)*.

  • Encoding: Best Performance vs. Best Quality

    So I have a project that if I select Best Performance it shows that the DVD will be 3.4GB, and if I select Best Quality it shows the DVD will be 2.0GB.
    So my question is: isn't that contrary to logic? Why wouldn't Best Quality be the largest size? Is Best Quality going to give me the best possible video quality?

    Since you have posted to the iDVD 6 forum, I asssume that is what you have.
    iDVD 6 has two encoding modes: 'Best Performance' and 'Best Quality'. iDVD '08 adds: 'Professional Quality'.
    People misunderstand the names and I wish Apple had used different names for all three modes.
    The below applies to a single layer disc (double the times for a double layer disc):
    'Best Performance' uses a fixed video encoding bit-rate that produces a DVD with a data playback bit-rate just about as high as a set-top DVD player can handle. This limits content to 60 minutes or less.
    'Best Quality' uses a fixed video encoding bit-rate that is BASED ON THE TOTAL AMOUNT OF CONTENT BEING COMPRESSED and is best suited for content between 60 and 120 minutes. Note that all the content is encoded at the same bit-rate so that it can fit on a single layer disc. (Apple calls this single-pass variable bit-rate encoding because 120 minutes of content gets compressed more than 60 minutes of content.)
    The new 'Professional Quality' uses a variable video encoding bit-rate that is BASED ON THE INFORMATION IN THE CONTENT BEING COMPRESSED. It uses a two-pass process to first evaluate the content and then encode it based on the motion/detail of individual clips. It is best suited for content between 60 and 120 minutes. Note that not all the content is encoded at the same bit-rate BUT the maximum data bit-rate on playback can not exceed the playback capability of a set-top DVD player. (This is two-pass, variable bit-rate encoding.) This means the BEST encoded quality should be about what is obtained with 'Best Performance' for content under 60 minutes.
    If your content is under 60 minutes, use 'Best Performance'. If your content is between 60 minutes and 120 minutes, use 'Professional Quality' if your processor is fast enough and you don't mind waiting about twice the time required for 'Best Quality'.
    About the only thing Apple can do to further improve the quality of DVD encoded video is to offer compressed audio instead of just the present uncompressed PCM audio because the audio 'eats up' part of the playback bit-rate a set-top DVD player can handle. Compressed audio would make more of the maximum playback bit-rate available for video.
    In your case, with iDVD 6, use 'Best Performance' for content under 60 minutes and 'Best Quality' for content over 60 minutes. Remember that your menu content counts against the total time limit.
    F Shippey

  • Apex on linux - when and how to install http server

    1. I installed 10g Release 2 Enterprise Edition on Vmware Linux on my laptop,
    2. I pathced database to 10.2.03,
    3. then I downloaded and installed Apex 3.0.1 on them.
    I thought Http Server will be coming with Apex 3 installation, but I understand that I needed to install that one between the first and second steps?
    So I can not be sure if I will ruin everything if I install Oracle Http Server from the companion cd as the fourth step?
    Also I would be so glad to have a reference like dizwell installation guides for apex on linux if there is any, I couldn't find up to now :(
    Thank you for your guidance.
    Message was edited by:
    FENERBAHCE
    I started installer from companion cd, and choosed 10g companion products(not htmldb) option since http server is listed at the bottom.
    Installation finished on a new Oracle home successfuly but I can not find the *.conf files or opmn/bin folder under this new Oracle Home.
    I must be missing something :)

    APEX from version 2.0 works in 2 modes:
    1) using HTTP server, but in version 3.x is not installed within instalation of APEX
    2) using XML DB feature - it must be installed within instalation standard Oracle -
    (Oracle XML DB HTTP listener with the embeded PL/SQL gateway)
    APEX 3.x you can launch over these links (port would be 8080)
    http://host:port/apex/apex_admin
    http://host:port/apex
    Try

  • Best slot for new PCIe FW-expansion card

    Hi there,
    I just upgraded my PM Quad with a NVIDIA 7800 GTX 512 card which occupies PCIe slots 1 and 2.
    Now I will have to install a FW 400/800 expansion card for operation of an external audio interface. I know that slot 3 would probably offer the best performance for the next card to be installed (8-lane)- but is likely to obstruct proper air circulation for the fan of my graphics card.
    So slot 4 (4-lane) would be the only alternative. I cannot judge if this would be sufficient for proper audio performance.
    So, any comments from the audience?
    Regards,
    Oliver
    PPC mini, PM Quad 4,5 GB RAM   Mac OS X (10.4.7)  

    Great - glad to hear that.
    Just inserted my card into slot 4 - real plug 'n play - smooth operation - no complaints so far!
    O.

Maybe you are looking for

  • Safari doesn't open properly at startup

    Hi everybody, I've been browsing the forum a bit, but didn't find an answer to my question. Here it goes: I have Safari added in my dock, and I marked it as 'open after login'. The same for Mail. When my mac has started up my Mail indeed opens, but S

  • IPhoto 6.0.4-Can't add new photos

    Have not been able to view photos imported since August. Rolls are in the library, but are "unreadable." Any way to get these rolls into viewable iPhoto?

  • I want to get rid of my iTunes files

    I'm trying to free up my internal hard drive space by putting all of my itunes files on an external hard drive (EHD). I copy/pasted all of them onto my EHD, and deleted the original files from my internal hard drive. I opened iTunes from my EHD once,

  • Control technique for dc-dc converter

    I simulated successfully this Switched Capacitor DC-DC converter which step up input voltage..Now main task is to  maintain output at fixed 14V irrespective of input.I am trying to implement this through PWM which will provide variable duty cycle to

  • Operating temperatures on Macbook Pro 15" (Mid-2009)?

    Hello people. I have a quick inquiry. I searched google and there were no definite answers regarding this issue, it's all a matter of opinion to many people. What is the safe range for operating temperatures for a 15" Macbook Pro (mid-2009)? Before h