Infoset - Performance and Design Question .

Hello BW Gurus,
I have a question regarding the infoset.
We have a backlog report designed as follows:
sd_c03 - with around 125 chracteristics and kf's - daily load will have around 10 K records at a time.
custome cube - say - zcust01 with around 50 characters and KF's.- daily full load aroun 15 K records.
My question:
1.We have infoset ontop of this 2 cubes for reporting. Can we use infoset for reporting .I used infoset for master data and DSO reporting. Do you guys see any performance issue with using the infoset instead of multiprovider. Is there any alternative instead of reporting from infoset.
2.Also I executed the  SAP_INFOCUBE_DESIGNS program for the above cube and some dimesions are more then 25% like 58%, 75%,even 102%. So this has to be fixed for sure. Is it correct.if we don't change the design then what will be the consequences.
Please advise. We are in the development and the objects are not yet moved to production yet.Again thanks for your help in advance.
Senthil

Hi......
1.We have infoset ontop of this 2 cubes for reporting. Can we use infoset for reporting .I used infoset for master data and DSO reporting. Do you guys see any performance issue with using the infoset instead of multiprovider. Is there any alternative instead of reporting from infoset
Multiproviders are a great tool that can be used freely to "combine" data without adding much overhead to the system. You may need to combine data of different applications (like Plan and Actual). Another good use is from a data modeling point of view...you can enable "partitioning" of the data by using separate cubes for separate years of data. This way the cube are more manageable, and with an InfoProvider on top the data can be combined for reports if reqd (instead of having one huge cube with all the data).
Also using a multiprovider as an 'umbrella' infoprovider, you can easily make changes to underlying InfoProviders with minimum imapct to the queries.
You should go for multi-provider when the requirement is to report on combined data coming from different infoproviders.
Multi-providers combines data using union operation, that means all values of underlying infoproviders are taken into consideration. On ther other hand, Infoset works on the principle of join. Join forms the intersection and only the common data from underlying infoprovider are considerd..............In terms of performance infoset is better since it fetch less number of records.........
Check this .........
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f5aa43f-0c01-0010-a990-9641d3d4eef7
2.Also I executed the SAP_INFOCUBE_DESIGNS program for the above cube and some dimesions are more then 25% like 58%, 75%,even 102%. So this has to be fixed for sure. Is it correct.if we don't change the design then what will be the consequences
If Dimensions are more it effects query performance.........its better to change the design........
Hope this helps........
Regards,
Debjani.........

Similar Messages

  • SAP BPC PERFORMANCE AND DESIGN SUBJECT

    Hello experts, i ahve an issue about the design of an application in SAP BUSINESSOBJECTS bpc7 for nw7, since there are three applications those have tobe within a same appset,and they use almost the  same dimensions, there are two options:
    1.-to unify the 3 applications separately in a same appset, which implies 3 cubes in bi7 backend, and they will ask data from each other
    2.-to unify the 3 applications in a same appset, and within the same application, which implies one cube in bi7 backend, and all the information will be inside this cube
    Which implications has each of this options in performance, anda complexity of implementation, which would be better?
    Best regards

    Hi,
    I would suggest you post the question into the EPM / BPC area as this forum is about the BI part of BusinessObjects
    ingo

  • Groupwise 8 on Server 2008R2 SP1 Performance and Defrag Questions

    I am running GroupWise 8.0.2 on an HP DL360G G7 with 24 GB of RAM and Dual Xeon X5650 processors under Server 2008 R2 Sp1. Post Office is using roughly 562 GB of an 819 GB Disk located on an HP p2000 g3 SAS connected enclosure comprised of 12 x 146 GB (15k rpm) SAS disks in a RAID 10 configuration. Typically never more than 700 users connected to the Post Office at any time. I am experiencing client slowness at regular intervals. Roughly every three weeks. The solution tends to be to restart the server and give it a fresh start, which holds for a while.
    Concerns.
    1. When the server restarts I see minimal memory utilization maybe 1.5 GB within half an hour the Used memory climbs a bit to about 2GB but free memory is quickly pre-allocated to standby memory. Within about 2 hours the Free memory is all consumed and I have about 20+ GB of Standby Memory. Running RAMMap indicates that the memory is being used by Metafile and Mapped File, Which tells me that the Post Offices files are being indexed and cached into RAM. Then after a couple weeks go by the amount of active RAM exceeds 8GB (still mostly Metafile, not so much Mapped File). And standby RAM still consumes the remaining RAM between Metafile and Mapped file, leaving no free memory. Typically once I reach about 8GB of memory actively used by the system (mostly Metafile), it's time for performance to drop for the clients.
    Also, I'm seeing increases in Disk Queue Length for the Post Office Volume. Typically below 2 rarely as high as 5,
    I suspect my best solution is to start a regular defrag process as my volume is now 29% defragmented [yeah NTFS :( ]
    Question:
    I am concerned a defrag could take as much if not longer than 10 hours. Which is too long for the agents to be down. So I was wondering if anyone had used DiskKeeper or alternative 3rd party Defrag utilities that might be able to defrag open files in the background, or if anyone had run defrag with the agents running to get the defragable files, then shut down the agents for a second pass which should be considerably shorter. Any advice that can be offered, or other suggestions for my described issue, would be greatly appreciated.
    Thank You!

    In article <[email protected]>, Matt Karwowski wrote:
    > I am running GroupWise 8.0.2 on an HP DL360G G7 with 24 GB of RAM and Dual Xeon X5650
    > processors under Server 2008 R2 Sp1. Post Office is using roughly 562 GB of an 819
    > GB Disk ... I am experiencing client slowness at regular intervals. Roughly every
    > three weeks. The solution tends to be to restart the server and give it a fresh start,
    A) Updating to latest code may assist as this could also be a memory fragmentation type
    issue or such that has been fixed
    B) Perhaps even more RAM might help. How much space do the ofuser/*.db and ofmsg/*.db files
    take up? The more mail flows, the more of those DB file content are held in memory. A few
    versions ago, you tended to need as much RAM as the total of the DB files, and while those
    days are gladly past, it is still a good number to watch out for.
    C) Explore Caching mode for at least some of the users as that significantly reduces the
    load on a server.
    > I suspect my best solution is to start a regular defrag process as my volume is now
    > 29% defragmented [yeah NTFS :( ]
    Still way better than old FAT ;)
    This is one of the reasons why GroupWise runs better on Linux with either EXT3 or NSS
    volume types, and is why SLES is provided along with your GroupWise license.
    As for running Defragments, even running it for just a few hours a week will gradually
    help, especially if you can fit in one big burst near the beginning. So if you can automate
    the process of shut down agents, run defrag for X hours and then shut it down, then restart
    the agents, you may clear this all up. Just having the agents down an hour a week might
    clear the memory usage issue for you.
    I would be very hesitant to run any defrag on open database files of any email system
    unless the defrag tool knew explicitly about that email system. But a smart defragger that
    can keep the *.db files in their own (fastest) section of the drive and the rest off to the
    side would go a long way to making the defragmentation process much more efficient.
    I haven't directly run a GroupWise system on any flavor of Windows, since OS/2, so this is
    more a combination of all my platform knowledge, but I hope it gets you closer to a smooth
    running system. And if the other GW on Windows admins can pipe in, all the better.
    Andy Konecny
    KonecnyConsulting.ca in Toronto
    Andy's Profile: http://forums.novell.com/member.php?userid=75037

  • Performance and Compatability questions

    Hi everybody,
    I'm hoping no one minds me asking these basic questions.
    These past few years I've been using my a linksys WRT54GS (I'm pretty sure that's it) and it has been cutting out almost non-stop as of recently, almost every 15 minutes or less (which tends to be annoying). So I decided to start looking into new routers, and I thought that perhaps the Airport Extreme might be a good alternative.
    My first question is, my family is mixed between MACs and PCs. I was wondering if anyone has had any experience with using both systems on the router, and if they have had any major problems because of it (cutting out, or not getting signals at all)? But the other thing is, I'm usually in the basement, with the linksys on the first floor, which comes to roughly 70 feet away.
    My next question is, that the MACs in my house are first gen. Intel based MACs, so I don't believe any of them have wireless-n support (I think you need Intel Core 2 Duo Processor, which ours are not), will that cause any problem with the wireless links, I had read that routers with wireless-n have to be backwards compatible, but I don't know i thats true or not, so would our MACs be able to connect?
    The final question I have, is will the airport extreme work with a PS3? It can pick up the linksys router (when the router isn't crashing, which is rare), but is it compatible with it as well?
    I've read some reviews and it seems to get pretty good ratings, but I figured I'd ask my specific questions before making a purchase.
    Thanks.

    My first question is, my family is mixed between MACs and PCs. I was wondering if anyone has had any experience with using both systems on the router, and if they have had any major problems because of it (cutting out, or not getting signals at all)?
    Of course everyones situation is different, but I currently have no issues connecting both Macs and PCs to my 802.11n AirPort Extreme Base Station (AEBSn). I recently replaced a Linksys 802.11a/g wireless access point with the AEBSn. I left it in its default radio mode, which is "802.11n (b/g compatible)" and my Macs & PCs are a mix of 802.11n and g devices.
    But the other thing is, I'm usually in the basement, with the linksys on the first floor, which comes to roughly 70 feet away.
    The issue here is really wireless signal strength and the affects of Wi-Fi interference, specifically building material which may make it difficult to get a clean signal through ... so it will affect any manufacturer's router, not just the AirPorts. One test is to use a free utility, like iStumbler to measure the signal strength of your current router from the location in the basement. Any new router may perform as poorly if this signal level is very low.
    My next question is, that the MACs in my house are first gen. Intel based MACs, so I don't believe any of them have wireless-n support (I think you need Intel Core 2 Duo Processor, which ours are not), will that cause any problem with the wireless links,
    No, the AEBSn will support 802.11a/b/g/n clients.
    The final question I have, is will the airport extreme work with a PS3?
    The PS3 has built-in 802.11g so it should be compatible as well.

  • Streaming Video and Design Question

    Hi,
    I'm coding a screen that displays about 10 components that update with textual information at a pretty low rate (1Hz). Another component on this screen has to display a video stream (I receive pixel data over a network connection, it just contains a grey scale value in each byte).
    I have looked into various Java2D related ways to do this. I believe I want to take an active rendering approach using full screen exclusive mode, with perhaps a separate thread for the video stream. A significant class for the video stream seems to be BufferedImage.
    I'm looking for some feedback in the design approach (active rendering in full screen exclusive mode, with a separate thread to build and render the video stream), as well as any advice on how to build the displayable image.
    Thanks for any advice.

    The curve 8330 handles some streaming formats... With the MobiTV app I'm told you can watch streaming video of various cable channels including news and discovery channels and such. You can also view some video clips from www.youtube.com Not the flash clips of course! As far as streaming audio you can download slacker radio and flycast and have a semblance of streaming audio. I'm hoping for winamp/real audio type support of streaming media very soon to listen to various internet radio stations that aren't included in the aforementioned radio apps. Sidenote: Money rules everything, everywhere

  • Performance and Server questions

    Can anyone tell me how the Performance of Dev 6 really is over
    the internet? We would really like to have our application run
    over the web. Also what would be recommended OAS server
    requirements - memory etc to have acceptable speed. Has anyone
    had any problems with this? Has anyone successfully deployed
    their application with the 3 teir architecture? Are there any
    white papers that address this that I can look at?
    Thanks.
    null

    Can anyone tell me how the Performance of Dev 6 really is over
    the internet? We would really like to have our application run
    over the web. Also what would be recommended OAS server
    requirements - memory etc to have acceptable speed. Has anyone
    had any problems with this? Has anyone successfully deployed
    their application with the 3 teir architecture? Are there any
    white papers that address this that I can look at?
    Thanks.
    null

  • How to improve the query performance in to report level and designer level

    How to improve the query performance in to report level and designer level......?
    Plz let me know the detail view......

    first its all based on the design of the database, universe and the report.
    at the universe Level, you have to check your Contexts very well to get the optimal performance of the universe and also your joins, keep your joins with key fields, will give you the best performance.
    at the report level, try to make the reports dynamic as much as you can, (Parameters) and so on.
    and when you create a paremeter try to get it match with the key fields in the database.
    good luck
    Amr

  • Performance issue and functional question regarding updates on tables

    A person at my site wrote some code to update a custom field on the MARC table that was being copied from the MARA table.  Here is what I would have expected to see as the code.  Assume that both sets of code have a parameter called p_werks which is the plant in question.
    data : commit_count type i.
    select matnr zfield from mara into (wa_marc-matnr, wa_marc-zfield).
      update marc set zfield = wa_marc-zfield
         where werks = p_werks and matnr = wa_matnr.
      commit work and wait.
    endselect.
    I would have committed every 200 rows instead of every one row, but here's the actual code and my question isn't around the commits but something else.  In this case an internal table was built with two elements - MATNR and WERKS - could have done that above too, but that's not my question.
                DO.
                  " Lock the record that needs to be update with material creation date
                  CALL FUNCTION 'ENQUEUE_EMMARCS'
                    EXPORTING
                      mode_marc      = 'S'
                      mandt          = sy-mandt
                      matnr          = wa_marc-matnr
                      werks          = wa_marc-werks
                    EXCEPTIONS
                      foreign_lock   = 1
                      system_failure = 2
                      OTHERS         = 3.
                  IF sy-subrc <> 0.
                    " Wait, if the records not able to perform as lock
                    CALL FUNCTION 'RZL_SLEEP'.
                  ELSE.
                    EXIT.
                  ENDIF.
                ENDDO.
                " Update the record in the table MARC with material creation date
                UPDATE marc SET zzdate = wa_mara-zzdate
                           WHERE matnr = wa_mara-matnr AND
                                 werks = wa_marc-werks.    " IN s_werks.
                IF sy-subrc EQ 0.
                  " Save record in the database table MARC
                  CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
                    EXPORTING
                      wait   = 'X'
                    IMPORTING
                      return = wa_return.
                  wa_log-matnr   = wa_marc-matnr.
                  wa_log-werks   = wa_marc-werks.
                  wa_log-type    = 'S'.
                  " text-010 - 'Material creation date has updated'.
                  wa_log-message = text-010.
                  wa_log-zzdate  = wa_mara-zzdate.
                  APPEND wa_log TO tb_log.
                  CLEAR: wa_return,wa_log.
                ELSE.
                  " Roll back the record(un save), if there is any issue occurs
                  CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'
                    IMPORTING
                      return = wa_return.
                  wa_log-matnr   = wa_marc-matnr.
                  wa_log-werks   = wa_marc-werks.
                  wa_log-type    = 'E'.
                  " 'Material creation date does not updated'.
                  wa_log-message = text-011.
                  wa_log-zzdate  = wa_mara-zzdate..
                  APPEND wa_log TO tb_log.
                  CLEAR: wa_return, wa_log.
                ENDIF.
                " Unlock the record from data base
                CALL FUNCTION 'DEQUEUE_EMMARCS'
                  EXPORTING
                    mode_marc = 'S'
                    mandt     = sy-mandt
                    matnr     = wa_marc-matnr
                    werks     = wa_marc-werks.
              ENDIF.
    Here's the question - why did this person enqueue and dequeue explicit locks like this ?  They claimed it was to prevent issues - what issues ???  Is there something special about updating tables that we don't know about ?  We've actually seen it where the system runs out of these ENQUEUE locks.
    Before you all go off the deep end and ask why not just do the update, keep in mind that you don't want to update a million + rows and then do a commit either - that locks up the entire table!

    The ENQUEUE lock insure that another program called by another user will not update the data at the same time, so preventing database coherence to be lost. In fact, another user on a SAP correct transaction, has read the record and locked it, so when it will be updated your modifications will be lost, also you could override modifications made by another user in another luw.
    You cannot use a COMMIT WORK in a SELECT - ENDSELECT, because COMMIT WORK will close each and every opened database cursor, so your first idea would dump after the first update. (so the internal table is mandatory)
    Go through some documentation like [Updates in the R/3 System (BC-CST-UP)|http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCCSTUP/BCCSTUP_PT.pdf]
    Regards

  • Method design question...and passing object as parameter to webserice

    I am new to webservice...one design question
    i am writing a webservice to check whether a user is valid user or not. The users are categorized as Member, Admin and Professional. For each user type I have to hit different data source to verify.
    I can get this user type as parameter. What is the best approach to define the method?
    Having one single method �isValidUser � and all the client web service can always call this method and provide user type or should I define method for each type like isValidMember, isValidAdmin ?
    One more thing...in future the requirement may change for professional to have more required field in that case the parameter need to have more attribute. But on client side not much change if I have a single isValidUser method...all they have to do is pass additional values
    isValidUser(String username, String usertype, String[] userAttributes){
    if usertype == member
    call member code
    else if usertype = professional
    call professional code
    else if usertype = admin
    call admin code
    else
    throw error
    or
    isValidMember(String username, String[] userAttributes){
    call member code
    One last question, can the parameter be passed as object in web service like USER object.

    First of all, here is my code
    CREATE OR REPLACE
    TYPE USERCONTEXT AS OBJECT
    user_login varchar2,
    user_id integer,
    CONSTRUCTOR FUNCTION USERCONTEXT (
    P_LOGIN IN INTEGER
    P_ID_ID IN INTEGER
    ) RETURN SELF AS RESULT
    Either your type wont be compiled or this is not the real code..

  • SCA design question - PIX and SCA with dual logical SSL server.

    I have a SCA design question. please correct or verify my solution.
    1. connectivity.
    <Client with port 443>--<ISP>--<PIX>--<SCA>--<SERVER(two IP on single NIC and each IP associates to WEB server) with port 81>
    * client will access WEB server with x.x.1.100 or x.x.1.101
    2. physical IP address
    - PIX outside=x.x.1.1
    - PIX inside=x.y.1.1
    - SCA device=x.y.1.2
    - SERVER NIC1=x.y.1.10
    - SERVER NIC2=x.y.1.11
    3. PIX NAT
    - static#1=x.x.1.100 map to x.y.1.10
    - static#2=x.x.1.101 map to x.y.1.11
    4. SCA configuration.
    mode one-port
    no mode one-port
    ip address x.y.1.2 netmask 255.255.255.0
    ip route 0.0.0.0 0.0.0.0 x.y.1.1
    ssl
    server SERVER1
    ip address x.y.1.10
    localport 443
    remoteport 81
    server SERVER2
    ip address x.y.1.11
    localport 443
    remoteport 81
    Thanks,

    The document http://www.cisco.com/univercd/cc/td/doc/product/webscale/css/scacfggd/ has a link to a page which describes how to use the configuration manager command line interface to configure the Secure Content Accelerator. Several configuration examples are also included in this page.

  • What version of osx would give an Imac 24'' 2.3 ghz intel core 2 Duo, with 4gb of DDR2 SDRAM, the best performance? To do illustration and design jobs.

    What version of osx would give an Imac 24'' 2.3 ghz intel core 2 Duo, with 4gb of DDR2 SDRAM, the best performance? To do illustration and design jobs.
    Would the more recents os versions increase the overall performance? Is it worth it?

    Hello!
    Snow Leopard is what i am currently using. But i guess it does need some clean up and maybe a clean install to see if it gets faster.
    I believe 4gb is the limit for this model, although i've seen some with 6gb…
    Thank you for the help

  • Does Final Cut Express 4 work with mac mini and other questions...?

    I have a few questions I need help in answering... I would appreciate opinions on the following:
    I am about to make a purchase of a new mac. I already have a mac mini PPC, but it is too slow...
    Here is what I am thinking.
    Mac mini, 2GB mem, 160GB hard drive. Will this new mac mini work with Final Cut Express 4?
    Or should I go with the more powerful iMac?
    I like to make movies on my mac but I hate iMovie 08 so I constantly use iMovie HD 06. Should I go to Final Cut Express 4 or just stick with iMovie HD 06?

    There is no doubt that FCE 4 will run on an Intel mini - the GMA graphics processor is compatible. There is also no doubt that an iMac would give you somewhat better overall performance, though it's important perhaps to note that as complex as FCE is, it is actually a little less demanding on hardware than iMovie has traditionally been, so the difference between a mini with 2Gb RAM and a similarly equipped iMac will not be all that great in FCE, though it will become a little more noticeable if you continue to use iMovie 6.
    The main reason for the performance difference between iMovie and FCE is that in order to provide a reasonably sophisticated video editing package that will run on very basic hardware, Apple relied heavily on caching techniques, meaning that while FCE basically spools from video and relies on CPU power, iMovie constantly moves data to and from the hard drive. The mini's hard drive is a (relatively) slow device in comparison to the iMac's - thus the iMac runs iMovie rather more seamlessly. In FCE, the difference is, as said, rather less clear unless you were to go for the top-end model.
    If your budget can stretch to an iMac it makes sense to go that route simply because the extra power and overall performance (and the better graphics processor) will give you greater flexibility down the road. On the other hand, if funds are tight and you need to spend as little as reasonably possible, a mini will do fine, and is sufficient;y inexpensive that in a couple of years could be sold off and replaced without severe financial implications.
    In terms of the choice between FCE or iMovie, beware that if you haven't used FCE before, the transition is not easy, and the learning curve for it is pretty steep. That said, it has far greater power and flexibility in terms of what you can achieve with it than has ever been possible with iMovie.
    That said, for casual and home use, iMovie (version 6 at least) has plenty of power and tools for the majority of projects. iMovie 8 is, in many ways, deeply flawed for those who want that sort of power and workflow tools, being that it is designed for quick and easy creation of movies, not creative and considered productions. It has it's uses, but if you have experience with iMovie already, v8 is really not intended for you at all.
    All that said, I've used iMovie 6 on systems will a lot less power than the current Intel minis, and have found it stable and workable, so I have no doubt it would work well on any current Mac.

  • EvDRE: Design Questions

    A couple of "best practices" questions regarding for a simple EVDRE report.
    1. Whis is a better design for performance:
    A. Five single DREs on one worksheet, or
    B. One DRE on one worksheet that uses five pipes (|)?
    2. I have a 1x1 report where column IDs need to be dynamically updated and rows are static. Which is better:
    A. Use a dynamic EVDRE with column expansion for column IDs and data, or
    B. Use an EVEXP for the column IDs and a static EVDRE for the data.

    I'll answer your questions using your numbering after covering the basic difference betwee EVDRE & the legacy queries.
    EVDRE is superior for performance and scalability over other legacy query functions (EVGET,EVGTS,EVSEN,EVSND) for a number of reasons:
          - EVDRE executes as a single query.  Legacy functions execute as individual queries.  So, if you have a 10x10 data area you'll have 1 query
              to populate the cells wtih EVDRE and 100 queries to populate with the other functions.
          - EVDRE can query either the relational database or the OLAP cube depending on which is most efficent for the given request.
               The other functions can only access the OLAP database.
          - EVDRE is the only function that will be enhanced in future releases.  Others are only for backwards support.  This would include all supporting functions such as EVEXP,EVNXP, etc.
         - The EVDRE data range only contains values that get populated by the query versus actual functions so you have more flexibility if you need to put XLS functions within the range and also cannot run into issues if someone into the data cell.
    1. It depends on the query.  If you have various specific combinations of data that you're querying, you're generally better off to split them into separate EVDRE's.  For example, if you have two blocks of data: Actuals/ 2008, Budget/2007.  You're better off making these separate because of the query that would be formed.  If you do it as a single EVDRE,  your query would be selecting the categories Budget & Actual and the years 2007 & 2008.  this means you'd get Budget 2007 & 2008 and Acutal 2007 & 2008.  Then after all that data is returned the values for Actuals/2007 & Budget 2008 would be discarded.  But by breaking them into two EVDRE's (that would execute in series) you would only query the data you actuall want.  Separate example, you want Actuals 2007 & 2008 and Budget 2008.  You would want to have two EVDRE's in one of two (equally performing) setups: Actuals/2007 & 2008, Budget 2008 or Actuals & Budget/2008, Actuals 2007.  I would probably use the first just for clarity.
    2.  Use the EVDRE to perform the column expansion.  It is poor practic to mix EVDRE and the legacy functions.  You can get both performance issues and execution ordering issues if you mix them.

  • SOA real-time design question

    Hi All,
    We are currently working with SOA Suite 11.1.1.4. I have a SOA application requirement to receive real-time feed for six data tables from an external third party. The implementation consists of five one-way operations in the WSDL to populate the six database tables.
    I have a design question. The organization plans to use this data across various departments which requires to replicate or supply the data to other internal databases.
    In my understanding there are two options
    1) Within the SOA application fork the data hitting the web-service to different databases.
    My concern with this approach is what if organizations keep coming with such requests and I keep forking and supplying multiple internal databases with the same data. This feed has to be real-time, too much forking with impact the performance and create unwanted dependencies for this critical link for data supply.2) I could tell other internal projects to get the data from the populated main database.
    My concern here is that firstly the data is pushed into this database flat without any constraints and it is difficult to query to get specific data. This design has been purposely put in place to facilitate real-time performance.Also asking every internal projects to get data from main database will affect its performance.
    Please suggest which approach should I take (advantage/disadvantage. Apart from the above two solutions, is there any other recommended solution to mitigate the risks. This link between our organization and external party is somewhat like a lifeline for BAU, so certainly don't want to create more dependencies and overhead.
    Thanks

    I had tried implementing the JMS publisher/subscriber pattern before, unfortunately I experienced performance was not so good compared to the directly writing to the db adapter. I feel the organization SOA infrastructure is not setup correctly to cope with the number of messages coming through from external third party. Our current setup consists of three WebLogic Servers (Admin, SOA, BAM) all running on only 8GB physical RAM on one machine. Is there Oracle guideline for setting up infrastructure for a SOA application receiving roughly 600000 messages a day. I am using SOA 11.1.1.4. JMS publisher/subscriber pattern just does not cope and I see significant performance lag after few hours of running. The JMS server used was WebLogic JMS
    Thanks
    Edited by: user5108636 on Jun 13, 2011 4:19 PM
    Edited by: user5108636 on Jun 13, 2011 7:03 PM

  • SQL Performance and Security

    Help needed here please. I am new to this concept and i am working on a tutorial based on SQL performance and security. I have worked my head round this but now i am stuck.
    Here is the questions:
    1. Analyse possible performance problems, and suggest solutions for each of the following transactions against the database
    a) A manager of a project needs to inspect total planned and actual hours spent on a project broken down by activity.
    e.g     
    Project: xxxxxxxxxxxxxx
    Activity Code          planned     actual (to date)
         1          20          25
         2          30          30
         3          40          24
    Total               300          200
    Note that actual time spent on an activity must be calculated from the WORK UNIT table.
    b)On several lists (e.g. list or combo boxes) in the on-line system it is necessary to identify completed, current, or future projects.
    2. Security: Justify and implement solutions at the server that meet the following security requirements
    (i)Only members of the Corporate Strategy Department (which is an organisation unit) should be able to enter, update and delete data in the project table. All users should be able to read this information.
    (ii)Employees should only be able to read information from the project table (excluding the budget) for projects they are assigned to.
    (iii)Only the manager of a project should be able to update (insert, update, delete) any non-key information in the project table relating to that project.
    Here is the project tables
    set echo on
    * Changes
    * 4.10.00
    * manager of employee on a project included in the employee on project table
    * activity table now has compound key, based on ID dependence between project
    * and activity
    drop table org_unit cascade constraints;
    drop table project cascade constraints;
    drop table employee cascade constraints;
    drop table employee_on_project cascade constraints;
    drop table employee_on_activity cascade constraints;
    drop table activity cascade constraints;
    drop table activity_order cascade constraints;
    drop table work_unit cascade constraints;
    * org_unit
    * type - for example in lmu might be FACULTY, or SCHOOL
    CREATE TABLE org_unit
    ou_id               NUMBER(4)      CONSTRAINT ou_pk PRIMARY KEY,
    ou_name          VARCHAR2(40)     CONSTRAINT ou_name_uq UNIQUE
                             CONSTRAINT ou_name_nn NOT NULL,
    ou_type          VARCHAR2(30) CONSTRAINT ou_type_nn NOT NULL,
    ou_parent_org_id     NUMBER(4)     CONSTRAINT ou_parent_org_unit_fk
                             REFERENCES org_unit
    * project
    CREATE TABLE project
    proj_id          NUMBER(5)     CONSTRAINT project_pk PRIMARY KEY,
    proj_name          VARCHAR2(40)     CONSTRAINT proj_name_uq UNIQUE
                             CONSTRAINT proj_name_nn NOT NULL,
    proj_budget          NUMBER(8,2)     CONSTRAINT proj_budget_nn NOT NULL,
    proj_ou_id          NUMBER(4)     CONSTRAINT proj_ou_fk REFERENCES org_unit,
    proj_planned_start_dt     DATE,
    proj_planned_finish_dt DATE,
    proj_actual_start_dt DATE
    * employee
    CREATE TABLE employee
    emp_id               NUMBER(6)     CONSTRAINT emp_pk PRIMARY KEY,
    emp_name          VARCHAR2(40)     CONSTRAINT emp_name_nn NOT NULL,
    emp_hiredate          DATE          CONSTRAINT emp_hiredate_nn NOT NULL,
    ou_id               NUMBER(4)      CONSTRAINT emp_ou_fk REFERENCES org_unit
    * activity
    * note each activity is associated with a project
    * act_type is the type of the activity, for example ANALYSIS, DESIGN, BUILD,
    * USER ACCEPTANCE TESTING ...
    * each activity has a people budget , in other words an amount to spend on
    * wages
    CREATE TABLE activity
    act_id               NUMBER(6),
    act_proj_id          NUMBER(5)     CONSTRAINT act_proj_fk REFERENCES project
                             CONSTRAINT act_proj_id_nn NOT NULL,
    act_name          VARCHAR2(40)     CONSTRAINT act_name_nn NOT NULL,
    act_type          VARCHAR2(30)     CONSTRAINT act_type_nn NOT NULL,
    act_planned_start_dt     DATE,
    act_actual_start_dt      DATE,
    act_planned_end_dt     DATE,
    act_actual_end_dt     DATE,
    act_planned_hours     number(6)     CONSTRAINT act_planned_hours_nn NOT NULL,
    act_people_budget     NUMBER(8,2)      CONSTRAINT act_people_budget_nn NOT NULL,
    CONSTRAINT act_pk PRIMARY KEY (act_id, act_proj_id)
    * employee on project
    * when an employee is assigned to a project, an hourly rate is set
    * remember that the persons manager depends on the project they are on
    * the implication being that the manager needs to be assigned to the project
    * before the 'managed'
    CREATE TABLE employee_on_project
    ep_emp_id          NUMBER(6)     CONSTRAINT ep_emp_fk REFERENCES employee,
    ep_proj_id          NUMBER(5)     CONSTRAINT ep_proj_fk REFERENCES project,
    ep_hourly_rate      NUMBER(5,2)      CONSTRAINT ep_hourly_rate_nn NOT NULL,
    ep_mgr_emp_id          NUMBER(6),
    CONSTRAINT ep_pk PRIMARY KEY(ep_emp_id, ep_proj_id),
    CONSTRAINT ep_mgr_fk FOREIGN KEY (ep_mgr_emp_id, ep_proj_id) REFERENCES employee_on_project
    * employee on activity
    * type - for example in lmu might be FACULTY, or SCHOOL
    CREATE TABLE employee_on_activity
    ea_emp_id          NUMBER(6),
    ea_proj_id          NUMBER(5),
    ea_act_id          NUMBER(6),      
    ea_planned_hours      NUMBER(3)     CONSTRAINT ea_planned_hours_nn NOT NULL,
    CONSTRAINT ea_pk PRIMARY KEY(ea_emp_id, ea_proj_id, ea_act_id),
    CONSTRAINT ea_act_fk FOREIGN KEY (ea_act_id, ea_proj_id) REFERENCES activity ,
    CONSTRAINT ea_ep_fk FOREIGN KEY (ea_emp_id, ea_proj_id) REFERENCES employee_on_project
    * activity order
    * only need a prior activity. If activity A is followed by activity B then
    (B is the prior activity of A)
    CREATE TABLE activity_order
    ao_act_id          NUMBER(6),      
    ao_proj_id          NUMBER(5),
    ao_prior_act_id      NUMBER(6),
    CONSTRAINT ao_pk PRIMARY KEY (ao_act_id, ao_prior_act_id, ao_proj_id),
    CONSTRAINT ao_act_fk FOREIGN KEY (ao_act_id, ao_proj_id) REFERENCES activity (act_id, act_proj_id),
    CONSTRAINT ao_prior_act_fk FOREIGN KEY (ao_prior_act_id, ao_proj_id) REFERENCES activity (act_id, act_proj_id)
    * work unit
    * remember that DATE includes time
    CREATE TABLE work_unit
    wu_emp_id          NUMBER(5),
    wu_act_id          NUMBER(6),
    wu_proj_id          NUMBER(5),
    wu_start_dt          DATE CONSTRAINT wu_start_dt_nn NOT NULL,
    wu_end_dt          DATE CONSTRAINT wu_end_dt_nn NOT NULL,
    CONSTRAINT wu_pk PRIMARY KEY (wu_emp_id, wu_proj_id, wu_act_id, wu_start_dt),
    CONSTRAINT wu_ea_fk FOREIGN KEY (wu_emp_id, wu_proj_id, wu_act_id)
              REFERENCES employee_on_activity( ea_emp_id, ea_proj_id, ea_act_id)
    /* enter data */
    start ouins
    start empins
    start projins
    start actins
    start aoins
    start epins
    start eains
    start wuins
    start pmselect
    I have the tables containing ouins and the rest. email me on [email protected] if you want to have a look at the tables.

    Answer to your 2nd question is easy. Create database roles for the various groups of people who are allowed to access or perform various DML actions.
    The assign the various users to these groups. The users will be restricted to what the roles are restricted to.
    Look up roles if you are not familiar with it.

Maybe you are looking for

  • External display not working with i-mac.

    Using a newer i-mac 2013. My acer monitor stopped working entirely when i installed the newest OS 10.9. Already tried the ram resets and display resets. Any help?

  • How to install Apache Web Server with PHP on Sun Solaris Sparc machine

    Hi, We are trying to install the Apache Web Server and the PHP package on a Sun Solaris Sparc machine running on SunOS 5.8. We are having compilation problems with the source code of both these packages. Does anybody know if there are ready solaris p

  • 20-pinn to VGA cable

    I can't seem to get my ipad to display on the projector when plugged into a projector via the apple vga connector cable. With my mac, I would use auto detect display, and when i plug the cable into my iphone it says that the accessory does not work w

  • TS1470 Downloaded songs stop mid-song in iTunes music library and on iPod.

    I received the 'secure network connection could not be established' error message several times during a recent audio download.  I was able to eventually complete all of the downloads by going to the 'available downloads' area.  However, when I playe

  • Can I Do This?  What Would Happen?

    Hello Everyone. I preordered my 3GS and received it last Friday at the store, and signed the 2 year renewal thing. My online account is still saying I am eligible for an upgrade and it will allow me to order the 3GS at the 199/299 price. What would h