Best PHP approach

Hi,
We use dbxml 2.4.13 php API (with php_db4). The app uses intensive xquery (just read - no xml_modify) on document, and performs whole document update : getDocument() => setContent() => updateDocument(). In addition, we works on 2 containers with 1 manager.
The first approach we have take works well, without using DBenv and Transaction. I have tried to enable dbenv and transaction, at the first sight it was perfect, i can see log files, use xquery trace function and transactions (although transaction is not required for us). But regulary, the openContainer function hang, and the only solution is to restart apache, which do not solve the openContainer hang though.
if i call db_deadlock -v it says "125 lockers" and "rejected 0 locks". after that i still can't openContainer from php, and db_verify hangs as well. the only solution is to call db_recover, which make the all things working again.
During this testing phase, i was using only one web-browser, so i don't think it's a concurrent write deadlock problem.
My questions are :
1/* What's wrong with my dbenv setup ?
2*/ Can i have dbenv without transaction (just log file) as my main concern is the reliabilty of stored data ?
3*/ Can i rely on just using the php api without log and transaction ? (no concurency updateDocument problem ?)
Thanks for your replies,
Willy

If you are Windows:
XAMPP and WAMP are excellent choices to get it setup quickly. I personally prefer WAMP.
If you are on Ubuntu:
Use the Synaptic Package Manager to install packages for Apache, MySQL and PHP.

Similar Messages

  • Best possible approach to add fields in an FPM WDA application

    Hi guys,
    Our SRM 7 has FPM based WDA views. Extending the customer include by appending fields alone will not work for us as some fields are dropdowns that will perform actions and display/hide other custom fields.
    I can think of the enhancement framework to enhance the views. Since i am knew to WDA and FPM I am thinking there is another easier way to do this.
    What is the best possible approach to fulfil this requirement?
    Regards
    Ali

    Thanks to responses from Chaitanya and Thomas.
    I managed to create additional fields(with action assigned) in the standard shopping cart screens.
    I extended the SRM header structure, added the custom fields of this structure in the header node of the context, and by binding the property of these fileds to the attributes, I don't have to code for the fields enable/disable when the transation is called in edit/create/display mode.
    However, I am still struggling with the dropdown(with action). They remain enabled, even when the data has been saved or the transaction is in display mode. Hints?
    Regards, Ali

  • What is best performing approach to report building in my case?

    Hi all,
    I want to know what is the best performing approach in the case of an overload of the system,
    understood as large number of concurrent operations.
    Each operation is a query that, in most cases, returns a large amount of data.
    I am interested in the approach that not create bottlenecks and slow down, blocks for a long time the system.
    The alternatives that I would like more information about are:
    1) reports built with the JDBC (JNDI) specifying "java:jdbc/xxxxdatasource"
    (taken from the oracle-ds.xml's jndi-name tag) as "Connection name (optional)"
    with the query written into the rpt file and runned by Crystal Reports that I think makes a direct connection to DB
    and integrated into Java with Java Reporting Component.
    This approach has also threads limits, depending on the version of the report engine.
    2) reports built with "Field definition only" with the query written and runned into the my application that call the report only through the resultSet to be displayed
    (reportClientDoc.getDatabaseController().setDataSource(resultSet, tableName , tableName);)
    My concern with this approach is that it seems to require loading all results into memory
    and generating the report in one big step.
    Is there a way to avoid this? Some-how to page through report data?
    I've also read that Crystal Reports can work with any data provider that implements ResultSet.
    Is this true? If so, could I create my own custom ResultSet implementation that would let me
    page through my results without loading everything into memory at-once?
    If possible, please point me to the documentation for this approach.
    I haven't been able to find any examples.
    If there is a better approach that I haven't mentioned, please let me know.
    Thanks in advance

    The first option is the best one for performance.  The only time you should use result sets is when you need to do runtime manipulation of the data through your application and is not acheivable in a stored procedure.

  • Best practice approach for seperating Database and SAP servers

    Hi,
    I am looking for a best practice approach/strategy for setting up a distributed SAP landscape i.e separating the database and sap servers. If anyone has some strategies to share.
    Thanks very much

    I can imagine the most easiest way:
    Install a dialog instance on a new server and make sure it can connect nicely to the database. Then shut down the CI on the database server, copy the profiles (and adapt them) and start your CI on the new server. If that doesn't work at the first time you can always restart the CI on the database server again.
    Markus

  • Which would be the best NAC approach with thin clients up today?

    Hello guys,
    Which would be the best NAC approach using CAM/CAS infrastructure for thin clients? I guess nac agent is still not supported on thin clients/virtual desktops right?
    Would mac address authentication be the best option?
    Regards,
    Emilio

    Hello guys,
    Which would be the best NAC approach using CAM/CAS infrastructure for thin clients? I guess nac agent is still not supported on thin clients/virtual desktops right?
    Would mac address authentication be the best option?
    Regards,
    Emilio

  • Best as3 approach for this website...

    Hi everyone!
    This post will be a little bit long and I thank you in advance for the patience and help. Sorry for any (or all) English errors. I'm brazilian.
    Well, I want to reconstruct a website I have but now using as external .swf and .as files as I can. Also I need to change the way it works.
    I have problems with loaders but I'm decided to dive into the subject and put it into practice here, for the first time.
    The website will be really simple and here I have a scheme I just made to help you understand what are my doubts.
    How this is going to work?
    - When the user clicks a button, the current section will slide out (maybe unload?) while  the new section slide in. This is the basic functionality.
    - The website will have a home page when the user clicks the company logo.
    - In the scheme you see all 4 sections but only one will be visible. The others will be hidden by a  mask.
    To start, I guess the best thing I can do is built each section in a separeted .fla file to have 4 external .swfs to load.
    Then I thought about loading all 4 sections inside a sprite container. Then, "onSectionClick" slide the sprite to the right X position to show the correspondent section.
    Now, what's the best approach to achieve that? The obvious is that each section should load only when the user clicks a button. Should each seaction has its own loader? When is the user going to see the message "loading section" or progress bar? Should I slide the section in onLoadingComplete or should I slide it in and show the message/progress bar? Should I unload the current section when that slide out?
    I also thought about working with two sprites. One to loadSection and slideIn and another to load another section. Then, onSectionClick the currentSprite slideOut while the other slideIn and on background the first sprite unloads its content, change its X position and get ready to load another section when requested.
    I think that's all.
    I have this thing working but it's terrible because all sections load at once and there's no loading progress or whatsoever.
    And thanks again!

    thanks kglad!
    So every section will become a distinct part of the site.
    I was really thinking about making every section an external .swf instead of exporting movieClips from stage (linkage option). Which one is better?
    As I said, Andrei helped me with the loading process. Actually he's still helping me but with what he already taught me I think I'll be able to do the loading thing.
    Anyway, to make my life easier, imagine that all sections will tween to stage from the right side of it... to do that I'll have an onLoad method that will run when the loading process has finished. This method will call other two methods, slideOut and slideIn.
    slideIn will only bring the loaded .swf loader on stage (from the right side) while the slideOut will push the current .swf out of stage (to the left side), unload it and finally be positioned at the other side of the stage (right) to receive the next section when requested?
    Is this the best way to do this?
    I think this is so confusing that not even you will understand what I'm trying to say.

  • Best Possible Approach : Service Broker or replication?

    I have a business scenario. I have an application whose DB resides on SQL Server A. My application resides on SQL server B. Both SQL servers are on the same network. Server A has a table that is being constantly updated via a web application. We want that
    table to be on Server B. At present it is updated nightly from Server A to Server B.
    We want the changes to be replicated to server B in almost real time. 30 minutes delay could work.
    Now considering this business case what would eb my best options:
    1. Implement CDC on server A for the interested table ( Please note that CDC is only for 1 table, I'm not sure if this is possible) . Bring the table from Server A once to Server B using a Linked Server. Then use the CDC from server A to Merge and Update
    on table on Server b using a SP and Schedule this SP after 30 minutes.
    OR
    2. Use CDC and Service combination.
    OR
    3. use Replication.
    OR  ...???
    Any help will greatly be appreciated.

    Snapshot Replication is OK solution here.  You would need to set the snapshot period to 30 min or less. If you want changes to get populated immediately use transactional replication. Service Broker is the best approach since you wouldn't
    have to maintain infrastructures associated with establishing replication facility. But use of Service Broker will require some expertise. Also take a look at this product called SnipeDB framework SnipeDB.com SnipeDB framework would not replicate the
    table, but would allow any subscriber application to get an update on any change in your data.

  • What is the best MVC approach for designing Swing application

    Hi...
    I am designing an client/server application. In this I am using java swing for the front-end development. We decided on using MVC approach for the front-end design. I found that there are more than one approachs for designing in MVC.
    Which is the best way to model. To create your model taking View into consideration or Creating the Model taking the tables into consideration.
    Can anybody give help me out in this this is urgent. Thanks
    sai

    I'm not sure what you are asking, so I'll just ramble a bit and hope it helps.
    Create your model taking the view and the data into consideration. :-)
    Design a class to hold instances of the rows in your database that correspond (more or less) to the structure of the data. Add any needed methods to that data object to support the data that you might need for the view.
    For example, suppose your database stores a name in two columns (firstname and lastname). You pull that data from the database into a collection of Person objects into your model. Your Person class has two fields (and set/get methods) to hold that data. Now, in your view, you want to display the name as one string, so add a method to the Person object to get the full name (just by concatenating the two data elements).
    You have to take everything into consideration. How you are going to view the data as well as how it might be structured.

  • Best Query Approach

    I have a scenario to display data from 2 tables combined. First table Named 'DayTable' consists of daily plan and actual. Second table named 'MonthTable' consists of Monthly plan and actual. I need to display last 6months data and the current month daily
    data.
    My output is like
    For this i wrote the query in 2 methods. One is using temp tables and another with out temp table. Please suggest the best apporach for this. Details are below
    Approach 1: Using Temp Table
    Declare @startdate date = CONVERT(DATE, DATEADD(dd, -DAY(DATEADD(MONTH, 0, GETDATE())) + 1, DATEADD(MONTH, 0, GETDATE())))
    Declare @endDate date = DATEADD(DAY, -DAY(DATEADD(MONTH, 1, GETDATE())), DATEADD(MONTH, 1, GETDATE()))
    CREATE TABLE #TEMP
    PlanDate NVARCHAR(100),
    [PastTrend - Plan] INT,
    [PastTrend - Actual] INT,
    [Current - Plan] INT,
    [Current - Actual] INT,
    ;With cte
    as
    Select @startdate sDate
    Union All
    Select DATEADD(day,1,sDate) From cte where DATEADD(day,1,sDate) <= @endDate
    INSERT INTO #TEMP
    SELECT
    REPLACE(CONVERT(CHAR(6), A.sDate, 106),' ',' - ') PlanDate
    ,NULL AS [PastTrend - Plan]
    ,NULL AS [PastTrend - Actual]
    ,SUM(B.PlanQuantity) AS [Current - Plan]
    ,SUM(B.Actual) AS [Current - Actual]
    FROM cte A
    LEFT OUTER JOIN DayTable B
    ON A.sDate = CONVERT(DATE,B.PlanDate)
    GROUP BY A.sDate
    SELECT
    FROM
    SELECT
    CONVERT(CHAR(3), datename(month,PlanMonth)) + ' ' + RIGHT(CONVERT(VARCHAR(4), YEAR(PlanMonth)), 2) AS PlanDate
    ,SUM(PlanQuantity) AS [PastTrend - Plan]
    ,SUM(Actual) AS [PastTrend - Actual]
    ,NULL AS [Current - Plan]
    ,NULL AS [Current - Actual]
    FROM
    MonthTable
    WHERE CONVERT(DATE, PlanMonth) >= CONVERT(DATE, DATEADD(dd, -DAY(DATEADD(MONTH, 0, GETDATE())) + 1, DATEADD(MONTH, -6, GETDATE())))
    group by PlanMonth
    UNION ALL
    SELECT
    PlanDate
    ,[PastTrend - Plan]
    ,[PastTrend - Actual]
    ,[Current - Plan]
    ,[Current - Actual]
    FROM
    #TEMP
    ) T1
    DROP TABLE #TEMP
    When i use Actual execution plan, Query Cost (relative to the batch) : 90%
    Approach 2: With out temp Table
    Declare @startdate date = CONVERT(DATE, DATEADD(dd, -DAY(DATEADD(MONTH, 0, GETDATE())) + 1, DATEADD(MONTH, 0, GETDATE())))
    Declare @endDate date = DATEADD(DAY, -DAY(DATEADD(MONTH, 1, GETDATE())), DATEADD(MONTH, 1, GETDATE()))
    ;With cte
    as
    Select @startdate sDate
    Union All
    Select DATEADD(day,1,sDate) From cte where DATEADD(day,1,sDate) <= @endDate
    SELECT
    T1.PlanDate
    ,T1.[PastTrend - Plan]
    ,T1.[PastTrend - Actual]
    ,T1.[Current - Plan]
    ,T1.[Current - Actual]
    FROM
    SELECT
    A.sDate AS OriginalDate
    ,REPLACE(CONVERT(CHAR(6), A.sDate, 106),' ',' - ') PlanDate
    ,NULL AS [PastTrend - Plan]
    ,NULL AS [PastTrend - Actual]
    ,SUM(B.PlanQuantity) AS [Current - Plan]
    ,SUM(B.Actual) AS [Current - Actual]
    FROM cte A
    LEFT OUTER JOIN DayTable B
    ON A.sDate = CONVERT(DATE,B.PlanDate)
    GROUP BY A.sDate
    UNION ALL
    SELECT
    PlanMonth AS OriginalDate
    ,CONVERT(CHAR(3), datename(month,PlanMonth)) + ' ' + RIGHT(CONVERT(VARCHAR(4), YEAR(PlanMonth)), 2) AS PlanDate
    ,SUM(PlanQuantity) AS [PastTrend - Plan]
    ,SUM(Actual) AS [PastTrend - Actual]
    ,NULL AS [Current - Plan]
    ,NULL AS [Current - Actual]
    FROM
    MonthTable
    WHERE CONVERT(DATE, PlanMonth) >= CONVERT(DATE, DATEADD(dd, -DAY(DATEADD(MONTH, 0, GETDATE())) + 1, DATEADD(MONTH, -6, GETDATE())))
    group by PlanMonth
    ) T1
    ORDER BY T1.OriginalDate
    Here Query Cost (relative to the batch) : 100%
    So can you suggest the best one. Actually i wrote the second method to avoid temp tables. If any failed in the query after create temp table, it will deleted. To avoid like problems and for simplicty i prefer the second. one. But now i am confused
    which is the best one in the performance wise also. 

    p>I will try to chnage as per your suggestions. Actually here the presentation is fully dynamic and the header is always used from the column name. So only i try to give the name with hyphen.
    I added the schema design of the table. The application uses multilingual so we use nvarchar. The tables are aggregated tables. The data populated daily or monthly using sql job
    CREATE TABLE [dbo].[MonthPlan](
    [IPCMonthlyVsActualId] [int] IDENTITY(1,1) NOT NULL,
    [IPCPlanMonth] [datetime] NOT NULL,
    [ModelId] [int] NOT NULL,
    [ModelName] [nvarchar](50) NOT NULL,
    [PlanQuantity] [int] NOT NULL,
    [Actual] [int] NOT NULL,
    [DifferenceQuantity] [int] NOT NULL,
    [ReasonCode] [nvarchar](max) NULL,
    [Remarks] [nvarchar](max) NULL,
    [Createdby] [int] NOT NULL,
    [CreatedDate] [datetime] NOT NULL,
    [CreatedIP] [nvarchar](30) NOT NULL,
    [Approval] [int] NULL,
    [UnApproval] [int] NULL,
    [ModelTypeName] [varchar](100) NULL,
    [ModelCategoryName] [varchar](100) NULL,
    [ModelColorName] [varchar](100) NULL,
    [ModelGroupName] [varchar](100) NULL,
    [ModelLocationName] [varchar](100) NULL,
    [ModelTypeId] [int] NULL,
    [ModelCategory] [int] NULL,
    [ModelColor] [int] NULL,
    [ModelGroup] [int] NULL,
    [ModelLocation] [int] NULL
    ) ON [PRIMARY]
    GO
    CREATE TABLE [dbo].[DayTable](
    [IPCDailyVsActualId] [int] IDENTITY(1,1) NOT NULL,
    [IPCPlanDate] [datetime] NOT NULL,
    [ModelID] [int] NOT NULL,
    [ModelName] [nvarchar](50) NOT NULL,
    [PlanQuantity] [int] NOT NULL,
    [Actual] [int] NOT NULL,
    [DifferenceQuantity] [int] NOT NULL,
    [ValuesetparameterId] [int] NULL,
    [ReasonCode] [nvarchar](20) NULL,
    [Remarks] [nvarchar](200) NULL,
    [CreatedBy] [int] NOT NULL,
    [CreatedDate] [datetime] NOT NULL,
    [CreatedIP] [nvarchar](30) NOT NULL,
    [Cell1Cell2Map] [int] NULL,
    [Cell3Actual] [int] NULL,
    [FB] [int] NULL,
    [Remarks2] [nvarchar](max) NULL,
    [Remarks3] [nvarchar](max) NULL,
    [Remarks4] [nvarchar](max) NULL,
    [Remarks5] [nvarchar](max) NULL,
    [Remarks6] [nvarchar](max) NULL,
    [Remarks7] [nvarchar](max) NULL,
    [Remarks8] [nvarchar](max) NULL,
    [Remarks9] [nvarchar](max) NULL,
    [Remarks10] [nvarchar](max) NULL,
    [Remarks11] [nvarchar](max) NULL,
    [Remarks12] [nvarchar](max) NULL,
    [UpdatedBy] [int] NULL,
    [UpdatedDate] [datetime] NULL,
    [UpdatedIP] [nvarchar](30) NULL,
    [ClosureFlag] [bit] NULL,
    [UnApproval] [int] NULL,
    [ModelTypeName] [varchar](100) NULL,
    [ModelCategoryName] [varchar](100) NULL,
    [ModelColorName] [varchar](100) NULL,
    [ModelGroupName] [varchar](100) NULL,
    [ModelLocationName] [varchar](100) NULL,
    [ModelTypeId] [int] NULL,
    [ModelCategory] [int] NULL,
    [ModelColor] [int] NULL,
    [ModelGroup] [int] NULL,
    [ModelLocation] [int] NULL
    ) ON [PRIMARY]

  • MDS Best Practice Approach - Sample HR Scenario

    Thanks for taking to time to read my MDS requirement...just looking for a better way to go about it.
    Here is the requirement: 
    Every month CEO releases an excel list of approved employment positions that can be filled to the HR.   The HR dept wants to be able to add positions that CEO approves and remove positions that the CEO feels are
    no longer necessary.  The recruiting group wants to track/modify this master list of positions per the CEOs discretion and assign employees to potentially each position as people are hired/terminated.
    The HR data steward must be able to:
    -when a position is filled, must be enabled to assign employees to the positions for org chart reporting
    -they need the ability to assign/reassign parent child relationships for any position i.e. the Director Position manages multiple Manager positions which manage multiple Register Clerk positions.
    I am new to MDS and am initially not sure how to approach this problem...do I create one entity for 'Positions' and another for 'employees' ?   I'm thinking with that approach I can create employee as an domain based attribute for Position, then
    create a derived Hierarchy for the Position parent/child relationships...just wondering if this is a good approach.
    Are there other things I should be taking into consideration?  Thanks!

    If your Material list document is not excessively long it probably wouldn't be too much overhead to add a few extra columns using the CalculatedColumn action block.  These extra columns, even though the number would be the same for all rows could contain a column for each of your aggregated functions.
    Then in your iGrid just set the Column Width for these addtional fields to zero and upon UpdateEvent of the grid you could javascript them from Row number 1 to your desired html elements, etc.

  • Looking for best design approach for moving data from one db to another.

    We have a very simple requirement to keep 2 tables synched up that live in 2 different databases. There can be up to 20K rows of data we need to synch up (nightly).
    The current design:
    BPEL process queries Source DB, puts results into memory and inserts into Target DB. Out of memory exception occurs. (no surprise).
    I am proposing a design change to get the data in 1000 row chunks, something like this:
    1. Get next 1000 records from Source DB. (managed through query)
    2. Put into memory (OR save to file).
    3. Read from memory (OR from a file).
    4. Save into Target DB.
    Question is:
    1 Is this a good approach and if so, does SOA have any built in mechanisms to handle this? I would think so since I believe this is a common problem - we don't want to reinvent the wheel.
    2. Is it better to put records into memory or writing to a file before inserting into the Target DB?
    The implementation team told me this would have to be done with Java code, but I would think this would be out of the box functionality. Is that correct?
    I am a SOA newby, so please let me know if there is a better approach.
    Thank you very much for your valued input.
    wildeman

    Hi,
    After going through your question, the first thing that came to my mind is what would be the size of the 20K records.
    If this is going to be huge then even the 1000 row logic might take significant time to do the transfer. And I think even writing it to a file will not be efficient enough.
    If the size is not huge then probably your solution might work. But I think you will need to decide on the chunk size based on how well your BPEL process will work. Possible you can try different size and test the performance to arrive at an optimal value.
    But in case the size is going to be huge, then you might want to consider using ETL implementations. Oracle ODI does provide such features out of the box with high performance.
    On the other hand, implementing the logic using the DBAdapter should be more efficient than java code.
    Hope this helps. Please do share your thoughts/suggestions.
    Thanks,
    Patrick

  • Looking for best Cutover Approach.

    Hi Experts,
    We are in the Testing Phase and are looking out for approaches for Cutover. Please note current set up in PRD is 3.5 Backend running on BI 7 Frontend. There is only 1 server and client each for DEV, TST and PRD with a spare server targeted for set up as application server in the near future :
    i) Option 1 - parallel 3.5 and 7 environments ((if technically feasible), preferred by client)
    -  use application server in the interim as backup server with copy of current PRD while cutover takes place on main server. Team still is unsure technically how much effort and level of difficulty to either  Update R/3 to BI 7 then update BW 3.5 or (ii) Set up R/3 to update both 3.5 and 7. Detailed investigation required but both cases require additional DEV and TST boxes to simulate chose option.
    ii) Option 2 - big bang to BI 7 (medium risk, recommended by us)
    -  once BI 7 objects migrated to TST and DEV, Support team and users are no longer able to access old reports to perform fixes or data validation.
    Would you be able to advise on which approach is typically taken during 3.5 to 7 upgrades and if Option 1, what are the complexities and considerations required.
    Your time is much appreciated.
    Thanks,
    Chandu

    the way it works usually is ..
    Take a copy of PRD and migrate. Test this system completely - usually takes about 2 to 4 weeks of full testing and signoff.
    The you will go into a DEV FREEZE - this means that no requests can be transported in this period.
    Upgrade DEV and test - takes about 2 weeks since you already know the critical objects to test.
    Teh upgrade QA - test for 2 to 3 days since most of the functionality has already been tested twice.
    Upgrade PRD and go live on 7.0.
    Also you can look at splitting the upgrade into Database upgradation first and then once there are no issues - go ahead with NW upgrade....

  • Best Slideshow Approach....

    Hello There,
    I was just wondering what the best way to create a sideshow for several band photo's for a web site would be.
    I have iLife, QuickTime Pro, an older version of Macromedia Dreamweaver MX 2004, RapidWeaver 3.6.7, and I'm still using 10.4.11.
    What's the most compatible across the board, Flash, QuickTime, or what??
    Of course it will have music etc..
    I'd like to have the photo's fairly good sized, yet not too big file size wise..
    Thanks for any help you can offer..
    Eric
    Message was edited by: Eric Cappotto

    Easiest and smallest file size would a QuickTime file made with Pro.
    Simply open your image files, copy, switch to the audio track, make a selection and add the image file "Scaled to the selection".
    QuickTime .mov files can have up to 99 "tracks" and each can have video "masks", "transparency" levels and "layers". This allows you to have a video "background" image and 97 other images tracks positioned anywhere over the background.
    QuickTime .mov files can also have "text" tracks. Real, scalable text that can be scrolled, colored and even use "karaoke" styles.
    QuickTime .mov files can also contain "links". A click on the file could send the viewer to a Web page for more information (a Web page that sells your music for example). Some of mine as examples:
    http://homepage.mac.com/kkirkster/03war/
    http://homepage.mac.com/kkirkster/64/
    http://homepage.mac.com/kkirkster/Lemon_Trees/
    http://homepage.mac.com/kkirkster/RedneckTexasXmas/
    http://homepage.mac.com/kkirkster/Nationals/ (made using iPhoto).

  • Best architectural approach ?

    Hi,
    We are in the process of creating a PoC to integrate WebLogic Web Services, Oracle Coherence and Oracle Composite Database. We have followed the below approach to integrate them...
    1. Created a CompositeView in Oracle Composite by connecting to three different databases / flat files ( Oracle, MySQL & XML ).
    2. Designed a Distributed Cache using Oracle Coherence on top of Oracle Composite ( with DB Cache Store implemented as read only ).
    3. Have a WebLogic Web Service ( implemented using JAX-WS ) which queries Oracle Coherence for data.
    Currently we have a single method exposed in our web service to take an orderId as an input and return a complex datatype ( plain Java Bean ) with one row of data. In DB cache store, we have implemented only the load() method to load each row from Oracle Composite in case there is a cache miss. Being a Poc, we have stuck to basics of coherence and web services. Do you have any better approach in implementing what we are trying out currently.
    Thanks
    Karthik

    Have you looked into relative path's? that would solve the issue of different OS path names.
    Also, you might want to consider the deployment descriptors you can use when compiliing, you can set enviroment specific variables like paths in there.

  • With 2008 - What would be the 'best practice' approach for giving a principal access to system views

    I want to setup a job that runs a few select statements from several system management views such as those listed below.  Its basically going to gather various metrics about the server, a few different databases and jobs.
    msdb.dbo.sysjobs
    msdb.dbo.sysjobhistory
    sys.dm_db_missing_index_groups
    sys.dm_db_missing_index_group_stats
    sys.dm_db_missing_index_details
    sys.databases
    sys.dm_exec_query_stats
    sys.dm_exec_sql_text
    sys.dm_exec_query_plan
    dbo.sysfiles
    sys.indexes
    sys.objects
    So, there a number of instance-level permissions that are needed, mainly VIEW SERVER STATE
    https://msdn.microsoft.com/en-us/library/ms186717.aspx
    Granting these permissions to a single login seems like introducing a maintenance headache for later.  What about a server role?
    Correct me if Im wrong, but this is a new feature of 2012 and above, the ability to create user-defined server roles.
    Prior to version 2012, I will just have to settle for granting these instance-level permissions to individual logins.  There wont be many logins that need this kind of permissions, but id rather assign them at a role level then add logins to that role.
     Then again, there is little point in creating a seperate role if there is only 1...and maybe 2 logins that might need this role?
    New for 2012
    http://www.mssqltips.com/sqlservertip/2699/sql-server-user-defined-server-roles/

    Just as any Active Directory Administrator will tell you you should indeed stick to the rule - "user in role- permissions to role" - in AD terms "A-G/DL-P. And since this is very much possible since SQL Server 2012 why not just do that. You
    lose nothing if you don't ever change that one single user. In the end you would only expect roles to have permissions and save some time when searching for permission problems.
    i.e.
    USE [master]
    GO
    CREATE SERVER ROLE [role_ServerMonitorUsers]
    GO
    GRANT VIEW SERVER STATE TO [role_ServerMonitorUsers]
    GO
    ALTER SERVER ROLE [role_ServerMonitorUsers]
    ADD MEMBER [Bob]
    GO
    In security standardization is just as much key as in administration in general. So even if it does not really matter, it may matter in the long run. :)
    Andreas Wolter (Blog |
    Twitter)
    MCSM: Microsoft Certified Solutions Master Data Platform, MCM, MVP
    www.SarpedonQualityLab.com |
    www.SQL-Server-Master-Class.com

Maybe you are looking for