Concurrent writing to a model

Hello,
We are trying to find the best solution to write concurrently to the same semantic model. We have at times more than 20+ threads requesting to update triples in the semantic model. We are currently creating a synchronized block around the operation that all of the threads use to only allow one of the threads access to the editable model. When we did not use the syncronized block the INS and DEL triggers on the Application Table creating deadlocks.
Is there a best practice for writing to the same model at the same time? Any guidence would be helpful.
Thanks
-MichaelB
Edited by: MichaelB on Feb 8, 2012 9:55 AM

The scenario that causes the deadlock is exactly the same as what occurs during concurrent DML into a relational table with a unique constraint.
Underneath, the semantic store too has a uniqueness constraint defined to prevent duplicate RDF triples in the same RDF model. So, suppose one transaction T1 inserts triple (a,b,c), a second trans T2 inserts triple (d,e,f). Then the first transaction T1 tries to insert the triples (d,e,f), but it blocks because this triple has already been inserted by the yet-to-be-committed transaction T2. Now, when the second transaction T2 tries to insert triple (a,b,c), it blocks as well because this triple has already been inserted by the yet-to-be-uncommitted transaction T1. So, this circularity brings us to a deadlock and Oracle would raise ORA-00060: deadlock detected while waiting for resource.
Since this is exactly how deadlock may occur in any relational table with unique constraints defined on it, the best practice for avoiding deadlock would be the same for concurrent inserts into the same RDF model.
Hope this helps.

Similar Messages

  • Facing issue with concurrent use of session

    Hi All,
       We are facing concurrent use of session.save call in our LDAP sycn up process. Issue we got while performing performance test for multiple user and issue is at com.day.crx.security.ldap.LDAPUserSync.performUpdate api. Please guide me what could be wrong missing in code due to this Out of box service is showing error.
    Below are logs:
    02.01.2013 14:32:07 WARN SessionState: Attempt to perform session.save() while another thread is concurrently writing to session-system-1. Blocking until the other thread is finished using this session. Please review your code to avoid concurrent use of a session. (SessionState.java, line 149)
    java.lang.Exception: Stack trace of concurrent access to session-system-1
    at org.apache.jackrabbit.core.session.SessionState.perform(SessionState.java:147)
    at org.apache.jackrabbit.core.SessionImpl.perform(SessionImpl.java:355)
    at org.apache.jackrabbit.core.SessionImpl.save(SessionImpl.java:758)
    at com.day.crx.security.ldap.LDAPUserSync.performUpdate(LDAPUserSync.java:230)
    at com.day.crx.security.ldap.LDAPUserSync.syncUser(LDAPUserSync.java:178)
    at com.day.crx.security.ldap.LDAPLoginModule.commit(LDAPLoginModule.java:266)
    at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
    at javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
    Regards,
    Yogesh

    Hey Jorg,
    Your comments are much appreicated!
    question about the statement "it seems, that you import lot of users concurrently from the LDAP".  for us yes we are but the question is should that me problamatic?  We do see that it is causing issues for us  in CQ5.5 with CRX 2.3.15:
    com.day.crx.security.ldap.LDAPLoginModule commit: could not commit: javax.jcr.InvalidItemStateException: property /some/LDAP/group/rep:lastsynced: the property cannot be saved because it has been modified externally.
    However we had no such issue with CQ5.3 with CRX2.0
    -Faraz

  • Problem in populating configuration model

    Hi,
    We are trying to populate a BOM model in oracle configurator, for this we have performed following steps:
    1. Created Items in Item Master
    2. Created BOM structure in Bills of Materials
    3. Exploded the BOM structure from sales order screen as we have different environment for configurator.
    4. Submitted concurrent program: Populate Configuration Models for top model (TEST_CONTAINER, we have set it as container model).
    Now the problem is, only the container item (TEST_CONTAINER) is visible in Oracle configurator Developer and its structure is missing.
    BOM SETUP:
    TEST_CONTAINER
    XX_TEST_INSTALLATION
    XX_TEST_A
    XX_TEST_B
    XX_TEST_C
    Appreciate if anyone provide valuable inputs in this regards.
    Thanks,
    Anupam

    That is a little bit odd. EBS installations typically need at least 2 orgs. One org is the master org and then at least one child org where most of the transactions are performed.
    What is the org (warehouse) on the sales order line? Is it the same as master?
    Sandeep Gandhi
    Omkar Technologies Inc.
    Independent Techno-functional Consultant

  • What does Java3d have to help with 3d model rendering?

    Well, basically i spent a few hours the other day writing a basic model rendering application, it uses the java.awt.Graphics class for the drawing and the java.awt.Canvas class for the main applet.
    Now, it was a pretty hard project for me in my opinion as i have never attempted anything like it before, now, in using the awt Graphics class you have to work everything out yourself.
    For example;
    You have your model, this has x, y and z vertices, then you have your faces, which are triangles which link up 3 of the vertices to make a face or skin for the model. After this you have your "camera" or "view point", this relates to the x, y and z rotations, offset (for zooming) and transitions of all of the above data, then after all this you can finally create the triangle or face by using the Polygon class and inputing the x, y and z point values.
    Now, i want to know if Java 3d has any of the above stuff already there for you to take advantage of, for example also lighting effects, such as the ability to add shadows.
    Ive already looked through the Java 3d tutorial pdf but i couldn't see anything that relates to creating your own model's, the only thing i saw were pre-programmed shapes such as cubes and other polygons.
    Thanks for any help.

    Hi ,
    Java 3d has many effects ready , like shadows , raytrace and... .
    I think you may have 5 issues ,
    1. Geometry and object models
    2. Lighting
    3. camera and rendering
    4. Sound
    5. Animation
    for geometry , best solution is using modeling tools like 3D Studio Max , export model and import in your program.
    for Lighting it has many light types , but 'm not good at using them ,
    for camera it has lots of usefull stuff . even special rendering for 3dGlasses !
    for Sound , as i know it supports 3d sound engine that renders your sound. ( I mean if your avator is close to sound source , sound is stronger , if source is left hand of avator , sound will apear in left speaker)

  • Select or Select....for update nowait (data write concurrency issue)

    Hello everyone,
    I am working on a jsp/servlet project now, and got questions about which is the better way to deal with concurrent writing issues.
    The whole senario is described as following:
    First each user is viewing his own list of several records, and each record has a hyperlink through which user can modify it. After user clicks that link, there will be a popup window pre-populated with the values of that record, then user can do the modifications. After he is done, he can either click "Save " to save the change or "Cancel" to cancel it.
    Method1---This is the method I am using right now.
    I did not do any special synchronization measures, so if user 1 and user2 click the link of same record, they will modify the record
    at the same time, then whose updates will take effect depends on who submits the request later. If user1 submitted first, then user 2, user1
    will not see his updates. I know with this method, we will have the problem of "Lost Updates", but this is the simplest and efficient way to handle this issue.
    Method2--This is the method I am hesitating.
    I am considering to use "Select....for update nowait " to lock a record when user1 has selected one record and intended to modify it. If user2 wanted to modify the same record, he is not allowed. ( by catching the sql exception.)But the issue I am concerned about is because the "select .. For update" action and "Update action" are not so consecutive as many transaction examples described. There could be a
    big interval between " select " and "update" actions. You could not predict user's behavior, maybe after he open the popup window, it took him a while to make up his decision, or even worse, he was interrupted by other things and went away for the whole morning?.Then the lock is just held until he releases it.
    And another issue is if he clicks "cancel" to cancel his work, if I use method1, I don't need to interact with server-side at all, but if user method2, I still need to interact with the server to release the lock.
    Can someone give me some advice ? What do you do to deal with similar situation? If I did not make clear of the question, please let me know.
    Thanks in advance !
    Rachel

    Hi Rachel,
    Congratulation, you have found a way to overcome your programming business logic.
    Have you ever consider that the solution of using CachedRowset concept yet to be included in j2se 1.5 tiger next year too prove workable under the scenario , whereby you can disconnect from the database after you have execute your query and reconnect again if you have to do transactional activity later, so that the loading overhead as well as the data pooling activity could be well balanced off.
    Although rowset is still not an official API now, but its potential to me is worth consideration.
    I have written a simple but crude cut JSP programme posted on this forum under the heading "Interesting CachedRowset JSP code to share " to demonstrate the concept of CachedRowset and hoping that the Java guru or the developer could provide feedback on how to imporve on the programming logic or methodology.
    Thanks!!

  • Multiple instances of FileOutputStream

    Hi, I'm learning how to use Java's multithreading features. I created a class called "ObjectMaker". An instance of this class serializes some objects into a file, using a FileOutputStream (inside an ObjectOutputStream) for writing. In the mean time an other object of another class deserializes those objects reading from the file. Each concurrent access to the file (for writing,reading) is synchronized. Here's the problem:[ u]if each writer has its own FileOutputStream, then the concurrent writing on the file doesn't work (despite of every writer is synchronized on the same object), and the reader aren't able to read anything from the file; if the FileOutputStream is static (every instance share the same FileOutputStream reference) everything works. I read on Documentation that some OS "allow a file to be opened for writing by only one FileOutputStream (or other file-writing object) at a time" .But it also says that I would recive an exception from FileOutputStream if I try to open a file for writing that is already open by another FileOutputStream....and It doesn't happen, no exception are thrown.Then here is the problem:It's possible to manage Multiple instances of FileOutputStream linked on the same file?

    Hello Joseph,
    Have u been successful on the installation.
    There were no replies after yours.
    Can you please let me know about installing 2 MDS instances on the same host.
    If you have any document you followed please mail me to [email protected]
    It will be more helpful as the dead line to this is killing me.
    Regards,
    VishnuM

  • Multiple instances of Crystal Reports Viewer possible in WPF?

    Hi, I've dragged and dropped three Crystal Reports Viewer controls on my WPF application. The goal is to be able to select up to 3 reports from a listbox and click run to generate the reports simultaneously. When I select just one report, it works fine. However when I select multiple reports it throws errors (object not found, etc). It seems there is a problem with multiple threads. Is it possible to have multiple instances of the Crystal Report Viewer display reports simultaneously? I am using version 13.0.9.1312 from the link below along with VS2013 C# WPF.
    SAP Crystal Reports, developer version for Microsoft Visual Studio: Updates & Runtime Downloads
    Thanks,
    Syed

    I was able to quickly cobble together a two viewer app that looks like this:
    The code is like this:
    Public Sub New()
            ' This call is required by the designer.
            InitializeComponent()
            ' Add any initialization after the InitializeComponent() call.
            Dim crReportDocument As New CrystalDecisions.CrystalReports.Engine.ReportDocument
            crReportDocument.Load("C:\tests\formulas.rpt")
            CrystalReportsViewer1.ViewerCore.ReportSource = crReportDocument
            Dim crReportDocument2 As New CrystalDecisions.CrystalReports.Engine.ReportDocument
            crReportDocument2.Load("C:\tests\report1.rpt")
            CrystalReportsViewer2.ViewerCore.ReportSource = crReportDocument2
        End Sub
    For more details see the document WPF Project Using the Crystal Reports WPF Viewer in 8 Easy Steps
    Now a couple of things to keep in mind;
    1) The report engine is based on 3 Concurrent Processor License (CPL) model. Meaning you can process at most three reports at the same time. In my test, doing the above code for four reports worked, but the reports are very, very simple - and with saved data. What a "real" world reports will do, I am not sure. I do know that in a web app, the reports are queued up until one report is done and thus a CPL is freed up. You will also need to keep in mind any database connection limits, etc.
    2) There is also a Print Job limit. This by default is set to 75. In a nutshell, almost anything done with a report is a Print Job. E.g.; paging, zooming, drilling, searching, etc., etc. In addition subreports are considered to be Print Jobs. Thus a report with one subreport in a detail section that returns a thousand rows of data and therefore running 1000 subreports will error out.
    You can read more about Print Job limits here:
    Crystal Reports Maximum Report Processing Jobs ... | SCN
    - Ludek
    Senior Support Engineer AGS Product Support, Global Support Center Canada
    Follow us on Twitter

  • Question about new BIOS on Satellite L20-182

    Hi :)
    I've got one question: may I install new BIOS 2.20 on my Satellite L20-182 model no. PSL2XE?
    If I can't, why users PSL2X can do this? Are there so many differences on both models?
    Greetings and sorry for my English - I'n not the best :)

    Hi,
    just for explanation: the bios you mentioned is the right bios for the machine. Toshiba is not writing the full model number since there is a range of model numbers of the same model, so dont worry, they are not different. Maybe the last digit but this is not important for you.
    Why I am knowing this? Because many manufacturers are doing this and I was many times confused when updating bioses from another manufacturer...
    Greets

  • Time Machine backup FROM multiple drives?

    I'm in the process of choosing a new MacPro, someone on these forums recommended Digilloyd's Mac Performance Guide as a good place to get help setting up a new mac for speed. The simplified version of what he advocates is replacing the stock internal drive with an SSD drive, on which you put the OS, apps & home folder. He then recommends creating a 0-raid stripe from 3 other drives to separate & hold your data, then using the forth internal bay drive (or another 0-raid stripe of the leftover, slower portions of various partitions of the drives) for Time Machine.
    My question is, can Time Machine backup both drives (boot & data raid-0) or would I have to choose one of them?
    +The more I read, the more confused I get+.

    OK, I've scanned through the various articles. My thoughts are still essentially the same, and that is just how effective the entire system is when using multiple partitions from several drives to combine into multiple RAIDs. In reality this is not speed effective if the RAIDs need to be accessed concurrently. A read/write head can only be in one place at a time meaning that when one partition on the drive is being accessed, the OS cannot concurrently access another partition on the same drive.
    Essentially this is how I understand the configuration at a simplistic level. Let's suppose we have two hard drives that we'll call Drive A and Drive B. Each drive is partitioned into two equally sized volumes that we'll refer to as follows:
    ....................... *Drive A* ......................... *Drive B* ......
    Volume................. 1 ..................................... 3 ............
    Volume................. 2 ..................................... 4 ............
    Now, we will make two RAID arrays. RAID A uses Volume 1 and Volume 3, and RAID B uses Volume 2 and Volume 4.
    Suppose you want to copy data from RAID A to RAID B. In order to do this the OS must first copy data from RAID A before it can write the data to RAID B. However, if RAID A was created using two separate drives (say, Drive A and Drive B,) and RAID B was created using two separate drives (say, Drive C and Drive D,) then the OS can copy from RAID A while concurrently writing to RAID B. This is physically possible because two read/write heads are involved instead of one. Theoretically the second construct is going to be much faster than the first construct.
    My second observation is with regard to the reliance on external storage. A 2nd or 3rd generation Mac Pro's SATA bus is capable of a data interface rate of 3.0 Gb/sec. Firewire 800 is capable of 800 Mb/sec. The MP's internal SATA bus can support data transfer rates nearly four times that of Firewire. A modern hard drive is capable of saturating the Firewire bus, but not the internal SATA bus. The higher interface rate of the SATA bus means it's much better suited for truly fast RAID arrays. This is not the case for the Firewire bus.
    External Firewire arrays are better suited for storage that does not require frequent or fast access.
    Now with all this said it makes more sense to fully understand what your overall storage needs are then consider suitable designs. One need not rely on complicated RAID arrays if they aren't required. The focus should be on data access, data storage, and backup needs.
    Although it's nice being able to brag at the cocktail party about having a fast SSD for your boot drive, let's consider how often you even need to boot the computer. I put my computers to sleep when they aren't in use. I never boot the computer unless a software update requires it or the computer has crashed completely. I haven't had the latter occur very often - mainly when I'm experimenting. Literally days, weeks, or months may go by before I reboot the computer. So a fast SSD boot drive would be for me a hugh waste of money.
    My 1st generation Mac Pro is set up for my needs. It has four 500 GB fast Hitachi enterprise level hard drives. I use enterprise level drives that cost more because the computer is always on, so I want drives that will be reliably working. I used to have four Maxtor 300 GB drives that lasted for four years before I replace them with the Hitachis. My setup has one drive partitioned into a startup volume and a Boot Camp volume. One drive is my 'scratch' drive used for different OS system versions and/or seed testing. Two drives are configured as a mirrored RAID and used as the primary backup for the boot volume. The boot volume is 450 GBs and the Windows volume is 50 GBs. There's no backup for the Windows volume at the present time. Backups are usually done in the late afternoon using a backup utility. Presently that utility is Synk Standard, but I've also used Synchronize! Pro X and Carbon Copy Cloner. Backups are done on a fixed schedule in the background so they are virtually transparent to me. I use a mirrored RAID for backup to provide redundancy. If one backup drive fails hopefully the other will still be usable to protect the backup. I also have one external Firewire drive that contains a clone of the startup volume. The clone is updated monthly by incremental backup. It's for security in the worst case scenario that both drives in the mirrored RAID were to fail simultaneously.
    Now, my need for frequent and fast access to data such as might be needed for streaming music or video is non-existent, so my configuration is one that is well-suited to my needs. You'll note that it's both simple and practical while providing data backup that's doubly secure.

  • Lightroom 4 Tethering Preset Publish -- Automation

    Hello,
    I have a specific need to shoot tethered in Lightroom, apply some presets, then publish (FTP to two location, publish on FaceBook), and save a copy of the file - that is save the modified file with a new name in a different directory and close that file. Any ideas on how to accomplish this?
    Thanks,
    Vic

    Automator on Mac looks very powerful and can apply a set of actions if a file or folder is added to a specific targeted folder.
    I am not a Mac expert.... but I would try the following.
    Shoot Tethered, setting up Lightroom with whatever  Lightroom Preset settings you wish to apply. You will have defined in Lightroom where the images are to be stored (Session Folder) and defined in your camera if you are using Raw / Jpg.
    Create an Automator action, which is activated for any files added to the  Session Folder above. Get Automator to open this file in Photoshop.
    You now have two options.
    a. Configure Photoshop to apply a set of Photoshop actions you require (possibly triggered by the Open File Event), which may include Raw presets and save your file to your choice of folder , with relevant settings (ie Jpg, psd, resolution, etc)
    or b.  There are a whole bunch of pre-configured Automator Photoshop actions which can purchased / trialled / downloaded from the internet (and loads of tutorials).  Some of these may apply the actions / presets you require.  This option has the advantage that you can continue your Automator script delivering the results to ftp / facebook etc. [Here is an example of such actions... http://www.robotphotoshop.com/?page_id=8 ... ps. I have no connection to this site or author, it was just the first site I found using google with photoshop / automator).
    Using Apple or Java scripts with Photoshop scripting language is an extremely powerful alternative... but unless you are familiar with script writing / photoshop object models, etc... it is not for the faint hearted.
    Automator with Photoshop should give you a lot of options.
    As I said at the top,  I am not aware of a scripting tool within Lighroom, hence the suggestion to use Photoshop and apply raw presets or any other changes using the full power of Photoshop.

  • User action logging

    I'm writing a small model of a GUI, which I'm going to
    perform user tests on via the web. For these tests, I need to log
    all user action (button clicks etc). I have no problem with the
    actual logging, but how do I submit it from the tester so I'll be
    able to view the statistics? Note that the tests will be performed
    at the testers own computers.
    Thanks a lot in advance.

    Anyone? :(

  • Changes made in BOM are not reflected in Configurator.

    Hi all,
    When i update minimum and maximum quantities in BOM, the new values are not reflected in the configurator.
    I have found a patch (patch# 4410573) for this issue in metalink and applying that patch did not resolve the issue.
    I have logged an SR with Oracle for this and Oracle support have asked me to run "Refresh Single Configuration Model" concurrent program and see if the changes are reflected.
    This program has 2 parameters one for folder and the other for Configuration Model Id.
    When i try to run the program, for any of the folders, the value set for the parameter Configuration Model Id does not show any values and hence i am unable to run the program.
    We don't have a separate instance for COnfigurator.
    Do i need to do any set up so that my Configuration model is reflected in the LOV?
    Can anyone please help me on this?
    Oracle Apps version: 11.5.10.2
    Configuration software build: 11.5.10.25.43A
    Regards,
    Sreekanth

    Hi Jason,
    Please see my comments below
    Jason said*
    Support is correct. If you haven't run the Refresh a Single Configuration Model concurrent program for your model, then the new BOM attributes will not be visible in your model. Changes to items/BOMs are not directly reflected in the configuration model. You must "refresh" the model to see the changes.*
    Even without running the "Refresh a Single Configuration Model" concurrent program, in some scenarios, changes are being reflected in Configurator.
    Ex:
    When i change only minimum and maximum quantities, the changes i.e. new minimum and maximum values are not reflected in Configurator.
    But, when i also change any of the following fields along with minimum and maximum quantities, new values are being updated in Configurator with out running the refresh program.
    effectivity_date, disable_date, component_quantity, planning_factor, component_yield_factor and optional class
    Can you please explain this?
    Also, i have already tried executing the query by commenting the enclosing_folder check. But still, the query is not retrieving any records.
    Basically, the sub-query that fetches devl_project_id itself is not retrieving any records.
    Jason said:+
    Based on the SQL above, the only criteria for importing a model is that the item under the folder must be:+
    a. A Model (PRJ = Project)+
    b. The Model cannot be deleted (Deleted_flag must be '0')+
    c. And the model must have been imported from Oracle apps, e.g. it cannot be a non-BOM model (orig_sys_ref is not null)+
    Can you please tell how can i check whether model is imported into configurator or not and if it is imported from where is it imported?
    Regards,
    Sreekanth
    Edited by: Sreekanth Munagala on May 12, 2011 8:52 AM

  • Things to consider while making a class singleton?

    sometimes i need to share the state of object across application.I get confused should i declare it singleton or declare state vaiables as static.
    As per mythoughts, if intention is just to share the state of object across application i should go for static variables. Is this right?
    My understanding about when we should go about singleton is below
    We should declare the class as singleton when we need only one instance across jvm. But i am not unable to find any practical scenario
    when we where we may need singleton . Any help here will be appreciated. Everywhere on different sites i get to see the example of logger
    class where the reason is generally given as so that single log file is created not multiple log files.
    But again this could have been achieved with declaring the file variable as static in logger file?
    So actual reason is for declaring the logger as singleton is becuase we need the single log file with issues avoiding concurrent writing
    which wont be possible if we make the separate instance of logger for each write..
    Is the above reasoning correct so that i can proceed in right direction?

    How will declaring its state as static accomplish that objective?With declaring variables as class variable instead of instance variables, there will be a single copy of each variable. In each instance (in this case logger if we dont declare it singleton) we can check if file is already created or not. I mean it will be visible across all instances.
    No, because the file name isn't the only state. There is also the output stream/writer, its current position, its charset, the log level, the logger name, its filters, its formatters, ...Agreed. I just wanted to convey the point. As you said there will be other parameters,in that case we can declare all of them as static.
    A configuration file holder is a good example: there is only one configuration file so there should only be one holder instance.Thanks for pointing it out. Configuration file is used mainly to read the properties. we usually dont update the values there.So even if we dont make it singleton it may be correct. The advantage i can think of making it singleton is that if configuration file is used frequently then we dont have create the object again and again.
    So actual reason is for declaring the logger as singleton is becuase we need the single log file with issues avoiding concurrent writing
    No it isn't, and that doesn't follow from anything you said previously so the 'so' part is meaningless.I want to say here is that t to have single log file should not be the only reason behind making the logger file as singleton, other reasons can be handling of concurrent writing too.
    Have a look at the Wikipedia article on the Singleton pattern, or buy the booki have gone through the singleton pattern in headfirst book and some of the articles on net. But they mainly describe how the make the class as singleton and the reason that We should declare the class as singleton when we need only one instance. But looking for actual scenarios where we need singleton. So i took the examplle of logger file which is used in many project and trying to understand it is constructed as singleton so that i can use in my project if required.
    Edited by: JavaFunda on Aug 28, 2011 3:51 AM
    Edited by: JavaFunda on Aug 28, 2011 3:56 AM

  • Records Management - Custom Attributes

    Hi all,
    I've created a Document Service provider in SRMREGEDIT in my RMS_ID. I archive the created documents with ORGANIZER in Documentum Archive Server.
    I have added a custom attribute to my Document type:
    1.- I want this custom attribute appears in ORGANIZER in the search window, so I can search the documents thru my custom attribute. How I can do this?
    2.- I also want to save my custom attribute in Documentum as attribute. How can I save SAP attributes in Documentum?
    Thanks in advance,
    Regards.
    Urtzi.

    Hi Urtzi,
    for custom attributs you need to create a Content Model (Customizing IMG). The Content Model description needs to be in the connection parameter values of your element typ Document. Then you have to customize your attributes in the dmwb (document modelling workbench), you should find your content model under the entity SRM, there you need to look for your content model id in documents, take the virtual class marked with "V" in the PHIO and LOIO classes, in the instance attribs you can finally customize your attributes by clicking the button "more". You have to hide all attributes you don´t need except SRM_DOCUMENT_ID. If you want to add own attributes you need to add them under IO-attributes first. When you restart your electronic desk now you should see the attributes you customized in the document and you are also able to search for these attributes now.
    You save these attributes for your documents by writing the Content Model ID into the connection parameter values (Document_Class) of your element typ Document. If you have different Documents that need different attributes you need to create a new Content Model.
    Hope that helps!
    Regards, Cornelia

  • Com.tangosol.util.AssertionException

    Getting this exception intermittently in my storage nodes (total of 4):
    23:54:16,092 ERROR [Logger@9260286 3.6.0.2] cluster (log:3) - 2012-05-01 23:54:16.087/2394.969 Oracle Coherence GE 3.6.0.2 <Error> (thread=DistributedCache:MessageDistributedCacheService,
                   member=7): Terminating PartitionedCache due to unhandled exception: com.tangosol.util.AssertionException
    23:54:16,126 ERROR [Logger@9260286 3.6.0.2] cluster (log:3) - 2012-05-01 23:54:16.087/2394.969 Oracle Coherence GE 3.6.0.2 <Error> (thread=DistributedCache:MessageDistributedCacheService,
                   member=7):
    com.tangosol.util.AssertionException:
         at com.tangosol.coherence.Component._assertFailed(Component.CDB:12)
         at com.tangosol.coherence.Component._assert(Component.CDB:3)
         at com.tangosol.coherence.component.net.message.requestMessage.DistributedCacheRequest$Poll.onCompletion(DistributedCacheRequest.CDB:15)
         at com.tangosol.coherence.component.net.Poll.close(Poll.CDB:13)
         at com.tangosol.coherence.component.net.Poll.onResponded(Poll.CDB:32)
         at com.tangosol.coherence.component.net.Poll.onResponse(Poll.CDB:3)
         at com.tangosol.coherence.component.net.message.requestMessage.DistributedCacheRequest$Poll.onResponse(DistributedCacheRequest.CDB:16)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:26)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:619)
    Version from the the manifest:
    Specification-Version: 3.6.0.2
    Implementation-Build: 18470
    This happens under the following scenario:
    I have 4 other nodes that are concurrently writing to the same cache using individual put() operations, about 200k objects in total
    Anybody have any ideas?
    Thanks

    Hi Carl,
    I guess the issue when the limit is exceeding.
    Probably you can try increasing the packet Size and run the test
    For more info:
    http://wiki.tangosol.com/display/COH33UG/Production+Checklist#ProductionChecklist-LargeClusterConfiguration
    Let me know the results.
    Thanks,
    Ashish

Maybe you are looking for