TopLink Insert Update Issue

Hi,
From my ADF Page I am calling the TopLink Query to update a record if it exists else insert a new record. I have an issue where first time if the record doesn't exist a new record is inserted in the Database table, next time if another call is made the record is updated. Now if I go and delete a Record in the Table and again Run the Query from my ADF Page the record is never inserted as it should insert a Record because I manaully deleted the Record before running the query again. Any idea what might be wrong looks like something is cached and when the query runs again it never creates the record. If I don't delete a record and manually update the data and run the query again it still doesn't update the record. Looks like only first time it works i.e. inserts a record and then updates it but after that any requests never do anything in the table. As it should update or insert a record for each request coming from ADF Page. Here is the code I have:
          Session session = getSessionFactory().acquireSession() ;
        UnitOfWork uow = session.acquireUnitOfWork() ;
        ExpressionBuilder builder = new ExpressionBuilder() ;
        Expression expression = builder.get("stateName").equal("TX").and(builder.get("deptDesc").equal("HR")).and(builder.get("id").equal("1234")).and(builder.get("user").equal("XASW")).and(builder.get("mngCont").equal("3A45")) ;
        DepartmentDescription dd = (DepartmentDescription)uow.readObject(DepartmentDescription.class,
                                                                        expression) ;
        if (dd == null)
            dd = new DepartmentDescription() ;
            dd.setStateName("TX") ;
            dd.setId("1234") ;
            dd.setDeptDesc("HR") ;
            dd.setUser("XASW") ;
            dd.setMngCont("3A45") ;
            dd.setValue("Record Inserted") ;
        else
            dd.setValue("Record Updated") ;
        uow.registerObject(dd) ;
        uow.commit() ;
        session.release();
          Thanks

This is because the object is still in the TopLink cache.
<p>
You can either invalidate the object in the TopLink cache (session.getIdentityMapAccessor().invalidateObject()), or use registerNewObject to force the insert. If it is just a test, you could also clear the cache (session.getIdentityMapAccessor().initializeIdentityMaps()), ).
<p>
You could also disable the cache by setting your descriptor to be isolated.
<p>
-- James : EclipseLink

Similar Messages

  • JDeveloper 10.1.2.3 Row Insert Update issue

    All,
    We have a problem we hope you can help solve, using JDeveloper 10.1.2.3.0 with JHeadStart 10.1.3.3.87 and its easy to reporduce and its a bad one.
    Create a simple web project > Create Entity/VO for one table > Enable JHeadStart >
    Use JHS to create Master Detail drill down single record
    Run app > Open Browser to the Master list > press button Create X to bring up a new row NOW open another browser TAB and do the same so we have 2 forms open in INSERT mode.
    Press Save on 1st tab to trigger the Insert - fine it reloads the form after commit and row is in the DB THEN switch to the other tab and submit that one > this one will overwire the record you entered 1st.
    This is bad and its happening often when we load the system with many users event without multiple (multiple tabs makes it easy to test) - trapping the doDML we expect the 2nd insert request to be DML_INSERT but we get DML_UPDATE. Depending on refresh setting on some attributes we will get error back JBO-25014 but if no refresh on insert update you still get the same issue only one row in db and 1st row gets over written.
    If you have any ideas what is going on and how to solve this ASAP that would be great.
    Thank you
    Anthony

    Steven,
    Thanks for the entry. Yes, we realized that the two-tab browser scenario was the same session. The thing that baffled us was why our test without JHS allowed two inserts in a two-tab, same session situation whereas with JHS, the second insert did an update on the first record committed. We suspected it had something to do with how the current row pointer in the view cache/entity cache was being handled.
    The main symptom we started with was that users in separate sessions in production are overwriting each other's records; and we agree with you, it is surprising but we can reproduce it when traffic is "heavy" (more than four users usually) or the operations are rapid fire -- less than a minute between User 1 inserting a record and User 2 inserting a record. The second user insert operation acutally updates user 1's record even though the data for both inserts is different.
    We switched EnableTokenValidation to "true" and this "fixes" the two-tab, single session situation -- the second insert presents a row currency JBO error (much better than overwriting the first record). In a two-user session scenario, the effect is worse with this setting: the second user trying to insert actually updates a different record from the record the first user just inserted.
    This seems to point to users sharing a view/entity cache and it seems like that might be an ADF or maybe an OC4J bug -- we are using datasources on OAS and an HTTPS protocol for this app.
    So next we are going to try using a database procedure that overrides the Save operation on new (and updated) records. If the database procedure sees a negative ID (from DBSequence) it will insert regardless of the current record pointer. If the database procedure sees a positive ID, it will update that record. I think we'll also need to roll back the CreateInsert that got us to the new record screen, too. Messy, yes, but if this effect is due to an ADF 10.1.3 bug, rewriting the app in 11g would take much longer.
    Our other ADF JHS 10.1.3 apps do not have this problem as far as we know, but they do not use datasources or HTTPS.
    Peter

  • Weird Radio button issue on Insert/Update form

    I recently installed Dreamweaver CS4 and am having a heck of a time with radio buttons on my Dynamic Insert/Update form.
    For some reason, when I update a record, the radio button I had previously selected, is NOT checked.  However, the value is in the mySQL db.
    I've tried this in several different folders, thinking there was a conflict (css, js or whatever) but have had the same issue every time.  I even updated my DDT includes folder but still no luck.
    Has anyone had an issue like this with Dreamweaver CS4 and know how to solve it?
    Also, I should probably add the Dreamweaver CS4 has crashed at least 3 times per day since I installed it less then a week ago.  My computer easily meets the required specs (3.39GHz, 2GB RAM, 75GB HardDrive).   Something is obviously wrong.  Whether its my computer or CS4, I'm not sure.
    Any help is appreciated!
    Thanks!
    Peter T

    OK, I've figured out what the issue is but not why.
    See the code below.  For some reason, where it should be {echo "checked";} its {echo "@@checked@@";}
    Anyone else run into this situation?
                <td><div>
                  <input <?php if (!(strcmp(KT_escapeAttribute($row_rssample_sam['answer_sam']),"1"))) {echo "@@checked@@";} ?> type="radio" name="answer_sam_<?php echo $cnt1; ?>" id="answer_sam_<?php echo $cnt1; ?>_1" value="1" />
                  <label for="answer_sam_<?php echo $cnt1; ?>_1">Yes</label>
                </div>
                  <div>
                    <input <?php if (!(strcmp(KT_escapeAttribute($row_rssample_sam['answer_sam']),"0"))) {echo "@@checked@@";} ?> type="radio" name="answer_sam_<?php echo $cnt1; ?>" id="answer_sam_<?php echo $cnt1; ?>_2" value="0" />
                    <label for="answer_sam_<?php echo $cnt1; ?>_2">No</label>
                  </div>
                  <?php echo $tNGs->displayFieldError("sample_sam", "answer_sam", $cnt1); ?></td>

  • Issue using SQL stored procedure to insert/update

    With help I finally managed to execute the stored procedure to insert/ update the sql database with the below stored procedure
    ALTER PROCEDURE [dbo].[uspInsertorUpdate]
    @dp char(32),
    @dv char(32),
    @e_num char(12),
    @mail varchar(50),
    @emerg char(32),
    @opt1 char(16),
    @stat char(20),
    @e_id char(35),
    @e_tit varchar(64),
    @e_date datetime
    AS
    BEGIN
    SET NOCOUNT ON;
    IF EXISTS (SELECT 1 FROM [dbo].[sampleemployee] WHERE e_id= @e_id)
    BEGIN
    UPDATE [dbo].[sampleemployee]
    SET dp = @dp,
    dv = @dv,
    e_num = @e_num,
    mail = @mail,
    emerg = @emerg,
    opt1 = @opt1,
    stat = @stat,
    e_tit = @e_tit,
    e_date = @e_date
    WHERE e_id = @e_id
    END
    ELSE
    BEGIN
    INSERT INTO [dbo].[sampleemployee]( dp, dv, e_num, mail, emerg, opt1, stat, e_id, e_tit, e_date)
    VALUES ( @dp, @dv, @e_num, @mail, @emerg, @opt1, @stat, @e_id, @e_tit, @e_date );
    END
    END;
    But the issue here is it just insert only one row and update that row only, even if there are some no.of rows need to be inserted . Not sure why

    Hi Sid_siv,
    To pass a table value to stored procedure, you can refer to the sample query below.
    create type FileDetailsType as table
    FileName varchar(50),
    CreatedDate varchar(50),
    Size decimal(18,0)
    create procedure InsertFileDetails
    @FileDetails FileDetailsType readonly
    as
    insert into
    FileDetails (FileName, CreatedDate, Size)
    select FileName, CreatedDate, Size
    from
    @FileDetails;
    Reference
    http://www.codeproject.com/Articles/22392/SQL-Server-Table-Valued-Parameters
    http://forum.codecall.net/topic/75547-sql-server-2008-passing-table-parameter-to-stored-procedure/
    Regards,
    Charlie Liao
    TechNet Community Support

  • Query on Correct Toplink Usage for insert/update

    Hi,
    I have a JSF based web application where-in my backing bean is having the Toplink generated enities as managed properties. i.e. the Toplink entity is being populated by the JSF value binding.
    Now when I want to insert/update this entity, what approach should be followed(keeping in mind that the entities are detached entities).
    I am using Toplink ORM v 10.1.3. I was thinking that I would have to call mergeClone for the update and registerObject for insert. But can I use mergeClone for both insert and update ?
    Also mention the performance implications of the suggested option.
    Regards,
    Ani

    1. The main concern is will it work with the telecom carriers in India?
    It will work in India. The N9 is not released in the US either, so its certainly not tuned to work with specific carriers there.
    2. Do I need it to get it unlocked if I buy it from USA?
    The N9 is not released in the US, hence you will not be able to buy a locked phone from there. So no question of unlocking it.  
    3. Will I be able to update the phone's software from India?
    I bought the phone in UK, where the N9 is not released. I was able to update the software without problems.
    4. What are the chances it will give problems as there would be no warranty/support on it when I bring it here in India.
    No idea. Depends on your luck, as with any other phone.
    5. Should I go ahead and buy Lumia 800 as it is readily available in India and its support too.
    I believe the N9 is a better phone than the Lumia 800. But if you have concerns on the support and warranty, better to be safe than sorry.
    A better bet would be to buy from Singapore, if you have someone who could send you the phone from there, or buy it in India from some dealer, who also promises some support.

  • Insert/Update/Delete Non-PO Invoice Line Item via FM/BAPI?

    Does anyone know of a way to insert/update/delete an Invoice Line item (Non-PO Accounting Invoice - Transaction FB60 or FV60) using a BAPI or Function Module (or set of function modules) using ABAP? I have been trying to find some code to accomplish this and am stuck on a couple of issues.
    I have found PRELIMINARY_POSTING_FB01 and PP_CHANGE_DOCUMENT_ENJ but both seem to submit the details to background processes. This is an issue because it gives the user a success message after execution but later delivers the error to Workflow. This is for an interfacing program so the results should be as real time as possible.
    Has anyone accomplished this via FM or BAPI and if so would you mind sharing your experiences?
    Thank you very much,
    Andy

    SG- Thank you for the reply.
    I have been playing with BAPI_INCOMINGINVOICE_PARK and I'm not sure if it is doing exactly what we want, but it is something that I have considered in the past. I plan on looking into BAPI_ACC_INVOICE_RECEIPT_POST this morning, hopefully that will provide some more for us.
    If possible I'd like to avoid BDC sessions because this program could hypothetically interface with multiple SAP systems with different configurations.
    I will check into those FM's and thank you very much.

  • Insert, update and delete trigger over multiple Database Links

    Hello guys,
    first of all I'll explain my environment.
    I've got a Master DB and n Slave Databases. Insert, update and delete is only possible on the master DB (in my opinion this was the best way to avoid Data-inconsistencies due to locking problems) and should be passed to slave databases with a trigger. All Slave Databases are attached with DBLinks. And, additional to this things, I'd like to create a job that merges the Master DB into all Slave DB's every x minutes to restore consistency if any Error (eg Network crash) occurs.
    What I want to do now, is to iterate over all DB-Links in my trigger, and issue the insert/update/delete for all attached databases.
    This is possible with the command "execute immediate", but requires me to create textual strings with textually coded field values for the above mentioned commands.
    What I would like to know now, is, if there are any better ways to provide these functions. Important to me is, that all DB-Links are read dynamically from a table and that I don't have to do unnecessary string generations, and maybe affect the performance.
    I'm thankful for every Idea.
    Thank you in advance,
    best regards
    Christoph

    Well, I've been using mysql for a long time, yes, but I thought that this approach would be the best for my requirements.
    Materialized View's don't work for me, because I need real-time updates of the Slaves.
    So, sorry for asking that general, but what would be the best technology for the following problem:
    I've got n globally spread Systems. Each of it can update records in the Database. The easies way would be to provide one central DB, but that doesn't work for me, because when the WAN Connection fails, the System isn't available any longer. So I need to provide core information locally at every System (connected via LAN).
    Very important to me is, that Data remain consistent. That means, that it must not be that 2 systems update the same record on 2 different databases at the same time.
    I hope you understand what I'd need.
    Thank you very much for all your replies.
    best regards
    Christoph
    PS: I forgot to mention that the Databases won't be very large, just about 20k records, and about 10 queriees per second during peak times and there's just the need to sync 1 Table.
    Edited by: 907142 on 10.01.2012 23:14

  • Oracle 11g: Oracle insert/update operation is taking more time.

    Hello All,
    In Oracle 11g (Windows 2008 32 bit environment) we are facing following issue.
    1) We are inserting/updating data on some tables (4-5 tables and we are firing query with very high rate).
    2) After sometime (say 15 days with same load) we are feeling that the Oracle operation (insert/update) is taking more time.
    Query1: How to find actually oracle is taking more time in insert/updates operation.
    Query2: How to rectify the problem.
    We are having multithread environment.
    Thanks
    With Regards
    Hemant.

    Liron Amitzi wrote:
    Hi Nicolas,
    Just a short explanation:
    If you have a table with 1 column (let's say a number). The table is empty and you have an index on the column.
    When you insert a row, the value of the column will be inserted to the index. To insert 1 value to an index with 10 values in it will be fast. It will take longer to insert 1 value to an index with 1 million values in it.
    My second example was if I take the same table and let's say I insert 10 rows and delete the previous 10 from the table. I always have 10 rows in the table so the index should be small. But this is not correct. If I insert values 1-10 and then delete 1-10 and insert 11-20, then delete 11-20 and insert 21-30 and so on, because the index is sorted, where 1-10 were stored I'll now have empty spots. Oracle will not fill them up. So the index will become larger and larger as I insert more rows (even though I delete the old ones).
    The solution here is simply revuild the index once in a while.
    Hope it is clear.
    Liron Amitzi
    Senior DBA consultant
    [www.dbsnaps.com]
    [www.orbiumsoftware.com]Hmmm, index space not reused ? Index rebuild once a while ? That was what I understood from your previous post, but nothing is less sure.
    This is a misconception of how indexes are working.
    I would suggest the reading of the following interasting doc, they are a lot of nice examples (including index space reuse) to understand, and in conclusion :
    http://richardfoote.files.wordpress.com/2007/12/index-internals-rebuilding-the-truth.pdf
    "+Index Rebuild Summary+
    +•*The vast majority of indexes do not require rebuilding*+
    +•Oracle B-tree indexes can become “unbalanced” and need to be rebuilt is a myth+
    +•*Deleted space in an index is “deadwood” and over time requires the index to be rebuilt is a myth*+
    +•If an index reaches “x” number of levels, it becomes inefficient and requires the index to be rebuilt is a myth+
    +•If an index has a poor clustering factor, the index needs to be rebuilt is a myth+
    +•To improve performance, indexes need to be regularly rebuilt is a myth+"
    Good reading,
    Nicolas.

  • Does the sequence of Insert/Update matter?

    Dear friends
    I wanted to know whether there is any difference in results if I change the sequence of Insert/Update statements. For example, if I'm inserting / updating records in a table, normally I write my code as below
    <code>
    Begin
    Insert Into Table Name(...)
    Values(...);
    Exception
    When Dup_Val_On_Index Then
    Update Table Name Set
    Column_Name = Value;
    When Others Then
    Sqlerrm;
    End;
    </code>
    Lets assume, if I change the order of statements as below:
    <code>
    Begin
    Update Table Name Set
    Column_Name = Value;
    If Sql%NotFound Then
    Begin
    Insert Into Table Name(...)
    Values(...);
    End;
    Exception
    When Others Then
    Sqlerrm;
    End;
    </code>
    So does this change make any difference in net results?

    Yes...till before today morning, it had no difference to me too, but today I was facing an insertion problem in a procedure in which insert statement had to go into when_dup_val exception part after first record and execute update statement. But surprisingly, it was going into when_dup_val only for 2nd record, i.e. 1st record was inserted as new record, then 2nd record had same primary key values, so when_dup_val caused to execute update statement, and lastly for 3rd record, having same primary key values, instead of executing update statement, insertion was going into when_others exception part and raising error ORA-01008.
    After very detailed hit n trial, I just reversed the operations, took udpate statement before insert in my code and then all the things got right. Procedure is executing finely.
    Although I've resolved the issue but I couldn't digest this change, thats why asking you peeps the operations behind.

  • SQL SERVER BULK FETCH AND INSERT/UPDATE?

    Hi All,
           I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some  business logic?
           How do i do this?
           Thanks in Advance.
    Regards
    Yogesh.B

    > is there a possibility that I can do a bulk fetch and place it in an array, even inside a stored procedure ?
    You can use Temporary tables or Table variables and have them indexes as well
    >After I have processed my records, tell me a way that I will NOT go, RECORD by RECORD basis, even inside a stored procedure ?
    As i said earlier, you can perform UPDATE these temporary tables or table variables and finally INSERT/ UPDATE your base table
    >Arrays are used just to minimize the traffic between the server and the program area. They are used for efficient processing.
    In your case you will first have to populate the array (Using some of your queries from the server) which means you will first load the arrary, do some updates, and then send them back to server therefore
    network engagement
    So I just gave you some thoughts I feel could be useful for your implementation, like we say, there are many ways so pick the one that works good for you in the long run with good scalability
    Good Luck! Please Mark This As Answer if it solved your issue. Please Vote This As Helpful if it helps to solve your issue

  • Sender JDBC Adapter Select/Update Issue

    Dear All,
    We have configured a Sender JDBC Adapter to Poll data from the DB2 tables. It is working fine and both the select and the update queries written are also getting properly executed and are changing the status of the flag from Y to N once read from database.
    In the communication channel ->
    select * from <table> where flag = 'N'.
    update <table> set flag = 'Y' where flag = 'N'.
    But I have one doubt after executing the select query some new data comes into the table of status flag 'N"., then will this unselected data will also be updated to 'Y' .
    The question is while we do a select and update from XI on the DB table and at the same time there is an insert happening into the table from the other end how will the adpater behave in this case.Will it result in missing of some records during next select/update transaction from XI..
    Your inputs will be appreciated.
    Regards
    Amit

    Amit
    Did you ever get a solution to your question ?
       Sender JDBC Adapter Select/Update Issue  
    Posted: Apr 24, 2008 2:29 PM           Reply 
    Dear All,
    We have configured a Sender JDBC Adapter to Poll data from the DB2 tables. It is working fine and both the select and the update queries written are also getting properly executed and are changing the status of the flag from Y to N once read from database.
    In the communication channel ->
    select * from <table> where flag = 'N'.
    update <table> set flag = 'Y' where flag = 'N'.
    But I have one doubt after executing the select query some new data comes into the table of status flag 'N"., then will this unselected data will also be updated to 'Y' .
    The question is while we do a select and update from XI on the DB table and at the same time there is an insert happening into the table from the other end how will the adpater behave in this case.Will it result in missing of some records during next select/update transaction from XI..
    Your inputs will be appreciated.
    Regards
    Amit

  • EF Inserts/Updates: Ridiculously Slow

    I cracked open the latest version of EF the other day. I had a project where I wanted to insert a lot of data in one hit. The number of records I wanted to insert was about 2,000,000. There are about 8 decimal columns in the table. So, I wrote the code in
    EF. I wrote the code with an "Add", and then a "SaveChanges". What was taking an hour and half to write to XML suddenly jumped to an estimated time of 48 hours. (Note: most of the 1 1/2 hours is downloading data from the internet).
    So I put an time trace in to see where each action was spending its time. The "Add" (putting the record of the type I want to insert in to the collection) was taking about 300 milliseconds and the "SaveChanges" was taking about 500 milliseconds.
    This is simply absurd.
    When I changed the code to avoid using EF - i.e. executing a straight INSERT statement against the SQL server database, the time went back down to about 1 1/2 hours. I found that the insert call was only taking about 2-3 milliseconds.
    I thought perhaps EF was doing too much at the database level. So, I traced SQL Server. To my surprise, I found that EF was only executing insert statements in the same way I was executing the insert statement. So, why so slow? What's wrong with EF and when
    will it be fixed? Is it even possible to use EF?

    Several things are important here. First of all: what do you want to do, using which tool,
    and how do you implement it.
    Concerning the what and which-tool:
    several people have already stated that an ORM is not the tool of choice when a bulk insert is what you actually want to do. EF has tons of great features but it’s simply not a solution for every problem.
    Then there’s the question of
    how you use EF. Two important things that come to mind here are the fact that by default EF tracks all entities that are attached to it, and that a call to savechanges will result in the execution of one insert/update statements for each of the changed
    entities.
    Let’s assume that EF was applied in this scenario, in the most simple –and naive - way imaginable. We create a context, we add 2M entities, we call SaveChanges.
    Then we can take a very long break… Now let’s take a look at what happened here. 2.000.000 entities were attached to the context and it will be tracking all of them for changes – that’s a lot to handle! Once all entities are attached, the SaveChanges method
    is called. At this point all modified/new entities are persisted to the database. But EF will actually do this using 2M single insert statement. Ouch.
    Now there are two issues here. One is the fact that EF executes only one update/insert statement at a time. You can deal with this using EF extensions that
    actually implement the bulk insert you’re looking for. The second issue is the number of entities attached to the context. You can significantly increase performance by lowering the number of attached entities. You could go about this by batching: for the
    first set of 500 entities create a new context, add and save the entities; then create a new context for the next 500, and so on.

  • Any KM which can insert , update and delete

    hi everyone ,
    i have a oracle target which i want to keep synchronize daily with sql server source which means if there is any deletion in source ,i want a deletion in target too . Hope iam clear .
    Is there is any KM that can do insert ,update and delete . i know the other way is to truncate and load but i few tables which are millions of records doing truncating daily not possible .
    is there is any way out for this issue ?
    Thanks

    Or
    1) create to load only the PK into a temp table (yellow interface as suggestion)
    2) create a procedure to delete your target with a "not exists" option from the created temp table
    3) create an "normal" interface with Incremental Update
    Make any sense?

  • Importing ODI Procedure in Insert/Insert-Update mode

    Can we import an ODI procedure from one project to another in INSERT or INSERT-UPDATE mode?
    We are getting xml import error while doing this. But when we do the import in DUPLICATION mode, it successfully does so.
    The issue is that we have an ODI procedure in INT environment which is been used by several other packages. We changed the code of it in our DEV environment and when we tried to import it in INT environment through INSERT-UPDATE mode so that the the changes gets effected, we got error.
    This is quite obvious that the folder/sub-folder IDs are different in both environment and this is causing trouble.
    So, is there any other way, we can reflect the change in the INT environment with minimum effort? I mean, we don't want to import it in DUPLICATION mode....if we do so, we will have to re-map that procedure in all the packages and regenerate them.

    You can use variable to replace the value at runtime in the query but you need to either pass the variable value at startup or you can refresh variable value. But I dont think you can directly retrieve any topology setting as the variable value to be substituted in the query.

  • Toplink insert sequence

    Hi all:
    Is there any sequence in which toplink fires insert/update queries to the database if there is no object mapping..?
    Actually I have domain classes which have mapping using the primaryKeys but not the actual objects.
    I am using toplink 9037.
    Regards,
    Viral

    TopLink will not insert or update objects that have no descriptor or mappings.
    If more information is required perhaps you could explain, with examples, what your object model looks like and what, exactly, it is that you are attempting to do.
    --Gordon                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Maybe you are looking for

  • Can't sync iphone in iTunes and can't connect to wi-fi

    Since my last update to IOS 5.1.1, I can no longer connect to wi-fi via my home router, nor can I back up my iPhone in iTunes.             Regarding the wi-fi connection, I receive a message stating that the wi-fi password is incorrect.  I have tried

  • Video chat connection problems

    I've been trying to video chat with my friend. She's using iChat v.4.0.7, I'm using iChat v.3.1.9. The most that happens is that we connect, and I get one frame from her end, but then we get a communications error (in particular, error -20). I made c

  • How to check GL & RG patchset level.

    Hi, How to check GL & RG patchset level? Any script ? thanks

  • Integration (CIF) of dependend demands to APO without planned orders?

    Hi, is there any way we can integrate dependent demands for the components generated by planned and/or process orders without integrating the final products and planned / process orders? The reason is that we are forecasting on component level only i

  • Maybe off topic, but.....

    hope this isnt too of topic, but looking for suggestions I have a BFG 6800 gt and want to replace the cooler as it is running hot, which is a better cooler?  1.  ARCTIC COOLING VGA Silencer AVC-1000 120mm VGA Silencer  or 2.  Zalman VF700-AlCu  ? the