A question about transaction consistency between multible target tables

Dear community,
My replication is ORACLE 11.2.3.7->ORACLE 11.2.3.7 both running on linux x64 and GG version is 11.2.3.0.
I'm recovering from an error when trail file was moved away while dpump was writing to it.
After moving the file back dpump abended with an error
2013-12-17 11:45:06  ERROR   OGG-01031  There is a problem in network communication, a remote file problem, encryption keys for target and source do
not match (if using ENCRYPT) or an unknown error. (Reply received is Expected 4 bytes, but got 0 bytes, in trail /u01/app/ggate/dirdat/RI002496, seqno 2496,
reading record trailer token at RBA 12999993).
I googled for It and found no suitable solution except for to try "alter extract <dpump>, entrollover".
After rolling over trail file replicat as expected ended with
REPLICAT START 1
2013-12-17 17:56:03  WARNING OGG-01519  Waiting at EOF on input trail file /u01/app/ggate/dirdat/RI002496, which is not marked as complete;
but succeeding trail file /u01/app/ggate/dirdat/RI002497 exists. If ALTER ETROLLOVER has been performed on source extract,
ALTER EXTSEQNO must be performed on each corresponding downstream reader.
So I've issued "alter replicat <repname>, extseqno 2497, extrba 0" but got the following error:
REPLICAT START 2
2013-12-17 18:02:48 WARNING OGG-00869 Aborting BATCHSQL transaction. Detected inconsistent result:
executed 50 operations in batch, resulting in 47 affected rows.
2013-12-17 18:02:48  WARNING OGG-01137  BATCHSQL suspended, continuing in normal mode.
2013-12-17 18:02:48  WARNING OGG-01003  Repositioning to rba 1149 in seqno 2497.
2013-12-17 18:02:48 WARNING OGG-01004 Aborted grouped transaction on 'M.CLIENT_REG', Database error
1403 (OCI Error ORA-01403: no data found, SQL <UPDATE "M"."CLIENT_REG" SET "CLIENT_CODE" =
:a1,"CORE_CODE" = :a2,"CP_CODE" = :a3,"IS_LOCKED" = :a4,"BUY_SUMMA" = :a5,"BUY_CHECK_CNT" =
:a6,"BUY_CHECK_LIST_CNT" = :a7,"BUY_LAST_DATE" = :a8 WHERE "CODE" = :b0>).
2013-12-17 18:02:48  WARNING OGG-01003  Repositioning to rba 1149 in seqno 2497.
2013-12-17 18:02:48 WARNING OGG-01154 SQL error 1 mapping LS.CHECK to M.CHECK OCI Error ORA-00001:
unique constraint (M.CHECK_PK) violated (status = 1). INSERT INTO "M"."CHECK"
("CODE","STATE","IDENT_TYPE","IDENT","CLIENT_REG_CODE","SHOP","BOX","NUM","KIND","KIND_ORDER","DAT","SUMMA","LIST_COUNT","RETURN_SELL_CHECK_CODE","RETURN_SELL_SHOP","RETURN_SELL_BOX","RETURN_SELL_NUM","RETURN_SELL_KIND","INSERTED","UPDATED","REMARKS")
VALUES
(:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8,:a9,:a10,:a11,:a12,:a13,:a14,:a15,:a16,:a17,:a18,:a19,:a20).
2013-12-17 18:02:48  WARNING OGG-01003  Repositioning to rba 1149 in seqno 2497.
The report stated the following:
Reading /u01/app/ggate/dirdat/RI002497, current RBA 1149, 0 records
Report at 2013-12-17 18:02:48 (activity since 2013-12-17 18:02:46)
From Table LS.MK_CHECK to LSGG.MK_CHECK:
       #                   inserts:         0
       #                   updates:         0
       #                   deletes:         0
       #                  discards:         1
From Table LS.MK_CHECK to LSGG.TL_MK_CHECK:
       #                   inserts:         0
       #                   updates:         0
       #                   deletes:         0
       #                  discards:         1
At that time I came to the conclusion that using etrollover was not a good idea Nevertheless I had to upload my data to perform consistency check.
My mapping templates are set up as the following:
LS.CHECK->M.CHECK
LS.CHECK->M.TL_CHECK
(such mapping is set up for every table that is replicated).
TL_CHECK is a transaction log, as I name it,
and this peculiar mapping is as the following:
ignoreupdatebefores
map LS.CHECK, target M.CHECK, nohandlecollisions;
ignoreupdatebefores
map LS.CHECK, target M.TL_CHECK ,colmap(USEDEFAULTS,
FILESEQNO = @GETENV ("RECORD", "FILESEQNO"),
FILERBA = @GETENV ("RECORD", "FILERBA"),
COMMIT_TS = @GETENV( "GGHEADER", "COMMITTIMESTAMP" ),
FILEOP = @GETENV ("GGHEADER","OPTYPE"), CSCN = @TOKEN("TKN-CSN"),
RSID = @TOKEN("TKN-RSN"),
OLD_CODE = before.CODE
, OLD_STATE = before.STATE
, OLD_IDENT_TYPE = before.IDENT_TYPE
, OLD_IDENT = before.IDENT
, OLD_CLIENT_REG_CODE = before.CLIENT_REG_CODE
, OLD_SHOP = before.SHOP
, OLD_BOX = before.BOX
, OLD_NUM = before.NUM
, OLD_NUM_VIRT = before.NUM_VIRT
, OLD_KIND = before.KIND
, OLD_KIND_ORDER = before.KIND_ORDER
, OLD_DAT = before.DAT
, OLD_SUMMA = before.SUMMA
, OLD_LIST_COUNT = before.LIST_COUNT
, OLD_RETURN_SELL_CHECK_CODE = before.RETURN_SELL_CHECK_CODE
, OLD_RETURN_SELL_SHOP = before.RETURN_SELL_SHOP
, OLD_RETURN_SELL_BOX = before.RETURN_SELL_BOX
, OLD_RETURN_SELL_NUM = before.RETURN_SELL_NUM
, OLD_RETURN_SELL_KIND = before.RETURN_SELL_KIND
, OLD_INSERTED = before.INSERTED
, OLD_UPDATED = before.UPDATED
, OLD_REMARKS = before.REMARKS), nohandlecollisions, insertallrecords;
As PK violation fired for CHECK, I've changed nohandlecollisions to handlecollisions for LS.CHECK->M.CHECK mapping and restarted an replicat.
To my surprise it ended with the following error:
REPLICAT START 3
2013-12-17 18:05:55 WARNING OGG-00869 Aborting BATCHSQL transaction. Database error 1 (ORA-00001:
unique constraint (M.CHECK_PK) violated).
2013-12-17 18:05:55 WARNING OGG-01137 BATCHSQL suspended, continuing in normal mode.
2013-12-17 18:05:55 WARNING OGG-01003 Repositioning to rba 1149 in seqno 2497.
2013-12-17 18:05:55 WARNING OGG-00869 OCI Error ORA-00001: unique constraint (M.PK_TL_CHECK)
violated (status = 1). INSERT INTO "M"."TL_CHECK"
("FILESEQNO","FILERBA","FILEOP","COMMIT_TS","CSCN","RSID","CODE","STATE","IDENT_TYPE","IDENT","CLIENT_REG_CODE","SHOP","BOX","NUM","KIND","KIND_ORDER","DAT","SUMMA","LIST_COUNT","RETURN_SELL_CHECK_CODE","RETURN_SELL_SHOP","RETURN_SELL_BOX","RETURN_SELL_NUM","RETURN_SELL_KIND","INSERTED","UPDATED","REMARKS")
VALUES
(:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8,:a9,:a10,:a11,:a12,:a13,:a14,:a15,:a16,:a17,:a18,:a19,:a20,:a21,:a22,:a23,:a24,:a25,:a26).
2013-12-17 18:05:55 WARNING OGG-01004 Aborted grouped transaction on 'M.TL_CHECK', Database error 1
(OCI Error ORA-00001: unique constraint (M.PK_TL_CHECK) violated (status = 1). INSERT INTO
"M"."TL_CHECK"
("FILESEQNO","FILERBA","FILEOP","COMMIT_TS","CSCN","RSID","CODE","STATE","IDENT_TYPE","IDENT","CLIENT_REG_CODE","SHOP","BOX","NUM","KIND","KIND_ORDER","DAT","SUMMA","LIST_COUNT","RETURN_SELL_CHECK_CODE","RETURN_SELL_SHOP","RETURN_SELL_BOX","RETURN_SELL_NUM","RETURN_SELL_KIND","INSERTED","UPDATED","REMARKS")
VALUES
(:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8,:a9,:a10,:a11,:a12,:a13,:a14,:a15,:a16,:a17,:a18,:a19,:a20,:a21,:a22,:a23,:a24,:a25,:a26)).
2013-12-17 18:05:55  WARNING OGG-01003  Repositioning to rba 1149 in seqno 2497.
2013-12-17 18:05:55 WARNING OGG-01154 SQL error 1 mapping LS.CHECK to M.TL_CHECK OCI Error
ORA-00001: unique constraint (M.PK_TL_CHECK) violated (status = 1). INSERT INTO "M"."TL_CHECK"
("FILESEQNO","FILERBA","FILEOP","COMMIT_TS","CSCN","RSID","CODE","STATE","IDENT_TYPE","IDENT","CLIENT_REG_CODE","SHOP","BOX","NUM","KIND","KIND_ORDER","DAT","SUMMA","LIST_COUNT","RETURN_SELL_CHECK_CODE","RETURN_SELL_SHOP","RETURN_SELL_BOX","RETURN_SELL_NUM","RETURN_SELL_KIND","INSERTED","UPDATED","REMARKS")
VALUES
(:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8,:a9,:a10,:a11,:a12,:a13,:a14,:a15,:a16,:a17,:a18,:a19,:a20,:a21,:a22,:a23,:a24,:a25,:a26).
2013-12-17 18:05:55  WARNING OGG-01003  Repositioning to rba 1149 in seqno 2497.
I've expected that batchsql will fail cause it does not support handlecollisions, but I really don't understand why any record was inserted into TL_CHECK and caused PK violation, cause I thought that GG guarantees transactional consistency and that any transaction that caused an error in _ANY_ of target tables will be rollbacked for _EVERY_ target table.
TL_CHECK has PK set to (FILESEQNO, FILERBA), plus I have a special column that captures replication run number and it clearly states that a record causing PK violation was inserted during previous run (REPLICAT START 2).
BTW report for the last shows
Reading /u01/app/ggate/dirdat/RI002497, current RBA 1149, 1 records
Report at 2013-12-17 18:05:55 (activity since 2013-12-17 18:05:54)
From Table LS.MK_CHECK to LSGG.MK_CHECK:
       #                   inserts:         0
       #                   updates:         0
       #                   deletes:         0
       #                  discards:         1
From Table LS.MK_CHECK to LSGG.TL_MK_CHECK:
       #                   inserts:         0
       #                   updates:         0
       #                   deletes:         0
       #                  discards:         1
So somebody explain, how could that happen?

Write the query of the existing table in the form of a function with PRAGMA AUTONOMOUS_TRANSACTION.
examples here:
http://www.morganslilbrary.org/library.html

Similar Messages

  • CDC Subscription question concerning transactional consistency

    Hi
    Hopefully a quick question: can anyone confirm whether or not it is the subscription that controls transactional consistency when accessing change records in the change views?
    For example, if you have 20 source tables that you are capturing changes for and create one change source, one change set containing 20 change tables and one subscription for which you have 20 change views then because it is the subscription that you specify when performing the extend and purge operationa and hence it is the one subscription than ensures that when an extend is issued then all change records across the 20 change views will be transactionally consistent.
    I have had an alternative design proposed to me that is to use 20 separate subscriptions - one for each source table and change table. My concern is that this will not ensure transactional consistency across the 20 tables and that any ETL design (for example 20 separate threads running in parallel and each doing an extend, process, purge sequence) cannot ensure that the change records in the change views correspond to the same transaction across the tables in the source database.
    I hope that this is clear - any views and opinions on this will be very gratefully received.
    Many thanks
    Pete

    >
    Apologies if this appears to be belabouring the point - but it is an important bit of understanding for me.
    >
    The issue is not that you are belabouring the point but that you are not reading the doc quote I cited or the last paragraph on my last reply.
    Creating a consistent set of data and USING (querying) a consistent subset of that data are two different things. The publisher is responsible for creating a change set that includes the data you will want and the change set will make sure that a consistent set of data is available.
    Whether a subscriber makes proper use of that change set data or not is another thing altogether.
    If you create 20 subscriptions then those are totally independent of one another just lilke 20 people subscribing to the Wall Street Journal; those subscribers and subscriptions have NOTHING to do with one another. If you want to try to synchronize the 20 yourself have at it but as far as Oracle is concerned each subscriber is unique and independent.
    If you want to subscribe to, and use, a consistent subset of the change set then, as the doc quote said, you have to JOIN the tables together.
    Read the documentation - you can't understand it if you don't read the doc and go through the examples in it.
    Step 4 Subscribe to a source table and the columns in the source table shows you exactly how you have to do the join.
    The second step of that example says this
    >
    Therefore, if the subscriber wants to subscribe to columns in both publications, using EMPLOYEE_ID to join across the subscriber views, then the subscriber must use two calls, each specifying a different publication ID:
    >
    '. . . join across the subscriber views . . .'
    Don't be afraid of breaking Oracle by trying things!
    The SCOTT EMP and DEPT tables might currently have a consistent set of data in the two tables. But if I query EMP from one session (subscription) and query DEPT form another session (subscription) the results I get might not be consistent between them because I used to independent operations to get the data and the data might have changed in between the two steps.
    However if I query DEPT and EMP using a JOIN then Oracle guarantees that the data from both tables will reflect the same instant in time.
    Not a perfect analogy but close to the subscription use. If you want to subscribe to data from mutiple tables/views in the change set AND get a consistent set of data you need to join the tables/views together. The mechanism for doing this is not the same as SQL but is shown in the above example in the doc.

  • The Question about stock transfer between HU-Management and WM-Management

    Hi,
    There is a scenario about stock transfer between HU-Management and WM-Management storage location. I use transaction MB1B , movement type 313 , 315. After Good issue from WM-management storage location, outbound delivery will genarated, then Pack, Create/Confirm TO, at last post goods issue for the outboud delivery. But when i do movement type 315, there is a warning message "Data of preceding document was not transmitted", and from the F1 help i find this system reponse "You can maintain an indicator that makes information about preceding documents in this delivery available under delivery type in Customizing. For some characteristics of this indicator, the type of preceding document and the related document and item numbers must be transmitted to delivery creation. At least one of these parameters is missing.".
    So, My questions are:
    1. Generallily, Outbound delivery is created by SO, inbound delivery is created referenc PO, but how the stock transfer for 2 steps generate the outbould delivery and inbound delivery? Could you pls tell me the where i can config this in the IMG?
    2. What's "Data of preceding document was not transmitted" mean ? how to fix this issue ?
    Best Regards
    Boxer Du
    I am the SAP fans, focus on MM and WM. I am interesting TRM Yard Management and Cross Docking now.Very Gladly to talk you about these areas. I want to exchange the knowledge with you, and want to be a good friend of you. Pls contact me. You can find My MSN in the profile. Thanks.

    Hi,
    Sure, The inbound delivery type is set up in the IMG->Logistics General -> HU management -> Basics ->Delivery type -> Delivery type determination.
    For Inbound delivery type 'HID' is maintained in this view.
    Best Regards.
    Some One want to discuss the details , can contact me. Thanks.

  • Some questions about the integration between BIEE and EBS

    Hi, dear,
    I'm a new bie of BIEE. In these days, have a look about BIEE architecture and the BIEE components. In the next project, there are some work about BIEE development based on EBS application. I have some questions about the integration :
    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?
    could anyone give some guide for me? I'm very appreciated if you can also give any other information.
    Thanks in advance.

    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?You, shud consider OBI Application here which uses OBIEE as a reporting tool with different pre-built modules. Both 10g & 11g comes with different versions of BI apps which supports sources like Siebel CRM, EBS, Peoplesoft, JD Edwards etc..
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?Its independent of any soure. This is OBIEE modeling to create RPD with all the layers. If you build it from scratch then you will require to create all the layers else if BI Apps is used then you will get pre-built RPD along with other pre-built components.
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?BI apps comes with pre-built ETL mapping to use with the tools majorly with Informatica. Only BI Apps 7.9.5.2 comes with ODI but oracle has plans to have only ODI for any further releases.
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?User will still see old data because its good to turn on Cache and purge it after every load.
    Refer..http://www.oracle.com/us/solutions/ent-performance-bi/bi-applications-066544.html
    and many more docs on google
    Hope this helps

  • Question About Sharing Apps Between My iPhone and My Wife's iPhone

    Hi, I have a question about downloading apps to my iPhone.
    I ordered two iPhone 3g's - one for my wife and one for me, and I've discovered that when I download an app (even free ones) on my iPhone and then later sync it to iTunes, my wife cannot put it on her phone. However if I download it on iTunes on my computer and later sync it to my phone, then my wife is able to sync it to her phone as well. It's no big deal when they are free because she can just go get it if she wants it. I haven't purchased any apps yet, but is there a way to be able to give my wife apps that I purchase on my iPhone? Or do I need to purchase all of my [pay] apps and songs on iTunes on my computer to ensure we both have access? Or do I have it all wrong completely LOL?? This rights management stuff is confusing.
    Thanks in advance for your help,
    Jimmy

    I had my iTunes account and phone before I got my wife's hers. When I first started, I synced hers from her account on my computer brought in my libraries and authorized her phone as one of the 5 allowed. She had all purchased apps and music. Over time we each bought music or apps on our seperate accouts. Of course, we told each other about cool apps or music we bought and wanted on our phone. When I went to sync iTunes was going to delete all music and apps and sync with the library I was logged into. Didn't want that, only work around I have found is to copy new music or new apps from one library to the other. Updating on the phone is also difficult because you can't switch between accounts so updates have to be done at the computer for apps bought under different accounts. I am sure there is a better way to do this and ask for an insight anyone would have. I hope it works out.
    Message was edited by: AfoUare for typos

  • Question about transaction

    In one of my screen flow, there is one action to call an external ejb. The ejb may be deployed on a different application server.
    How to make that ejb call in the same transaction of the obpm?
    For example, if there is any error happens after the action calling ejb, I want the ejb call could be rolled back.
    How to do that?
    Thanks a lot!

    Hi,
    All transaction codes are stored in table TSTC.
    Thorgh Se93   Transaction code is created for the applications
    like Dialog programmming
          Reports
          OOtransaction
          Parameter Transaction
          Table maintenace
    Even you can change the Transaction code here.
    Regards,
    Raj.

  • Question about inputting data to a pivot table

    Hi,
    I have 2 questions about using the ADF pivot table component (I would like it for data input).
    1. Is it possible to paste into multiple cells (i.e. using cut in Excel for example, and then pasting the cells into the pivot table) ?
    2. What is the recommended approach for the case where there is no data values (no rows on the database table)? For example, if I have regions, products, measures and time periods and then want to be able to select some regions and some products to enter new sales figures - and then create the rows on the database with save button, is there a recommended way to do this?
    (Jdeveloper version: 11.1.2.0.0)

    Anybody able to help with this?

  • Question about exporting FBL3N data in Pivot table format

    Dear all
    I have a question about the "exporting file" function of ECC 6.0.
    We started to use ECC 6.0 system, and it appears that we are no longer able to export "FBL3N data" to a local PC in pivot table format.
    Is there a way to do it?
    We used to be able to do so when the system is 4.6.
    I appreciate your support!
    Kind regards

    Dear:
                 You can do that. Execute report FBL3N with desired GL account in it. After when report has been executed right click on it. There will be an option Spread Sheet. Select all available format. In drill down select 08 (Excel in existing XXL format) and press OK. There you will get pivot table option for downloading., Hope this will help you resolve your query.
    Regards

  • Question about Transactions...  Part 1 (Urgent)

    Hi all!
    Is there a transaction that I can use to check the status of ALL the objects that I activated in the BI system.  If there is, what is it?  Can this transaction return data about who activated it, when and what is wrong with it if ever?
    Is there a transaction that can return a list of unactivated objects in the system (tables, datasources, infosources and infoproviders)?  Can this transaction tell me why the selected object cannot be activated?
    Thank you!
    Philips

    Something NEW to learn in nw2004s
    <b>Content Analyzer</b> is available in SAP NetWeaver 2004s  from BI Content Add-On Version 2.
    The program provides tools to monitor and enforce project quality standards. Content Analyzer (transaction RSBICA) checks for inconsistencies and errors of customer-defined objects throughout your entire landscape .
    This provides a way to analyze the objects and flush out hard-to-find problems. This tool assists BI project teams in identifying many problems that they need to address, such as InfoObjects that were created and not assigned to an InfoObject Catalog, or objects still in the temporary development class.
    In the past, it was nearly impossible to analyze and find all the issues that Content Analyzer now can unearth for you. BI users just did not have the tools or techniques to handle all the issues that surfaced. As a result, BI teams often did not thoroughly address and resolve all problems. You can run Content Analyzer across the entire BI landscape to research possible errors. In some cases, the produced report provides data and directions for resolving the problem.
    You configure Content Analyzer in transaction RSBICA
    <b>
    Content Analyzer allows you to check for the following errors:</b>
    1)InfoObjects not assigned to an InfoObject catalog
    2)Objects that do not comply with the configured naming convention
    3)Objects that are in the temporary development class
    4)Query elements that have multiple global unique identifiers (GUIDs)
    5)Object status with inconsistencies between A (active) and D (delivered) versions
    Hope it Helps
    Chetan
    @CP..

  • Question about transactions...

    Halo...
    I am still in a process of understanding oracle transactions and this is the question :
    If I do in store procedure this:
    PROCEDURE SomeProc....
    varField NUMBER(15,2);
    BEGIN
    SAVEPOINT UndoAll;
    UPDATE aTable SET aField = 1000, bField = 'YES' WHERE cField = 'id10';
    SELECT aField INTO varField FROM aTable WHERE bField = 'YES' AND cField = 'id10';
    EXCEPTION
    WHEN OTHERS THEN ROLLBACK TO UndoAll;
    END SomeProc;
    Should I do COMMIT in between UPDATE and SELECT !?
    Or, because it is a single transaction the SELECT statemant fetchs the changes done by previous UPDATE statemant !?

    Should I do COMMIT in between UPDATE and SELECT !?
    Or, because it is a single transaction the SELECT
    statemant fetchs the changes done by previous UPDATE statemant !? When you do the UPDATE, you lock those rows. If you do not COMMIT then do a SELECT on those same rows, you'll see your updates.
    If you do the UPDATE and then COMMIT, you may or may not see your changes. As soon as you commit, your locked rows become unlocked. Then anyone else may update those rows. Some of your rows may have been updated in the (very brief) time between your COMMIT and your SELECT.
    Of course, in your example that's not likely to happen. But in a highly concurrent environment...who knows?
    The reason you do a COMMIT is to make permanent a logical unit of work.

  • Question about the difference between HDDs and SSDs

    Hi. I currently have a MacBook (Black one) that's overheating a lot for the past few months. I'm thinking about purchasing a MacBook Pro for school because it seems to be more stable. I have a question though. What is the difference between HDDs and SSDs? Which one is better? All I know is that with my MacBook and my iBook is that I had hard drive failures (iBook hard drive clicking, MacBook hard drive sounding loud and making my computer overheat). Are hard drive failures normal with Macs?
    Thank you

    astoldbywesley wrote:
    Is a 128GB Solid State Drive better than a 500GB Serial ATA Drive @ 5400 rpm or 500GB Serial ATA Drive @ 7200 rpm?
    Depends what you mean by "better." Faster? Yes. The 500 has 3x more storage capacity.
    Message was edited by: tjk

  • Question about creating association between Sharepoint BDC entities using Visual Studio 2010

    I am following this tutorial:
    http://blogs.msdn.com/b/vssharepointtoolsblog/archive/2010/08/02/walkthrough-of-creating-association-between-sharepoint-bdc-entities-using-visual-studio-2010.aspx
    I am getting the following error:
     There is an error in the TypeDescriptors of Parameters on Method with Name 'CustomerToOrder' on Entity (External Content Type) with Name 'Customer' in Namespace 'BdcAssociationSample.BdcModel1'. The TypeDescriptors incompletely define where the Foreign
    Identifiers of Entity 'Order' in Namespace 'BdcAssociationSample.BdcModel1' are to be read. That Entity expects exactly '1' Identifiers, but only '0' TypeDescriptors with the correct Association were found from which to read them.
    I have viewed the following:
    http://lightningtools.com/business_connectivity_services/the-typedescriptors-incompletely-define-where-the-identifiers-of-entity-x-are-to-be-readbcs-error/
    after reading that article, I am still not sure what setting I am missing.  Does anyone have the full working source code from the msdn article linked above?

    Hi,
    According to your post, my understanding is that you got the symbols not loaded error.
    First try rebuilding your project by right mouse click the project > Rebuild If that doesn't work, try a clean of the project (right mouse click on the project > clean)
    If that didn't work check this:
    Right mouse click your project
    select [Properties]
    select the [Build] tab
    make sure [Define DEBUG constant] and [Define TRACE constant] are checked
    Click the [Advanced] button at the bottom of the Build tabpage
    Make sure that [Debug Info:] is set to [full]
    Click [OK] and rebuild the project ;
    Hope that works for you! (step 6 generates the .pdb files, these are the debugging symbols)
    What’s more, you can clean & re-compile the code, then install the respective dlls one more time to GAC (just remove them before adding to GAC again), do an IISReset to check whether it works.
    More reference:
    http://stackoverflow.com/questions/2155930/fixing-the-breakpoint-will-not-currently-be-hit-no-symbols-have-been-loaded-fo
    http://stackoverflow.com/questions/2301216/the-breakpoint-will-not-currently-be-hit-no-symbols-have-been-loaded-for-this-d
    http://geekswithblogs.net/dbutscher/archive/2007/06/26/113472.aspx
    Thanks,
    Jason
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Jason Guo
    TechNet Community Support

  • Question about main difference between Java bean and Java class in JSP

    Hi All,
    I am new to Java Bean and wonder what is the main difference to use a Bean or an Object in the jsp. I have search on the forum and find some post also asking the question but still answer my doubt. Indeed, what is the real advantage of using bean in jsp.
    Let me give an example to illustrate my question:
    <code>
    <%@ page errorPage="errorpage.jsp" %>
    <%@ page import="ShoppingCart" %>
    <!-- Instantiate the Counter bean with an id of "counter" -->
    <jsp:useBean id="cart" scope="session" class="ShoppingCart" />
    <html>
    <head><title>Shopping Cart</title></head>
    <body bgcolor="#FFFFFF">
    Your cart's ID is: <%=cart.getId()%>.
    </body>
    <html>
    </code>
    In the above code, I can also create a object of ShoppingCart by new operator then get the id at the following way.
    <code>
    <%
    ShoppingCart cart = new ShoppingCart();
    out.println(cart.getId());
    %>
    </code>
    Now my question is what is the difference between the two method? As in my mind, a normal class can also have it setter and getter methods for its properties. But someone may say that, there is a scope="session", which can be declared in an normal object. It may be a point but it can be easily solved but putting the object in session by "session.setAttribute("cart", cart)".
    I have been searching on this issue on the internet for a long time and most of them just say someting like "persistance of state", "bean follow some conventions of naming", "bean must implement ser" and so on. All of above can be solved by other means, for example, a normal class can also follow the convention. I am really get confused with it, and really want to know what is the main point(s) of using the java bean.
    Any help will be highly apprecaited. Thanks!!!
    Best Regards,
    Alex

    Hi All,
    I am new to Java Bean and wonder what is the main
    difference to use a Bean or an Object in the jsp. The first thing to realize is that JavaBeans are just Plain Old Java Objects (POJOs) that follow a specific set of semantics (get/set methods, etc...). So what is the difference between a Bean and an Object? Nothing.
    <jsp:useBean id="cart" scope="session" class="ShoppingCart" />
    In the above code, I can also create a object of
    ShoppingCart by new operator then get the id at the
    following way.
    ShoppingCart cart = new ShoppingCart();
    out.println(cart.getId());
    ...Sure you could. And if the Cart was in a package (it has to be) you also need to put an import statement in. Oh, and to make sure the object is accessable in the same scope, you have to put it into the PageContext scope. And to totally equal, you first check to see if that object already exists in scope. So to get the equivalant of this:
    <jsp:useBean id="cart" class="my.pack.ShoppingCart"/>Then your scriptlet looks like this:
    <%@ page import="my.pack.ShoppingCart %>
    <%
      ShoppingCart cart = pageContext.getAttribute("cart");
      if (cart == null) {
        cart = new ShoppingCart();
        pageContext.setAttribute("cart", cart);
    %>So it is a lot more work.
    As in my mind, a normal class can also
    have it setter and getter methods for its properties.True ... See below.
    But someone may say that, there is a scope="session",
    which can be declared in an normal object.As long as the object is serializeable, yes.
    It may be
    a point but it can be easily solved but putting the
    object in session by "session.setAttribute("cart",
    cart)".Possible, but if the object isn't serializable it can be unsafe. As the point I mentioned above, the useBean tag allows you to check if the bean exists already, and use that, or make a new one if it does not yet exist in one line. A lot easier than the code you need to use otherwise.
    I have been searching on this issue on the internet
    for a long time and most of them just say someting
    like "persistance of state", "bean follow some
    conventions of naming", "bean must implement ser" and
    so on. Right, that would go along the lines of the definition of what a JavaBean is.
    All of above can be solved by other means, for
    example, a normal class can also follow the
    convention. And if it does - then it is a JavaBean! A JavaBean is any Object whose class definition would include all of the following:
    1) A public, no-argument constructor
    2) Implements Serializeable
    3) Properties are revealed through public mutator methods (void return type, start with 'set' have a single Object parameter list) and public accessor methods (Object return type, void parameter list, begin with 'get').
    4) Contain any necessary event handling methods. Depending on the purpose of the bean, you may include event handlers for when the properties change.
    I am really get confused with it, and
    really want to know what is the main point(s) of
    using the java bean.JavaBeans are normal objects that follow these conventions. Because they do, then you can access them through simplified means. For example, One way of having an object in session that contains data I want to print our might be:
    <%@ page import="my.pack.ShoppingCart %>
    <%
      ShoppingCart cart = session.getAttribute("cart");
      if (cart == null) {
        cart = new ShoppingCart();
        session.setAttribute("cart", cart);
    %>Then later where I want to print a total:
    <% out.print(cart.getTotal() %>Or, if the cart is a JavaBean I could do this:
    <jsp:useBean id="cart" class="my.pack.ShoppingCart" scope="session"/>
    Then later on:
    <jsp:getProperty name="cart" property="total"/>
    Or perhaps I want to set some properties on the object that I get off of the URL's parameter group. I could do this:
    <%
      ShoppingCart cart = session.getAttribute("cart");
      if (cart == null) {
        cart = new ShoppingCart();
        cart.setCreditCard(request.getParameter("creditCard"));
        cart.setFirstName(request.getParameter("firstName"));
        cart.setLastName(request.getParameter("lastName"));
        cart.setBillingAddress1(request.getParameter("billingAddress1"));
        cart.setBillingAddress2(request.getParameter("billingAddress2"));
        cart.setZipCode(request.getParameter("zipCode"));
        cart.setRegion(request.getParameter("region"));
        cart.setCountry(request.getParameter("country"));
        pageContext.setAttribute("cart", cart);
        session.setAttribute("cart", cart);
      }Or you could use:
    <jsp:useBean id="cart" class="my.pack.ShoppingCart" scope="session">
      <jsp:setProperty name="cart" property="*"/>
    </jsp:useBean>The second seems easier to me.
    It also allows you to use your objects in more varied cases - for example, JSTL (the standard tag libraries) and EL (expression language) only work with JavaBeans (objects that follow the JavaBeans conventions) because they expect objects to have the no-arg constuctor, and properties accessed/changed via getXXX and setXXX methods.
    >
    Any help will be highly apprecaited. Thanks!!!
    Best Regards,
    Alex

  • Question about renaming XDCAM source and targets

    Using XDCAM, is it okay to rename the source folder after you've transferred it?  Same with the folder your transferring to.  Okay to rename it before or after you've transferred the footage?

    Hi,
    I copy the SxS card to an external drive into a folder with a given name like CH-1.  Then I change it to CH-Howe.  I've thought about it and I don't think that has any bearing on anything.  And I believe transfer is the wrong word.  This is an XDCAM import (possibly capture).  Once it's imported, the question then becomes what if I change the name of the folder the files were imported to?  That seems straightforward, too.  I simply reconnect or relink the media.

  • Question about Transaction scope

    Hello all
    I failed to find the answer in the Forte documentation.
    what appends when
    - a client GUI uses transactions,
    - services use message duration AND dependant transactions.
    Do we have
    - two independant transactions,
    - an undetected error,
    - some thing in between ?
    thank you for any reference to the right document or an
    explanation.
    Jean-Claude Bourut
    s-mail 72-78 Grande-Rue 92310 Sevres
    e-mail [email protected]
    Tel (33-1) 41 14 86 41

    If you want to access variables outside of a method then you have to use class variables.
    regards,
    Owen
    class TestClass
      String a;
      public void init ()
        a = "aba";
      public void output()
        System.out.println ( a );
      public static void main(String[] args )
         TestClass test = new TestClass();
         test.init();
         test.output();
    }

Maybe you are looking for

  • Incorrect Cash Discount in F-28

    Hello All, When we post incoming payments from customers using transaction F-28, what happens is the system calculates incorrect cash discount by 1cent ($0.01). The cash discount in Invoice is calculated correctly but during the posting (F-28) it is

  • Help In Forms Personalization

    Hi all, I have a requirement to get the last four characters in a text field using forms personalization. How to get it. Note: Its a query only form. I have tried using format mask option but it never worked. For ex: if a value is 1111111 in need to

  • Exporting from Premiere makes my video washed out. Thoughts?

    Hey Forum! Question of about exporting from Premiere CS6. When I export a video, and open it in Quicktime, the image has lost contrast and color saturation. Does anyone have a solution to this? I've heard about x264 but haven't tried it. Has anyone h

  • VPN Connection assistant (VPNUK via OpenVPN) cannot change network connection

    I have an issue with using my VPN assistant via VPNUK.net.  The connection assistant will update the VPN connection with a new server IP and connection configuration based on what you select in the application. There is no error specifically about th

  • Problems Restoring to Backups

    Yesterday I went into an Apple Store for a warranty service on my iPhone 4S.  It was an older phone and the lock screen button had become stuck and unusable. Knowing that I might be getting a new phone I backed up my iPhone to both iCloud and iTunes.