DSO records.

I have created a DSO as a staging area and loaded data from the PSA. the PSA contains 63500 records whereas the DSO has only 644 records.
my question is the DSO should contain the same number of records which is in the PSA? If so why do i have different number of records in DSO when compared with PSA.  your inputs will be greatly apprectiated. thanks for your help in advance.
Regards
Rajkumar

Not necessarily.
PSA has key called technical key made up of  Request Number, Packet Number and Record Number.
So each record is different in PSA.
But DSO will contain unique records based on the its Key.
Let's say you have Material ,Calday  , Customer as the key fields of DSO.
In PSA , you might have 100 records for one combination of this. but in DSO one combination will be have exactly one record. Else there will be primary key constraint.
After you run the DTP, do not activate DSO data, Check in New table of DSO from Manage->Contents->New Table data.
Check  the number of entries in new table, it should be more than your DSO data, as New table also has Technical Key.
But in active table of standard DSO , you have business Keys ( defined by you).
Based on the keys either key figuresx will be overwritten ( if the Key Figures are overwrite in nature) or aggregated
But before all this , please check if you have DTP filter or start routine to delete some records ( in the transformation)
Regards
Anindya
Edited by: Anindya Bose on Feb 21, 2012 6:50 AM

Similar Messages

  • DB Connect DataSource PSA records and DSO records are not matching...

    Dear All,
    I'm working with SAP NetWeaver BW 7.3 and for the first time, I have loaded from Source System DB Connect. I have created a DataSource and pulled all records and found 8,136,559 records in PSA. When I designed and created DSO with Key Fields 0CALDAY, Item No and Company Code, it has transferred records about 8,136,559 and added records about 12,534 only. Similarly following InfoCube has about 12,534 records into its Fact table. When I tried to reconcile the data/records with source DBMS for a month, the records/data could not matched?
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    4. How should I resolve this issue?
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Many thanks,
    Tariq Ashraf

    Dear Tariq,
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    Ans:  Check transformation once. Is there any start routine you have used or direct assignments. What kind of DTP settings you have done.
    Check the messages at the DTP monitor. You will surely find some clue. Any duplicate records are being detected or not check once if you are using semantic keys in your DTP.
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    Ans:  The transformation key and the DSo key are they same in your case?
    What kind of DSO is it? Like for sales order DSO you take Order number as a key field., So you have to define the key fields according to business semantics I suppose. Do you agree?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    Ans:  I dont think so as the keys you defined will help in having unique data records isnot it?
    4. How should I resolve this issue?
    Ans: Please check the above as in Ans:1 please share your observation.
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Ans: DSO overwriting of key figures is useful when you have full loads in picture. Are you always going to perform full loads ?
    For reference would you like to check this thread:  Data fileds and key fields in DSO
    Lets see what experts give their inputs.
    Thank You...

  • Dso record get modified or overwrite after data uploaded

    dear BI consultants  ,.
    can any one answer this
    The record of view2 is already uploaded to target with one barcode number if we use the date as upload criteria.
    If the record from the view one is uploaded today with the same barcode number what happens .
    Overwrite the record or modify the existing record ?
    view 1  >        data source             |    data source    |
                                                                                                  DSO
    view 2  >             data source        |     data source   |
    The problem is operation t , operation , zusr1 , zusr  , is empty they must be field  with the values in the view 2 .
    The data is loaded based on the date criteria while loading in the info package where the data loss is happening and the data in the target is miss matching.
    The process chain is working daily but the data is not loaded correctly from view 2 into info package and the psa will be not having the records.
    The data field in the view one does not match with date field in the view 2 .
    The key figure here is the barcode number only (so only one record exist for on barcode)

    Hi,
    Check what is the keyfield of your dso as data will be loaded based on keyfields.
    Also please check if you've set overwrite option in your dso at transformation.
    Regards,
    Antony Jerald.

  • Comparasions of records between two dsos  using abap program

    Hello Experts,
                           As per my business requirements we Implemented  30 DSO (EDW Layer) for x Reason  with the reference from  Y Reason DSOs. and for x reson dso we created  transformations and dtp. through process chains we loaded the data to X Reason Dsos. after loading i want to comare these two dso records through  ABAP Progrm.
    here my source is : X Region Dsos
    here Trget is        :  Y Region Dsos
    these two are the mandatory fields
    and the optional fields are:
    1. Sales org
    2. Sales division
    3. document creation date.
      for this type of requirement i want to implement  ABAP Program.i want to abap code for this type of requirement. anyone  have abap code for similar type of requirement please send the abap code.
    Thanks & Regards,

    Hi saurabh,
    If your requirement if to comepare both the values based on the sales org, sales div, data, u can build a report or if you want to look up the both the DSO and want to perform any operation we need to write routine at transforamation level.
    I'm sending u a sample code where i had used for look up two ods and delete the exisitng saled docs.in the BI.
    DATA : BEGIN OF i_uxxx OCCURS 0,
           /bic/zaw        LIKE    /bic/aZD_UDLIT00-/bic/zaw,
           END OF i_uxxx.
    DATA : wa_udlit LIKE i_uxxxt.
    DATA : BEGIN OF i_uxxxx OCCURS 0,
           /bic/zaw        LIKE    /bic/aZD_UDAN00-/bic/zaw,
           END OF i_udxxx.
    DATA : wa_uxxxx LIKE i_uxxx.
    case 1 : For uxxxx lite data
    CLEAR i_uxxx.
        DATA: wa_srcpack  TYPE  tys_sc_1,
              zindex      TYPE  sy-tabix.
        BREAK-POINT.
    CLEAR: zindex.
        SELECT /bic/zaw
        FROM  /bic/aZD_UDLIT00
        INTO  TABLE i_uxxx.
        FOR ALL ENTRIES IN SOURCE_PACKAGE
        WHERE /bic/zaw  = SOURCE_PACKAGE-/bic/zcustomer.
        LOOP AT SOURCE_PACKAGE INTO wa_srcpack.
          zindex = sy-tabix.
          CLEAR wa_xxx.
          READ TABLE i_uxxx INTO wa_uxxx
                               WITH KEY /bic/zaw = wa_srcpack-/bic/zcustomer
          IF sy-subrc = 0.
            Delete SOURCE_PACKAGE.
          ENDIF.
       MODIFY SOURCE_PACKAGE INDEX zindex FROM wa_srcpack.
        CLEAR: zindex.
        ENDLOOP.
    *case 2 : For uxxxx data
    CLEAR i_uxxxx.
        DATA: wa_srcpack1  TYPE  tys_sc_1,
              zindex1      TYPE  sy-tabix.
        BREAK-POINT.
    CLEAR: zindex1.
        SELECT /bic/zaw
        FROM  /bic/aZD_UDAN00
        INTO  TABLE i_uxxxx
        FOR ALL ENTRIES IN SOURCE_PACKAGE
        WHERE /bic/zaw  = SOURCE_PACKAGE-/bic/zcustomer.
        LOOP AT SOURCE_PACKAGE INTO wa_srcpack1.
          zindex1 = sy-tabix.
          CLEAR wa_uxxx.
          READ TABLE i_uxxxx INTO wa_uxxx
                               WITH KEY /bic/zaw =
                               wa_srcpack1-/bic/zcustomer
          IF sy-subrc = 0.
            Delete SOURCE_PACKAGE.
          ENDIF.
       MODIFY SOURCE_PACKAGE INDEX zindex FROM wa_srcpack.
        CLEAR: zindex1.
    endloop.
    Hope this helps...
    Regards
    KP

  • How to dynamically and selectively update DSO based on values in a csv file

    Hi,
    I'm loading a csv file into a DSO. When loading the flat file in FULL mode I need to do a pseudo delete of records that were previously loaded but are not in the new flat file.
    Is it possible to dynamically determine the unique set of records (say Pk1, Pk2, Pk3) in the csv file and then set all the corresponding DSO records' quantities to 0 - maybe in a start routine??  After that, I can load the csv file with the correct quantities (effectively update and inserts).  The net result should be that the change log only be updated through to the next DSO.
    Example: Load 10 records yesterday. Today reload 9 records. 10th records must have quantity set to 0. Other 9 records will have quantity values set to those in today's csv file -  some will be the same & some will be different. The net change log of all 10 records must be loaded into the next DSO.
    Any suggestions on how to do this logic?
    Thanks!

    Hi Gregg,
    You can create one transformation from the DataStore to itself.  In the "Technical" rules group, set 0RECORDMODE = 'X' (before image) or 'R' (reverse).  Therefore, when you execute its corresponding DTP, all existing records shouldl be set to zero.
    Then, as a second step, you can execute the DTP which is related to the transformation between the DataStore and the DataSource, thus loading the new records.
    I hope this helps you.
    Regards,
    Maximiliano

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • 50 million records Data extract to CSV files

    Hi Experts,
    Could you please suggest me,  we need to extract 50 millions DSO records to CSV Files using Open hub in BI 7.0.
    We are having  create date is key field in my DSO.
    Thanks
    Vijay

    Hi,
    Give some value ranges on creation date and download the data to files.
    Hope it helps....

  • Does activation fails in ods when it is acting as source & loading data

    Hi All,
    If i am loading data from ODS to cube which is nothing but repair full (data from active table), i bring delta to ODS & start ods activaion of the delta request will it fail due to the lock on the active table?
    Thanks in advance

    Hi,
    The data load happens this way in the DSO.
    DSO consists of three tables namely "new,Active and changelog tables".
    When you load the data to DSO, records sit in New table.
    When you Activate the data records move to Active table creating a before,After image in change log.
    when you are running the load from DSO to CUBE, the records are loaded from the Active or Change log based on the setting.
    So these tables will be locked when you are using them to load the cube.
    when you are activating the data in DSO, it means that we are moving data from new to active table.
    So when loading from DSO the tables are locked and when activating, you are tring to move data in these tables which is not possible.
    So,we cannot activate DSO same time when some records are loding from same dso to cube
    Hope this helps.
    Regards,
    Haritha.

  • Overwrite in Cube

    i know cube is additive as compared to dso, but i was wondering if it is possible to aggregate multiple records into just 1 record in the cube. i have key figures which are dependent on a value type and i would like to see them as 1 record compared to 3 in the cube. my scenario is as follows:
    DSO
    (record # / Value type /KF1/ KF2/ KF3)
        1 - 00 - 50 - X - X
        2 - 10 - X - 100 - X
        3 - 20 - X - X - 25
    In the cube i would like to see it aggregate as follows:
    Cube
    (record # / vtype1 / vtype2 / vtype3 / KF1 / KF2 / KF3)
        1- 00 - 10 - 20 - 50 - 100 - 25

    Hi Mark,
    I am assuming that you are using BI 7.0.
    In BI 7.0, under one transformation rule, you can define as many transformation groups you want. Transofrmation group means multiple transformation groups between same source and target with different definitions.
    Consider this,
    Source
    Vtype, KF
    00, 100
    10, 200
    20, 300
    Target
    Vtype1, Vtype2, Vtype3, KF1, KF2, KF3
    you can define 3 different transofrmation group one for each Vtype to Vtypex (1, 2 or 3). In each of the transofrmation group vtype will be hardcoded.
    Vtype1 = 00, KF1 = KF
    Vtype2 = 10, KF2 = KF
    Vtype3 = 20, KF3 = KF
    Once you have define this, you can schedule the DTP, so system will load data through each of this transformation groups to the target cube. After that you can compress the request to form one single record.
    - Danny

  • Issue with data

    Hello ,
    I have one doubt to clear regarding selective deletion ,if i get that idea i can implement that in the real time issue and job is done .
    Suppose there is upload from DSO to Cube using delta dtp .
    Suppose in the database two employee are there with same name 'Rahul' but Employee code are different .
    Rahul (Employe Code - 1)
    Rahul (Employe Code - 2)
    DSO Records
    Rahul M 23  1     (Employe Code - 1)
    Rahul M 23 -1     (Employe Code - 1)
    Rahul M 25  1     (Employe Code - 2)
    Infocube
    Rahul M 23  1     (Employe Code - 1)
    Rahul M 23 -1     (Employe Code - 1)
    Rahul M 25  1     (Employe Code - 2)
    Now there is a Query being developed on the Infocube when we execute that we will get the record displayed as
    Rahul M 25  1 . Thats correct .
    But when we open the filter in the Query for employee we will get two Rahul . If we select the First Rahul we will get data but if we select the second we will get no data available thats perfectly fine .
    Problem -
    The User has same issue as why two names are appearing in the
    filter for employee .
    Analysis -
    If i do a selective deletion on Infocube for rahul with
    (Employe Code - 1) then i think the issue will be solved .
    But let me know this will not be picked again from DSO with delta .

    Hello ,
    Thanks for the answers to the post .
    I have checked that that setting is maintained as Only Posted values .
    The record is being fetched still but no changes as the record is closed in the source system .
    So i mean no new changes are there .
    If I do a selective deletion , do i need to delete and create index again .As in the Process Chain this step is involved .
    So , after i do a selective deletion , the chain runs and the index will be adjusted automatically .
    Please suggest .
    Marks assigned .
    Thanks ,
    Rahul

  • CUBE Not getting All records from DSO

    Hi Experts ,
    We have a situation where we have to load data from a DSO to a Cube . The DSO contains only 9 records and while we r loading the data into cube , the cube is only getting 2 records . Thus 7 records are missing. Also The Cube contains more Fields  than DSO. in the transformations , for the extra fields we have written end routine and those extra fields will get data by reading master data .. Any pointers how to get the missing records or what is the error ...
    Sam

    Why multiple threads ????

  • How to differentiate the EMPTY Records and Null Values in DSO

    Hello....how is everyone here?? Ehehehe!
    I try to load some data from the flat file which contains some EMPTY data and Null Values for the records. The data type for the InfoObjects of the fields "Quantity" is "number". The sample data from the flat file (CSV) are as below:
    Food              Quantity
    Hamburger  -       12
    Cheese        -       0
    Vegetable      -               (Empty)
    When I try to load the above sample data to the DSO, I get the results of the data as follow:
    Food              Quantity
    Hamburger     - 12.000
    Cheese           -  0.000
    Vegetable         - 0.000
    In this case, how can the user differentiate whether the records is contain empty value of null values in DSO? This is kinda of hard to differentiate the both scenarios above. Is there any way to differentiate the scenarios described here?
    Thanks alot =)

    Hi Fluffy,
    It depends on the initial values of the data type
    The inital values For quantity/Currency/ Numbers it takes spaces as 0
    for char it is SPACE
    We cannot differeniate between space and null values.
    IF you have to force this then define quantity as char and load the data. we will not have units and aggregation in this case.
    Hope this helps.
    PV

  • Different Keys of DSO transformation nullfy record info with RecordMode 'R'

    Two DSOs:  DSO1 and DSO2 and we load data from DSO1 to DSO2
    DSO1 has the following fields:
    Delivery Doc Number (Key field), Delivery Doc Item (Key field), SO (data field), SO Item(data field), Delivery Quantity(data field)
    DSO2 has the following fields:
    SO(Key field), SO Item(Key field), Delivery Doc Number (data field), Delivery Item (data field), Delivery Quantity (data field)
    The problem is a record with Record Mode = 'R' in DSO will be transferred to DSO2 and makes this record in DSO2 only contains SO/SO Item values, all other field values are set to null.    Or in other word, a cancelled Delivery also causes it's corresponding SO cancelled which is absoluted incorrect! 
    We are thinking out a solution to place some coding in the start routine to overcome this problem.  Maybe to verify if Record Mode ='R', then set it as a null value?  But null value for Record Mode means After Image, not sure if it works or not.  Anyone's idea is greatly appreciated!

    Hello ,
    Please follow these steps:
    1. Remove the 0recordmode mapping in the Technical group of trhe transafermations between DSO's.
    2. Write the logic as below in the transfermation routine in each of the Key figures ( Delivery Qty  Etc ... ) .
       -   IF recordmode EQ '' or recordmode eq 'N'.
       -   RESULT = Delivery Qty.
       -   elseif  recordmode eq 'X' or recordmode eq 'R'  .
       -   RESULT = -1 * Delivery Qty
           endif.
    Hope this helps.
    Thanks,
    Ravi.

  • No of Records in DSO dont match to the  No in PSA

    Hi SDNer's,
    I have a datamismatch hpng btn PSA n DSO...not all the data is getting loaded to DSO, there is lot of data which is being filtered out by the transformation which I am unable to figure out.
    To analyze better I deleted data in DSO n PSA n only loaded only <b>1 GLAccount</b> for a time per <b>05/ 2007</b>.
    <b>RSA3 has 12 records &#61664; PSA has 12 records ( Info package – Full ) &#61664;  DSO has only 9  ( Full DTP )</b>
    DSO has 3 records missing which exist in PSA.
    <b>*</b>No filter used in DTP.
    <b>**</b>Its all 1:1 mapping btn DSO n PSA.
    <b>***</b>No start routines at all.
    <b>****</b> No aggregation of data in DSO.
    Here transformations are filtering out the records from 12 to 9.
    <b>RSMO Details Tab</b>
    RSDS 0FI_GL_10 D05CLNT400 -> ODSO ZFIGLO10 : 12 -> 9 Data Records
    <u><b>Records in PSA</b></u>
    <u><b>Glant/Cmpcd/CAra/Crtyp/Cmny/Fisc/Bal/Deb/Cre/Sales</b></u>
    0001151015/1001/1000/00/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/1001/1000/10/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/1001/1000/30/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/2001/1000/00/----
    /2007005/0,00/100,00/100,00/0,00
    0001151015/2001/1000/10/----
    /2007005/0,00/100,00/100,00/0,00
    0001151015/2001/1000/30/----
    /2007005/ - 100,00/0,00/100,00/ - 100,00
    <b><i> 0001151015/2003/1000/00/----
    /2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/10/----
    /2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/30/----
    /2007005/100,00/0,00/100,00/100,00</i></b>
    <b>0001151015/2003/1000/00/001001/2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/10/001001/2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/30/001001/2007005/100,00/0,00/100,00/100,00</b>
    <b>----
    /500,00/800,00/300,00/500,00</b>
    <u><b>Records in DSO</b></u>
    <u><b>Glant/Cmpcd/CAra/Crtyp/Cmny/Fisc/Bal/Deb/Cre/Sales</b></u>
    0001151015/1001/1000/00/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/1001/1000/10/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/1001/1000/30/----
    /2007005/0,00/0,00/0,00/0,00
    0001151015/2001/1000/00/----
    /2007005/0,00/100,00/100,00/0,00
    0001151015/2001/1000/10/----
    /2007005/0,00/100,00/100,00/0,00
    0001151015/2001/1000/30/----
    /2007005/ - 100,00/0,00/100,00/ - 100,00
    0001151015/2003/1000/00/001001/2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/10/001001/2007005/100,00/0,00/100,00/100,00
    0001151015/2003/1000/30/001001/2007005/100,00/0,00/100,00/100,00
    <b>----
    /200,00/500,00/300,00/200,00</b>
    <b>Records citated in Bolded Italic are missing in the DSO.</b>
    Keyfields are Glaccount, Company code, CURTYPE..
    Keyfigures are of update type " Overwrite "
    Company ( 001001 ) is a datafield in DSO.
    Modified original post accordingly.
    Any hints/inputs wud really help............Thanks All !!
    Message was edited by:
            Jr Roberto

    Kamaljeet,
    There is no diff at all in records excepting the Company which I made it as Keyfield...which resulted same.
    None are flagged as keyfields in the datasource.
    Ex :
    2007          USD          001     0001151015     2003                    1000                              10     00     2007010     K4     CAMT               USD     4.100,00     6.000,00     2.000,00     4.000,00                              
    2007          USD          001     0001151015     2003                    1000                              10     10     2007010     K4     CAMT               USD     4.100,00     6.000,00     2.000,00     4.000,00                              
    2007          USD          001     0001151015     2003                    1000                              10     30     2007010     K4     CAMT               USD     4.100,00     6.000,00     2.000,00     4.000,00                              
    2007          USD          001     0001151015     2003                    1000                    001001          10     00     2007010     K4     CAMT               USD     100,00     0,00     0,00     0,00                              
    2007          USD          001     0001151015     2003                    1000                    001001          10     10     2007010     K4     CAMT               USD     100,00     0,00     0,00     0,00                              
    2007          USD          001     0001151015     2003                    1000                    001001          10     30     2007010     K4     CAMT               USD     100,00     0,00     0,00     0,00

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

Maybe you are looking for

  • Multiple Apple Ids - One Mac Book

    Hello, I need some advice.  My husband and I have been going along just fine with two separate apple IDs and one computer. But recently we replaced our computer and now we're setting up our accounts on our new mac book.  We've come across the issue t

  • 2 node RAC: one 10gR2  node and one 11.2.0.3 node on Solaris 10.

    Is it possible to have a mixed Oracle version 2 node RAC with 10gR2 database on one machine and 11.2.0.3 database installed on the other machine. Has anyone done this?

  • "Unknown" User Showing in Get Info Box

    This is strange - at the bottom of any Get Info screen I am seeing three listings in this order: 1 - administrator (my account) with Read/Write privileges 2 - (unknown) - Read Only 3 - everyone - Read Only When I try to delete unknown, nothing happen

  • Need Help on a Time Capsule setup

    I want to install a new 2TB Airport Time Capsule to replace my older Time Capsule.  I would like to continue to use my original Airport Extreme for my network and have the new Airport Time Capsule serve as an extension of that network. I have referre

  • Disk 3 is corrupted

    Hi I downloaded oracle app server 10g(10.1.2.0.2).Two disks have been downloaded successfully.Disk 3 is downloaded but when i am unzipping the file,it is showing the file is corrupted or invalid. I downloaded it twice,It showed me the same error twic