Restricting source data for GL Analytics

Hi All,
I need to run GL Analytics ETL process but i want my warehouse tables to be populated with recent data (Eg: data from 2007) but my source system has data from 2003. Is there any way or set any parameter in DAC to do so.
Thank you

Hello,
If it is a matter of authorization. the Atif's answer is right.
If it is a mater of validation.
To restrict G/L Account(s) with Profit center(s)
You need to use GGB0 Validation in Accounting Documents.
then you need to activate it through this path:
SAP Customizing Implementation Guide - Financial Accounting (New) - Financial Accounting Global Settings (New) - Tools -Validation/Substitution - Validation in Accounting Documents.
Note event is very important you can make it on line item level
Regards,
Edited by: Tarek Elkiki on Dec 11, 2011 10:51 AM

Similar Messages

  • There is no source data for this data record, Message FZ205

    Hi Experts,
    I am facing a problem with the DME File download. This problem happened all of sudden in our production system since last month and it was never before. Our system landscape has also not been changed but as per our basis consultant he has added two-three more new application server to the Production client. Even we do not have this problem in our testing clients.
    Please note that we have been using the output medium '1' from the day one and thus the system has been generating the DME in 'File System' which we download on the desktop and upload the same to the bank online. After running the payment run when we trying to download the DME File, the system gives the error "There is no source data for this data record, Message FZ205".
    I tried to fix this issue through many ways but not able to. So can you please let me know the reason of this error and solution to fix this.
    With best regards,
    BABA

    Hi Shailesh,
    Please share how you solved this problem.
    Many Thanks,
    Lakshmi

  • Source data for Legal and Management Consolidation

    Hi,
    I'm in ECC5, using BCS 4.0 and BW 3.5.
    Our current designed required 2 type consolidation, which is company consolidation and profit centre consolidation. Note that the profit centre consolidation also required balance sheet and profit/loss.
    Now, I know that basicly the source of data coming from R/3 is actually the special ledger table FAGLFLEXT. In this table, both company and profit centre shared the same table in order to maintain data consistency.
    My question is:
    1. Is my understanding about FAGLFLEXT correct?
    2. What are the prerequisites steps so that the table FAGLFLEXT can have the profit centre data inside?
    Any advise please....
    regards,
    Halim

    Hi Halim,
    Yes, you are right.
    As a prerequisite, you need to activate new General Ledger Accounting in Customizing for Financial Accounting in the OLTP system:
    http://help.sap.com/saphelp_nw04/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/b6/5f58405e21ef6fe10000000a1550b0/frameset.htm
    See here an example of configuration:
    http://help.sap.com/bp_bblibrary/500/documentation/N70_BB_ConfigGuide_EN_DE.doc
    here a presentation on GL in mySAP ERP:
    http://www.asug.com/client_files/DocUploads/search/doc1194.ppt
    and here a thread about dataflow from R/3 to BCS:
    http://eai.ittoolbox.com/groups/technical-functional/sap-r3-sem/dataflow-from-r3-to-sem-bcs-950671
    Best regards,
    Eugene

  • Source Data for Import Map

    I inherited some MDM work from a consultant who rolled off our SRM project.  He did not leave any source input files to use when making changes to maps, and I read in the SAP MDM Guides that you should always revert back to your original source file when making map changes so you don't lose any mappings (if the newer source file does not contain every segment that the original did, etc).
    Am I off base?  Is there a safe way to make map changes without having the original source file?
    Thank you in advance,
    Keith

    Hi Keith,
    You are absolutely right. This is a common problem which is being faced while subsequent loads into MDM. The problem is that generally the Map is created in Import manager and then if more values come in for value mapping or more fields are mapped due to a new business requirement, then the map sometimes throws a problem of Map being out of date.
    The solution that we came out for this was a creation of Value mapping template ( you can also include the field mapping). This would have the complete list of fields and values in one map. Also if some new value is getting added, then firstly add in the template and then Map it in the original Map.
    Now in your case, you can either go and create a Template or else, use the SAVE UPDATE option present in Import Manager everytime you face an exception via MDIS. SAVE UPDATE will help you update the additional mapping onto the original map and then if a similar file comes in again, it will processed successfully via MDIS.
    You can refer to my Article on this:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80ad0cff-19ef-2b10-54b7-d4c7eb4390dd
    Hope it helps.
    Thanks and Regards
    Nitin Jain

  • Source data for Record Group

    Hello,
    I am very new to Oracle Forms and have been tasked with pointing some forms we have to a new server and adding a couple of columns to come areas. Everything was going ok until I got to a the point where I have to add a new column to an area on a form. The forms are pointing to the new tables and the searches are working, or at least seem to be working. How can I tell the data source for a data group? I checked properties for the record group (RECORD_STATISTICS) that populates a certain area on a form and it has query selected as the record group type but there is no query showing. I added the column needed to the column specifications list but it does not show up when I run the form. There is a spot for it because the extra hyphen is there.
    Here is the code that populates the fields on the form. The field I added is the ahs_site column. As mentioned earlier I added that field to the RECORD_STATISTICS data group as well as to all the procedures I can find but am missing something..
    DECLARE
    htree ITEM;
    num_selected NUMBER;
    current_node FTREE.NODE;
    v_note_value number;
    v_node_depth number;
         total_rows number;
    group_id          RecordGroup;
    v_selection_count NUMBER;
    BEGIN
    -- Find the tree itself.
    htree := Find_Item('BLOCK_STATISTICS_TREE.TREE_ITEM_STAT');
    v_selection_count := Ftree.GET_TREE_PROPERTY(htree, Ftree.SELECTION_COUNT);
    IF v_selection_count>0 THEN
              v_note_value := Ftree.Get_Tree_Node_Property(htree, :SYSTEM.TRIGGER_NODE, Ftree.NODE_VALUE);
              IF v_note_value IS NOT NULL THEN
                   group_id := Find_Group('RECORD_STATISTICS');
                   total_rows := Get_Group_Row_Count(group_id);
              v_node_depth := to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', v_note_value));
              -- :BLOCK_BUDGET_PARAMETER.DI_SELECTED2 := v_node_depth;
                   GO_BLOCK('BLOCK_STATISTICS_DETAIL');
                   CLEAR_BLOCK;
                   FOR i in v_note_value..total_rows LOOP
                        IF v_node_depth=4 THEN
                             :BLOCK_STATISTICS_DETAIL.DI_TEMPLATE_SEQ := Get_Group_Number_Cell('RECORD_STATISTICS.NODE_SEQ', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.DI_DESCRIPTION := Get_Group_Number_Cell('RECORD_STATISTICS.SITE', v_note_value)
                             || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.AHS_SITE',v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.PRIMARY_CD', v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD', v_note_value)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD_DESC', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.DI_YR_AND_MNTH := Get_Group_Number_Cell('RECORD_STATISTICS.YR_AND_MNTH', v_note_value);
                             :BLOCK_STATISTICS_DETAIL.TI_QUANTITY_STAT := Get_Group_Number_Cell('RECORD_STATISTICS.QUANTITY', v_note_value);
                        ELSE
                             IF Get_Group_Char_Cell('RECORD_STATISTICS.LEAF_NODE', i)='Y'
                                  AND v_node_depth < to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', i)) THEN
                                  :BLOCK_STATISTICS_DETAIL.DI_TEMPLATE_SEQ := Get_Group_Number_Cell('RECORD_STATISTICS.NODE_SEQ', i);
                                  :BLOCK_STATISTICS_DETAIL.DI_DESCRIPTION := Get_Group_Number_Cell('RECORD_STATISTICS.SITE', i)
                                  || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.AHS_SITE',i)
                                                                                                                                 || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.PRIMARY_CD', i)
                                                                                                                                 || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD', i)
                                                                                                                            || ' - '|| Get_Group_Char_Cell('RECORD_STATISTICS.SECONDARY_CD_DESC', i);
                                  :BLOCK_STATISTICS_DETAIL.DI_YR_AND_MNTH := Get_Group_Number_Cell('RECORD_STATISTICS.YR_AND_MNTH', i);
                                  :BLOCK_STATISTICS_DETAIL.TI_QUANTITY_STAT := Get_Group_Number_Cell('RECORD_STATISTICS.QUANTITY', i);
                                  Next_record;
                             ELSIF v_note_value<>i AND v_node_depth = to_number(Get_Group_Number_Cell('RECORD_STATISTICS.NODE_DEPTH', i)) THEN
                                  EXIT;
                             END IF;
                        END IF;
                   END LOOP;
                   First_record;
         END IF;
         END IF;
    END;
    Hope that made sense. I do not understand how data flows through forms just yet or how to phrase my question in terms that understandable. I do have some screen shots I could send anyone willing to help.
    Thank you.

    Adding a column to column specification does nothing.
    First of all, check the record group query in record group properties:
    1) In forms builder object tree find that record group, right-click > property palette.
    2) Look for property (just cann't remember exactly its name) where select query is specified.
    3) Add the column you need to the query. Column specification will refresh automatically.
    There is one more way to specify query for record group. Look for calls of POPULATE_GROUP_WITH_QUERY procedure in the form code.
    Forms 6i: menu program > find and replace pl/sql, Forms 10: edit > find and replace pl/sql. In the search field type POPULATE_GROUP_WITH_QUERY. Then see the results where your record group RECORD_STATISTIC is being populated programmatically. If no calls were found - the only data source is in record groups properties.

  • Source data for pipeline report

    Greetings,
    I wanna to create a pipeline reports that demonstrate:
    - Number of Leads
    - Number of Leads converted in Opportunities
    - Number of Opportunities
    - Number of Won Opportunities
    I used the funnel design key for leads, but does lack a filter w/ how many opportunities have been won .
    Would anyone have a suggestion ?
    Thanks,
    Julio Zarnitz

    Hi Julio,
    Since single report doesn't suffice all your the field requirement, you can use combined data source and map the desired key figures and characteristics to get required results
    (e.g. you can try out combination of  Lead funnel & Opportunity funnel as both will have all your required fields)
    Refer below thread for detailed steps
    http://scn.sap.com/docs/DOC-63151
    Regards,
    Surjeet Bhati

  • Source data for Goods Reciept ledger transactions

    Hi,
    I've been tasked with writing a custom report simular to transaction KSB1 ( Display actual cost line items for cost centers )but with vendor no & name added.
    I can write the ABAP but need some help to identify the source tables.
    The fields on the report below look like they are from the accounts payable ledger but I'm not sure which table that is.  Any help greatly appreciated.
    Cost Center
    Cost Element
    Period
    Cost Element Name
    Document Type
    Document No.
    User
    Purchase Order Text
    Purchasing Document
    Document Header Text
    Value in Report Currency

    Hi,
    You will find this information in the MSEG table: position details for Material Documents (Header information you will find in MKPF)
    I think this is a better strategy than using the GL lineitems for the vendor.
    Goodluck,
    Paul

  • Defining Authorizations for User to restrict the data in report.

    Hi Gurus,
    I have no idea on authorization concept in BI. Please give me anyone steps to creating authorization objects, roles and profiles to restrict the data for users.
    Ex.
    i have functinal location info object checked as authorization relavent with below data.
    FL001
    FL002
    FL003
    FL004
    FL005
    FL006
    FL007
    FL008
    FL009
    We have users like below.
    User1
    User2
    User3
    Now, if User1 is analysing a report he can see only FL001, FL005, FL009 only, remaining have to be omited.
    If User2 is analysing that report he can see only FL002, FL003, FL009. And like wise.
    So, Please help me providing the completed steps. I have done somting but failed.
    Thanks in advance
    Peter.

    Hello Peter,
    Please go through the following links
    Authorization :
    http://help.sap.com/saphelp_nw70/helpdata/en/59/fd8b41b5b3b45fe10000000a1550b0/frameset.htm
    SAP Authorization Concept :
    http://help.sap.com/saphelp_nw70/helpdata/en/52/671285439b11d1896f0000e8322d00/frameset.htm
    Thanks.
    With regrads,
    Anand Kumar

  • Using sqlldr when source data column is 4000 chars

    I'm trying to load some data using sqlldr.
    The table looks like this:
    col1 number(10) primary key
    col2 varchar2(100)
    col3 varchar2(4000)
    col4 varchar2(10)
    col5 varchar2(1)
    ... and some more columns ...
    For current purposes, I only need to load columns col1 through col3. The other columns will be NULL.
    The source text data looks like this (tab-delimited) ...
    col1-text<<<TAB>>>col2-text<<<TAB>>>col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    END-OF-RECORD
    There's nothing special about the source data for col1 and col2.
    But the data for col3 is (usually) much longer than 4000 chars, so I just need to truncate it to fit varchar2(4000), right?
    The control file looks like this ...
    LOAD DATA
    INFILE 'load.dat' "str 'END-OF-RECORD'"
    TRUNCATE
    INTO TABLE my_table
    FIELDS TERMINATED BY "\t"
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    col1 "trim(:col1)",
    col2 "trim(:col2)",
    col3 char(10000) "substr(:col3,1,4000)"
    I made the column 3 specification char(10000) to allow sqlldr to read text longer than 4000 chars.
    And the subsequent directive is meant to truncate it to 4000 chars (to fit in the table column).
    But I get this error ...
    Record 1: Rejected - Error on table COL3.
    ORA-01461: can bind a LONG value only for insert into a LONG column
    The only solution I found was ugly.
    I changed the control file to this ...
    col3 char(4000) "substr(:col3,1,4000)"
    And then I hand-edited (truncated) the source data for column 3 to be shorter than 4000 chars.
    Painful and tedious!
    Is there a way around this difficulty?
    Note: I cannot use a CLOB for col3. There's no option to change the app, so col3 must remain varchar2(4000).

    You can load the data into a staging table with a clob column, then insert into your target table using substr, as demonstated below. I have truncated the data display to save space.
    -- load.dat:
    1     col2-text     col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    XYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    END-OF-RECORD-- test.ctl:
    LOAD DATA
    INFILE 'load.dat' "str 'END-OF-RECORD'"
    TRUNCATE
    INTO TABLE staging
    FIELDS TERMINATED BY X'09'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    col1 "trim(:col1)",
    col2 "trim(:col2)",
    col3 char(10000)
    SCOTT@orcl_11gR2> create table staging
      2    (col1 varchar2(10),
      3       col2 varchar2(100),
      4       col3 clob)
      5  /
    Table created.
    SCOTT@orcl_11gR2> host sqlldr scott/tiger control=test.ctl log=test.log
    SCOTT@orcl_11gR2> select * from staging
      2  /
    COL1
    COL2
    COL3
    1
    col2-text
    col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    XYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    1 row selected.
    SCOTT@orcl_11gR2> create table my_table
      2    (col1 varchar2(10) primary key,
      3       col2 varchar2(100),
      4       col3 varchar2(4000),
      5       col4 varchar2(10),
      6       col5 varchar2(1))
      7  /
    Table created.
    SCOTT@orcl_11gR2> insert into my_table (col1, col2, col3)
      2  select col1, col2, substr (col3, 1, 4000) from staging
      3  /
    1 row created.
    SCOTT@orcl_11gR2> select * from my_table
      2  /
    COL1
    COL2
    COL3
    COL4       C
    1
    col2-text
    col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    more-col3-text
    XYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
    1 row selected.

  • Source data is blank

    Hi,
    We are using Web ADI to generate letters in Fastpath -> Assignment form in HR.
    We follow the following steps
    1. Create a view in apps schema.
    2. Create HR Integrator
    3. Create the Form function association
    4. Shutdown and restart the instance.
    5. Define a layout for the Integrator.
    6. Navigate to Fastpath Assignment -> Click on Export Data icon -> Select the Integrator name from the dropdown -> Download the souce data in the excel file.
    6. This excel is used as a source data for mail merge with the word document which is used to generate the letters.
    We have used these steps successfully for at least 10 different integrators.
    Recently, I am trying to create a new integrator but the source file that I am downloading is blank. It is not getting any data.
    But the view is running fine in sqlplus and returning data.
    I have used the same view earlier for other Integrators and it was working fine. This time I just added a few more columns.
    Any idea why the source is coming blank ?
    Any help would be highly appreciated.
    Regards,
    -Deb

    I'd be ensuring the the Web ADI Logging Level was set to Trace. Then run through a download and check the log file to see what errors/warnings are occuring.
    This probably isn't the type of error that and end-user would appreciate. It also sounds like its a development type error. So the assumption is that the log file would be where a developer would like. Rather than redirecting it out to a limited amount of space within the end-user UI.

  • Source Data option won't dispaly on combo box proprieties.

    I am trying to use combo box to dispaly a serie of data but when activated, the source data for the combo box is not available. Any idea what might be the problem and how can I resolve it?
    Thanks.

    The source data is not available for certain insert options as they do not need source data.
    These options are position, label and status list.
    If you switch to any of the other insert types, you can then bind your source data.

  • Important tables used for Configuring BI apps Irrespective of source data

    Hello Folks,
    i am trying to know "WHAT ARE THE COMMON TABLES OR COLUMNS OR SUBJECT AREAS" available in ETL/BI Apps
    irrespective of data from any source data applications
    for example: i am trying to use siebel crm or oracle apps or peoplesoft
    this will have pre-configured adopters in the ETL Workflow designer
    but irrespective of this sources, i think there is some common tables or columns used which are used by DAC or ETL for internal pupose
    so want to know what are they, where are they and whats the purpose
    if possible, explain me in detail
    thank you
    kumr

    Hi,
    This information is all available in the data model reference for the applications version you are looking at, for instance:
    Oracle® Business Analytics Warehouse
    Naming Conventions and Domain Values Guide
    Version 7.9.5
    E12088-01
    Navigate to the section:
    2.3 Internal Tables in Oracle Business Analytics Warehouse
    This shows the internal tables used by OBIEE Apps in the transactional database and the datawarehouse.
    Regards,
    Matt
    Edited by: mod100 on 07-Sep-2009 07:03 - Added tags

  • Restrict WebI data sources

    Hi All,
    Is it possible to restrict data source access to a particular set of users in webi?
    For example:
    User A should see only excel as a data source.
    User B should see only Bex
    And so on.
    Thanks
    Vinayak

    Hi,
    You can restrict to see BEx and Analysis View sources if you will deny the access  "Interfaces - enable Rich Internet Application" on Webi Intelligence application. After that users only can see universe and  no data source.
    For individual restriction on data sources i don't think so there is a option.
    Other way around if you restrict the users to access  Connection/Universe folders then users can not create anything and there will be no use of  to select the data sources.In this case depend on the access requirement you need to save OLAP/Relational connections and analysis view in the different folders.
    Regards,
    Amit

  • Sharepoint Web Analytics no data for reports

    Hi all,
    I would like some assistance in troubleshooting the web analytic problem I am experiencing with one of my farms.
    My web analytics reports stopped beeing generated recently and the uls logs have not been helpfull at this point.
    I have checked the following: 
    Verified that the Web Analytics Data Processing Service and Web Analytics Web Service are started on the server.
    Made sure web analytics service application is started.
    .USAGE files are being generated on the WFE
    Noticed that the SharePoint Logging database (LoggingDB) DOES not contains any information.
    I think this is where my problem resides, I have tried to create a new Web Analytics services application, with a new staging and logging database, but no luck. 
    I have also tried to create a new wss_logging db, still no luck.
    I have made sure that the account that runs web analytics has the correct SQL db right on the logging db. still no luck.
    Your assistance is highly appreciated.
    Clifford - South Africa

    Generally it take 24 hrs to generate analytics data.
    Check if below jobs are active:
    Diagnostic Data Provider: Event Log
    Diagnostic Data Provider: Performance Counters – Database Servers
    Diagnostic Data Provider: Performance Counters – Web Front Ends
    Microsoft SharePoint Foundation Usage Data Import
    Microsoft SharePoint Foundation Usage Data Processing
    http://pinchii.com/home/2012/03/site-web-analytics-reports-show-no-data-in-sharepoint-2010/
    Also try steps from:
    http://blogs.msdn.com/b/sharepoint_strategery/archive/2012/03/16/troubleshooting-sharepoint-2010-web-analytics.aspx
    http://social.technet.microsoft.com/Forums/sharepoint/en-US/0728b722-29e1-4de3-87ee-45ac40242ccf/web-analytics-no-data-for-sites?forum=sharepointadminprevious

  • Data Source to upload plan data for CO_OM_CCA_: CC: Costs and Allocations

    Hi Guru's,
    We have Data source that which upload Actual data for CCA (0CO_OM_CCA_1 - Cost Centers:cost and allocations).  Every time it is full upload before loading the data deletion of previous load request will be done.
    One more information required that i have checked in the BW cube for plan data it is available in the info cube for one cost center and rest of the costcenter i don't have the data now the users are requesting to upload the plan data for rest of cost centers.
    Now as per the business requirement the users are requesting to upload the plan data in to the same i am not sure whether it will pull the plan data with actuals in to BW cube. Is there any alternate with the same data source or else is there any specific data source for uploading Costs and Allocations plan data
    Please suggest me how can i go a head with this requirement.
    Thanks in Advance.
    Ganni

    You can use the same datasource to load plan/budget data to the cube for all the costcenters. And regarding the actuals, you can use 0CO_OM_CCA_9 datasource to load to actuals cube and create a multi on top of both plan and actuals cube for reporting purposes.

Maybe you are looking for

  • To load book because the requested resource is missing."  Thanks

    I just purchased my firt e-book from itunes, but when I went to open it up, the book wouldn't open...it says "failed to load book because the requested resource is missing."  Help and Thanks!

  • IPhoto 5.0.4 will not load

    I currently have a ibook G4 with Mac OS X version 10.4.11.  It came with iPhoto 5.0.4 and up until the past two days I have not experienced any problems.  However currently the program will not load.  When I open it states it is loading the photos.  

  • Unexpected result when parsing a datetime using sap.ui.core.format.DateFormat

    I am parsing datetime values in order to use them in an makit Chart. My x axis is a time axis with values in the JavaScript datetime format like 1408177125000. I wondered about a time shift in my axis and see the following result: console.log(Date(fV

  • TEST -- Active Undo Tablespace Drop

    TEST SUMMARY:- When no active transactions are in UNDO tablespace, I can offline the undo datafile and subsequently drop the tablespace. No problems there. Active Undo transactions combined with SHUTDOWN ABORT , I cannot open the database? **********

  • Can't open iTunes on new iMac

    After migrating all my documents from a ProMac to an iMac I can't open iTunes in new iMac.  "iTunes Library.itl cannot be read because it was created by a newer version of iTunes."  When I select "Download iTunes" I get "No Updates available" and an