BI Export Import

ERROR: 2011-06-12 01:50:01 com.sap.inst.migmon.LoadTask run
Unloading of 'SAPDODS' export package is interrupted with R3load error.
Process 'E:\bin\uc\R3load.exe -e SAPDODS.cmd -datacodepage 4103 -l SAPDODS.log -stop_on_error' exited with return code 2.
For mode details see 'SAPDODS.log' file.
Standard error output:
sapparam: sapargv( argc, argv) has not been called.
sapparam(1c): No Profile used.
sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
==============SAPDODS.log FILE===============
(EXP) TABLE: "/BIC/AZSD_O2040" #20110612013025
(EXP) ERROR: DbSlPrepare/BegRead failed
  rc = 103, table "/BIC/B0000104000"
  (SQL error 208)
  error message returned by DbSl:
Invalid object name '/BIC/B0000104000'.
Statement(s) could not be prepared.
(DB) INFO: disconnected from DB
E:\bin\uc\R3load.exe: job finished with 1 error(s)
E:\bin\uc\R3load.exe: END OF LOG: 20110612013026
===========================================
While doing the export import process i am geeting an error , can you please advice what could be the reason
-Addi

The indexes of all fact and efact table were not correct. We ran a program "RSDD_MSSQL_CUBEANALYZE" after confirmation from SAP to fix the issue.
After that the issue was fixed.

Similar Messages

  • Memory Limitation on EXPORT & IMPORT Internal Tables?

    Hi All,
    I have a need to export and import the internal tables to memory. I do not want to export it to any data base tables. is there a limitation on the amount of memroy that is can be used for the EXPORT & IMPORT. I will free the memory once I import it. The maximum I expect would be 13,000,000 lines.
    Thanks,
    Alex (Arthur Samson)

    You don't have limitations, but try to keep your table as small as possible.
    Otherwise, if you are familiar with the ABAP OO context, try use Shared Objects instead of IMPORT/EXPORT.
    <a href="http://help.sap.com/saphelp_erp2004/helpdata/en/13/dc853f11ed0617e10000000a114084/frameset.htm">SAP Help On Shared Objects</a>
    Hope this helps,
    Roby.

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Project duplication:  Copy/Paste vs. Export/Import

    Anyone have any thoughts on project duplication?
    Presently I am copying a project and then pasting it into multiple new locations then renaming them. Recently I was in a training session where the instructor suggested exporting the project and then importing it claiming database size savings. Can anyone shed any insight?
    Thanks in advance.

    I can't imagine why an export/import would be a smaller footprint within the database than a project copy/paste. Not to mention that if you are talking about a relatively large project it can take a substantial amount of time to export and import. One of our schedules that is about 18k activities takes approximately 1.5 hours to export/import where as a project copy/paste only takes about 10 minutes. Note that we do not copy the baseline schedules when replicating projects.

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

  • Error in Client export/import

    Dear all
                     i have test system i would like to perform client export/import in single system correct if made a mistake
    i have export client which create 3 request namely SIDKOxxxxxx , SIDKTxxxxxx,SIDKXxxxxx .i had selected target SID as system same sid bcaz i have only one system
    then i login again with scc4 create new client and login in that sap* and pass then when i run command stms_import to import all three transport into that client as said my message server gets disconnected correct me i am wrong
    or i have to directely run scc7 bcaz the request already in /usr/sap/trans then these error won't occur  i have the back up of database
    now i am getting error oracle disconnected
    Regrards

    Dear Pavan
    I tried last night as you said i made new client as 300 in ides and copy user profile to that
    when i tried to configure stms by setting CTC =1 when i configure transport routes i don't find any client number
    to provide source and target client
    kindly send some screen shots if u have that would great support
    .Delete the existing TMS configuration.
    2.In 000 client by using the user ddic create TMS configuration using STMS.
    3.Click on the system, in transport tool tab add a parameter CTC and set it to 1.
    4.Now configure the transport routes, there you can find the client number in consolidation routes.
    5.Provide your source and destination client and create routes.
    Regards

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol &amp;quot;牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐&amp;quot; when expecting one of the following: ( - + case mod new not null &amp;lt;an identifier&amp;gt; &amp;lt;a double-quoted delimited-identifier&amp;gt; &amp;lt;a bind variable&amp;gt; avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Oracle 9 spatial index export/import

    Hi,
    when exporting/importing a user via exp/imp I encounter a problem with the numeric characters encoding during creation of a spatial index.
    Imp tool produces script like this:
    "BEGIN "
    "execute immediate 'INSERT INTO USER_SDO_GEOM_METADATA values (''POINTS'',''GEOMETRY'',mdsys.SDO_dim_array(MDSYS.SDO_DIM_ELEMENT(''X'',50000000,160000000,,005),MDSYS.SDO_DIM_ELEMENT(''Y'',450000000,600000000,,005)),,005) ' ; "
    "COMMIT; END;"
    Problem is with wrong representation of the numeric value '0.005' as ,005.
    Originally, the index was created via
    INSERT INTO USER_SDO_GEOM_METADATA
    VALUES (
    'points',
    'geometry',
    MDSYS.SDO_DIM_ARRAY(
    MDSYS.SDO_DIM_ELEMENT('X',-125000000,-115000000, '0.005'),
    MDSYS.SDO_DIM_ELEMENT('Y', 30000000, 42100000, '0.005')
    NULL
    Any hints how to reimport the index correctly?
    Thanks,
    Bogart

    You might need to set the NLS_LANG environment variable to get character set conversion done on import (I don't know this unfortunately - it has never been a problem for me).
    There is a section on character set conversions in the utilities manual.

  • Offline Instantiation of a Materialized View Site Using Export/Import

    Has anyone had any success performing offline instantiation of a materialized view site using export/import in Oracle9?

    This is what I wanted to ask in the forum. I want to use datapump for the initial instantiation because I believe that indexes (the actual indextables, not just the definition) are also replicated using datapump.
    So, is it possible to do the instantiation using datapump?

  • Upgrade from 8.1.6 to 9.2.0.7 via export/import

    Has anyone done this or know if it's possible to simply build a 9.2.0.7 instance, then use export/import to perform the upgrade from 8.1.6 to 9.2.0.7 since a direct manual upgrade from 8.1.6 is not supported??
    TIA

    Yes, building a database in a new release and then exporting the existing database using its version of the exp utility, performing a binary copy of the exp file to the new server, and then running the new version imp utility using the transferred dmp file was the standard method of moving databases to new platforms/versions for many years. The same process could always be used on the same server between equal or from a current to a new release.
    HTH -- Mark D Powell --

  • System copy using SAPInst(Export Import database Independent prcoess failed

    Hello ,
    I am doing a System copy using SAPInst export/import process .
    Source system : SAP Netweaver'04( BW 3.5 , Kernel : UC 640 ,Patch level 196 )
    Export process fails at Phase 2 -  Database Export at R3load jobs running 1,waiting 0 . Below is the log details
    SAPSDIC.log
    (EXP) INFO:  entry for BAPICONTEN                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICONTENT255                    in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20031127161249
    (EXP) INFO:  entry for BAPICONVRS                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20010131174038
    (EXP) INFO:  entry for BAPICREATORDATA                   in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICRMDH1                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMDH2                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMEXP                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175513 > 20031211120627
    (EXP) INFO:  entry for BAPICRMEXT                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120627
    (EXP) INFO:  entry for BAPICRMKEY                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMKEY_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMMSG                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMMSG_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMOBJ                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120628
    (EXP) INFO:  entry for BAPICRMPAREX_T                    in DDNTT is newer than in DDNTT_CONV_UC: 20051229175452 > 20031211120305
    (EXP) INFO: limit reached, 5000 tables in DDNTT are newer than in DDNTT_CONV_UC
    (EXP) INFO: NameTab check finished. Result=2  #20100113131216
    (EXP) INFO: check for inactive NameTab entries: Ok.
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 20100113131216
    ***SAPCLUST.log ****
    (NT)  Warn:  EDIDOC: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "EDIDOC"
    (NT)  Warn:  PCDCLS: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "PCDCLS"
    (NT)  Warn:  SFHOA: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHOA"
    (NT)  Warn:  SFHYT: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHYT"
    (NT)  Warn:  UMG_TEST_C: normal NameTab from 20040211095029 younger than alternate NameTab from 20031113150115!
    (EXP) TABLE: "UMG_TEST_C"
    myCluster (55.22.Exp): 712: error when retrieving alternate nametab description for physical table UMG_TEST_F.
    myCluster (55.22.Exp): 713: return code received from nametab is 32
    myCluster (55.22.Exp): 299: error when retrieving physical nametab for table UMG_TEST_F.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 2010011312563
    Please suggest.
    Thanks & Regards
    Ganesh

    Is your DB unicode?  If so, did you select the unicode flag in sapinst?
    This [thread|System Copy Error while exporting ABAP; might offer some help.
    -Zach

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Client Copy Export/Import on large system. 1.5 TB

    Hi,
    We have a 3 tier landscape. DEV, TST, PRD.
    Database size in PRD is 1.5 TB (1500 GB)
    Our developers need fresh master and trans data in TST and DEV.
    We are thinking about making a
    PRD --> TST  (1.5 TB) we do a system copy.
    TST --> DEV (1.5 TB) we can't do a system copy. (Company policy says not to touch repository objects, version management what so ever in DEV....)
    1. How long will a client export / import take from TST --> DEV,  lets say we have "good HW" ?  Will it work ?
    2. I've read about TDMS but its toooo expensive.....is there maybe any 3 party tool that is good for this purpose ? Transfering 1.5 TB of data ...
    Br,
    Martin

    Hi,
    >1. How long will a client export / import take from TST --> DEV, lets say we have "good HW" ? Will it >work ?
    In my opinion, a 1.5 TB database is much too big for a client copy.
    It will work but may last a full week and would degrade the production system performance.
    We have 1.3 TB databases and stopped using client copies since the system was 300 GB...
    We only use database restores and renaming for system refresh even for the dev system.
    There is procedure to keep the versionning of sources.
    Regards,
    Olivier

Maybe you are looking for

  • Calculation of VAT on PO Sub Total

    Hi Friends,   We have requirement to calculate VAT on Basic+Customs Duty, As per TAXID (TAx Calculation Procedure for Indonesia) system is VAT on Basic Amount (First condition record) in PO. But we want on Basic +Customs, Can anybody suggest if way o

  • Forte Compilation Time

    Hello everybody, I have a generic question regarding the Forte compilation performance. We have a large application in Forte 3.0.G. We compile this application to generate an executable. We notice two problems: 1. Each time we change a class, Forte g

  • Mac 10.5.7 update has made golive 9 stop rendering css in layout view

    Has any one else on Mac with the mac 10.5.7 update have new issues with golive 9 not rendering css in layout view. It works still in pdf preview and if I boot to 10.5.6 all is normal again. Also in the live rendering engine it works as well...just no

  • Importing albums from Elements 9 on a PC to LR4 on a Mac

    How can I import my Elements 9 albums from my PC to LR4 on a MacBookPro?  I have tried creating a backup of my Elements catalogue but cannot figure out how to import it into LR4.

  • How do i download apple application support on a pc?

    Every time i try and download itunes to my pc computer a message pops up saying that i must download apple appllication support. How do i do this? i need itunes for my i pad please help !!!!!