Some thoughts on Active Notes

Note sure if anyone from Nokia software development reads these fora but if they do here's what I'd want to tell them regarding the Active Notes application.
The program is too slow. Sure, this may be dependent on the hardware it runs on but on my E51 it's not quick enough. When I type a note the letters appear on the screen with delay - even when I do not use T9 but multitap.
The application needs a "view" mode so that one could browse long notes one page at the time, for instance, by tapping the right and left keys of the "navi scroll" key. Currently, when one wants to read a note the application goes directly into edit mode and the only way to scroll (on E51) is to go line by line and even that is slow.
There's a bug in that when I write new text in an already-existing note, the old text in that note appears to be bold but it is not - it's just a bigger type of font. The new text is in comparison very thin, though being of the same point size.
There needs to be an option to select all text in a note.
I would personally skip the bold, italics etc options but can understand why people want them.
There should be an undo feature.
/ph

Seems more an early firmware related problems. Wait for an upgrade, and maybe... it should work better.

Similar Messages

  • HELP!  I had iPhoto on my mac, for some reason it was not responding down loaded a new version but can't find my photos. any ideas please.  I thought once on the mac always on the mac.

    HELP!  I had iPhoto on my mac, for some reason it was not responding or damaged.   down loaded a new version but can't find my photos. any ideas please.  I thought once on the mac always on the mac.

    Sorry but we need details to help
    What version of the OS do you have? What has changed since iPhtoo worked?
    What exactly is the problem - you say "it was not responding" - You downloaded a new version (what version)" -you can't find yoru photos" none of which is helpful
    look in your applications folder - is there an Iphoto application there? If so get info and report the version
    Look in the picures folder - is there an iPhoto library there? If so exactly (including exact error messages) happens if you double click on it?
    what else can you tell us that might help us assist you?
    LN

  • TS1292 How do I redeem my gift card if some of the activation code numbers peeled off and so now I can not read them?  Will I be able to use the serial code?

    I got a gift card to itunes for my birthday.  Some of the activation code peeled off the card when I removed the sticker and I am unable to figure out the code.  How do I redeem it now?  the serial code?  If so where?

    http://support.apple.com/kb/TS1292

  • Just some thoughts

    Just some thoughts.
    Adobe says that piracy is one of the reasons for going to the cloud.  CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee.  On the other hand, why can't a user subscribe to CC for a month to get the latest version and then go on the web to get a crack?
    I also own, unless they go to a subscription, Elements 10.  Lots of functions that CS6 has for photo enhancements.  I found some plugins that really enhance Elements. e.g. ElementsXXL adds new menu items, icons, buttons, key shortcuts and dialogs, so they seamlessly integrate into the user interface of Photoshop Elements. ElementsXXL bridges the gap between Photoshop Elements and Photoshop and greatly enhances the image editing experience in Photoshop Elements.
    I am sure other plugins will appear to make Elements more like Photoshop.

    "Adobe says that piracy is one of the reasons for going to the cloud. CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee."
    Why on earth would anyone want to pay a "nominal yearly fee" for another nuisance created by Adobe.  They can already shut down perpetual license holder's software if they pull the plug on the activation servers.  Don't you know that their servers check our software now as it is?
    Just recently with the release of the CS2 software to previous owners, many people could no longer use their already-installed CS2 software because Adobe's CS2 servers were shut down.  What does that tell you?
    I'm not going the way of cruising the internet for cracks.  That's for disillusioned kids who think everything should be free.  When my CS6 software no longer works I'm going with another company.  Probably Corel for some of the programs.  CorelDRAW was always better than Illustrator anyway.  Industry standard doesn't always mean it's great.  It just means everyone is using it and they expect you to do as well.  Personally, I think Macromedia made better software before Adobe took them over and MM software was a lot cheaper.

  • Email Activity Not sending email when BPEL process fails

    In 11g SOASuite I have a BPEL process which does the following,
    1) Receive response from client
    2) Email activity - Notify with an id received in the request as content
    3) Insert data into a table using db adapter
    Whenever db adapter has some key violations or issues, I notice that no emails are getting dispatched. When I look at EM console it displays that txn has been rolled back after 3rd step. I understand that but it makes me ask this question,
    How can I make the process to send an email at step2 irrespective of an error that happens down stream at step-3? How to make email activity not part of this Txn scope?
    It works as expected(emails are dispatched) when there are no issues at step-3.

    Hi,
    In order to not rollback your transactions, you need to hit a dehydration point, meaning, save your process into the db store and perform a commit until that point.
    It sounds like a simple process, so you can drag a checkpoint activity after the email activity.
    That is one option. Try, and see if it works for you.
    Arik

  • IPad activation not working please help Urgent

    iPad activation not working, I got this message "this ipad is currently linked to an Apple ID, sign in with the Apple ID that was used to set up this iPad. I enter my icloud account and user name to the computer and its ok, but not working for ipad

    See if these Apple Support documents help you ...
    https://www.apple.com/support/appleid/
    Find My iPhone Activation Lock: Removing a Device from a Previous Owner's Account
    http://support.apple.com/kb/ts4515
    If not those, then go to Apple Support for some official Apple Support help ...
    Apple Support
    http://www.apple.com/support/
    Apple Retail Store - Genius Bar
    http://www.apple.com/retail/geniusbar/

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Change Logs to show for CRMD_ORDER tx changes to the activity notes area

    Hi
    Does anyone know if there is a way to configure the change documents function in the transaction CRMD_ORDER when you have a business activity open to show the changes made to the activity notes section? This changes function is accessed via the top menu when you have an activity open: Extras -> change documents
    We would like to, if at all possible, show changes that have happened in the notes/text area of this transaction as well as the other areas.
    Many thanks
    Cara

    Hi Masayuki
    Thanks for your reply, this note looks like it will work very well for what we want to achieve. However, I can't quite figure out what I need to do to get this to work. Do you understand this functionality enough to give me some guidance at all?
    I have found everything I need to change in Tx COMC_TEXT, with text obj = CRM_ORDERH, Text Det Proc = ACT0001, Text type = A002 Note. For the Text type, the settings currently maintained are Sequence = 0001, Changes = edit, transfer type = not yet identified.
    The SAP note specifies that the changes needs to equal both P and R (which I don't understand how to get both available), and I am not sure what the transfer type setting should be. I have tried to change this in our dev system, however it doesn't seem to flow through to the Business Activities transaction. Are you able to assist?
    Many thanks
    Regards
    Cara

  • Some HTML Emails Do Not Display Properly

    Recently, some HTML emails do not display properly in Mail. All I see is HTML. Yet, 98% of my HTML emails render properly.
    Any thoughts on why this happens with some emails? Is it a Mail issue or are these emails not coded properly?

    Hello Tony.
    Could be a combination but since Mail renders 98% of of all HTML messages properly (using Safari as the engine), this indicates it is a coding problem.
    I'm not a big fan of using HTML with email since it introduces a number of problems. If everyone used Plain Text for message composition, the majority if not all problems experienced with email would be eliminated.

  • Managed system configuration: Some required roles have not been granted

    Hi experts,
    I face a strange problem in my SolMan 7.1 SPS 4:
    In the configuration of managed system everything is green except the last step "Check configuration" => "Configuration Check".
    The error is:
    Some required roles have not been granted to user SAPSUPPORT  (ID: MANAGED.DUAL.SAPSUPPORT)
    Action: Execute step Managed System Configurationn / Create Users
    Check Context: Managed systems users/roles | Managed users/roles | SYSTEM =LSM~ABAP
    and the same for SMDAGENT_xyz (ID: MANAGED.ABAP.WILYAGT).
    The funny thing is that the step no. 7 (Create users) is done successfully. I have also deleted SAPSUPPORT manually and have reexecuted step 7 but no success.
    Any ideas? Thanks and best regards, Basti

    Hello,
    Just a thought... it could be due to a UME role that is not assigned to the user. Have you checked the security guide?? There are a couple of steps to do in Visual Admin for this use as per the guide.
    Cheers,
    Diego.

  • IPhoto6 likes some photo's but not others

    Hi All,
    I've a real problem with iPhoto and thought some of you gurus could maybe help. I've around 3500 photo's in my library. I populated my library by importing photo's from my folders on my Windows PC (networked to my Mac) and more recently directly from my camera (all done as I'm supposed to - through iPhoto). The photo's are all there, no problems opening iPhoto etc - the application is swift and neat. Now here's the problem.
    I have created a number of albums and have organised my photo's (or am trying to) by album. I'm dragging the photos from the main library into the appropriate album, however, some photo's do not like being dragged and dropped at all - they eventually do (after say 5 minutes of spinning beach ball) place a copy (or reference to main original file) in the album. There may be some reason behind this but from my perspective it appears random - there's no telling which photo iPhoto will and will not like. This reduces my productivity heaps - what should be a very slick and easy task turns into a hair-pulling, eye-poppingly frustrating one.
    This of course also mucks up all the other iLife applications eg iDVD - I'm using the albums to create slideshows - dragging and dropping the albums to a slideshow created in iDVD easy? Nope - will not import some photo's. Same goes for iWeb.
    There should be no problems with imported photos as I kept all photo's on my old PC organised in a similar way to the Mac does anyway ie by month and then by original and by edited.
    I've also tried cleaning thumbnails and entire library (marginal performance achieved) and running onyx to clean permissions etc. I've taken out RAW files and movies taken on camera's to see if that would work...nope. Running out of ideas now...
    MacBook Pro   Mac OS X (10.4.7)  

    do you know if it is possible to open multiple instances/windows of iPhoto
    As Terence stated, no. The best you can do is to use iPLM or iPhoto Buddy to quickly jump between libraries. Or use iTLM to merge the one library into the other and keep them all in one. In any case take a look at the tip at the end of my signature. It may help protect you against damaged database files which has been an all to common problem here.
    Do you Twango?
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.

  • BAdi: Meaning of "active not switchable through custom" ?

    Looking at a BAdI implementation on the tab named "Enh. Implementtation Elements" of se18, there is a checkbox labelled "active not switchable through custom" . What exactly does this mean? And if I uncheck it, where in the IMG might I be able to customize BAdI activity?
    I have observed that a BAdI implementation was active in one client (the one in which it was developed) and inactive in another one (the test client) when this box was unchecked. So it doesn't seem to have anything to do with having an active version of the code.
    I'd also appreciate a pointer to documentation. SAP help wasn't much use to me.
    -- Sebastian

    Hi Bill,
    if i may ask, does the column already referenced well?
    http://social.msdn.microsoft.com/Forums/sharepoint/en-US/1084935b-0367-4814-b4fc-390670806bd3/content-query-custom-fields-not-showing-up-even-after-following-all-the-steps-all-steps-listed?forum=sharepointdevelopmentprevious
    http://msdn.microsoft.com/en-us/library/ms497457%28v=office.14%29.aspx
    if possible please have a try to create some calculated columns that reference the InfoPath content type columns and used the calculated ones.
    you may also try this codeplex, for testing :
    http://enhancedcqwp.codeplex.com/
    Regards,
    Aries
    Microsoft Online Community Support
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • TechNet Wiki Activity not Showing in our Profiles

    This has been reported in the "MSDN and TechNet Profile and Recognition System Discussions" forum, but I thought it would be appropriate to also report the issue here:
    No Wiki activity has been reflected in our profiles for several days, perhaps since September 30. Also, our statistics (number of new articles, edits, and comments) are not changing. And, the leaderboards (most points and most activities) seem to not reflect
    anything in the last 3 days. We still see leaders, but if you compare, everyone's activities and points are steadily declining as the 7 day window includes more days with no activity.
    I hope the metadata still reflects our activity, so our activity, statistics, and achievements can be brought up to date later.
    Richard Mueller - MVP Directory Services

    I concur to everybody's thought here its not getting reflected. Richard I guess points are also not getting updated
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Wiki Article
    MVP

  • Help on Livecycle/DSN...API or some type of Active X solution to help making connection

    I have a solution that I'm trying to come up with.
    LiveCycle is a relatively handy tool for automating input from ODBC data
    sources to download to forms
    What we would like to do is automate the process of setting up the DSN used
    by LiveCycle and automate the process of making the connection to that DSN
    to populate the fields in the forms. Automating the DSN setup is done. But
    what I'm looking for is an API or some type of Active X solution to help
    with making the connection, populating the forms using LiveCycle and
    printing the PDF output file. Can you help me with this?
    Currently I'm using LiveCycle Designer 8.0 as part of the Adobe Acrobat
    suite. Can I use this, or do I need to look at a different software package?
    Thank you for any assistance you can provide me with.

    Hiho Ashley,
    Iam sorry that I can´t answer your question about the Active-X-Stuff and I truely doubt, that you will look in this thread again. But Iam interested in your solution for automated build of an DSN. Is it within the PDF-File, so not every user wanting the the PDF to be filled have to settle up an extra connection? And does it work, if the user has no permission to make a DSN on his own?
    Hope, someone can reply
    Greetings

  • Nokia X/X2 -- Some thoughts and questions

    Moved from Symbian Belle, some thoughts:
    1. Email widget doesn't automatically refresh. Have to manually tap to load new mail despite specifying a sync. interval.
    2. Asking me to Sign in to a MS, Exchange, Nokia or Google account etc. to use the calendar. Can I not get a local one for which I can choose to Sync if desired, similar to Symbian?
    3. Does the X2 have a play via radio capability? I don't see the app but wanted to if the hardware capability is there?
    4. How do I know the phone is using Gps, Glonass or Beido? Specifications listed positioning capabilities.
    5. Don't see the mix radio app. Possibly not available in certain countries.
    6. Belle had option to automatically switch between GSM and 3G where available. X2 only allows a 3G only setting and GSM/CDMA.
    7. People and Phone apps need to be bundled together. While scrolling through contacts via phone app, if you accidently tap one, it starts dialing. Plus it doesn't show items as together and repeats contacts with more than one number.
    8. Didn't really like the Metro and Fastlane UIs. One reason was that the wallpaper was not changeable. So if you have gaps, then all you see is black.
    9. Glance screen doesn't seem customizable. Time appears on 2 lines, seems a bit odd.

    Reply to 9:
    Glance Screen on X v2 is almost amazing. http://developer.nokia.com/resources/library/nokia-x-ui/essentials.html
    A past Nokia X user, now a X user.

Maybe you are looking for

  • Using roadmap and select-options

    I'm new to abap web dynpro, i'm using a roadmap with 4 views. This is working fine but, when I try to use select options in view1 they don't show. When I set view1 as default view the select options are working fine. this is how my current web dynpro

  • How can I install handwritten lettertpes on my iPad 3

    As stated above: How can I install other lettertypes on my ipad?

  • Need help for a newbie!!!

    Hi, I have set up a website on a new Redhat7.3 box with Apache2.0 and integrated Tomcat4.0 to handle the servlets used in the site. Yesterday I finished installing Oracle 9i Enterprise edition on the machine (I need a simple single table database tha

  • What codec to use?

    Hey, I was wondering what codec to use. I've brought my video into a sequence with the Apple Pro Res 422 HQ codec. My Sequence is set at Apple Pro Res 422 HQ as well. I'm using a macbook pro 2GHZ/ 2GB of Ram with a 128MB PCI Express graphics card. I

  • [svn:cairngorm3:] 17952: simpler and safer ways to handle flex3 and flex4 sdk and profiles

    Revision: 17952 Revision: 17952 Author:   [email protected] Date:     2010-09-30 08:02:19 -0700 (Thu, 30 Sep 2010) Log Message: simpler and safer ways to handle flex3 and flex4 sdk and profiles Modified Paths:     cairngorm3/trunk/build-parent/pom.xm