Just some thoughts

Just some thoughts.
Adobe says that piracy is one of the reasons for going to the cloud.  CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee.  On the other hand, why can't a user subscribe to CC for a month to get the latest version and then go on the web to get a crack?
I also own, unless they go to a subscription, Elements 10.  Lots of functions that CS6 has for photo enhancements.  I found some plugins that really enhance Elements. e.g. ElementsXXL adds new menu items, icons, buttons, key shortcuts and dialogs, so they seamlessly integrate into the user interface of Photoshop Elements. ElementsXXL bridges the gap between Photoshop Elements and Photoshop and greatly enhances the image editing experience in Photoshop Elements.
I am sure other plugins will appear to make Elements more like Photoshop.

"Adobe says that piracy is one of the reasons for going to the cloud. CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee."
Why on earth would anyone want to pay a "nominal yearly fee" for another nuisance created by Adobe.  They can already shut down perpetual license holder's software if they pull the plug on the activation servers.  Don't you know that their servers check our software now as it is?
Just recently with the release of the CS2 software to previous owners, many people could no longer use their already-installed CS2 software because Adobe's CS2 servers were shut down.  What does that tell you?
I'm not going the way of cruising the internet for cracks.  That's for disillusioned kids who think everything should be free.  When my CS6 software no longer works I'm going with another company.  Probably Corel for some of the programs.  CorelDRAW was always better than Illustrator anyway.  Industry standard doesn't always mean it's great.  It just means everyone is using it and they expect you to do as well.  Personally, I think Macromedia made better software before Adobe took them over and MM software was a lot cheaper.

Similar Messages

  • Just some thoughts for Win Users.

    I am not an expert but just a regular forum brouser.
    Is your system mother board is designed for Win XP? Yes,
    then that is what you have got max user RAM 3GB. Your system board may accept two 2GB sticks but the system will only use a max of 3GB. Do not overclock your processor or activate the /3GB option, will not help you on PS or LR but may only cause problems.
    You only get what you pay for no more, no less.
    Win Vista same advice use what the system is designed for.
    Win 7, make sure your motherboard is designed for the system and go for the full install and not the upgrade. If not be prepared to trouble shoot to get the performance optimized.
    Do not expect Lightroom to sort out you OS problems.

    Lots of good replies here; this is typically an OS issue, not an architecture issue. As long as the motherboard and CPU combo supports the (typically -- few of us run the Intel 64-bit server architecture) "amd64" architecture the 64-bit versions of the OS should work fine to see a "flat" (well, flat enough for Windows, which is rather lumpy in its approach to memory) 64-bit memory space.
    With two caveats:
    1. The OS still determines how much memory you see, and Microsoft has tweaked different versions and releases to see varying amounts of physical memory. Some see this as a defect.  See here: http://msdn.microsoft.com/en-us/library/aa366778(VS.85).aspx
    2. Not all 64-bit operating systems are "64-bit clean." As far as I know, Windows 7 is the only 64-bit clean OS Microsoft has released. That is, the various pieces that make up an OS (kernel, services, driver, &etc.) are all 64-bit, with some of those pieces designed to handle 32-bit addressing if they have to.
    (1) may affect user applications like Lr. (2) will almost certainly not, but it will affect overall system performance.
    Lr does not require 64-bit to work (I assume it will in a future release -- the writing is on the wall) but the right architecture will allow you to run applications with larger address spaces, which means more resources and more efficient use of those resources.
    Ironically, some folks will note that 64-bit has meant some things have gotten much larger. The process stack, threads and other internals have grown much larger with 64-bit platforms. This can sometimes be seen as a larger application footprint and even, in some cases, slower access to some system resources for 64-bit applications. These are extreme cases, however, and the best thing to do is run modest hardware that puts the emphasis on generally efficient processor and memory architecture and an OS that can take advantage of the physical memory you provide.
    The point of the OP is well taken, however. Your "64-bit" OS as installed on your hardware may not, in fact, be able to see anything more than 2-4Gb of your physical memory. Older releases and OEM installs are infamous for setting things up so that all you end up with is a moderately more expensive and slow 32-bit platform.
    Buy a real copy of the OS and install it yourself with none of the taskbar garbage OEMs love to give you.  Make sure you can disable real-time anti-virus protection, or remove it altogether (practice safe computing and you won't need it.) If you purchased from an OEM like Dell, demand your OS media (the are supposed to sell you installation media with pre-installed systems) and install it yourself (or get your local Windows enthusiast to do it for you.) Take regular backups -- actually, take two regular backups; hard drive "bathtub curves" are getting shorter by the year, while the cost per Gb shrink monthly.
    Lr is a demanding app, but nowhere as demanding as many. It will use as much of the physical memory the OS can offer under the right circumstances, but the OS can only ration so much to various processes until something gives. Just because Lr is lagging on your system does not make it the problem! An OS is an ecosystem, and the kernel does its best to ration out resources to running processes.  Hungry foreground processes will just suffer more obviously from strained resources.

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Just some child in a webform?

    Hi,
    Please somebody of you can explain to me how to display in a webform just some child (not all child) related to member of a dimension.
    Example:
    Dimension: Account
    Member:
    Generation1: Product
    Generation2 : Sales
    Generation3 :
    - Quantity
    - price
    - 711111 (Revenue)
    I want to display in my webform just Sales, Quantity and Price. How can I do that?
    PS: I tried Ichildren(sales) but it didn't work
    Thank you

    Can I assume you have thought about using the Descendants function with missing blocks and data are suppressed? Otherwise, the only option that comes to my mind is to manually select members, but I know it is not ideal.
    Also, I would post this on Planning and Budgeting You may get more responses.
    Cheers,
    Mehmet

  • Discussions: Some Thoughts From Abroad

    Hi all,
    I would like to add some thoughts on the points debate. I think we all recogonise that the majority of users of Apple Discussions are in the USA, I don't have a problem with that, it's a fact of life. However, the new points system does have a bias towards those replying from the same country as the OP due to time zone differences.
    I don't ever see the day when I can imbibe in the Lounge with the 4s and 5s. By the time I get to my computer each day, all the questions I am able to answer have already been answered unless there are a few insomniacs around, or I become one myself!
    I liked the old system of awarding points. Not just because I was able to get a few myself, but mainly because I was able to give recognition to someone who had helped me expand my knowledge of Macs, even when I was not the OP or after the question had been answered. I realise that this system was open to abuse, but only by a few surely. Limiting the amount of votes one could award each day according to your Level would address that.
    In closing I would just like to say a big thankyou to all who keep these Discussions going. Your work is appreciated.
    Cheers
    NikkiG

    737/3183
    Hi NikkiG,
    "I liked the old system of awarding points. Not just because I was able to get a few myself, but mainly because I was able to give recognition to someone who had helped me expand my knowledge of Macs, even when I was not the OP or after the question had been answered. I realise that this system was open to abuse, but only by a few surely. Limiting the amount of votes one could award each day according to your Level would address that."
    And this possibility of awarding others was shaping a very different spirit:
    A lot of tips could be found among "side-replies" that were on topic but coming later, after the OP was already satisfied with the urgent fix.
    It was also shaping a spirit of sharing knowledge in a deeper, more elaborate way. To me this was really about a community of people who shared this kind of enthusiasm for good knowledge, for sense and for general improvement.
    Now that we live in immediateness, the "game" is much more about Shooting First, the one short and "simplified" answer that would, yes, quick-fix, but also kill most of the "side-discussion" as considered almost off-topic...
    "By the time I get to my computer each day, all the questions I am able to answer have already been answered"
    Well, the general feeling, anyway, no matter what time is it in Australia, is more like
    "By the time I'm done reading the question as it appears on top of a forum, one or two "Link-to-TheXLab"-type of quick answer have already been copy-pasted within the same minute"!
    (not only Donald but some others who are better at self-censorship, who manage equally well in the new system or who don't speak much because they don't know enough about true reasons/motivations, have left a word or two here, in their own way, about your feeling)
    "A bias towards those replying from the same country as the OP due to time zone differences"
    I see what you mean,
    although for me European at the contrary there is not anymore this period of disabling the "20% chances" thing that I was feeling in the Old Discussions
    you still can give quick replies to some newbies from Japan/New Zealand/Australia, but I agree there may not be as many as those from the US...
    Your suggestions:
    I second and support them, but I'm not sure, now that the system works again smoothly with no slowing down
    - Thanks so much Mods!! -
    whether they are very high anymore on the list of priorities?
    This depends also on the "political" point of viewing them. If with Tuttle we consider that "many of the new features mainly enhance the forums' usefulness to Apple"
    (my main personal guess about this aspect, would be about the gain in Phone Technical Support Crew management)
    possibly there will be no more changes about this?
    Possibly even more immediateness in the future,
    after all we are seriously entering the "iPod world" now, although let's hope the good old times of computing being a matter of a few Cupertino geniuses and a bunch of truly enthusiastic pure Mac people, is not too far gone now?
    Let me finish with your words, as they are exactly those I'd have liked to write myself:
    In closing I would just like to say a big thankyou to all who keep these Discussions going. Your work is appreciated.
    Axl

  • Just some feedback on the 2014 CC Update

    Just some feedback. When I installed the 2014 Adobe CC Apps last evening, I was expecting it to "update" (As it was labelled an update) the current CC apps on my hard drive. Instead it installed all new 2014 versions instead of an upgrade. This ended up being a minor issue as I am running my OS and Apps on an SSD, and have limited space. It just would have been nice to have seen it (CC 2014) clearly labeled as a new version of the apps instead of as an "update". That could have been a misunderstanding on my part, but it could be better labeled. Other than that, everything is fantastic, thank you very much.

    martyr2 wrote:
    1) Do I NEED to "update" or INSTALL?
    2) Is this the Adobe way?
    3) We sure thought we were signing up for "always current apps" - updates, yes - but a complete new installation?
    1) If you don't have alot of money tied up in plug-in's that don't work under the new programs... then you way want to update/install. If you have need of the new features, then you may want to update/install. However, need is subjective, and only you can determine that.
    2) For this update, it appears that this is the Adobe way, for the moment. From the information I have gleaned, (Caution! Assumption! ) it appears that they are doing it this way because of the lack of backward plug-in compatibility with the new CC 2014 apps. I can see why they would want not to alienate people who have several hundred dollars or more tied up in plug-ins.
    3) That was my understanding as well, but I can understand why they are doing what they did... I just wish they would have let us know.

  • How do I set up my iTunes Genius to play selections from all, not just some, of my songs in my currently playing playlist?

    Last night, when I played my playlist songs with iTunes Genius, my iTunes, with one of my playlists selected, and set on "Shuffle On," "Genius Shuffle On, "Genius On," "Match On," and "Play," played selections from all of my songs in my playlist. Today, my iTunes Genius went back to the same problem it was giving me before last night. My iTunes Genius has been playing selections from just a few of my playlist albums. How do I set up my iTunes Genius to play selections from all, not just some, of my songs in my currently playing playlist?

    No backup is a huge mistake.
    You can redownload some itunes purchases in some countries:
    Downloading past purchases from the App Store ... - Apple - Support

  • Back up just SOME of the music library to an external hard drive

    I want to move just some of my music from my iTunes library onto an external hard drive (reason? the music cannot be got again, but right now I am not sure if I need it all or not).
    So I want to move, say, 50 albums from the main iTunes libraray to an external drive and then be able to move some or all of them back again at later date.
    Do I have to copy the whole library over to the external drive and then delete the albums there that I did not want to copy over?  Or is there a better way?
    And how will I copy back just some of the back-up albums at a later date - is it the normal "Add folder/file to [main] library command from the File menu?

    OK, I don't think I have been making it very clear what I am trying to achieve - let me try putting it differently!!
    I have one iTunes folder and, at the moment, that is all backed-up via SyncToy.  I only have music in iTunes.
    Think of that music as being of 2 types - part A that I want for definite, and all that music is also on my iPod (I have my iPod set to Sync manually with iTunes) - this is probably well over 100 albums; and Part B music, that is on iTunes, not on the Ipod, and that I am not sure yet if I want to have in iTunes and on the iPod or not (this was a load of music copied over from another iPod several months ago) - this might be a further 50 albums.
    So - part A, I want to keep and want to be fully backed-up.  OK, SyncToy does that currently by backing up 100% of iTunes.
    But part B - I want to copy the part B music to a different copy location, and then just keep there the part B stuff which, album by album, I will either delete completely (if I don't want it) OR want to copy back to iTunes.  Once this second copy is set up, I would delete from this copy all the Part A stuff, and also delete from iTunes all the part B stuff.
    I would then have:
    - part A stuff on iTunes, iPod and back-up location 1;
    - part B stuff on back-up location 2.
    So the main question is - from what level in the iTunes folder do I need to copy over to the part B back-up so as to ensure that, when I subsequently try to copy albums back from the part B back-up to iTunes, I get all the information, artwork etc that already exists for the albums?  Do I copy the music file; the media file, or the whole iTunes folder?
    This may sound all terribly longwinded but, for various reasons, I am not in a position to be able to decide all at once which of the part B albums I need to keep.
    Is that any clearer?
    Cheers!

  • Speakerphone dynamics.....Just a thought.

    I wonder - is the volume problem with the speaker phone Apples way of handling feedback? With my limited experience with sound reinforcement depending on where you set a mic and a speaker and what they are facing could change their characteristics. When the bottom of the phone facing a hard structure like a wall or a hand it could cause feedback and apple must have anti-feedback character electronics lowering the volume. With nothing obstructing the bottom of the phone it should be less of a feedback problem and thus louder volume because no feedback issues.
    (try the iPhone on a table that has no obstruction in front of the bottom of the handset and test your volume)
    With cells or land line phones that have speaker phone capabilities and even car speaker phone kits feedback is always an issue and depending on where the speaker is compared to the mic will make a large difference. (like the Razr Vxx with the speaker on the opposite side of the microphone and it's great volume on their speaker phone)
    Why Apple or someone representing Apple here isn't address this issue we are left to guess and get more frustrated until others can come up with explanations (like me) as to what the issues are and how to address them.
    I guess that if Apple had an official response on this forum way too many people would respond with their perceived problems and real issues as to Apple inability to address their hardware. Apple hasn't come up with a Pandora's box that closes automatically.
    Just a thought and lets test your iPhones on our new "Table" test with your iPhone and get back to us.

    Latoque wrote:
    The mail counter is one of those features in Comcast email that is just wonky sometimes.  I sure wouldn't count on it to be accurate at any particular moment.  Does it clear if you refresh the page?Yeah, it clears when I refresh the page, but that's the point. I always have to go back to email to check that those aren't new emails....ones that came after I deleted the ones I just read. (I hope that made sense!) I wish that when I go to email, read it, then delete it, that it shows no email on my homepage. But that's the "problem". I HAVE TO refresh the homepage to make it current. (otherwise it still shows that I have email that I just deleted) It's not a big deal, it just bugs me..frankly. And I know there must be a way to fix that. (sorry about that..I just noticed I was repeating myself!)

  • Am suddenly getting error code 3250 when trying to download certain podcasts...not all, just some. How can I fix? Help.

    Am suddenly getting error code 3250 when trying to download certain podcasts...not all, just some. How can I fix? Help.

    I am getting the same error, and have been, since the last iTunes update.     Driving me crazy, as some podcasts download without issue, others won't download at all, and get the 3250 error code.

  • How could I set the proxy settings for just some URLs and not for all?

    Hello,
    I am using HttpURLConnection to establish a HTTP connection . The connection pass through a proxy, and it requires security.
    I know that I can set the proxy settings in the system properties, and this works perfect.
    But I don't want to set the proxy settings in the system properties, because this proxy settings will be for ALL the URLs, and I just want for a few URLs.
    How could I set the proxy settings for just some URLs and not for all?
    Thanks

    java.net.URL.openConnection(java.net.Proxy proxy)
    @since 1.5

  • Nokia X/X2 -- Some thoughts and questions

    Moved from Symbian Belle, some thoughts:
    1. Email widget doesn't automatically refresh. Have to manually tap to load new mail despite specifying a sync. interval.
    2. Asking me to Sign in to a MS, Exchange, Nokia or Google account etc. to use the calendar. Can I not get a local one for which I can choose to Sync if desired, similar to Symbian?
    3. Does the X2 have a play via radio capability? I don't see the app but wanted to if the hardware capability is there?
    4. How do I know the phone is using Gps, Glonass or Beido? Specifications listed positioning capabilities.
    5. Don't see the mix radio app. Possibly not available in certain countries.
    6. Belle had option to automatically switch between GSM and 3G where available. X2 only allows a 3G only setting and GSM/CDMA.
    7. People and Phone apps need to be bundled together. While scrolling through contacts via phone app, if you accidently tap one, it starts dialing. Plus it doesn't show items as together and repeats contacts with more than one number.
    8. Didn't really like the Metro and Fastlane UIs. One reason was that the wallpaper was not changeable. So if you have gaps, then all you see is black.
    9. Glance screen doesn't seem customizable. Time appears on 2 lines, seems a bit odd.

    Reply to 9:
    Glance Screen on X v2 is almost amazing. http://developer.nokia.com/resources/library/nokia-x-ui/essentials.html
    A past Nokia X user, now a X user.

  • ZCM Device Map Designer or something, just a thought

    Hi!
    Very useful option is to have Novell iPrint Map Designer with OES. So, I thought useful could be also something like this for ZCM devices? I mean, could be much easier for admins to find clickable devices reflected on building map, sometimes! Also, for admin substitutes (for vacation etc.) and general view. Anyway, just a thought. Of course, not very hard to do it wo special tool.
    Anyway, yes, did write a few lines for Novell.
    Alar.

    NovAlf,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • Back up just some music from iTunes onto an external hard drive

    I want to move just some of my music from my iTunes library onto an external hard drive (reason? the music cannot be got again but right now I am not sure if I need it all or not).
    So I want to move, say, 50 albums from the main iTunes libraray to an external drive and then be able to move some or all of them back again at later date.
    Do I have to copy the whole library over to the external drive and then delete the albums there that I did not want to copy over?  Or is there a better way?
    And how will I copy back just some of the back-up albums at a later date - is it the normal "Add folder/file to [main] library command from the File menu?

    I want to move just some of my music from my iTunes library onto an external hard drive (reason? the music cannot be got again but right now I am not sure if I need it all or not).
    So I want to move, say, 50 albums from the main iTunes libraray to an external drive and then be able to move some or all of them back again at later date.
    Do I have to copy the whole library over to the external drive and then delete the albums there that I did not want to copy over?  Or is there a better way?
    And how will I copy back just some of the back-up albums at a later date - is it the normal "Add folder/file to [main] library command from the File menu?

  • Animation of just some parts of a CAD assembly group

    Hello again,
    Is it possible to animate just some parts of an assembly group (e.g. rotating a shaft between two static bearing cases)?
    I have loaded the geometry as a .obj file. And each part is named by the parameter "g" in the .obj file, as far as I understand it. So I wonder wether one could use these named parts to form a transform group??
    Would there be a better file format than .obj for something like this?
    Thanks for your advice!!!

    see GearTest.java in the j3d demos.

Maybe you are looking for

  • Custom infotype replacement for dialog modules

    Hi, I have been asked to recreate a custom OM infotype (created with PPCI and based on the Planned Compensation infotype) to the new SAP EHP4 environment.  Part of the infotype is a custom dialog maintenace module that allows the infotype to be viewe

  • Font embedding marginal value

    Hey... when i do an export from document to pdf-file, i embed all the used fonts. there is a margin value - automatically setted to 100 %. i know, when i use 100 % only used characters got embedded as subassemblies.When i use 0 % the whole font type

  • Safari 6.0.4 crash (See log)

    Since the Safari 6.0.4 update yesterday I can't use Safari properly. On every fill in form like Facebook, Apple Support forum etc. Safari crashes with the following crash log: http://pastebin.com/raw.php?i=qigk20Gm I'm using ML 10.8.3 with Safari 6.0

  • Can the Organization Type attribute be customized?

    Hi, all, I'm wondering if the OIM 11g "organization entity" can be customized so that its out-of-the-box "organization type" attribute can display more/different values. The out-of-the-box values are: Company, Branch, or Department. I need things lik

  • Error when installing Rescue and Recovery

    Hello,  I own a 2 year old Thinkpad L430 running Win7 Pro 64bit. I've been trying to install Rescue and Recovery because I want to reset my PC from the recovery partition, and as far as I understand the only way to do that is through RnR(?) But whene