Advice needed about lens correction plugin's for Aperture 2.1

Please can I have some advice about lens correction plugin's for Aperture 2.1?
I see that the Apple Aperture Downloads site promotes two:
1)LensFix 4.3
An external editor. The Aperture Downloads site calls it Lens Fix 4.3 but when you go the developers site (Kekus Digital) it only shows LensFix CI is this the same thing? Am I supposed to download the PTLens database from epaperpress.com and install manually before it is any good?
OR
2)Edit FixLens 1.6
Their site looks very poorly designed and the link from the Apple page does not go to the Aperture plugin, you have to drill down until you find it! Does it use the exif data about your lens and the zoom setting it was on automatically or do you need to configure it each time?
Which one did you buy? Was it worth it?
Thanks for your time!

LensFix CI is the same thing. It has the standalone version, the PS plug-in and the Aperture plug-in.
I've used LensFix via PS for a long time. It's great but... it can be a bit buggy if the image in use is 16-bit. Often it will work on the first image or two and then crash on the next.
I am told it's not compatible with PS CS4.
Another option is PTLens. It's also a PS plug-in and an Aperture plug-in. It's supposed to be CS4 compatible.
http://epaperpress.com/ptlens/index.html
It uses the same database of lens corrections as LensFix.
FWIW, I've tended to do this kind of work in Photoshop, either round tripping an image in Aperture or exporting an image from Aperture. Since Aperture plug-ins are going to 16-bit Tiff files anyway and since most images (if they are final files for clients) need some more PS work, I just go that route.
Jon Roemer
site: http://www.jonroemer.com/
blog: http://jonroemer.typepad.com/jon_roemer/

Similar Messages

  • Need a lens correction for Canon zoom 8mm-15mm f4/L, it's not in the list of lens in LR

    Need a lens correction for Canon zoom 8mm-15mm f4/L, it is not in the list of lens in Ligthroom
    and how can I ad it?

    It's available with ACR 8.4RC, so I would imagine that it will be available with LR 5.4RC/5.4. Just when that will be, is not known by anybody outside of Adobe. Hopefully soon.

  • Premiere pro needs a lens correction for the GoPro cameras

    Premiere Pro needs a lens correction for the GoPro cameras

    Please add your request here:
    https://www.adobe.com/cfusion/mmform/index.cfm?name=wishform
    Best,
    Peter Garaway
    Adobe
    Premiere Pro

  • Is there a pitch correction plugin available for garageband?

    Is there a pitch correction plugin available for garageband?

    For the record, I've been using Melodyne in GB for a couple of years now and couldn't get along without it. Had a singer scheduled for four songs that I needed finished demos for in two days. He came recommended, but I'd never actually heard him. turns out he had a voice like Robert Preston, if Robert Preston couldn't carry a tune or keep time. I was facing total failure. But I went ahead and recorded him, warts and all and hoped Melodyne would be up to the task. I even recorded part of one song in B-flat because he was stronger in that key than in G, which is what my other singer had recorded in.
    Melodyne turned four utterly unworkable, unusable tracks into decent representations. I was even able to engineer harmony lines that were never recorded. It gives you more editing ability for singing tracks than you get with midi notes.
    It's worth the price.

  • What do i do about microsoft drm plugin update for windows 7 service pack 1?

    what do i do about microsoft DRM plugin update for windows 7 service pack 1? Do I even need it for my computer? It says next to the download button (to the left) that it is out of date.

    Hi,
    Are you refering to this KB?
    KB947821
    Fix Windows Update corruption errors such as 0x80070002 and 0x80070057
    Could you please have a share with the error messages that shown after SP1 failed to install?
    And, now I have multiple copies of Windows 7 Service Pack 1 installed and, the computer wants to continually install another copy upon Shutdown.
    Do you mean the computer wants to continually install another copy every time when shutdown?
    Please make sure we have enough space in the system drive(where the system OS installed),  perform a cleanup of that drive(ensure we have ), then take a try to boot into
    clean boot and install SP1 manually, download fron here:Windows 7 and Windows Server 2008 R2 Service Pack 1 (KB976932)
    Best regards
    Michael Shao
    TechNet Community Support

  • Advice needed - optical bay caddy and HDD for T530

    Since I rarely use my optical drive and frequently back up files to an external hard drive, I'd like to put a HDD in the optical bay and just swap it out for the DVD player whenever that is needed.
    My T530 (i5-3210M Win7HP) has a 256GB Crucial mSSD for the boot drive and programs;  the original 500GB/7200 rpm in the drive bay has data files, Outlook and Quicken files, music, photos and documents.
    For the optical bay HDD 500 GB would be plenty, but the 750 GB is only about $10 more, so why not?  (I'd probably put the 750 in the drive bay and make it the data drive, with the 500 in the optical bay for a back up drive... and this is about convenience more than need.)
    Best I can tell the optical drive supports SATA 6GB/sec but most of the 2.5" 750GBx7200 rpm drives are SATA 3GB/sec.  (On the other hand, the Hitachi MK5061GSY 500GBx7200 that came in my T530 appears to be 'only' SATA II, and most forum posters suggest there's not a noticable difference.)
    Of the 750 GB drives it seems the Seagate Momentus ($82) and Western Digital Scorpio Black ($78) would be the top two choices.  I think they're both 9.5mm and would fit. Some reviews indicate the Western Digital is quieter.
    The caddy is more problematic, as there are so many choices, and many appear to be poorly-fitting Chinese knockoffs.  I'm not inclined to pay $70 for the Lenovo, when the others are in the $10-15 range, but I do want one that fits well and works well and lasts.  (And is at least SATA 3GB/sec.)  I can't tell much from the prices or pictures, and there's contradicting information in the 'product reviews.'
    So I need a 12.7mm optical bay (Ultrabay) caddy for a 9.5mm HDD that fits well without gaps, matches the T530 case, and supports at least SATA 3GB/sec.
    Will I need the rubber 'bumpers' or is that just for SSD drives?  If I need them, do they come with the caddy or drive?  If not, what do I get and where?
    Does either the Seagate or Western Digital offer an advantage in function or reliability?
    Please correct anything that I've mistated or misunderstood.  Thank you for your advice and recommendations.
    Solved!
    Go to Solution.

    How the four screws work. 
    HD's and SSD's usually have four threaded holes on the bottom and also two threaded holes on each long side. The video you may have seen with the screws going through holes in the bottom of the caddy into the threaded bottom holes of the HD is a video of the $45 Newmodus caddy. That's how they affix the drive to the caddy.
    The Chinese caddy I linked uses the screws in a different way, and it is actually pictured in the top right illustration on the caddy.
    While the drive is out of the caddy, you just thread the screws (through nothing) into the side holes on the drives. So, the drive now has two little protruding screw heads on each side. The only purpose is to create those four slight protrusions - two on each side. 
    Then you insert the drive into the caddy by sliding from one end. The thing on the bottom of the caddy picture is like a little door that folds up and down. You lift up the door and slide the drive into the other end. The protruding screw heads slide under little rail protuberances that are molded inside the sides of the caddy. When you have slid the drive all the way forward into the electrical connectors, you then snap down the door behind it. The door prevents the drive from moving forward or back. The screw heads wedged under the rails prevent the drive from moving up or down. You can do the entire operation in 5 seconds.
    Ny-compu-tek seems to have the same caddy listed under various titles and prices, including one that specifically says T530. I think that's just a marketing gimmick, as the T530 takes the same caddy as the T520. Anyway the one I linked is the least expensive from that source and fits perfectly. 
    It was also shipped out in one day with a tracking number. 

  • Advice needed on choosing the best SSD for 13-inch macbook 5,1.

    I'm thinkning about to upgrade my macbook. It's a late 2008 aluminum version. I want to replace the old HD with a nice SSD. I'm struggling to choose from the following two SSD.
      1. Crutial M500 (960G)  SATA III 6Gbps
      2. Intel dc s3700 (800G)
    I can get both of these at similar price around 300pounds just for this 2 days. But I don't know which would work the best on my macbook. Need advice please.

    Thanks for the link, unfortunatly there's no option for intel dc s3700.
    After seeing crucial M500's score there, I'm not that empressed. I heard that the Intel dc s3700 (800G) worth over 1000pounds, is it worth it just for the sake of getting the intel at a much cheaper price?  I don't know Why I've got the impression that Intel dc S3700 acturallly better than crucial M500.

  • Is there a lens correction plugin?

    Hi All,
    I am looking for something Aperture related that has built in lens profiles similar to LR that can correct distortion quickly. I tried unsuccessfully to install the PTlens plugin three times. It simply doesn't work. Are there any other options that are decent?
    Thanks for the help!

    Did you install all three files in the correct locations as follows?
    /Application folder:
    PTLens.app
    PTLensEdit.app
    /Users/<YourLogin>/Library/Application Support/Aperture/Plug-Ins folder:
    PTLens.ApertureEdit
    Note - the user Library is hidden by default in OS X 10.7.x. If you press and hold the Option key while using the 'Go' menu in OS X menu bar, you will be able to select and navigate to the Library folder in the latter address above.
    Unfortunately, with PTLens (or any other plug-in) - Aperture makes a TIFF (or PSD) master file to work on. I don't know of any program or plug-in that doesn't add to the workflow in Aperture for lens correctlion.
    FWIW, I use DxO Optics Pro 7.5 for my kit lens, but I output in JPEG format and then import into Aperture. It works well for me, but is not necessarily for everyone.

  • Help or Advice needed about converting my movies... i think?!! Thanx!

    Hi, hope someone can help...
    I'm not a real web designer but over the last few years I've taught myself how to make web pages & do some coding so that I can make basic sites & galleries for me & my girls...
    Well I've just realized what a huge collection of movies we've got now, and I've started putting them on to web pages - they are WMV...
    But half the time they don't work, especially on a MAC ect...
    Then i came across this blog at Lady Sonia dot net and all her movie clips work perfectly, all on the same page, and when i right-clicked them they say
    'about adobe flash player 10'
    So my question is would anyone be kind enuff to tell me how I can change all our wmv movie clips into whatever they need to be so they work like that???
    or maybe someone does that kind of job and i could pay sum1 to do them for me???!!! lol
    I doubt anyone wants to do that lol so in that case would anyone point me to the help page I need to read for this exact issue, because I have tried F1 as many people on here say and it took me to firefox and was no help at all, i've been searching flash help sites all night and i have no idea what i'm reading its soooo confusing, so if anyone knows a place they can point me to that would be really great!
    Thanx!
    xxx Honey xxx

    Hey thanx for explaining - and for helping now!
    Ok the code is pasted below, but what i actually did was in dreamweaver
    > insert media > insert flash > inserted the flash clip
    but that didnt work, so i tried
    insert media > insert plugin >  insert the FLV  movie clip
    but it says
    'unable to find the plugin that handles this type of media'
    altho i now have 2 clips playing automatically!!!
    <p> </p>
    <p>
      <embed src="hoover-cam-51s-1.flv" width="320" height="240"></embed>
      <object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,29,0" width="320" height="240">
        <param name="movie" value="player.swf">
        <param name="quality" value="High">
        <param name="swliveconnect" value="true">
        <embed src="player.swf" quality="High" pluginspage="http://www.macromedia.com/go/getflashplayer" type="application/x-shockwave-flash" width="320" height="240" swliveconnect="true"></embed>
      </object>
      <object classid="clsid:166B1BCA-3F9C-11CF-8075-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/director/sw.cab#version=8,5,0,0" width="320" height="240">
        <param name="src" value="hoover-cam-51s-1.flv">
        <embed src="hoover-cam-51s-1.flv" pluginspage="http://www.macromedia.com/shockwave/download/" width="320" height="240"></embed>
      </object>
    </p>
    <body>
    <p>
      <object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,29,0" width="32" height="32">
        <param name="movie" value="dom-jet-13s.flv">
        <param name="quality" value="high">
      </object>
      <object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,29,0" width="32" height="32">
        <param name="movie" value="54s.flv">
        <param name="quality" value="high">
        <embed src="54s.flv" quality="high" pluginspage="http://www.macromedia.com/go/getflashplayer" type="application/x-shockwave-flash" width="32" height="32"></embed>
      </object>
      <embed src="player.swf" width="320" height="240"></embed>
    </p>
    <p> </p>
    <p>
      <embed src="hoover-cam-51s-1.flv" width="320" height="240"></embed>
      <object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,29,0" width="320" height="240">
        <param name="movie" value="player.swf">
        <param name="quality" value="High">
        <param name="swliveconnect" value="true">
        <embed src="player.swf" quality="High" pluginspage="http://www.macromedia.com/go/getflashplayer" type="application/x-shockwave-flash" width="320" height="240" swliveconnect="true"></embed>
      </object>
      <object classid="clsid:166B1BCA-3F9C-11CF-8075-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/director/sw.cab#version=8,5,0,0" width="320" height="240">
        <param name="src" value="hoover-cam-51s-1.flv">
        <embed src="hoover-cam-51s-1.flv" pluginspage="http://www.macromedia.com/shockwave/download/" width="320" height="240"></embed>
      </object>
    </p>
    </body>

  • Question about Windows Media plugin(s) for MBP

    I like to listen to baseball games via Gameday Audio at MLB.com; however, it seems that I am unable to do so using my MacBook Pro because their service requires a Windows Media Player plugin for your web browser, and the WMP installer does not install one.
    I tried installing Flip4Mac, but that's not universal yet, so I've come up empty. Any other suggestions as to how I can work around this problem and listen to the games?
    Thanks in advance,
    Crissy
    MacBook Pro 2GHz Intel Core Duo   Mac OS X (10.4.6)   1 GB DDR2 SDRAM

    I can't get Gameday Audio to play on my MacBook Pro either. I've intalled the Flip4Mac patch (via Rosetta); the Flash plugin; Windows Media Player; Firefox for Intel-based Macs. I've tried opening Safari in Rosetta. I've had some success playing free video and audio from Major League Baseball in both Safari and Firefox. But whenever I log into the subscription audio service, it fails. Firefox tells me, "Additional plugins are required" but, when I agree to install them, it informs me that "no suitable plugins were found." For its part, Safari says, "Some content on this page requires an Internet plug-in that Safari doesn’t support. The application “Windows Media Player” may be able to display this content. Would you like to try?" I do try, but WMP is never up to the task: the rainbow spinner goes round and round and Force Quit confirms that WMP is "not responding." I'm invariably reduced to pulling out my old computer to hear the game. I've talked to MLB about this, and suspect that their stock advice for Mac users does not yet take into account Intel-based Macs. Do I have to wait on Flip4Mac to produce a patch specifically for Intel-based Macs. Can anyone recommend anything else?
    MacBook Pro   Mac OS X (10.4.6)  

  • Advice needed: is BDB a good fit for what I aim at?

    Hello everyone,
    I'm not a BDB user (yet), but I really think that this the BDB library
    IS the perfect fit for my needs.
    I'm designing an application with a "tricky" part, that requires a very fast
    data storage/retrieval solution, mainly for writes (but for reads too).
    Here's a quick summary of this tricky part, that should at least use
    2 databases:
    - the first db will hold references to contents, with a few writes per hour
    (the references being "pushed" to it from a separate admin back end), but
    expected high numbers of reads
    - the second db will log requests and other events on the references
    contained in the first db: it is planned that, on average, one read from DB1
    will produce five times as much writes into DB2.
    To illustrate:
    DB1 => ~25 writes / ~100 000 reads per hour
    DB2 => ~500 000 writes / *(60?) reads per hour
    (*will explain about reads on DB2 later in this post)
    Reads and writes on both DBs are not linear, say that for 500 000 writes
    per hour, you could have the first 250 000 being done within 20 minutes,
    for instance. There will be picks of activity, and low activity phases
    as well.
    That being said, do the BDB experts here think that BDB is a good fit for
    such a need? If so or if not, could you please let me know what makes you
    think what you think? Many thanks in advance.
    Now, about the "*(60?) reads per hour" for BD2: actually, data from DB2
    should be accessed in real time for reporting. As of now, here is what
    I thing I should do to insure and preserve a high rate throughput not to
    miss any write in DB2 => once per minute another "DB2" is created that will
    now record new events. The "previous" DB2 is now dumped/exported into another
    database which will then be queried for real-time (not exactly real-time,
    but up to five minutes is an acceptable delay) reporting.
    So, in my first approach, DB2 is "stopped" then dumped each minute, to another
    DB (not necessarily BDB, by the way - data could probably re-structured another
    way into another kind of NoSQL storage to facilite queriing and retrieval
    from the admin back end), which would make 60 reads per hour (but "entire"
    reads, full db)
    The questions are:
    - do you think that renewing DB2 as often would improve or strain performances?
    - is BDB good and fast at doing massive dumps/exports? (OK: 500 000 entries per
    hour would make ~8300 entries per minute on average, so let's say that a dump's
    max size is 24 000 rows of data)
    - would it or not be better to read directly into the current DB2 as it is
    storing (intensively) new rows, which would then avoid the need to dump each
    minute and then provide more real-time features? (then would just need a daily
    dump, to archive the "old" data)
    Anyone who has had to face such questions already is welcome, as well as
    any BDB user who think they can help on this topic!
    Many thanks in advance for you advice and knowledge.
    Cheers,
    Jimshell

    Hi Ashok
    Many thanks for your fast reply again :)
    Ashok_Ora wrote:
    Great -- thanks for the clarification.Thank YOU, my first post was indeed a bit confusing, at least about the reads on DB2.
    Ashok_Ora wrote:
    Based on this information, it appears that you're generating about 12 GB/day into DB2, which is about a terabyte of data every 3 months. Here are some things to consider for ad-hoc querying of about 1 TB of data (which is not a small amount of data).That's right, this is quite a huge lot of data, and will keep growing, and growing... Although the main goal of the app is to be able to achieve (almost) real time reporting, it will also need to be able (potentially) to compute data over different time ranges, including yearly ranges for instance - but in this case, the real time capabilities wouldn't be relevant, I guess: if you look at some data on a year span, you probably don't need it to be accurate on a dayly interval, for instance (well, I guess), so this part of the app would probably only use the "very old" data (not the current day data), whatever it is stored in...
    Ashok_Ora wrote:
    Query performance is dramatically improved by using indexes. On the other hand, indexing data during the insert operation is going to add some overhead to the insert - this will vary depending on how many fields you want to index (how many secondary indices you want to create). BDB automatically indexes the primary key. Generally, any approach that you consider for satisfying the reporting requirement will benefit from indexing the data.> Thanks for pointing that out! I did envisage using indexes, but my concern was (and you guessed it) the expectable overhead that it brings. At this stage (but I may be wrong, this is just a study in progress, that will also need proper tests and benchmarking), I plan to favour write speed over anything else, to insure that all the incoming data is indeed stored, even if it is quite tough to handle in the primary stored form.
    I prefer to envisage (but again, it's not said that it is the right way of doing it) very fast inserts, then possibly re-process (sort of) the data later, and (maybe? certainly?) elsewhere, in order to have it more "query friendly" and efficient for moderately complex queries for legible reports/charts.
    Ashok_Ora wrote:
    Here are some alternatives to consider, for the reporting application:
    - Move the data to another system like MongoDB or CouchDB as you suggest and run the queries there. The obvious cost is the movement of data and maintaining two different repositories. You can implement the data movement in the way I suggested earlier (close "old" and open "new" periodically).This is pretty much "in line" with what I had in mind when posting my question here :).
    I found out in several benchmarks (there are not a lot, but I did find some ^^) that BDB amongst others is optimized for bunch queries, say that retrieving a whole lot of data is faster that, for instance, retrieving n times the same row. Is that right? Now, I guess that this is tightly related to the configuration and the server's performances...
    The process would then feed data into a new "DB2" instance every 60 seconds, and "dumping"/merging the previous one into another DB (BDB or else), which would grow until some defined limit.
    Would the "old DB2" > "main, current archive" be a heavy/tricky process, according to you? Especially as the "archive" DB is growing and growing - what would be a decent "limit" to take into account? I guess that 1TB for 3 months of data would be a bit big, wouldn't it?
    Ashok_Ora wrote:
    - Use BDB's SQL API to insert and read data in DB1 and DB2. You should be able to run ad-hoc queries using SQL. After doing some experiments, you might decide to add a few indices to the system. This approach eliminates the need to move the data and maintaining separate repositories. It's simpler.I read a bit about it, and this is indeed very interesting capabilities - especially as I know how to write decent SQL statements.
    That would mean that DB2 could grow more than just within a 60 seconds time span - but would this growing alter the write troughput? I guess so... This will require proper tests, definitely.
    Now, I plan the "real" data (the "meaningfull part of the data"), except timestamps, to be stored in quite a "NoSQL" way (this term is "à la mode"...), say as JSON objects (or something close to it).
    This is why I envisaged MongoDB for instance as the DB layer for the reporting part, as it is able to query directly into JSON, with a specific way to handle "indexes" too. But I'm no MongoDB expert in any way, so I'm not sure at all, again, that it is a good fit (just as much as I'm not sure right know what the proper, most efficient approach is, at this stage).
    Ashok_Ora wrote:
    - Use the Oracle external table mechanism (Overview and how-to - http://docs.oracle.com/cd/B28359_01/server.111/b28319/et_concepts.htm) to query the data from Oracle database. Again, you don't need to move the data. You won't be able to create indices on the external tables. If you do want to move data from the BDB repository into Oracle DB, you can run a "insert into <oracle_table> select * from <external_table_in_DB2>;". As you know, Oracle database is excellent database for all sorts of applications, including complex reporting applications.
    This is VERY interesting. VERY.
    And Oracle DB is, you're, a very powerful and flexible database for every kind of processes.
    I'll look into the docs carefully, many thanks for pointing that out (again!) :)
    I have not yet decided if the final application would be free nor open source, but this will eventually be a real question. Right now, I don't want to think of it, and just find the best technical solution(s) to achieve the best possible results.
    And BDB and Oracle DB are very serious competitors, definitely ;)
    Ashok_Ora wrote:
    Hope this was helpful. Let me know your thoughts.It definitely is so much useful! Makes things clearer and allow me to get more into BDB (and Oracle as well with your latest reply), and that's much appreciated. :)
    As I said, my primary goal is to insure the highest write throughput - I cannot miss any incoming data as there is no (easy/efficient) way to re-ask for what would be lost and get it again being sure that it hadn't changed (the simple act of re-asking would induce data flaws, actually).
    So, everything else (including reporting, stats, etc.) IS secondary, as long as what comes in is always stored for sure (almost) as soon as it comes in.
    This is why, in this context, "real" real-time is not really crucial, an can be "1 minute delayed" real time (could even be "5 minute delayed", actually, but let's be a bit demanding ^^).
    Ashok_Ora wrote:
    Just out of curiousity, can you tell us some additional details about your application?Of course, I owe you a bit more details as you help me a lot in my research/study :)
    The application is sort of a tracking service. It is primarily thought to serve the very specific needs of a client of mine: they have several applications that all use the same "contents". Those contents can be anything, text, HTML, images, whatever, and they need to know almost in real time what application (used by which external client/device) is requesting ressources, which ones, from where, in which locale/area and language, etc.
    Really a kind of "Google Analytics" stuff (which I pointed out at the very beginning, but they need something more specific, and, above all, they need to keep all the data with them, so GA is not a solution here).
    So, as you can guess, this is pretty much... big. On the paper, at least. Not sure if this will ever be implemented one day, to be honest with you, but I really want to do the technical study seriously and bring the best options so that they know where they plan to go.
    As of me, I would definitely love it if this could become reality, this is very interesting and exciting stuff. Especially as it requires to see things as they are and not to fall into the "NoSQL fashion" for the sake of being "cool". I don't want a cool application, I want an efficient one, that fits the needs ;) What is very interesting here is that BDB is not new at all, though it's one of the most serious identified players so far!
    Ashok_Ora wrote:
    Thanks and warm regards.
    ashokMany thanks again, Ashok!
    I'll leave this question opened, in order to keep on posting as I'm progressing (and to be able to get your thoughts and rewarding comments and advice above all :) )
    Cheers,
    Jimshell

  • Advice needed: what does your company log for SAP security role changes?

    My client has a situation where for many years, they never logged changes to SAP security roles.  By that I mean, they never logged even basic details, like who requested a change, tested it, approved it, and what changed!!  Sadly their ticketing system is terrible, completely free-form text and not even searchable. 
    Does anyone here use Word docs, Excel sheets, or some other way to capture security role change details?   What details do you capture?  What about Projects, that involve dozens of changes and testing over several months?
    I plan to recommend, at least, they need to use a unique# (a ticket#, or whatever) for every change and update the same in PFCG role desc tab, plus in CTS description of transports... but what about other details, since they have a bad ticketing system?  I spoke with internal audit and change Mgmnt "manager" about it, and they are clueless and will not make recommendations.  It's really weird but they will get into big trouble eventually without any logs for security changes!

    Does anyone here use Word docs, Excel sheets, or some other way to capture security role change details? What details do you capture? What about Projects, that involve dozens of changes and testing over several months?
    I have questions:
    a) Do you want to make things straight
    b) Do you want to implement a versioning mechanism
    c) You cannot implement anything technical, but you`re asking about best "paper" practise?
    The mentioned scenarios can be well maintained if you use SAP GRC Solutions 10 (Business Role Management)
    Task Based, Approvals, Risk Analysis, SOD and role generation and maintenance in a structured way (Business Role Management). Workflow based, staged process with approvals.
    PFCG transaction usage will be curtailed to minimum if implemented fully.
    Do we really want to do things "outside" PFCG?
    @all:
    a) do you guys use custom approval workflows for roles?
    b) how tight your processes are? how much paperwork, workflow, tickets, requests and incidents you have to go through to change a role?
    c) who is a friend of GRC here, raise your hand
    Cheers Otto
    p.s.: very interesting discussion, I would like to learn something here about how it works out there in the wild

  • Advice needed with Audio Punchin-in workflows for News

    I'm testing a new FCP installation for a News Broadcaster and reached a major stumbling block with the recording of Voice Over in tight turnaround situations. Previously the broadcaster fully utilized the 'Audio Punch-in' functionality of Avid Media Composer but we are yet to find a satisfactory equivalent in FCP or Soundtrack.
    The Avid workflow currently involves playing out our programme mix on Tracks 1 and 2 to our external mixer whilst simultaneously recording the Programme / VO mix from an external mixer on Avid Mojo i/p 3 and 4 onto tracks 3 and 4. They also record the clean VO from the external mixer via Avid Mojo i/p 5 and 6 to tracks 5 and 6. At any point the journalist could make a vo mistake so they would stop the vo recording and change the punch-in in point on the timeline and proceed to record over the required portion of vo with a defined playback preroll before the vo begins to record.
    These tools ensure they can very quickly record vo and fix mistakes on the fly without the need for further editing. The broadcaster relies on this functionality for tight turnaround bulletins and without an equivalent toolset in FCP this could be reason enough not to make the switch to FCP :O(
    To summarize they need the following features for voice over:
    - simultaneous recording of 4 audio channels
    - playing back the recorded track during pre-roll and then switching over to the inputs like a kind of EE operation
    - defining the tracks for recording and overwriting the previously recorded material if you record the same part again.
    Has anyone on this forum found a way to record VO like this on FCP or Soundtrack? Any advice would be gratefully received.
    Independent Editor, Technician and Trainer
    <Edited by Host>

    Open the Help menu in FCP, and search the online user's manual with the words "Voice Over Tool"... FCP has something similar to Avid's tool... you can keep nat sound etc, and record VO right to the timeline window. Even do multiple takes. Video can be routed externally to a monitor in a VO booth too during the record. The talent can have a headphone as well.
    Jerry

  • Advice needed about slow broadband speed

    Hi there,
    I was wondering if you could help me. I live in the countryside and wouldn't you guess it, I get slow internet speeds. Obviously I don't expect superfast broadband but I just want to see if there is any other reason to my slow speeds other then my distance from the exchange (which I believe is 4.3Km, a long distance I know).
    I am posting my router statistics (BT HomeHub 3) and the BT Wholesale Performance test report. Essentially I am curious whether old wiring or noise on the line is playing an issue. I'm certainly no expert but my home is quite old and I'm not sure when the wiring was installed so I'd really appreciate any advice. I've also just emailed BT Customer support about resetting the IP Profile as it has been extremely slow ever since the outages due to last weeks weather playing havoc. Any advice appreciated. 
    Thanks very much.
    Nick
    ADSL Line Status
    Connection Information
    Line state: Connected
    Connection time: 4 days, 11:07:43
    Downstream: 2.125 Mbps
    Upstream: 448 Kbps
    ADSL Settings
    VPI/VCI: 0/38
    Type: PPPoA
    Modulation: G.992.1 Annex A
    Latency type: Fast
    Noise margin (Down/Up): 5.8 dB / 12.0 dB
    Line attenuation (Down/Up): 63.5 dB / 31.5 dB
    Output power (Down/Up): 17.2 dBm / 12.3 dBm
    FEC Events (Down/Up): 0 / 91
    CRC Events (Down/Up): 6871 / 77
    BT Wholesale Broadband Performance Test:
    Download Speed (Mbps): 1.25
    Upload Speed (Mbps): 0.31
    Ping Latency(ms): 43.88

    Thank you for such a fast reply.
    There is the further diagnostic test below. The strange thing is I was getting around 3-3.5mbps downstream a week ago but with the frequent disconnections due to the weather I have a bad feeling the exchange has downgraded my line.
    1. Best Effort Test: 
    Download Speed : 1.5 Mbps
    Your speed test has completed and the results are shown above, however during the test an error occurred while trying to retrieve additional details regarding your service. As a result we are unable to determine if the speed you received during the test is acceptable for your service. Please re-run the test if you require this additional information.

  • Advice needed on which processor to buy for a Mega PC Pro

    Hello all, just registered,
    Could anyone recommend which processor I should buy for the MEGA PC 865 Pro I just ordered? I can see a Intel Pentium 4 630 3 Ghz Socket 775 800Mhz 2mb Retail Boxed on Ebuyer for £124 which is my price range but I'm not sure why it's good. The only criteria I have is that I will be doing video editing and I want my PC to stay quiet so I suppose there will be a trade-off somewhere. I'm confused about why there are so many processors with so little apparent difference between the.
    Also is Windows XP really the best software for a Pentuim 4? I prefer Windows 2000 but I read that XP takes advantage of Hyperthreading. BUt will I really notice any difference?
    Thanks

    richmitch, the thread you referred to is specifically for the Mega 865
    the 865PRO is a slightly different kettle of fish
    Win2K should work with no problems, not sure about the hyperthreading though, MSI just say that they only support customers with WinXP though; you can still get drivers for Win2K though
    having said that, Win2K and WinXP are still very much the same. if its the graphical look of WinXP you don't like, you can always set it to the "Classic" look, and it looks and feels like Win2K anyway
    all info on Mega 865PRO, click to links at left for drivers, CPU support etc.
    http://www.msi.com.tw/program/products/slim_pc/slm/pro_slm_detail.php?UID=602
    you can find info on differences between P4 CPUs in this chart
    http://www.intel.com/products/processor_number/index_view_p4.htm

Maybe you are looking for

  • Back ground job completion date and time

    Hi Experts, I am running one back ground job.I need that job completion date and time. where that values are stored.i need that table name where these values are storing. please let me know its urgent. Thanks in advance. Tharangini

  • PARALLEL option in oracle 8i

    Hi, I have a problem in using the parallel hint in my sql statement. I have enabled the parallel dml option, also given the parallel hint in the script. This works fine with the test database which is a clone of the production database. But it behave

  • Pagination do not have choice of returning all rows

    I have apex 3.0. When report is built, I normally give pagination scheme "Row Rnages 1-15 16-30 in select list (with pagination)", Number of Rows 15, Maxium Row count 2000. Problems are 1. Can not have "all rows" as select list choice ? (OEM grid has

  • BusinessObjects Enterprise 12.0FileStore is occupying 55 GB out of 91 GB

    HI Team, I am very much new to this SAP BI and sorry for this basic question. i am facing an issue which disc is utilizing more space. The folder Output in the path  E:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\FileStore is occup

  • Information message-urgent

    I am writing info message in a sales order user exit, for some condition types, the info message is coming as pop up and for some condition types, it is showing status. message i378(zv2). but i always need a pop up, please let me know ur inputs. than