Best practice - over issue of move orders in WIP

Hi Business analyst guru's,
I am working for a electronics manufacturing industry. I am trying to use work orders functionality across our organizations. I am stuck with a logic. Please advise me what you do in this scenario.
BoM : I have a a board assembly which requires 10 components say one of each component on BoM.
Setups:
all component's WIP supply type is push, Supply subinventory blank for all BoM components
Release backflush components is not enabled in WIP parameters.
one fine day I want to build 100 board assemblies and created the work order for 100, and released the move order through compoentn pick release form with transaction type as WIP component issue.
Now the warehouse picker went into warehouse to pick 100 of each component with a move order with 10 lines on it. 2 components are in reels which he can not split the reel and split reel can not be used in production floor (reel size is 1000). He can issue 1000 against on the move order
Now all the material went into production and build the assembly by 100 units. But two reels are left over with 900 of each. here I can do a WIP component return against the job.
But expectation is if I want to build another work order (say qty by 100 again) this 900 of two components should be available for the next work order. If I do component pick release, the move order should request only 8 components.
Is this possible? If not what is the best practice to avoid WIP return transaction and avoid splitting the reel size?
Thanks
Veerakumar

1) If you have these items as Push, you will get a move order line for 100. Therefore, even if you move a whole reel the floor, only 100 will get transacted out of warehouse. Someone will have to manually transact additional 900 to the floor otherwise your inventory accuracy will go for a toss.
Have you considered making those items pull?
1) As and when the worker needs reel, he raises a signal. (different ways of doing this - could be Oracle kanban, could be a visual signal, could be a holler - whatever works for you)
2) You transfer 1000 to the fllor
3) As and when jobs complete, the 100 units get issued to work order
4) Whatever is left over (say 800 after the 2 work orders) and not needed is transferred back to warehouse and you do a subinv xfer transaction.
2) If you can't make them pull, then you will be forced to move the 900 back to warehouse when the first job is done.
3) If you can't make them pull, you do a component pick release (CPR) for multiple jobs at a time? You can group your pick tickets by destination operation. This way, upon component release, you will have 1 move order line with qty =200. The picker transacts the move order line for 200 and a subinv xfer to WIP for 800.
4) Here is the best case scenario. Don't know if your floor layout or factory processes will support this.
You make the items pull on BOM. You have a temporary holding area on the floor (aka supermarket) . When operator needs the item, a visual signal is raised. The supervisor (aka spider) checks the supermarket and brings a reel to the operator. Upon completing the job(s), whatever is left of the reel goes to the supermarket. Once the reel is no more needed for the day (or week or month), you do a subinv xfer from supermarket to warehouse of whatever is left. The components get issued to work order upon completion (of operation/assembly).
Do the best you can out of this scenario. :)
Sandeep Gandhi

Similar Messages

  • Best practice - material staging for production order

    Hi Experts,
    could any of You pls, support me with some hints of best practice how to handle material staging WM-PP interface in a certain case?
    Up till now we had a system, where production had no separate location in IM, but one location existed including raw material wh and production. In the same time in WM we had separate storage types for production and raw materials u2013 hence we did material staging transferring goods only inside one IM location between different WM storage types. The material staging should be done based on separate prd. orders.
    Now this need to be changed and separate location need to be handled in IM for production u2013 which means the staging should be done between different IM locations and WM administration also need to be handled.
    Up till now we used LP10 for staging, then LB13 for TO creation etc. We can keep going like that, but if do so, there is another step required in IM u2013 movement 311, where material numbers and qty need to be added manually to finish the whole procedure. I would like to avoid this u2013 which makes the administrational procedure quite long.
    I have been checking the following possibilities:
    1.     Set released order parts-staging at control cycle and use MF60 for staging u2013 but I can not select requirements based on pro ordders here (only able to find demand if component including into selection)
    2.     Two step transfer 313/315 u2013 but this not a supported procedure u2013 313 TI /TO / 315
    3.     Try to find solution how to create 311 movement based on TO or based on WM stock at certain storage type / dynamic bin.
    I have failed.
    So, could any of You pls, support me with some useful ideas, how to handle material staging where 311 included and definetly the last step of procedure, but administrator does not need to enter items manually one by one in MIGO.
    All answers will be appreciated

    Hi,
    Storage location control should be able to take care of your problem.
    If you want to stage the material to a different IM location then the WM location then make the following settings
    If location xxxx is your WM location and location yyyy is your Production location.
    You have defined Production storage type ZZZ for production storage location YYYY and have maintained the supply area for the same
    In WM configuration - For interfaces - IM interface-Control of Assignment "Plant / Stor.Loc. - Whse Number"
    Assign location XXXX as the Standard Location. Maintain entry donot copy sloc in TR for location YYYY
    In WM configuration - For interfaces - IM interface-  Storage Location control for WH
    This entry ensures that there will be a WM tarnsfer Posting between your WM and Production storage Location automatically when you confirm your TO. You can have this done via a btach job also if you want cumulative posting. (schedule job RLLQ0100)

  • Best practice for exporting a .mov file for YouTube

    I am using FCPX 10.0.9 with a MacBook Pro. We are needing to upload relatively small-size high school football videos as .mov files to YouTube for my newspaper. I have been choosing the Export File option with the setting set to h.264, exporting the file to the desktop and then uploading that .mov file to YouTube. (I do not have Compressor installed on this Mac.) The other night the first of three files done like this uploaded fine, but the next two just kept processing and processing at YouTube and never finished. The following morning the files processed fine. Any thoughts about why ths might be happening?
    And would it be better to export in another file format --.mp4 files -- instead that might cause less of a problem and which of the export options would be best?
    Thanks,
    Douglas

    douglas i wrote:
    Am I correctly uploading a .mov file to YouTube by using the Export File command with the export setting set to .h264 then if I wish a reasonable size file for upload?
    Sure. There are many ways to get videos to YT and yours is a reasonable approach.
    "Reasonable size" should be thought of in terms of upload time. Again, YT will re-compress whatever you give it. So depending on how quickly you need to get it up on the Web, it's better to upload files that have more information than less information. (To that end, some folks upload very large Pro Res files; they also have a lot of patience.) 
    Just to let you know, the majority of FCPX users follow one of two workflows:
    1) Share>You Tube option and have FCP handle the uploading;
    2) Export as a Pro Res Master File; bring the master file into Compressor and apply the user's custom settings; upload using YT's uploader.
    FWIW, I prefer the second workflow.
    Russ

  • SQL 2012 Best Practice Analyzer issue with nothing available in pulldown on Microsoft Baseline Configuration Analyzer V2.0

    We have tried using both a Windows 7 and a Windows 8 machine and still cannot see any items available in the pulldown (ie. no sql 2012 or anything)  Is this a known issue and BPA does not work for SQL 2012?  Any suggestions?  I've seen several
    posts with the same issue but, no resolution.
    Laura

    Hi Laura,
    I installed Microsoft Baseline Configuration Analyzer 2.0 successfully. I can select a product: SQL Server 2012 BPA. Do you mean this?
    Thanks.
    If you have any feedback on our support, please click
    here.
    Maggie Luo
    TechNet Community Support

  • Autodiscover best practice over multiple domains

    Hi guys,
    Domain A is a SME with Exchange 2007, recently installed 2010 and working in a mixed mode. All working well.
    A company took over my company which looks after Domain A. A domain migration was done into Domain B, however the mail environment was left "as is" for now in the original domain.
    So all objects are in a single domain (parent company), but we have the old domain A with the users mailboxes still.
    Domain B is now about to deploy a 2010 infrastructure. Currently I use an SRV record on the split-brain internal DNS to sort out Autodiscover. I assume that when Domain B publishes their SCP into the domain that all the users sit on, my users (that have
    their mail on Domain A still) will pickup the SCP and get errors....
    So having everyone in domainb.com at the moment, but with a set of users having their mailboxes on domaina.org. Their SMTP is still domaina.org too. When domainb.com roles our their Ex2010 CAS and publishes an SCP how can I prevent my users from getting
    tripped up.
    How would / do you implement Autodiscover so different mail users in a single domain get put to their local CAS if using local domain email but gets pushed to a trusted domain if using a different email domain....?
    Ideas welcome! :)
    Thanks - Steve

    Hi Steve,
    Domain A migrated into domain B, and the legacy domain A still existing, right ?
    “We have forwarders in domainb to resolve requests to domaina, and we have a stub in domaina to point back to domainb.”
    How to do this ?  
    By the way, what’s your goal ? And do you want the users log on via new CAS server ?
    Wendy Liu
    TechNet Community Support

  • Best Practice - Order Types for Consumers

    I was curious what the best practice is for entering orders within the CRM Interaction Center.   Do you have a separate order type within the CRM system and pass it to the SAP system for "Consumer" orders and then have a different order type that is defined in the ECC system for distributor/wholesale orders?
    We are trying to determine if it is best practice to have a separate order type for consumer orders or not.
    Any help on this topic would be greatly appreciated.
    Thanks,
    Darcie

    Hello Darcie,
    I received the following guidance from one of our CRM consulting practice managers at SAP:
    "My recommendation to customers is that they separate out the number ranges and order types.  Especially if they are being created via different sales channels.  The reason is two-fold:
    1. Gives greater visibility to what is created in one channel vs another channel (easy to report by order/document type and visually, someone would see an order that starts with 7 and know that itu2019s for consumers whereas one that starts with a 4 would be for wholesale accounts).
    2. Even if the order/document types behave the exact same way today, they may want the flexibility to be able to change how a consumer order behaves without affecting the wholesale orders.  They could always add the new order type later, of course, but itu2019s easier to set that structure in place now."
    Regards,
    John

  • Return against Move order issue

    We need to get back the material in the inventory stock issued by move order transaction.
    Example:
    10 bag Cement issued by Move order. Now my stock is less by 10 bag.
    Now from these 10 bags 2 bags is returned to stock which will increase the stock by 2 bags.
    How we can do this in Purchasing/Inventory ?

    Account Alias and Misc receipt are same kind of transaction. Account Aliases are created in the system for the ease of use for the business.
    Ex: If a particular category items should always hit a spesific account code, in that case you can define the name and account tied to it.
    In this example, say cement is a category. Then you can create an account alias called Cement and put assign an account to it.
    Based on you are doing a issue or a receipt the account tied to this Cement AA will be Debited or Credited respectively.

  • Does anyone have "Best Practices" to move GRC 5.2 to its own environment?

    Hello,
    Has anyone else put the GRC applications on a shared server and then decided to move them?  We have been running Access Enforcer, & Compliance Calibrator on a box shared with Solution Manager.  We now have permission to put it on a separate server.  We have been looking for "Best Practices" for making this type of change.  If you have experience doing these would you be willing to share knowledge?  
    Also,  what little we have advise or knowledge we have found indicates "there is no way to do a J2EE system copy from a dual-stack system into a J2EE-only system."  Does anyone know if this is true?
    We want to move the GRC 5.2 to its own environment before we upgrade to GRC 5.3...

    Hi Renee,
        There is not best practice or methdology to move GRC AC 5.2 easily. AC 5.3 has import/export functionality in place to move most of the data from one environment to another. I will recommend you to upgrade to AC 5.3 and then move everything to new server.
    Another way is to ask your BASIS person and see if he can backup and restore DB on another server.
    there is no way to do a J2EE system copy from a dual-stack system into a J2EE-only system." Does anyone know if this is true? This is true.
    Regards,
    Alpesh

  • Hyperion Essbase Best Practices Document...

    Hi All,
    Is there any document out there that has the best practices for Hyperion Essbase development? I am looking for methodologies, naming conventions and such information. Wondering if there is any such doc with Hyperion Essbase Guru's that outlines the best approaches for Essbase-outline development with it . Searching this forum with the string "best practice" yields a lot of threads on specific issues, but couldn't find any document.
    Any pointer is most appreciated.
    Regards,
    Upendra.

    Various consulting organizations have different
    guidelines, each with their own strengths and
    weaknesses. To get them to cough it up without
    bringing them in for a paid project might be
    difficult, but not impossible.I agree with Doug here.. Many of these consulting companies have developed their best practices over a number of years and view them as a competitive advantage over the other consulting firms. I would be highly surprised if you managed to get ahold of such a document very easily.
    That being said, those same consultants share information here in bits and pieces, so you can learn at least some of the best practices here (along with best practice tips from consultants/developers/customers who don't have an 'official' best practices guide)..
    Tim Tow
    Applied OLAP, Inc

  • Joining mac to windows domain what are the best practice?

    Hi,
    I work in a MNC environment and we have been using Windows based system 95% of our servers are on windows and as of now 100% of our users are on windows. Now we are looking forward to give our management some Mac's. I wanted to know what would be the best practice to be followed in order to add Mac's to our existing domain's and use our AD. At the same time we have windows based file servers which are added to the user using windows script's on to the user profile.
    Thanks & Regards,
    Aj_Mac

    1) Use section name instead of Title View to name your report. This way sections can be collapsed and user can still see report name.
    2) Enable alternate coloring in tables and pivots for easy readablity and set table and pivot widths to 100% (for reports in dashboards) to reduce white space and achieve a more "professional look."
    3) Use column selectors and view selectors to reduce the width of reports and reduce the amount of columns user sees to a "practical minimum."

  • Prevent fractional Move Order Allocation

    We use "Move Order Issue" type Move orders in our system.
    We allocate Lot or serial Items, which are transacted as whole numbers. But the Move order creation/allocation API allows me to request and allocate fractional quantities for lot items. This should ideally fail allocation. SO for LOT Number XXXXX which has 10 quantites, I could potentially request .3, and get an allocation for .3 quantity. I want to prevent this from happening. Is there a setup which prevents fractional allocation.
    Thanks
    PHK

    PHK,
    Are you using API or form? If it is form you can easily achieve this using form personalization.
    API, you can stop calling allocation logic if available qty is less than the requested quantity.
    Thanks
    Nagamohan

  • Home Movie Cataloging - BEST PRACTICES

    I have about 200 hours of old home movies on VHS which I am in the process of adding to my iMac. I am wondering about 'best practices' on how much video can be stored inside of iMovie '08, when how much video becomes too much inside of the program, etc.
    In a perfect world, I'd like to simply import all of my home videos into iMovie, leave them in the 'library' section, and make 2-5 minute long clips in the 'projects' section for sharing with family members, but never deleting anything from the 'library'. Is this a good way to store original data? Would it be smarter to export all of the original video content to .DV files or something like that for space saving, etc?
    Can I use iMovie to store and catalog all of my old home movies in the same way I use iPhoto to store ALL of my photos, and iTunes to store ALL of my music/hollywood-movies, etc?

    We-ell, since no-one else has replied:
    1 hour of DV (digital video in the file system which iMovie uses) needs 13GB of hard disc space.
    You have 200 hours of video. 200 x 13 = 2,600 gigabytes. Two point six terabytes. If you put all that on one-and-a-bit 2TB hard discs, and a hard disc fails - oops! - where's your backup? ..Ah, on another one-and-a-bit 2TB hard discs ..or, preferably, spread over several hard discs, so that if one fails you haven't lost everything!
    iMovie - the program - can handle video stored on external discs. But are you willing to pay the price for those discs? If so; fine! Digitise all your VHS and store it on computer discs (prices come down month by month).
    Yes, you can "mix'n'match" clips between different projects, making all sorts of "mash-ups" or new videos from all the assorted video clips. But you'll need more hard disc space for the editing, too. You could use your iMac's internal hard disc for that ..or use one of the external discs for doing the editing on. That's how professionals edit: all the video "assets" on external discs, and edit onto another disc. That's what I do with my big floorstander PowerMac, or whatever those big cheesegraters were called..
    So the idea's fine, as long as you have all the external storage you'd need, plus the backup in case one of those discs fails, and all the time and patience to digitise 200 hours of VHS.
    Note that importing from VHS will import material as one long, continuous take - there'll be no automatic scene breaks between different shots - so you'll have to spend many hours chopping up the material into different clips after importing it.
    Best way to index that? Dunno; there have been several programs which supposedly do the job for you (..I can't remember their names; I've tried a few: find them by Googling..) but they've been more trouble - and taken up more disc space - than I've been prepared to bother with. I'd jot down the different clips as you create them, either by jotting in TextEdit (simplest) or in a database or spreadsheet program such as Excel or Numbers or similar ..or even in a notebook.
    Jot down the type of footage (e.g; 16th Birthday party), name of clip (e.g; 016 party), duration (e.g; 06:20 mins and seconds) and anything else you might need to identify each clip.
    Best of luck!

  • Best practices for managing Movies (iPhoto, iMovie) to IPhone

    I am looking for some basic recommendations best practices on managing the syncing of movies to my iPhone. Most of my movies either come from a digital camcorder into iMovie or from a digital Camera into iPhone.
    Issues:
    1. If I do an export or a share from iPhoto, iMovie, or QuickTime, what formats should I select. I've seem 3gp, mv4.
    2. When I add a movie to iTunes, where is it stored. I've seen some folder locations like iMovie Sharing/iTunes. Can I copy them directly there or should I always add to library in iTunes?
    3. If I want to get a DVD I own into a format for the iPhone, how might I do that?
    Any other recommedations on best practices are welcome.
    Thanks
    mek

    1. If you type "iphone" or "ipod" into the help feature in imovie it will tell you how.
    "If you want to download and view one of your iMovie projects to your iPod or iPhone, you first need to send it to iTunes. When you send your project to iTunes, iMovie allows you to create one or more movies of different sizes, depending on the size of the original media that’s in your project. The medium-sized movie is best for viewing on your iPod or iPhone."
    2. Mine appear under "movies" which is where imovie put them automatically.
    3. If you mean movies purchased on DVD, then copying them is illegal and cannot be discussed here.
    From the terms of use of this forum:
    "Keep within the Law
    No material may be submitted that is intended to promote or commit an illegal act.
    Do not submit software or descriptions of processes that break or otherwise ‘work around’ digital rights management software or hardware. This includes conversations about ‘ripping’ DVDs or working around FairPlay software used on the iTunes Store."

  • Best practice on 'from dev to test'move.

    Hello.
    My Repository 10.2.0.1.0
    My client 10.2.1.31
    I am writing to ask someone what would be the best practice and most common_sense_oriented way to move OWB from dev to qa/test environment?
    I have read a number of recommendations on this forum and other oracle docs and it seem somewhat tedious exercise...
    At the moment i am either having to simply copy and paste (ye not very professional but works a treat!) then just re-sync the tables to point to correct location on some of the smaller of my projects.
    Now i have a huge project with hundreds of maps with different source locations etc.
    I want to move it into test.
    My test environment is where we test ETL process before implementing it in live as oppose to UAT test.
    I imported the tables into OWB etc from the test, now i want to move my maps from dev to test and this is where my HOW TO comes.
    I have different runtime repositories on my test as per oracle recommended (same names apply to dev and test and live repositories for the consistency purposes). Importing the maps from the export of dev to test doesn't really work and i don't really want start tweaking with export files.
    for some reasons the import only imports it back to the project the export was taken from.... (which is as useful as a smack on a head, in my humble).
    copy and paste the re-sync all tables would be madness, misery and pain all in one!
    So what do i need to do?
    1) OK i imported all the tables and views from the test environment into OWB
    2) How do i move my maps from dev to test?
    3) even if i copy them over - would i honeslty have to then resync tables in every single map (i am already crying by the thought of it)?!
    It seem a little tidious to me.
    I can imagine that there is no silver bullet and everyone have different ideas, but someone please share your experience on how would do it?
    Here is something from the user guide and no matter how many times i read this - i just don't get how i can relate it to what i need to achieve.
    The following quote is from "OWB User Guide", Chapter 3.
    "Each location defined within a project can be registered separately within each Runtime Repository, and each registration can reference different physical information. Using this approach, you can design and configure a target system one time, and deploy it many times with different physical characteristics. This is useful if you need to create multiple versions of the same system such as development, test, and production."
    As i said - i have all my tables imported from DB to OWB, now how do i make my maps to appear in repository on test? I can see the relevance of location to deploy maps into the test runtime repository, but before then i somehow have to make them to appear in my test runtime repository in Design and make sure they are referencing correct tables etc.
    Any help would be greatly appreciated.
    Kind Regards
    Vix

    Hello Oleg.
    Thank you very much for such detailed and very helpful reply.
    You are correct - i have my Design Centre and within it two projects - dev and test.
    Dev has all the locations pointing to the Development DB and it has it's own runtime repository/control centre configured.
    Test has all the locations pointing to the Test DB and it also has it's own runtime repository/control centre configured.
    I have one design centre and two runtime environments.
    both dev and test have identical tables etc. I moved the logic over form dev to test (all the functions, procedures etc), i have also imported tables and logic from TEST DB to the test project.
    All i want to do is now move over the maps from DEv to TEST. Which is not a problem (copy and paste are helpful), but then the copied maps are still point to the tables in Dev. Which means i have to sync it with test tables - i hope i am making sense here!
    I was hoping that there is some clever way of just changing something to effectively tell table in the map 'to point to the table in this database'. If the map is already configured - the only way to do it is to sync the tables, which will enable you to select the DB and table you want your table in the map to be pointing to.
    The reason i do not use imp/exp between projects - it is not really reliable. Have to then jump through the hoops ensuring all contrains etc are there. It is safer to just import whatever i need from DB - ensuring all my constrains etc are there.
    I do regular exports as a means of 'having a backup copy of the project', but never managed to import anything from one project to another (was easier with OWB 9 where it was possible to amend the .mdl file). It works fine to import back to the project the export was taken from.
    I don't have problems with the location etc - took me hours to set everything the way i wanted it to be and now all the deployments are going to the right schemas, DBs etc.
    Is there are any other way re-pointing the tables in the map to another DB? Like in the falt files - there is an option to choose the location of the file. So once the location is define/registered etc - you can choose whatever one is needed fromm the drop down on the left of the map.
    I hoped there would be something similar for the tables. Like a big bulk option for 'tick here if you want all tables in the maps to be pointing to identical tables in another DB) type thing. Guess something like bulk sync option...
    Oh well, guess i just have to stick with sync option (sobbing uncontrolably) and it hasn't stopped raining here for days!
    Once again - thank you very much for all your kind help and advice.
    Kind Regards
    Vix

  • Best Practice for Roaming Profiles over Branch Offices

    Hello Everybody,
    I was hoping I could gain advice from experienced engineers on an issue me and my team are currently experiencing.
    The issue:
    We have a client that has their main office in London and this company has other remote offices over the world, Paris, Milan, Luxembourg etc.. Each remote office has a local DC&file server installed as two separate servers or both roles on the same
    server. Everything is central to London, all the main file shares that the company uses is based in London and the terminal server's are based in London too.
    We have DFS-R & N set up on the London File servers to replicate the dfs shares over the remote offices which works fine and we don't generally ever have any issues with this and works well when user's in remote offices access file shares from
    London.
    However and I didn't set this up but this client also has DFS-R & N set up for roaming profiles!!! The issue we are having here is only with the terminal servers. For example I will log in as one of the users from London on the terminal server and will
    load the profile fine, I will log that user off and log in as the admin and remove the profile through advanced system settings. I will then log back on as that same user and will be given a temporary profile, I repeat the first step and the profile loads
    fine, so every other log on will load the user with a temporary profile. I know this is the case because for that user, if I change the profile path in AD from
    \\xxxx.com\public to
    \\lon-fs3v\profiles$\user it then loads every time with no issues. Before anyone asks I have rebuilt the terminal server(s) to rule out if they were misconfigured. You may ask why not do that for everyone? We can't do that for people based in Milan otherwise
    their profile will forever take time to load and that's where the dfs replication comes into play for the profiles.
    Unfortunately they are a very stubborn client so some users (important people) have very large profiles which sometimes takes a while to load up.
    I have done some reading already on the web and have seen the unsupported scenario from Microsoft regarding this (
    https://support.microsoft.com/en-gb/kb/2533009 )  so unsure the best way to do it. The link I've put in does say you can have issues with the profiles loading (which we do) if there are too many
    connections which we have 10 to replicate profiles around the remote offices.
    I have done some reading into the hostedbranch cache method but not sure if you can do this with roaming profiles or not?
    We generally want to eliminate the issue for users when logging on remotely with getting a temp profile every other log on attempt. I must add though this issue never occurs on the workstations just the terminal server, our client is in the private equity
    market so one user may just spend one day in a remote office and then come back to London to carry on as they normally would.
    So that's the background to the issue and I was generally trying to work out what methods or if this is possible with the branch cached method for roaming profiles ?
    Thank you to anyone who replies.
    Best,
    Liam

    Hi UC3ngineer,
    Agree with Luca.
    If the branch users need to do lot of conferencing, the best practice is to deploy an new Front End Pool and an Edge Server in Branch Office; Otherwise you must have a 100%
    reliable WAN connection to your central site.
    Best regards,
    Eric
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

Maybe you are looking for

  • Slow Performance

    My Macbook Pro runs very slow. I know this could be for many reasons. However I've done all the usual to see if that fixes the problem. HD has 130Gb of space, permissions repaired, cache emptied, software updated, scripts run etc etc. I do notice in

  • ITunes deletes song files from my computer when I sync a device

    When I sync a device in iTunes, it will sync fine and copy over all the files I need it to, but sometimes it will delete those files from my computer after it finishes copying them over. I'm not sure why this happens, I just know that I definitely ha

  • Which USB to DVI graphics card adapter for MBP 13

    After talking with Apple reps who had different answers, I understand that to do a second external display on a MBP 13, the only option is a USB to DVI adapter that basically has a graphics processor in it. My question is, does anyone know which bran

  • ERM - Workflow Approval Configuration in ERM and CUP

    Hi Experts, I'm in the midst of configuring the workflow approval for ERM and have some queries. I followed the post-installation guide part 1 for ERM on the workflow configuration and have sucessfully done the following: 1. Verified that the "AE_ini

  • Acrobat ERROR playing media clip...

    I created a PDF with Mindjet and also got a pdf for testing from them... both create this error: "Acrobat has encountered an error while playing this media clip: The media clip cannot be played in the manner that the author intended. you may need to