Best Practice for Plan for Every Part (PFEP) Database/Dashboard?

Hello All-
I was wondering if anyone had experience with implementing / developing a Plan for Every Part (PFEP) Database in SAP. My company is looking to migrate its existing PFEP solution (Custom developed Excel/Access system) into SAP. If you are unfamiliar, a PFEP is a dashboard view of a part/material that provides various business groups with dedicated views to data from Material Masters, Info Records, and Vendor Master Records and combines it with historical/forecasting information. The goal is to provide a single source to all the part/material settings for a given part.
Is there a Best Practice PFEP in SAP? Or if this is something that most companies custom develop in ERP or BI?
Thanks in advance.
-Ron

I think you will likely get a response in SAP ERP - Logistics Materials Management (SAP MM)
additionally you might want to do some searches based on SAP Lean Inventory, perhaps Kanban. I am assuming you are not using WM or EWM either?
Where I have seen PFEP incorporated into the supply chain strategy this typically requires not inconsiderable additions to the alternate UoM in MM dropping of automatic replenishment levels (reorder level) and rethinking aspects of the MRP plan so be prepared or significant additional data management work if you haven't already started on that. I believe Ryder logistics uses PFEP and theirSAP infrstructure is managed by IBM; might be an idea to try and find a linkedin  resource from there. You may also find one of the ASUG supply chain,logistics,  MM or WM sigs a good place to also ask questions and look for answers.

Similar Messages

  • Best Practice for Removing Zeroes from Database

    Does anyone have some clever bits of code or best practices for evaluating a database and instances of zeroes? I'm working on cleaning up our rules file and am thinking the best way to start would be to write some code to look for zeroes and write them to a log file. This would at least indicate if there was even a problem with zeroes (which there may or may not be).
    Any suggestions out there / utilities / code samples?
    Thanks.

    We accomplished this using data extracts from a subset of scenarios/years/entities/accounts to ensure that all of our potential rules could be checked to ensure they were not writting zero's. This worked pretty well for our purposes, a text editor called EmEditor allows for VB macros in it pretty easily and we could write a quick macro to check for strings ending in "; 0." You may also want to review your check box of calculated in your extract and see if the zeros are a result of calculations. A rule output could work pretty well, although it would take some defining as you would have to write it out in a sub and make sure that you capture the data of all subroutines if your zero's are rule driven or actual inputs. May want to review some if you have very small insignificant values getting written, seen items that have one value 13 places to the right of the decimal that were not really signficant.
    JTF

  • Looking For Guidance: Best Practices for Source Control of Database Assets

    Database Version: 11.2.0.3
    OS: RHEL 6.2
    Source Control: subversion
    This is a general question aimed at database professionals, however, it is not specific to any oracle version, etc.  Its a leadership question for other Oracle shops regarding source control.
    The current trunk, in my client's source control, is the implementation of a previous employee who used ER Studio.  After walking the batch scripts and subordinate files , it was determined that there would be no formal or elegant way to recreate the current version of the database from our source control - the engineers who have contributed to these assets are no longer employed or available for consulting.  The batch scripts are stale, if you will.
    To clean this up and to leverage best practices, I need some guidance on whether or not to baseline the current repository and how to move forward with additions of assets; tables, procs, pkgs, etc.  I'm really interested in how larger oracle shops organize their repository - what directories do you use, how are they labeled...are they labeled with respect to version?
    Assumptions:
    1. repository (database assets only) needs to be baselined (?)
    2. I have approval to change this database directory under the trunk to support best practices and get the client steered straight in terms of recovery and
    Knowns:
    1. the current application version in the database is 5.11.0 (that's my client's application version)
    2. this is for one schema/user of a database (other schemas under the database belong to different trunks)
    This is the layout that we currently have and for the privacy of the
    client I've made this rather generic.  I'd love to have a fresh
    start...how do I go about doing that...initially, I like using
    SqlDeveloper's ability to create sql scripts from a connected target. 
    product_name
      |_trunk
         |_database
           |_config
           |_data
           |_database
           |_integration
           |_patch
           |   |_5.2A.2
           |   |_5.2A.4
           |   |_5.3.0
           |   |_5.3.1
           |
           |_scripts
           |   |_config
           |   |_logs
           |
           |_server
    Thank you in advance.

    HiWe are using Data ONTAP 8.2.3p3 on our FAS8020 in 7-mode and we have 2 aggregates, a SATA and SAS aggregate. I want to decommission the SATA aggregate as I want to move that tray to another site. If I have a flexvol containing 3 qtrees CIFS shares can I use data motion (vol copy) to move the flex vol on the same controller but to a different aggregate without major downtime? I know this article is old and it says here that CIFS are not supported however I am reading mix message that on the version of data ONTAP we are now on does support CIFS and data motion however there will be a small downtime with the CIFS share terminating. Is this correct? Thanks

  • Best Practices for zVM/SLES10/zDB2 environment for dialog instances.

    Hi,  I am a zSeries system programmer who has just completed an IBM led Proof of Concept which demonstrated the viability of running SAP instances on SUSE SLES10 Linux booted in zVM guests and accessing zDB2 data via hipersockets. Before we build a Linux infrastructure using the 62 IFLs we just procured, we are wondering if any best practices for this environment have been developed as an OSS note or something else by SAP.    Below you will find an email which was sent and responded to by IBM and Novell on these topics...
    "As you may know, Home Depot has embarked on an IBM led proof of concept using SUSE SLES10 running in zVM guests on IBM zSeries hardware to host SAP server instances.  The Home Depot IT organization is currently in the midst of a large scale push to modernize our merchandising and people systems on SAP platforms.  The zVM/SUSE/SAP POC is part of that effort, as is a parallel POC of an Intel Blade/Red Hat/SAP platform.  For our production financial systems we now use a pSeries/AIX/SAP platform.
          So far in the zVM/SUSE/SAP POC, we have been able to create four zVM LPARS on IBM z9 hardware, create twelve zVM guests on those LPARS, boot SLES10 in those guests, install and run SAP instances in those guests using hipersockets for access to our DB2 SAP databases running on zOS, and direct user workloads to the SAP instances with good results.  We have also successfully developed cloning scripts that have made it possible to create new SLES10 instances, configured and ready for SAP installs, in about 10 seconds using FLASHCOPY and IBM DASD.
          I am writing in the hope that you can direct us to technical resources at IBM/Novell/SAP who may be able to field a few questions that have arisen.  In our discussions about optimization of the zVM/SUSE/SAP platform, we wondered if any wisdom about the appropriateness of and support for using zVM capabilities to virtualize SAP has ever been developed or any best practices drafted.  Attached you will find an IBM Redbook and a PowerPoint presentation which describes the use of the zVM discontiguous shared segments and the zVM named saved system features for the sharing of reentrant code and other  elements of Linux and its applications, thereby conserving storage and disk resources allocated to guest machines.   The specific question of the hour is, can any SAP code be handled similarly?  Have specific SAP elements eligible for this treatment been identified? 
          I've searched the SUSE Knowledgebase for articles on this topic to no avail.  Any similar techniques that might help us reduce the total cost of ownership of a zVM/SUSE/SAP platform as we compare it to Intel Blade/Red Hat/SAP and pSeries/AIX/SAP platforms are of great interest as we approach the end of our POC.  Can you help?
          Greg McKelvey is a Client I/T Architect at IBM.  He found the attached IBM documents and could give a fuller account of our POC.  Pat Downs, IBM zSeries IT Architect, has also worked to guide our POC. Akshay Rao, IBM Systems IT Specialist - Linux | Virtualization | SOA, is acting as project manager for the POC.  Jim Hawkins is the Home Depot Architect directing the POC.  I've CC:ed their email addresses.  I am sure they would be pleased to hear from you if there are the likely questions about what the heck I am asking about here.  And while writing, I thought of yet another question that I hoping somebody at SAP might weigh in on; are there any performance or operational benefits to using Linux LVM to apportion disk to filesystems vs. using zVM to create appropriately sized minidisks for filesystems without LVM getting involved?"
    As you can see, implementation questions need to be resolved.  We have heard from Novell that the SLES10 Kernel and other SUSE artifacts can reside in memory and be shared by multiple operating system images.  Does SAP support this configuration?  Also, has SAP identified SAP components which are eligible for similar treatment?  We would like to make sure that any decisions we make about the SAP platforms we are building will be supportable.  Any help you can provide will be greatly appreciated.  I will supply the documents referenced above if they are not known to any answerer.  Thanks,  Al Brasher 770-433-8211 x11895 [email protected]

    Hello AL ,
    first, let me welcome you on board,  I am sure you won't be disapointed with your choice to run SAP on ZOS.
    as for your questions,
    it wan't easy to find them in this long post , so i suggest you take the time to write a short summary that contains a very short list of questions.
    as for answers.
    here are a few usefull sources of information :
    1. the sap on db2 for Z/os sdn page :
    SAP on DB2 for z/OS
    in it you can find 2 relevant docs :
    a. best practices for ...
    b. database administration for db2 udb for z/os .
    this second publication is excellent , apart from db2 specific info , it contains information on all the components of the sap on db2 for z/os like zlinux,z/vm and so on ...
    2. I can see that you are already familiar with the ibm redbooks , but it seems that you haven't taken the time to get the most out of that resource.
    from you post it is clear that you have found one usefull publication , but I know there are several.
    3. a few months ago I wrote a short post on a similar subject ,
    I'm sure its not exactly what you are looking for at this moment , but its a good start , and with some patience you may be able to get some answers.
    here's a link
    http://blogs.ittoolbox.com/sap/db2/archives/index-of-free-documentation-on-sap-db2-administration-14245
    good luck.
    omer brandis.

  • Best practice for the test environment  &  DBA plan Activities    Documents

    Dears,,
    In our company, we made sizing for hardware.
    we have Three environments ( Test/Development , Training , Production ).
    But, the test environment servers less than Production environment servers.
    My question is:
    How to make the best practice for the test environment?
    ( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
    Also please , Can I have a detail document regarding the DBA plan activities?
    I appreciate your help and advise
    Thanks
    Edited by: user4520487 on Mar 3, 2009 11:08 PM

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

  • Best practice for Plan and actual data

    Hello, what is the best practice for Plan and actual data?  should they both be in the same app or different?
    Thanks.

    Hi Zack,
    It will be easier for you to maintain the data in a single application. Every application needs to have the category dimension, mandatorily. So, you can use this dimension to maintain the actual and plan data.
    Hope this helps.

  • What is best practice for dealing with Engineering Spare Parts?

    Hello All,
    I am after some advice regarding the process for handling engineering spare parts in PM. (We run ECC 5)
    Our current process is as follows:
    All materials are set up as HIBE's
    Each material is batch managed
    The Batch field is used for the Bin location
    We are now looking to role out PM to a site that has in excess of 50,000 spare parts and want to make sure we use best practice for handling the spare parts. We are now considering using a basic WM setup to handle the movement of parts.
    Please can you provide me with some feedback on what you feel the best practice is for dealing with these parts?
    We are looking to set up a solution that will us to generate pick lists etc and implment a scanning solution to move parts in and out of stores.
    Regards
    Chris

    Hi,
    I hope all the 50000 spare parts are maintained as stock items.
    1. Based on the usage of those spare parts, try to define safety stock & define MRP as "Reorder Point Planning". By this, you can avoid petty cash purchase.
    2. By keeping the spare parts (atleast critical components) in stock, Planned Maintenance as well as unplanned maintenance will not get delayed.
    3. By doing GI based on reservation, qty can be tracked against the order & equipment.
    As this question is MM & WM related, they can give better clarity on this.
    Regards,
    Maheswaran.

  • Best Practice for Planning and BI

    What's the best practice for Planning and BI infrastructure - set up combined on one box or separate? What are the factors to consider?
    Thanks in advance..

    There is no way that question could be answered with the information that has been provided.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • What is the best practice for APO - Demand planning implementation

    Hi,
    M client wants to implement demand planning.
    Cient has come up with one scenario like a New Customer is created in ECC, and if I use BI and then APO flow for Demand planning, user will have to wait for another day. (AS BI is always having one day delay).
    For this scenarios user is insisting on ECC and APO-DP interface.
    Will anybody suggest what should be the best practice for Demand planning.
    ECC -> Standalone BI -> Planning area (Planning is done in APO) -> Stand alone BI
    Or ECC -> APO-DP (Planning is done in APO) -> Standalone BI system
    I hope I am able to explain my scenario.
    Regards,
    Saurabh

    Any suggestions !!

  • Best practice for implementing Manufacturing Cost Planning ( MCP)

    is there any best practice for implementing Manufacturing Cost Planning ( MCP) using BI-IP?

    Hi:
            Both options are viable. If you reverse posting in FB50 then FI GL account postings will also be reversed and along with cost center postings. Hence here advantage is that cost center reversal will be with referenced to the original document with which wrong posting were made. Disadvantage here is that you will to post the entry again in FB50 . In KB11N you will simply transfer cost center amount from wrong to new one that should be in place of it but here you will have no reference . I personally think reversing posting through FB50 is viable options , reverse postings can be seen in KSB1 as well against that cost center.
    Regards

  • Best practice: Deployment plan for cluster environment

    Hi All,
    I want to know, which way is the best practice for preparing and deploying new configuration for WLS-cluster environment. How can I plan a simultan deployment of ALL of nodes, with out single point of failure?
    Regards,
    Moh

    Hi All,
    I get the Answer as followed:
    When you deploy an application OR redeploy an application, the deployment is initiated from the Admin Server and it it initiated on all targets (managed servers in the cluster) at the same time based on targets (which is expected to be cluster).
    We recommend that applications should be targeted to a cluster instead of individual servers whenever a cluster configuration is available.
    So, as long as you target the application to the cluster, the admin server will initiate the deployment on all the servers in a cluster at the same type, so application is in sync on all servers.
    Hope that answers your queries. If not, please let me know what exactly you mean by synchronization.
    Regards,
    Moh

  • Best practices for loading apo planning book data to cube for reporting

    Hi,
    I would like to know whether there are any Best practices for loading apo planning book data to cube for reporting.
    I have seen 2 types of Design:
    1) The Planning Book Extractor data is Loaded first to the APO BW system within a Cube, and then get transferred to the Actual BW system. Reports are run from the Actual BW system cube.
    2) The Planning Book Extractor data is loaded directly to a cube within the Actual BW system.
    We do these data loads during evening hours once in a day.
    Rgds
    Gk

    Hi GK,
    What I have normally seen is:
    1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
    2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
    For DP monthly, SNP daily
    You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
    Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
    Thanks - Pawan

  • Best practices for setting up projects

    We recently adopted using Captivate for our WBT modules.
    As a former Flash and Director user, I can say it’s
    fast and does some great things. Doesn’t play so nice with
    others on different occasions, but I’m learning. This forum
    has been a great source for search and read on specific topics.
    I’m trying to understand best practices for using this
    product. We’ve had some problems with file size and
    incorporating audio and video into our projects. Fortunately, the
    forum has helped a lot with that. What I haven’t found a lot
    of information on is good or better ways to set up individual
    files, use multiple files and publish projects. We’ve decided
    to go the route of putting standalones on our Intranet. My gut says
    yuck, but for our situation I have yet to find a better way.
    My question for discussion, then is: what are some best
    practices for setting up individual files, using multiple files and
    publishing projects? Any references or input on this would be
    appreciated.

    Hi,
    Here are some of my suggestions:
    1) Set up a style guide for all your standard slides. Eg.
    Title slide, Index slide, chapter slide, end slide, screen capture,
    non-screen capture, quizzes etc. This makes life a lot easier.
    2) Create your own buttons and captions. The standard ones
    are pretty ordinary, and it's hard to get a slick looking style
    happening with the standard captions. They are pretty easy to
    create (search for add print button to learn how to create
    buttons). There should instructions on how to customise captions
    somewhere on this forum. Customising means that you can also use
    words, symbols, colours unique to your organisation.
    3) Google elearning providers. Most use captivate and will
    allow you to open samples or temporarily view selected modules.
    This will give you great insight on what not to do and some good
    ideas on what works well.
    4) Timings: Using the above research, I got others to
    complete the sample modules to get a feel for timings. The results
    were clear, 10 mins good, 15 mins okay, 20 mins kind of okay, 30
    mins bad, bad, bad. It's truly better to have a learner complete
    2-3 short modules in 30 mins than one big monster. The other
    benefit is that shorter files equal smaller size.
    5) Narration: It's best to narrate each slide individually
    (particularly for screen capture slides). You are more likely to
    get it right on the first take, it's easier to edit and you don't
    have to re-record the whole thing if you need to update it in
    future. To get a slicker effect, use at least two voices: one male,
    one female and use slightly different accents.
    6) Screen capture slides: If you are recording filling out
    long window based databse pages where the compulsory fields are
    marked (eg. with a red asterisk) - you don't need to show how to
    fill out every field. It's much easier for the learner (and you) to
    show how to fill out the first few fields, then fade the screen
    capture out, fade the end of the form in with the instructions on
    what to do next. This will reduce your file size. In one of my
    forms, this meant the removal of about 18 slides!
    7) Auto captions: they are verbose (eg. 'Click on Print
    Button' instead of 'Click Print'; 'Select the Print Preview item'
    instead of 'Select Print Preview'). You have to edit them.
    8) PC training syntax: Buttons and hyperlinks should normally
    be 'click'; selections from drop down boxes or file lists are
    normally 'select': Captivate sometimes mixes them up. Instructions
    should always be written in the correct order: eg. Good: Click
    'File', Select 'Print Preview'; Bad: Select 'Print Preview' from
    the 'File Menu'. Button names, hyperlinks, selections are normally
    written in bold
    9) Instruction syntax: should always be written in an active
    voice: eg. 'Click Options to open the printer menu' instead of
    'When the Options button is clicked on, the printer menu will open'
    10) Break all modules into chapters. Frame each chapter with
    a chapter slide. It's also a good idea to show the Index page
    before each chapter slide with a progress indicator (I use an
    animated arrow to flash next to the name of the next chapter), I
    use a start button rather a 'next' button for the start of each
    chapter. You should always have a module overview with the purpose
    of the course and a summary slide which states what was covered and
    they have complete the module.
    11) Put a transparent click button somewhere on each slide.
    Set the properties of the click box to take the learner back to the
    start of the current chapter by pressing F2. This allows them to
    jump back to the start of their chapter at any time. You can also
    do a similar thing on the index pages which jumps them to another
    chapter.
    12) Recording video capture: best to do it at normal speed
    and be concious of where your mouse is. Minimise your clicks. Most
    people (until they start working with captivate) are sloppy with
    their mouse and you end up with lots of unnecessarily slides that
    you have to delete out. The speed will default to how you recorded
    it and this will reduce the amount of time you spend on changing
    timings.
    13) Captions: My rule of thumb is minimum of 4 seconds - and
    longer depending on the amount of words. Eg. Click 'Print Preview'
    is 4 seconds, a paragraph is longer. If you creating knowledge
    based modules, make the timing long (eg. 2-3 minutes) and put in a
    next button so that the learner can click when they are ready.
    Also, narration means the slides will normally be slightly longer.
    14) Be creative: Capitvate is desk bound. There are some
    learners that just don't respond no matter how interactive
    Captivate can be. Incorporate non-captivate and desk free
    activities. Eg. As part of our OHS module, there is an activity
    where the learner has to print off the floor plan, and then wander
    around the floor marking on th emap key items such as: fire exits;
    first aid kit, broom and mop cupboard, stationary cupboard, etc.
    Good luck!

  • Best practice for managing a Windows 7 deployment with both 32-bit and 64-bit?

    What is the best practice for creating and organizing deployment shares in MDT for a Windows 7 deployment that has mostly 32-bit computers, but a few 64-bit computers as well? Is it better to create a single deployment share for Windows 7 and include both
    versions, or is it better to create two separate deployment shares? And what about 32-bit and 64-bit versions of applications?
    I'm currently leaning towards creating two separate deployment shares, just so that I don't have to keep typing (x86) and (x64) for every application I import, as well as making it easier when choosing applications in the Lite Touch installation. But I know
    each deployment share has the option to create both an x86 and x64 boot image, so that's why I am confused. 

    Supporting two task sequences is way easier than supporting two shares. Two shares means two boot media, or maintaining a method of directing the user to one or the other. Everything needs to be imported or configured twice. Not to mention doubling storage
    space. MDT is designed to have multiple task sequences, why wouldn't you use them?
    Supporting multiple task sequences can be a pain, but not bad once you get a system. Supporting app installs intelligently is a large part of that. We have one folder per app install, with a wrapper vbscript that handles OS detection. If there are separate
    binaries, they are placed in x86 and x64 subfolders. Everything runs from one folder via the same command, "cscript install.vbs". So, import once, assign once, and forget it. Its the same install package we use for Altiris, and we'll be using a Powershell
    version of it when we fully migrate to SCCM.
    Others handle x86 and x64 apps separately, and use the MDT app details to select what platform the app is meant for. I've done that, but we have a template for the vbscript wrapper and its a standard process, I believe its easier. YMMV.
    Once you get your apps into MDT, create bundles. Core build bundle, core deploy bundle, Laptop deploy bundle, etcetera. Now you don't have to assign twenty apps to both task sequences, just one bundle. When you replace one app in the bundle, all TS'es are
    updated automatically. Its kind of the same mentality as active directory. Users, groups and resources = apps, bundles and task sequences.
    If you have separate build and deploy shares in your lab, great. If not, separate your apps into build and deploy folders in your lab MDT share. Use a selection profile to upload only your deploy side to production. In fact I separate everything (except
    drivers) into Build and deploy folders on my lab server. Don't mix build and deploy, and don't mix Lab/QA and production. I also keep a "Retired" folder. When I replace an app, TS, OS, etcetera, I move it to the retired folder and append "RETIRED - " to the
    front of it  so I can instantly spot it if it happens to show up somewhere it shouldn't.
    To me, the biggest "weakness" of MDT is its flexibility. There's literally a dozen different ways to do everything, and there's no fences to keep you on the path. If you don't create some sort of organization for yourself, its very easy to get lost as things
    get complicated. Tossing everything into one giant bucket will have you pulling your hair out.

Maybe you are looking for

  • Database Growth and Grooming Errors

    About 3 weeks ago our Service Manager database began to grow about 1GB per day. We've been in production for about 3 years and the database size is 140 GB and now it's rapidly increasing. There have been no recent changes to the environment, however,

  • Installing type 1 font in Windows 7 - AWEFUL customer support

    I purchased and installed Helvetica Bold Oblique font this morning and downloaded it to install on my computer. I have @600 fonts on my machine, so this isn't my first attempt at installing fonts, but it's certainly been the most annoying and it's ex

  • IPhoto 2.0.1 and photoshop elements 3 work together?

    I import digital pics into iPhoto but usually edit them in photoshop cause it offers much more. When I edit and save an image w/i photoshop and save it back into an iPhoto album w/i the Photoshop program, or add it to a newly created folder in Photos

  • HTML tags in generated javascript

    Hello BEA Experts, As you know that auto generated javascript functions like getNetuiTagNames in Workshop are included with in the the HTML comment tags like <!-- --!> its another thing they should be generated correctly as follows <!-- //--!>, to hi

  • Upgrading from CS3 to CS5.5 Design Premium??

    Hi there, I work for a small print company and we are in the process of upgrading from CS3 to CS5.5 Design premium, the company has purchased a new Mac (runs Lion 10.7), our old Mac is on 10.4 and it has run CS2 upgrade, and a few years back we insta