Best practice to Maintain Folders in Portal

Hello Everyone,
Our Portal folders are a bit messed-up. So, we are thinking to re-arrange all of them in a proper manner. Before we do that, I thought of asking the experts advise about the way, the Folders copied or Delta Linked from SAP Standard ESS folders should be maintained.
It would be very much appreciated, if anyone can advise us about the best way of Copying or Deltalinking the SAP Standard ESS Folders.
Regards,
Gopal.

Hi Gopal,
I'm not sure if I got your question completely - if you want to know if copying or deltalinking is the better approach, or if you want to know more details how to arrange this etc. The latter I couldn't answer.
For the first question: As long as you don't have many many roles and as long as you don't maintain long chains of delta-links, using delta-links of course is better from the view of administration and how you can maintain things centrally.
Long delta-link chains may be a problem from the view of performance. So if you have many roles, within the roles big navigation structures and tha targets are delta-links pointing to delta-links pointing to ... pointing to the origina iViews or whatever, this can become a problem in performance.
Anyhow, this last case points to a problematic strucure how you maintain your PCD content, and as you are just tidying up, I expect that using delta-links and it's advantage concerning maintainibility is the best way to go.
Hope it helps
Detlev

Similar Messages

  • Best practice to maintain code across different environments

    Hi All,
    We have a portal application and we use
    JDEV version: 11.1.1.6
    fusion middleware control 11.1.1.6
    In our application we have  created many portlets by using iframe inside our jspx files and few are in navigation file as well and the URL's corresponding to these portlets are different
    across the environments (dev,test and prod). we are using subversion to maintain our code .
    problem we are having is: Apart from changing environment details while deploying to Test and prod, we also have to change the portlet URL' s from dev portlet URL's to corresponding env manually.
    so is there any best practice to avoid this cumbersome task? can we achieve this by creating deployment profile?
    Thanks
    Kotresh

    Hi.
    Put a sample of the two different URLs. Anyway you can use EL Expression to get current host instead of hardcoded. In addition, you can think in using a common DNS mapped in hosts file for all environments.
    Regards.

  • Best Practice Regarding Maintaining Business Views/List of Values

    Hello all,
    I'm still in the learning process of using BOXI to run our Crystal Reports.  I was never familiar with the BO environment before but I have recently learned that every dynamic parameter we create for a report, the Business View/Data Connectors/LOV are created on the Enterprise Repository the moment the Crystal Report is uploaded.
    All of our reports are authored from a SQL Command statement and often times, various reports will use the same field name from the database for different reports.  For example, we have several reports that use the field name "LOCATION" that exists on a good number of tables on the database.
    When looking at the Repository, I've noticed there are several variations of LOCATION, all which I'm assuming belongs to one specific report.  Having said that, I see that it can start to become a nightmare in trying to figure out which variation of LOCATION belongs to what report.  Sooner or later, the Repository will need to be maintained a bit cleaner, and with the rate we author reports, I forsee a huge amount of headache down the road.
    With that being said, what's the best practice in a nutshell when trying to maintain these repository items?  Is it done indirectly on the Crystal Report authoring side where you name your parameter field identifiable to a specific report?  Or is it done directly on the Repository side?
    Thank you.

    Eric, you'll get a faster qualified response if you post to the  Business Objects Enterprise Administration forum as that forum is monitored by qualified support for BOE

  • Best Practice in maintaining multiple apps and user logins

    Hi,
    My company is just starting to use APEX, and none of us (the developers) have worked on this before either. It is greatly appreciated if we can get some help here.
    We have developed quite a few applications in the same workspace. Now, we are going to setup UAT and PRD environments and also trying to understand what the best practice is to maintain multiple apps and user logins.
    Many of you have already worked on APEX environment for sometime, can you please provide some input?
    Should we create multiple apps(projects) for one department or should we create one app for one department?
    Currently we have created multiple apps for one department, but, we are not sure if a user can login once and be able to access to all the authenticated apps.
    Thank you,
    LC

    LC,
    I am not sure how much of this applies to your situation - but I will share what I have done.
    I built a single 700+ page application for my department - other areas create separate smaller applications.
    The approach I chose is flexible enough to accomdate both.
    I built a separate access control application(Control) in its own schema.
    We use database authenication fo this app - an oracle account is required.
    We prefer to use LDAP for authentication for the user applications.
    For users that LDAP is not option - an encrypted password is stored - reset via email.
    We use position based security - priviliges are based on job functions.
    We have applications, appilcations have roles , roles have access to components(tabs,buttons,unmasked card numbers,etc.)
    We have positions that are granted application roles - they inherit access to the role components.
    Users have a name, a login, a position, and a site.
    We have users on both the East Coast and the West Coast, we use the site in a sys_context
    and views to emulate VPD. We also use the role components,sys_contexts and views to mask/unmask
    card numbers without rewriting the dependent objects(querys,reports,views,etc.)
    The position based security has worked well, when someone moves,
    we change the position they are assigned to and they immediately have the privileges they need.
    If you are interested I can rpovide more detail.
    Bill

  • "best practice to maintain the SAP OM Org Structure"

    Hi SAP Experts,
    My client want to have an best practice or an safe process to update, better and maintain their existing SAP HCM Organizational Structure. In one way you can say that i am doing an process oriented job.
    Our client system is not up-to-date due to the lack of user awareness and complete knowledge on the system. Due to this they are unsure on the accuracy of the reports that comes out of the system.
    As a HCM functional consultant i can look from the technical perspective but not on this process oriented role. I need your guidance in this regard, please sugguest me how can i move ahead and make  some really valuable recommendations ? I am confused on where to start and how to start. Please help me in this regard.
    Thanks in advance,
    Amar

    The only thing u need to keep in mind the Relatioships between the objects in OM
    check the Tcode OOVK  for relationships and assigning those objects   PP01 , PP02
    Re: Organization Structure
    this thread may help u
    let us know if there is anything else
    Edited by: Sikindar on Dec 4, 2008 9:36 AM

  • Best Practice? - Implementing different sap portals on the same hardware

    We have a very large intranet portal implementation today spanning multiple boxes with 30k+ users on it.
    A different business group is asking us to build a sap vendor portal system, but would like to know if we can run it on the same equipment.
    The intranet uses ldap where the vendor will authenticate/authorize against the database. Aside from this, other configurations will be different. My gut feel is that this is something we should not do (mixing both intranet and vendor systems on the same hardware with different config's).
    Is there a best practice document that outlines if this is something that should be done or avoided. Also, if you have run into this and have an answer, would appreciate the feedback.
    Thanks in advance for the assistance,
    Todd

    Hi Todd,
       Technically there isn't a reason you couldn't run both portals on the same hardware assuming it is sized properly. You could even use the same portal if you wanted to.
    The thing I would be concerned with is security.  I assume you have more stringent security requirements for external facing applications than internal applications like the need for additional firewalls and reverse proxies.  Usually if you pursue the security requirements you will find the need for separate portal hardware. 
    Hope this helps
    John

  • Best Practices for Maintaining SSAS Projects

    We started using SSAS recently and we maintain we one project to deploy to both DEV and PROD instances by changing the deployment properties. However, this gets messy when we introduce new fact tables in to DEV data warehouse (that are not promoted to
    Production data warehouse). While we work on adding new measure groups and calculations (based on new fact tables in DEV) we are unable to make any changes to production cube (such as changes to calculations, formatting etc) requested by business
    users. Sorry for long question but is there is a best practice to manage projects and migrations? Thanks.

     While we work on adding new measure groups and calculations (based on new fact tables in DEV) we are unable to make any changes to production cube (such as changes to calculations, formatting etc) requested by business users.
    Hi Sbc_wisc,
    You can create a new project by importing the metadata from the production cube on the server, using the template, Import from Server (Multidimensional and Data Mining) Project, in SQL Server Data Tools (SSDT). And then make some changes on this project
    and then redeploy it to production server.
    Referencec:
    Import a Data Mining Project using the Analysis Services Import Wizard
    Regards,
    Charlie Liao
    TechNet Community Support

  • Best practice for maintaining URLs between Dev, Test, Production servers

    We sometimes send order confirmations which include links to other services in requestcenter.
    For example, we might use the link <href="http://#Site.URL#/myservices/navigate.do?query=orderform&sid=54>Also see these services</a>
    However, the service ID (sid=54) changes between our dev, test, and production environments.  Thus we need to manually go through notifications when we deploy between servers.
    Any best practices out there?

    Your best practice in this instance depends a bit on how much work you want to put into it at the front end and how tied to the idea of a direct link to a service you are.
    If your team uses a decent build sheet and migration checklist then updating the various URL’s can just be part of the process. This is cumbersome but it’s the least “technical” solution if you want to continue using direct links.
    A more technical solution would be to replace your direct links with links to a “broker page”. It’s relatively simple to create an asp page that can accept the name of the service as a parameter and then execute an SQL query against the DB to return the ServiceID, construct the appropriate link and pass the user through.
    A less precise, but typically viable, option would be to use links that take advantage of the built in search query functionality. Your link might display more results than just one service but you can typically tailor your search query to narrow it down. For example:
    If you have a service called Order New Laptop or Desktop and you want to provide a link that will get the user to that service you could use: http://#Site.URL#/RequestCenter/myservices/navigate.do?query=searchresult&&searchPattern=Order%20New%20Desktop%20or%20Laptop
    The above would open the site and present the same results as if the user searched for “Order New Desktop or Laptop” manually. It’s not as exact as providing a direct link but it’s quick to implement, requires no special technical expertise and would be “environment agnostic”.

  • Best Practices for data storage in portal database

    Hi,
    I need to store some stuctured data which is related to portal only. This data may grow data by data and may huge amount some point of time.
    Iam thinking which is the best way to handle it like maintanance point of view.
    i think i can store it in R/3 as custom table which is easy to maintain and read/write using RFC.
    the other one is store it on the portal database (dictionary), may not be easy to handle it.
    suggestions please?
    Thanks.

    Best way is to maintian data(growing data) on R3 and use  JCA to write a simple portal application(JSPDynpage or WD) to get the data back on portal.
    Using JCA, won't affect noticeable portal performance..
    Itz not advisable to store large data on dictionary.
    Regards,
    N.

  • Best Practice for Maintaining a web scoped feature

    I am in the process of creating a web scoped feature that creates a few lists and some event recievers for those lists.
    My plan was to have the lists created on feature activating. This way, users can simply activate the feature if they want to take advantage of this functionality on a site by site basis.
    While trying to flesh out this idea, I realized that if we ever need to make changes to this feature in the future, we'll likely have to uninstall and reinstall the solution.  After uninstalling/reinstalling the solution, the feature will
    no longer be active on any site by default. This means that all of those users who were using the functionality will have to go in and re-activate the feature to continue using the event recievers.
    What I'd like to happen is that the sites that had this feature activated, still have the feature activated after I uninstall and reinstall the solution. Those sites that didn't have the feature active, still dont' have the feature activated.
    I know that there is the option to update a solution, which will do what I want, but updating is limited in that I cannot add additional items to the solution.  Which means eventually I'll have to uninstall and reinstall.
    Is there a good way to go about this?

    If you are modifying or expanding the features, then your best option would be using Feature upgrade
    http://www.sharepointnutsandbolts.com/2010/06/feature-upgrade-part-1-fundamentals.html
    >> know that there is the option to update a solution, which will do what I want, but updating is limited in that I cannot add additional items to the solution.  Which means eventually I'll have to uninstall and reinstall.
    Yes this is correct. Update solution works only for same set of files and features in the existing solution. If there is any change then you have to retract and deploy the new solution.
    My Blog- http://www.sharepoint-journey.com|
    If a post answers your question, please click Mark As Answer on that post and Vote as Helpful

  • TechNet Wiki - Best Practice Blog Posts

    Lately, we've had some great blog posts about best practices on TechNet Wiki. So we're going to share them with you here...
    Wiki
    Life: Commenting on Comments... Care to Comment?- 10/16/14 by Ed Price
    How
    to write a great post on the Wiki - For Dummies - 10/12/14 by Gokan Ozcifci
    Wednesday
    - Wiki Life: The Importance of Longer, High-Quality Articles - 10/8/14 by Ed Price
    Wednesday
    - Wiki Life: 10 ways to become the most hated Wiki ninja on the planet - 10/1/14 by Peter Geelen
    Wiki Life:
    PowerShell PowerPack! - 9/17/14 by Matthew Yarlett
    The
    most unseen and unspoken TechNet Wiki roles: The mentor Role - 6/22/14 by Sandro Periera
    Wiki Life: Smart Tags -
    6/18/14 by Matthew Yarlett
    Wiki Life:
    Ownership and Credibility - 6/11/14 by Matthew Yarlett
    Wiki
    Life: Best Practices for building TechNet Wiki Portals - 6/4/14 by Horizon Net
    Wiki
    life: Technet Wiki tagging, the ugly truth. - 5/29/14 by Peter Geelen
    Wiki Life:
    Getting too Personal!  - 5/14/14 by Matthew Yarlett
    Wiki Life:
    YOU edited MY article??!  - 4/30/14 by Matthew Yarlett
    Wiki
    Life: Are you right in making it a rite to write? - 4/16/14 by Matthew Yarlett
    Wiki Life - Alerts -
    4/9/14 by Alan Carlos
    Wiki
    Life: Speling an gamma, it is umpotant? - 4/2/14 - by Matthew Yarlett
    Wiki
    Life: How to Translate TechNet Wiki Articles - 4/2/14 by Horizon Net 
    Wiki Life:
    Attention to Detail - 3/19/14 by Matthew Yarlett
    Wednesday - Wiki Life - Mobility - 3/12/14 by Alan Carlos
    Wiki
    Life: A Picture is Worth a 1000 Words - 3/5/14 by Matthew Yarlett
    Wiki Life: Cut'N'Paste -
    2/19/14 by Matthew Yarlett
    Wiki Life: How to Join Leadership - 2/19/14 by Horizon Net
    Wiki Life: Featured Articles in the TechNet Wiki - 2/12/14 by Durval Ramos
    Wiki Life: Code.Format() -
    2/5/14 by Matthew Yarlett
    Wiki Life: The CodePlex Corner - 2/5/14 by Horizon Net
    Did you know that we have a layout article? - 1/29/14 by Durval Ramos
    Wiki
    Life: Get to the point, keep it short! - 1/22/14 by Matthew Yarlett
    Wiki Life:
    Planning a Great Article - 1/8/14 by Matthew Yarlett
    Wiki Life: Best Practices for converting an MSDN / TechNet Forum thread into a Wiki Article!!!
    - 12/25/13 by Ed Price
    Wiki Life: Best Practices for Giving Credit - 12/18/13 by Horizon Net
    Wiki Life: How To Fix a Wiki Article TOC  - 12/4/13 by Benoit Jester
    Wiki Life: How To Detect Missing Tags Without any Effort  - 11/20/13 by Benoit Jester
    Wiki Life: How To Import an Microsoft Excel Spreadsheet Into a Wiki Article - 10/30/13 by
    Markus Vilcinskas
    Wiki Life: Cross Linking  - 10/9/13 by Horizon Net
    Wiki Life: User Groups Portal - 10/2/13 by Horizon Net
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

    Respected sensei Wiki Ninja,
    what else do you need to start a Wiki article?
    Put you signature in practice!
    So I kindly invite you all to continue your braindump over here:
    http://social.technet.microsoft.com/wiki/contents/articles/27905.technet-wiki-best-practices-blog-posts-articles.aspx
    Peter Geelen (Microsoft Belgium) - Premier Field Engineer Security & Identity
    [If a post helps to resolve your issue, please click the
    "Mark as Answer" of that post or click "Vote as helpful" button
    of that post.
    By marking a post as Answered or Helpful, you help others find the answer faster.

  • Best practice for photo format: RAW+PSD+JPEG?

    What is the best practice in maintaining format of files while editing?
    I shoot in RAW and import into PS CS5. After editing, it allows me to save as various formats, including PSD and JPEG. PS says that if you want to re-edit the file, you should save as PSD as all the layers are maintained as-is. Hence I'd prefer to save as .PSD. However, in most cases, the end objective is to share the image with others and JPEG is the most suitable format. Does this mean, that for each image, its important to save it in 3 formats viz RAW, PSD and JPEG? Wont this increase the total space occupied tremendously? Is this how most professionals do it? Pls advice.

    Thanks everyone for this continued discussion in my absence over two weeks. Going through it i realize its helpful stuff. During this period, i downloaded Aperture trial and have learnt it (there's actually not much learning, its so incredibly intuitive and simple, but incredibly powerful. Since I used iphoto in the past, it just makes it easier.
    I have also started editing my pics to put them up on my photo site. And over past 10 days, here is the workflow I have developed.
    -Download RAW files onto my laptop using Canon s/w into a folder where i categorize and maintain all my images
    -Import them into Aperture, but letting the photos reside in the folder structure i defined (rather than have Aperture use its own structure)
    -Complete editing of all required images in Aperture (and this takes care of 80-90% of my pics)
         -From within Aperture open in PS CS5 those images that require editing that cannot be done in Aperture
         -Edit in CS5 and do 'Save', this brings them back to Aperture
         -Now I have two versions of these images in Aperture - the original RAW and the new .PSD
    -Select the images that I need to put up on my site and export them to a new folder from where i upload them
    I would be keen to know if someone else follows a more efficient or robust workflow than this, would be happy to incorporate it.
    There are still a couple questions I have:
    1 - Related to PS CS5: Why do files opened in CS5 jump up in terms of their file size. Any RAW  or JPEG file originally btn 2-10 MB shows up as minimum 27 MB in CS. The moment you do some edits and/or add layers, it reaches 50-150MB. This is ridiculous. I am sure I am doing something wrong.  Or is this how CS5 works with everyone.
    2 - After editing a file in CS by launching it from Aperture, I now end up with two versions in Aperture, the original file and the new .PSD file (which is usually 100MB+). I tried exporting the .PSD file to a folder to upload it on my site, and wasnt sure what format and size it would end up with. I got it as a JPEG file within reasonable filesize limits. Is this how Aperture works? Does Aperture allow you options of which format you want to save the file in?

  • Seaching for Best Practice links that work

    Hi,
    past few years I have been able to access SAP Best Practices documents like SAP Best Practices SAP Best Practices for CP and Wholesale Industries
    (this one still works and guides me to the building block and process overview documents!).
    Recently any link I can find to SAP Industry or Baseline Best Practices ends up with a dead link. See for example trying to get from here SAP Best Practices Baseline packages – SAP Help Portal Page
    to Localized for Netherlands V1.607 SAP Best Practices package further below on that page, results in screen shot attached. I have seen that in many more examples (different countries, or in Industry Best Practice Packages instead of Country Baseline packages....)
    Does any know whether and how SAP redesigned their access to Best Practices documents (Configuration Guides, eCatts, Scenario Process Overviews etc.?
    Thanks for your reply.
    Thijs

    Hi, Thijs,
    There is currently a problem with Best Practices on the Help Portal.  On the home page of the portal (http://help.sap.com/) there is a message that reads "Stay Tuned - There are temporary problems when accessing some content types, for example PDF documents or Best Practices. We are working on a solution."
    Our Wholesale Distribution industry group does not manage the Help Portal pages, so, unfortunately, I don't know the status of the problem or when it might be resolved.
    Lynn

  • Item Master Data Best Practice

    hello all
    we are now using SBO for more than a year, and yet we still always add new items in our item master data. what is the best practice on maintaining the item master data. for you to understand this is the scenario. since in the Factory/Mill there are a lot of spare parts and equipments there, if some of this equipments is damage, we have to buy a new one, here the problem occur because if it only differ in Part Numbers we use another item code for it. with this practice, at later part we found out that we have more than 1 item code for only one item because of the naming convention. so we have to hold the other item code and use the other one coz we cant delete it anymore. sometimes 1 itemcode occurrs only once in the in the item history.
    please suggest what is the best Practice on this matter.
    1. Item Grouping
    2. Naming Convention
    etc..
    NOTE:
    our goal is minimize adding of items in item master data.
    FIDEL

    FIDEL,
    From what I understand, you have to replace broken / damaged component of an item like Bulldozer, Payloader and mill turbines.  This is the reason why you defined the parts as a new item.
    From your Item code examples, I am not clear why you have 2 different names for the same item.  and also what you mean by "this two item codes are actually the same,
    If you are just buying parts to replace components and if you do not need to track them then I would suggest you create generic itemcodes in the Item master and simply change the description when you buy / sell them.
    Example:  Same Item different description.
    REPL101  OIL FILTER
    REPL101  FUEL FILTER
    REPL101  xxxxx
    This way you are not going to keep creating items in the database and also you can see the description and know what it was.
    Simply change the ItemName in the marketing document and instead of pressing Tab to move to the next column Press CTRL+Tab so that SAP does not auto check then ewly typed name against the item master.
    Let me know if your scnenario is otherwise
    Suda

  • Request info on Archive log mode Best Practices

    Hi,
    Could anyone from their personal experience share with me the Best Practices for maintaining Archiving on any version of oracle. Please tell me
    1) Whether to place archives and log files on same disks?
    2) How many lgwr processes to use.
    3) checkpoint frequency.
    4) How to maintain speed of the server being run in archivelog mode.
    5) Errors to look.
    Thanks,

    1. Use separate mount point for archive logs like /archv
    2. Start using with 1 and check the performance.
    3. This is depends upon the redo log file size. Create your redo log file such that hourly maximum 5-8 log switch will happen. Try to make it less than 5 log switch per hour.
    4. Check the redo log file size.
    5. Check for archive log mount point space allocation. Take the backup of archive by RMAN and deleted the backed up archive logs from the archived destination.
    Regards
    Asif Kabir

Maybe you are looking for

  • Data links Questions on Reports Builder

    We are currently using Reports Builder 10.1.2.3.0. We are trying to create a topology report. So, we have about 5 queries in the data models which are linked using a query links with Child Destination column in the child query. So, here is the exampl

  • Error During install Exchange 2013 through Powershell on Server 2012 "Mailbox role: Client Access service"

    Dear all During install Exchange 2013 through Powershell on Server 2012 I got this error in Mailbox role: Client Access service : The following error was generated when "$error.Clear(); $BEVdirIdentity = $RoleNetBIOSName + "\OWA (Exchange Back End)";

  • How to edit the properties.

    I installed a tool in LiveCycle called "LiveCycle Workbench". I dont know how to edit the properties of Reader Extensions server and deselect [CommentsOnline] through Workbench. Please kindly advice me in detail with examples or samples. Thanks, Prab

  • Why is all video on my MBP screen jerky, please help.

    So; after a few weeks since I bought my Samsung Syncmaster monitor to watch and do everything on, I have just gone back to using the screen on my MacBook Pro, and all video; that means any codec, avi, mpeg, mkv, on any player: Quicktime, Flip4Mac, VL

  • Back to my Mac, iDisk Questions

    Hi, I have my iMac since Sept.07 and got a Macbook Air yesterday. Love every minute of it. Today I set up my MBA for access to my iMac both through wireless at home and through Back to my Mac when out. Both work. However, I have never used iDisk. If