Manage Portal data ?   best practice type thing

Hello,
I am looking how best do set up a Portal System on Production, in particular a
good process to back-up and re-create an Portal systems.
I list some possibilities below. Does anyone know if this is how it is typically
done ? Does this cover all data which should be backed up / migrated ?
thanks!
1- 'Entitlements' data. As far as I know, this is stored in the embedded LDAP.
Can this be extracted ?
2- DataSynch data.
DataSynch web application.
extract with ftp-like command
upload as jar file
3- Users and Groups.
Export to a dat file. (Suddenly I forget how to do this, though think I saw it
somewhere).

Okey, and then using a RFC call from the webdynpro application to fetch data from the sap database?
This answered my question:
Best regards
Øyvind Isaksen

Similar Messages

  • Portal backup best practice!!

    Hi ,
    Can anyone let me know where can i get Portal Backup Best Practices!! We are using SAP EP7.
    Thanks,
    Raghavendra Pothula

    Hi Tim: Here's my basic approach for this -- I create either a portal dynamic page or a stored procedure that renders an HTML parameter form. You can connect to the database and render what ever sort of drop downs, check boxes, etc you desire. To tie everything together, just make sure when you create the form, the names of the fields match that of the page parameters created on the page. This way, when the form posts to the same page, it appends the values for the page parameters to the URL.
    By coding the entire form yourself, you avoid the inherent limitations of the simple parameter form. You can also use advanced JavaScript to dynamically update the drop downs based on the values selected or can cause the form to be submitted and update the other drop downs from the database if desired.
    Unfortunately, it is beyond the scope of this forum to give you full technical details, but that is the approach I have used on a number of portal sites. Hope it helps!
    Rgds/Mark M.

  • Number ranges - Import of Legacy Data - best practice

    Hi,
    we are planning to move legacy data objects to our SAP CRM.
    These Objects have an external key (Numeric, 6 digit) in a number range that is relatively full and highly fragmented.
    Is there a best practice how to implement internal number assignment for this kind of pre filled number range?
    The internal key in SAP would be different and under our control, the external key is the interesting one.
    Cheers,
    Andreas

    Hi Luís,
    The scenario is in the context of insurance business.
    The setup: SAP CRM as central Business Partner system. And in the CRM we keep the policy numbers of the surrounding (non SAP) policy systems as references (I'm talking about insurance policies...).
    For each policy we create a one order object, containing, among others, LOB, policy type and the policy number.
    These policy number ranges are to be maintained in the central CRM system in the future.
    And in one of these Systems they have the situation described above:
    6 digit key in a number range that is relatively full and highly fragmented. They are managing their numbers in an xls right now, but we would also have them migrated into our system.
    And after the migration we would be responisble to find a unused number, whenever a new policy is to be created.
    Cheers,
    Andreas

  • Single Template support multiple formats of data - Best practice

    I have a requirement to create Invoices in a single PDF file.
    The invoices would belong to different categories - DebitNotes, CreditNotes, Invoices with single product,
    invoices with multiple product etc.. each will have a different format.
    I initially thought the right way to create a single pdf is to use a
    single Template, with different formats of invoice seperated by conditional formatting.
    The see from reading the blogs that the other way is to create sub-template
    (one each for credit, invoice, Debits etc) and plug it into the
    main template.
    I would like to know what is the best practice that is followed in the above case.
    If I were to use sub-templates how would I make it possible to view the invoice stub only on the first page.
    Since the data from the sub-template would go to multiple pages.
    Is adding the stub data to the footer the only option. Please can someone share with me an example
    template.
    Thanks
    Shandrila

    Shandrila
    If the various document types are of a single XML format ie the same structure with just document type differences and the layout format is the same just the data different or very minimal changes that can be handled with conditional formatting then I think it would be OK to have a single report for all document types.
    If the data structures are very different and the layout requirements are different then I would create separate reports for each document type. If the data structure is teh same but the document type layouts are different then go for separate layout formats.
    Going down the sub template path can be a little difficult, you might end up with a very complex set of templates that are almost as much of a pain to manage as the original report you are trying to replace.
    Here's the best scenario IMHO ...
    1 data extract, parameterized to pull invoice, CM, etc data based on the user request
    Multiple layout templates, 1 for each document type. If you have common layout sections across the layouts e.g address blocks then break them out as sub template components that all of the layouts can access and share.
    Multiple report definitions, sharing the data extract with a single layout template associated with them
    cheers
    Tim

  • Portal Design - Best Practices for Role and Workset Tab Menu

    We are looking to identify and promote best practices in SAP Portal Design. 
    First, is there a maximum number of tabs which should exist on the highest level tab menu, commonly called the role menu?  Do a large number of tabs on this menu cause performance issues?  Are there any other issues associated with a large number of tabs on this menu?
    Second, can the workset tab menu be customized to be 2 lines of tabs?  Our goal is to prevent tab scrolling.
    Thanks

    Debra,
    Not aware of any performance issues with the number of tabs in the Level 1 or 2 menus, particularly if you have portal navigation caching enabled.
    From an end user perspective I guess "best practice" would be to avoid scrolling in the top level navigation areas completely if possible.
    You can do a number of things to avoid this, including:
    - Keep the role/folder/workset names as short as possible.
    - If necessary break the role down into multiple level 1 entry points to reduce the number of tabs in level 2.
    An example of the second point would be MSS.  Instead of creating a role with a single workset (i.e. level 1 tab), we usually split it into two folders called something like "My Staff" and My Finance" and define these folders as entry points.  We therefore end up with two tabs in level 1 for the MSS role, and consequently a smaller number of tabs in level 2.
    Hope that helps......
    Regards,
    John

  • Storing data - best practice?

    Hi,
    I wonder if there is any best practice to store data in my EP6.0 portal? For instance, in a standard website if you have a list of events, each event can be stored in a related sql-database and can then be fetched and updated whenever necessary.
    What is the best way to do developing portal content? The reason I am asking is because I want to develop a WebDynpro application where I can select a date and then display all registered events on that day in my portal.
    Best regards
    Øyvind Isaksen

    Okey, and then using a RFC call from the webdynpro application to fetch data from the sap database?
    This answered my question:
    Best regards
    Øyvind Isaksen

  • Portal build best practices

    Hello everyone,
    I'm soon starting a new WLP92 project and I would like to know which are the recommended best practices for building a portal project.
    I have experience with WLP8 projects that, frankly, wasn't easy to build.
    Is the following approach still valid?
    1 - export the Ant build using Workshop
    2 - customize the build if needed (test, etc.)
    Is anyone using continuous integration with Cruisecontrol?
    What about Maven2? Is anyone using the approach suggested on the dev2dev article ([url http://dev2dev.bea.com/pub/a/2007/03/maven-weblogic-portal.html])?
    This is quite a broad topic. It would be very interesting if we can use this thread to share common experience about build best practices.
    Thanks
    Luciano
    Edited by koevet at 07/29/2007 12:08 AM

    Debra,
    Not aware of any performance issues with the number of tabs in the Level 1 or 2 menus, particularly if you have portal navigation caching enabled.
    From an end user perspective I guess "best practice" would be to avoid scrolling in the top level navigation areas completely if possible.
    You can do a number of things to avoid this, including:
    - Keep the role/folder/workset names as short as possible.
    - If necessary break the role down into multiple level 1 entry points to reduce the number of tabs in level 2.
    An example of the second point would be MSS.  Instead of creating a role with a single workset (i.e. level 1 tab), we usually split it into two folders called something like "My Staff" and My Finance" and define these folders as entry points.  We therefore end up with two tabs in level 1 for the MSS role, and consequently a smaller number of tabs in level 2.
    Hope that helps......
    Regards,
    John

  • Using WebI with SAP BW Data - Best practice for introducing BW Accelerator

    Hi,
    We have a significant investment in using BOE XI 3.1 SP2 integrated to SAP BW 7.0 EHP 1 SPS05.
    Now we intend to introduce BW Accelerator to improve the data fetch performance for the Adhoc (WebI) Analysis and the formatted reports built using WebI (Infoview).
    Data handling in question is approx. 2 Million+ records for each WebI report / adhoc analysis (20 to 30 columns).
    The solution could be BW Cubes --> BW Accelerator --> BW Queries --> BO Universe --> WebI using Infoview
    Does it really help in introducing the BW Accelerator like the case described above ?
    Understand that the BW Accelerator could improve the performance of the underlying data and hence the BW Queries do work faster; but does it really improve (9x to 10x) performance for the MDX Queries generated by the BO Universe ( BOE XI 3.1 SP2 ) & WebI.
    What is the roadmap for the future wrt BW Accelerator and SAP BI BO Integration; if we intend to use WebI ?
    Or should be migrate to BO Explorer as the front end for Adhoc Analysis ?
    Is BO Explorer able to present 1 Million + records with 20-30 columns ?
    What is the best practice / better on performance; as an integrated product / solution ?
    1) BW Cubes --> BW Accelerator --> BW Queries --> SAP Integ Kit --> BO Universe --> WebI
    2) BW Cubes --> BW Accelerator --> ??? --> BO Explorer --> ??? --> WebI ???
    3) BW Cubes --> BW Accelerator --> ??? --> BO Pioneer --> ??? --> WebI ???
    4) BW Cubes --> BW Accelerator --> ??? --> BO Explorer
    5) BW Cubes --> BW Accelerator --> ??? --> BO Pioneer
    6) BW Cubes --> BW Accelerator --> BW Queries --> SAP Integ Kit --> Crystal Reports (to handle above data volume)
    7) BW Multiproviders --> BW Accelerator --> BW Queries --> SAP Web Analyzer (to handle above data volume)
    regards,
    Rajesh K Sarin
    Edited by: Rajesh Sarin on Jan 25, 2010 4:05 PM

    Hi,
    We have a mix of Adhoc Analysis (60 %) and Formatted Reports (40%). We selected WebI as the tool for the purpose & used it for requirements which process approx. 2M records. We faced bottleneck issues on performance (we are on BO XI 3.1 SP2 & SAP BW 7.0 EHP1, SP05).
    We are further analyzing possibility to introduce BWA; if this can handle similar record processing & still preserve our investment on OLAP Universes, WebI, SAP Integration Kit & training users on WebI frontend.
    I see a lot of documentation suggesting "BO Explorer and BWA" - understand that BWA would improve the DB time and BO Explorer would help on the Front-end / OLAP time.
    Request your guidance on the road map & continuation of investment using BWA + WebI.
    regards,
    Rajesh K Sarin

  • No Data Best Practice

    Hi Gurus,
    What is the best practice for handling no data for your select? With SQL Server you can do an ISNULL and do an IF on that. But, with Oracle it appears you need to do a Count INTO or use the EXEPTION NO_DATA_FOUND. What is best practice? Is it one of the two above or something different? Just want to make sure we are handling this scenario properly.....it has a topic that has come up many times with our dev group.
    --S                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    There is no real "best practise" - only reality and fact.
    When using an implicit SQL statement in PL/SQL you tell the PL egnine that you are not interested in the cursor handle for that SQL. You do not want to fetch from that cursor handle manually. You do not want to reference the projection (columns returned) via that cursor handle.
    PL says this is fine. But as you do not want to deal with the cursor handle, you also cannot deal with return codes of that cursor. So it will raise an exception when no data is returned by that SQL.
    When you define an explicit cursor, the exact opposite is true - as you do define a cursor handle on the "client" side (PL is a SQL client/caller). This means that you do want to deal yourself with the actual cursor. Thus no exception will be raised when there is no data returned - as you can test that yourself using that very same cursor handle you have defined.
    Thus no "best practise". Only fact. You deal with no data found exceptions when using implicit SQL cursors in PL. You deal with a cursor handle when using explicit SQL cursors in PL.

  • Item Master Data Best Practice

    hello all
    we are now using SBO for more than a year, and yet we still always add new items in our item master data. what is the best practice on maintaining the item master data. for you to understand this is the scenario. since in the Factory/Mill there are a lot of spare parts and equipments there, if some of this equipments is damage, we have to buy a new one, here the problem occur because if it only differ in Part Numbers we use another item code for it. with this practice, at later part we found out that we have more than 1 item code for only one item because of the naming convention. so we have to hold the other item code and use the other one coz we cant delete it anymore. sometimes 1 itemcode occurrs only once in the in the item history.
    please suggest what is the best Practice on this matter.
    1. Item Grouping
    2. Naming Convention
    etc..
    NOTE:
    our goal is minimize adding of items in item master data.
    FIDEL

    FIDEL,
    From what I understand, you have to replace broken / damaged component of an item like Bulldozer, Payloader and mill turbines.  This is the reason why you defined the parts as a new item.
    From your Item code examples, I am not clear why you have 2 different names for the same item.  and also what you mean by "this two item codes are actually the same,
    If you are just buying parts to replace components and if you do not need to track them then I would suggest you create generic itemcodes in the Item master and simply change the description when you buy / sell them.
    Example:  Same Item different description.
    REPL101  OIL FILTER
    REPL101  FUEL FILTER
    REPL101  xxxxx
    This way you are not going to keep creating items in the database and also you can see the description and know what it was.
    Simply change the ItemName in the marketing document and instead of pressing Tab to move to the next column Press CTRL+Tab so that SAP does not auto check then ewly typed name against the item master.
    Let me know if your scnenario is otherwise
    Suda

  • Best Practice type question

    Our environment currently has 1 Connection factory defined per JMS Module. We also have multiple queues per JMS Module.
              In other J2EE AppServer environments I have worked in, we defined 1 connection factory per queue.
              Can someone explain if there a best practice, or at least a good reason for doing one over the other.
              The environment here is new enough that we can change how things are set up if it makes sense to do so.

    My two cents: A CF allows configuration of client load-balancing behavior, flow-control behavior, default QOS, etc. I think its good to have one or more CFs configured per module, as presumably all destinations in the module are related in some way, and they therefore likely all require basically the same client behavior. If you have very few destinations, then it might help to have one CF per destination, but this places a bit more burden on the administrator to configure the extra CFs in the first place, and on everyone to remember which CF is best for communicating with which destination.
              Tom

  • Management IP Address : best practices ?

    Hi,
    What are the advantages to assign the mgmt IP @ to service profil rather to blade server ?
    Can we do the both ?... and in what puropose ?
    what are the best practices for specific use ?
    many thx in advance for you feedback.
    Nicolas.

    The ability to assign the IP address to the SP was done at the request of users.  This allows the KVM IP adddress to follow the SP (and OS associated with that SP).  Customers wanted to know that KVM IP was also associated with there OS.
    The IP associated to the blade can be used at any time for a KVM session.  An IP address associated with the SP can only be used while the SP is associated with a blade.
    Both can be used.  I dont believe there is a best practice for their assignment.
    Thank You,
    Dan Laden
    Cisco PDI Data Center
    Want to know more about how PDI can assist you?
    http://www.youtube.com/watch?v=4BebSCuxcQU&list=PL88EB353557455BD7
    http://www.cisco.com/go/pdihelpdesk

  • Adobe LiveCycle Process Management Overview and Best Practices

    To get familiar with the best practices of process management watch this recording of a webinar hosted by Avoka Technologies.

    To get familiar with the best practices of process management watch this recording of a webinar hosted by Avoka Technologies.

  • OWB Change Management/Version Control Best Practice

    Hi
    I am about to start developing a data warehouse using OWB 10g R2, and I've been doing quite a lot of research into the various deployment/change management/version control techniques that can be used, but am still unsure which is the best to use.
    We will have 2-3 developers working on the project, and will be deploying from Development, to Test, to Production (each will have a separate repository). We want to be able to easily identify changes made between 1 release and the next to have a greater degree of control and awareness of what goes into each release. We also wish to use a source control system to track changes (we'll probably use SVN, but I don't think that the actual SCS tool makes a big difference to our decision at this point).
    The options available (that I'm aware of), are:
    1. Full MDL export/import.
    2. Snapshot MDL export/import.
    3. Manual coding of everything using OMB Plus.
    I am loath to use the full MDL export/import functionality since it will be difficult, if not impossible, to identify easily the changes made between 1 release and the next.
    The snapshot MDL export/import functionality is a little better at comparing releases, but it's still difficult to see exactly what has changed between 1 version and the next - particularly when a change to a transformation has been made. It also doesn't cope that well with tracking individually made changes to different components of the model.
    The manual coding using OMB Plus seems like the best option at the moment, though I keep thinking "What's the point of using a GUI tool, if I'm just going to code everything in scripts anyway?".
    I know that you can create OMB Plus code generation scripts to create your 'creation' scripts, but the code generation of the Alteration scripts seems that it would be more complicated than just writing the Alteration scripts manually.
    Any thoughts anyone out there has would be much appreciated.
    Thanks
    Liffey

    Well, you can also do per-object MDL exports and then manage those in your version control system. With a proper directory structure it would be fairly simple to code an OMB+ Script that scans a release directory tree and imports the objects one by one. I have done this before, although if you are using OWB as the primary metadata location for database objects then you have to come up with some way to manage object dependency order issues.
    The nice thing about this sort of system is that a patch can be easily shipped with only those objects that need to be updated.
    And if you force developers to put object-level MDL into your version control system then your system should also have pretty reporting on what objects were changed for a release and why.
    At my current job we do full exports of the project MDL and have a deployment script that drops the pre-existing deployed version of the project before importing and deploying the new version, which also works quite well - although as you note the tracking of what has changed in a release then needs to be carefully managed elsewhere. But we don't deploy any of our physical database objects through OWB. Those are deployed from Designer, and our patch script applies all physical changes first before we replace the mappings from the OWB project. We don't even bother synching the project metadata for tables / views / etc. at deployment. If the OWB project's metadata for database objects is not in sync with Designer, then we wind up with deployment errors. But on the whole it works pretty well.

  • Large amount of data best practices.

    Hello Experts,
    I have an scenario where i have to extract large volume of data from SAP system to a external database using SAP PI. The process has to extract about 400.000 rows from SAP and send it to this external database. I guess the best way to insert the data to the database is using JDBC adapter but i'm wondering what's the best adapter i can use to comunicate SAP R/3 and SAP PI? What's the best way to send a message of 400.000 rows to SAP PI? Files, idocs, proxies? Could you please tell me if there's any documentation on the topic?
    Thank you in advance.

    HI,
    In your case, ClientProxy to JDBC is the best for the performance.
    Please see the link, it will explain you scenario (proxy to JDBC) in details.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0ac1a33-debf-2c10-45bf-fb19f6e15649?quicklink=index&overridelayout=true
    Regards,
    Rajesh

Maybe you are looking for

  • Mac Mini Server Boot Camp Windows 8.1 White Screen Issue

    Hello Apple Peeps! I have a Mac Mini Server 2012 which I have had for around 6 months. I use it quite a lot mainly for Music Production on the OS X side and Graphic Design on the Windows side. I recently decided to factory restore it as I wanted to s

  • How to connect to pioneer vsx42 through air play

    Has anyone here used airplay through a pioneer receiver? I just purchased vsx42 Elite and I can't figure out how to get it working. I have both the iPad and the pioneer on the same network. Any thoughts would be appreciated.

  • Timestamp coming in report with evaluate function

    Hi All, I have one date prompt where i have defined the presentation variable pvar_strt_dt. Now i am running report which captures the value of pvar_strt_dt but i m getting following error as follows State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY

  • How do I remove ken burns effect?!?!

    Hi, I have iLife 6.0.3 and I'm experiencing 2 problems. 1. iMove keeps on crashing on me, it seems like I have to close my eyes and pray that it doesn't crash before I get to save my project, which is most often than not. 2. Probably the most frustra

  • JSP Debugging in Tomcat 5.5.x

    Hi, Does NitroX Workshop for Struts support Debugging in the Tomcat 5.5.9 App Server? If so how to configure that in Eclipse? Thanks in advance. Subir