Import best pratices

Hello everybody
I want to import the SAP best practices, then how to import?
  can use the transaction code SAINT?
i read the document Quick guide, but cant understand that how to upload the files means which transaction   code is used?
and
how to directly create the a transport request and workbench request
anybody, please suggest me how to do?
Thanks
ganesh

Hi,
Go through the Note 847091 and at the bottom of the note you will find "BL_Quick_Guide_EN_UK.zip".
You can follow that document for BP installation.
Note : This note is for SAP Best Practices Baseline Package UK,SG,MY V1.500 but your requirement may be different search accordingly.
--Kishore

Similar Messages

  • Need best pratices advices

    Hey guys,
    Anyone can share with me the best pratices for the setup of an oracle database. I know that the amount of redo, grouping, file system layout, etc.. depend on the size of your BD. So to help here is the spec of my BD
    oradata : 200GB
    change rate : 50k/s (I got that by dividing the size of my archive redolog by the amount of time between the first and last archlog).
    This is a standard database (not OLTP or Data Warehouse) use to store client information
    My RPO (Recovery Point Objective) is 30 minutes
    Some quick question
    1. How should I layout the file system
    2. How many redo/group/size
    3. How many control file, where shoud I put it
    4. How I should setup the log switching
    Anyway doc, quick, don't want to read a 300 pages oracle document :-) This is why I'm looking on your knowledge
    Thanks
    Edited by: Sabey on 9-Feb-2011 8:01 AM

    Sabey wrote:
    Ok a bit more information.
    Storage : SAN, RAID 5 disk onlySince it's SAN, the RAID 5 (which is generically bad for performance in any update environment) will have minimal adverse effect (because the RAID 5 is hidden by massive cache). Just try to spread the data files across as many disks as possible.
    Oracle works best for datafiles on 'SAME' (Stripe and Mirror Everything). Spread the data files across all possible disks and mix data and index to try to get randomization.
    No ASMPity. A lot of potential transparency will be side-stepped.
    OS: Solaris 10 on a M4000, (2 SPARC 2.1GHz, 4 core each), 16GB RAMFinally some meat. ;-)
    I assume Enterprise Edition, although for the size, the transaction rate proposed, and for the configuration, Standard Edition would likely be sufficient. Assuming you don't need EE-specific features.
    You don't talk about the other things that will be stealing CPU cycles from Oracle, such as the app itself or batch jobs. As a result, it's not easy to suggest an initial guess to memory size. App behaviour will dictate PGA sizing, which can be as important as SGA size - if not more so. For the bland description of app you provide, I'd leave 2GB for OS, subtract whatever else required (app & batch, other stuff running on machine) and split the remaining memory at 50/50 for SGA and PGA until I had stats to change that.
    >
    Like I said, I espect a change rate of 50k/s, is there a rule of thumbs for the size of redo log, the amount, etc.. No bulk load, data is entered by people from a user interface, no machine generated data. Query in read for report but not a lot.Not too much to worry about then. I'd shoot for a minimum of 8 redo logs, mirrored by Oracle s/w to separate disks if at all possible, and size the log files to switch roughly every 15 minutes under typical load. From the looks, that would be (50k/s * 60 sec/min * 15 min) or about 50M - moderately tiny. And set the ARCHIVE_LAG_TARGET to thrum at 15 minutes so you have a predictable switch frequency.
    >
    BTW, what about direct I/O. Should I mount all oracle FS in that mode to prevent the use of OS buffer cache?Again, this would be eliminated by using ASM, but ... here is Tom Kyte's answer confirming direct IO http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:4159251866796
    Your environment is very very small in Oracle terms. Not too much to fuss over. Just make sure you have a decent backup/recovery/failover strategy in place and tested. Use RMAN for the BR and either DataGuard (or DBVisit for Standard Edition)

  • ADF Faces & BC: Best pratices for project layout

    Season greetings my fellow JDevelopers!
    Our software group has been working with ADF for around 5 years and through the years we have accumulated a good amount of knowledge working with JDeveloper and ADF. Much of our current application structure has been resurrected in the early days of JDeveloper 10 where there were more samples codes floating around then there were "best pratice" documentation. I understand this is a subjective topic and varies site to site, but I believe there is a set of common practices our group has started to identify as critical to streamlining a development process(reusable decorated ui components, modular common biz logic, team development with svn, continuous integration/build, etc..). One of our development goals is to minimize dependency between each engineer as everyone is responsible for both client and middle layer implementation without losing coding consistency. After speaking with a couple of the aces at the last openworld, I understand much of our anticipated architectural requirements are met with JDeveloper 11(with the introduction of templates, declarative components, bounded task flows, etc..) but due to time constraints on upcoming deliverables we are still about an year away before moving on with that new release. The following is a little bit about our group/application.
    JDeveloper version: 10.1.3.4
    Number of developers: 7
    Developer responsibilties: Build both faces & bc code
    We have two applications currently in our production environments.
    1.A flavor of Steve Muench's dynamic jdbc credentials login module
    2.Core ADF Faces & BC application
    In our Core ADF Faces application, we have the following structure:
    OurApplication
         -OurApplicationLib (Common framework files)
         -OurApplicationModel (BC project)
              -src/org/ourapp/module1
              -src/org/ourapp/module2
         -OurApplicationView (Faces project)
              public_html/ourapp/module1
              public_html/ourapp/module2
              src/org/ourapp/backing/module1
              src/org/ourapp/backing/module2
              src/org/ourapp/pageDefs/
    Total Number of Application Modules: 15 (Including one RootApplicationModule which references module specific AMs)
    Total Number View Objects: 171
    Total Number of Entities: 58
    Total Number of BC Files: 1734
    Total Number of JSPs: 246
    Total Number of pageDefs: 236
    Total Number of navigation cases in faces-config.xml: 127
    Total Number of application files: 4183
    Total application size: 180megs
    Are there any other ways to divide up this application? Ie: module specific projects with seperate faces-config files/databindings? If so, how can these files be "hooked" together? A couple of the aces has recommended that we should separate all the entity files into its own project which make sense. Also, we are looking into the maven builds which should remove those pesky model.jpr files that constantly gets “touched”. I would to love hear how other groups are organizing their application and anything else they would like to share as an ADF best pratice.
    Cheers,
    Wes

    After discussions over the summer/autumn by members of the ADF Methodology Group I have published an ADF Coding Standards wiki page that people may find useful:
    [http://wiki.oracle.com/page/ADF+Coding+Standards]
    It's aimed at ADF 11g and is intended to be a living document - if you have comments or suggestions please post them to the ADF Methodology google group ( [http://groups.google.com/group/adf-methodology?hl=en] ).

  • Best pratices for ODI interfaces

    I was wondering how everyone is handling the errors that occur when running an interface with ODI.
    Our secinaro:
    We have customer data that we want to load each night via ODI. The data is in a flat file and a new file is provided each night.
    We have come across an issue where a numeric field had data that was non numeric in it ... so ODI created a bad file ... with the bad record in it .... and an error file with the error message. We then had some defined constraints that forced records into the E$ table.
    My question is how does everyone handle looking for these errors. We would like them to just bereported to one place ( an oracle table ) so when the process runs we can just look at the one table and then act on the issues.... as shown above ODI puts errors in two different places... DB ones in a flat file and user defined in the E$ tables.....
    I was wondering if anyone has come across this issue and might be able to tell me what was done to handle the errors that occurr .. or what the best pratices might be for handling these errors?
    Thanks for any assistance.
    Edited by: 832187 on Sep 29, 2011 1:18 PM

    If you have got few fields affected by conversion problem you could try insert an ODI constraint or you could modify LKM to load bad file if present.

  • Best pratices for the customizing about the performance

    Hello,
    I would like to know the list of the best pratices for the customizing BPC NW 7.5 about the performance.
    Best regards
    Bastien

    Hi,
    There are few how to guides on SDN which will give you a basic idea on script logic. Apart from this, you can refer to the help guide on help. sap.com.
    The templates might also effect the performance. The number of EVDRE functions, the number of expansion dimensions, the number of members on which expansion takes place will effect the performance. A complex formatting in the template will also effect.
    Hope this helps.

  • Best pratices for GATP by SAP

    Hi all,
    I am not able to download best pratices for GATP by SAP from http://help.sap.com/bp_scmv250/BBLibrary/HTML/ATP_EN_DE.htm. Seems the documents are removed. Can some one who already downloaded share the same with me?
    Also can you provide working links for best pratices for SNP and PP/DS?
    Thankyou,
    Ram

    Hello Ram
    Please check this wiki page - it has good content and some useful links
    APO-GATP General Information - Supply Chain Management (SCM) - SCN Wiki
    and
    Find out more on RDS solution for GATP at : http://service.sap.com/rds-gatp
    if you search http://service.sap.com/bestpractices you will find a documents about best practice in GATP.  The help.sap.com for GATP is a good resource too to start with as well.
    Also you can read below blog written by me
    Global Available To Promise (GATP) Overview
    Hope this will help
    Thank you
    Satish Waghmare

  • Linux Native Multipath and ASM Best Pratices questions

    Hallo,
    I 'd like to know your opinions about some questions I have:
    I am using linux native multipath without asmlib and I wonder:
    1-
    Is it mandatory/best pratice to partition (by fdisk) device-mapper luns before using them to build an ASM DISKGROUP, or Oracle asks to
    partition them because asmlib works better on partition? In other words, is there any issues to use directly /dev/device-mapper/mpath1 or I
    have to use /dev/device-mapper/mpath1p1 with 1 MB offset?
    2-
    Is it best to give the proper user/group to mpath lun's by rc.local or by udev rules? Is there any difference?
    Please , write me what do you have experienced..
    Thanks and bye

    ottocolori wrote:
    Hallo,
    I' m trying to have a clearer picture of it, and as far as I know:
    1 -
    Assuming you need to use the whole disk,
    Partitioning it is mandatory only if you use ASMLIB as it works only on partitioned disk.
    Yes you need to partition the disk first before presented to ASMLib
    ottocolori wrote:
    2-
    There is no need to skip first cylinder, or at least, I can't find official infos about that.
    What do you think about? TIANo need in linux platform to skip 1st cylinder, If I remember correctly you'd need to skip 1st cylinder in solaris as there is bug
    Cheers

  • Vmware Data recovery Best pratices

    Hi,
    I am looking for Vmware Data recovery Best pratices, I find everywhere on the internet the following link :  http://viops.vmware.com/home/docs/DOC-1551
    But this is not a valid link and I can't find it anywhere...
    Thanks

    Hi,
    I am looking for Vmware Data recovery Best pratices, I find everywhere on the internet the following link :  http://viops.vmware.com/home/docs/DOC-1551
    But this is not a valid link and I can't find it anywhere...
    Thanks

  • Analog to Digital import - Best Settings

    I, like so many others, have decided to move our old VHS and Hi8 movies into digital format. I've acquired a Grass Valley ADVC300 for this purpose and have begun transferring the tapes. Now that I have half a dozen imported, I realize I may be losing some of the original viewing area (small part at the top of field of view). I know this could be ADVC related but it raised another question.
    My question is how best to import them into iTunes to maintain the maximum amount of data (max. screen size, quality, etc.). Where can I find information on the ramifications of each of the import file types, etc.? (Is a decision tree flow chart too much to hope for?) At my knowledge level, there seem to be so many options with different outcomes, I'd like to know where each option will take me before spending more hours on this.
    Here's the setup. Due to location, I begin by using iMovie HD-5 to create a project to import into. I'm immediately given 6 options for the project: DV, DV widescreen, HDV 1080i, HDV 720p, MPEG-4, and iSight. I've been using DV.
    Once onto this HD I travel to where I have newer Macs. iMac and MBP with iMovie 11v9 and import this video as an iMovieHD project. On the MBP I'm given the options of no optimization, optimization with full original size, or large-960x540 with a cautionary note stating "Selecting full on this computer may result in degraded video playback." Wow, what are all these options doing to my raw data? Is this critical enough to where I should import into one project on the iMac directly? Should it be optimized video? (eeek)
    ** My goal is to get the maximum raw data into a DV format archive, regardless of file size, so that as time allows I can review, edit and work with the maximum data possible before the tapes degrade further. (Like the RAW or tiff data for digital stills). Are there good titles of references out there on this subject?
    Message was edited by: crtolson

    Is there a difference between the import quality between iMovie 5 and 6?
    Both will import at full quality.
    By the way, I have tried to install the iMovieHD v6.0.1 from the bundled software disk on to my new iMac and MacBook Pro which are running 10.6.7 and it will not load on either.
    If by "bundled software" you mean the disc that came with a different computer you may be able to extract iMovie 06 using pacifist.
    http://www.charlessoft.com/
    If you have the iLife 06 disc it should install on any Macintosh with a G4 processor or newer.
    You can purchase iLife 06 on Amazon or eBay, it will cost more than iLife 11.

  • Which "Presentation Software" Imports Best?

    Hello,
    I am looking to create video in iMovie which incorporate slides from Powerpoint, Keynote, Camtasia - whichever. What I've found is that it's very hard to text to appear clear no matter which you use (I export/import those files as movies into into iMovie) I've even tried exporting individual slides as pictures to then import through iPhoto - still blurry. Titles don't proved enough options in terms of amount of content or placement on screen - same with mattes created in photoshop, etc.
    Any ideas are much appreciated!

    Klaus' method of producing high quality slide show movies/DVDs:
    There are many ways to produce slide shows using iPhoto, iMovie or iDVD, but they all have one thing in common: they reduce the quality of the photos to that of a movie still frame and sometimes limit the number of photos you can use.
    If what you want is what I want, namely to be able to use high resolution photos (even 300 dpi tiff files), to pan and zoom individual photos, use a variety of transitions, to add and edit music or commentary, and to end up with a DVD that looks good on both your Mac and a TV - in other words end up with and end result that does not look like an old fashioned slide show from a projector - you may be interested in how I do it. You don't have to do it my way, but the following may be food for thought!
    Firstly you need proper software to assemble the photos, decide on the duration of each, the transitions you want to use, and how to pan and zoom individual photos where required, and add proper titles. For this I use Photo to Movie. You can read about what it can do on their website:
    http://www.lqgraphics.com/software/phototomovie.php
    (Other users here use the alternative FotoMagico: http://www.boinx.com/fotomagico/homevspro/ which you may prefer - I have no experience with it.)
    Neither of these are freeware, but are worth the investment if you are going to do a lot of slide shows. Read about them in detail, then decide which one you feel is best suited to your needs.
    Once you have timed and arranged and manipulated the photos to your liking in Photo to Movie, it exports the file to iMovie 6 as a DV stream. You can add music in Photo to Movie, but I prefer doing this in iMovie where it is easier to edit. You can now further edit the slide show in iMovie just as you would a movie, then send it to iDVD 7, or Toast, for burning.
    You will be pleasantly surprised at how professional the results can be!

  • File import best practice

    I need some outside input on a process. We get a file from a bank and I have to take it and move it along to where it needs to go. Pretty straight forward.
    The complexity is the import process from the bank. It's a demand pull process where an exe needs to be written that pulls the file from the bank and drops it into a folder. My issue is they want me to kick the exe off from inside SSIS and then use a file
    watcher to import the file into a database once the download is complete. My opinion is the the SSIS package that imports the file and the exe that gets the file from the bank should be totally divorced from each other.
    Does anybody have an opinion on the best practice of how this should be done?

    Here it is:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/bd08236e-0714-4b8f-995f-f614cda89834/automatic-project-execution?forum=sqlintegrationservices
    Arthur My Blog

  • Importing best practices baseline package (IT) ECC 6.0

    Hello
    I hope is the right forum,
    i've a sap release ECC 6.00 with stack abap 14.
    In this release i have to install the preconfigured smartforms that now are called
    best practices baseline package. These pacakges are localized and mine is for Italy:
    SAP Best Practices Baseline Package (IT)
    the documents about the installation say that the required support package level has to be with stack 10.
    And it says :
    "For cases when the support package levels do not match the Best Practices requirements, especially when HIGHER support package levels are implemented, only LIMITED SUPPORT can be granted"
    Note 1044256
    By your experience , is it possible to do this installation in this support package condition?
    Thanks
    Regards
    Nicola Blasi

    Hy
    a company wants to implement the preconfigured smartforms in a landscape ECC 6.0
    I think that these smartforms can be implement using the SAP best practices , in particular the baseline package ....see service.sap.com/bestpractices  --> baseline package;  once installed you can configured the scenario you want....
    the package to download is different each other ,depends the localization...for example italy or other country but this is not important at the moment....
    the problem is the note 1044256...it says that to implement this, i must have the support package level requested in this note...not lower and above all not higher.......
    before starting with this "baseline package" installation i'd like to know if i can do it because i have a SP level of 14 for aba e basis for example....while the notes says that want a SP level of 10 for aba e basis.
    what i can do?
    i hope is clear now....let me know
    thanks
    Nicola

  • Iphoto photo import - best image type for max resolution (jpg vs tiff)

    What is the recommended image type for maximum resolution of photos brought into iphoto albums, since I will be enlarging them? (ie assuming everything brought into iphoto is set/reduced to a standard resolution, then jpg might allow more digital data for later enlargement; on the other hand tiff is often recommended for best resolution though it requires more space). Thanks, - D

    Thanks, Terence - OK, lets assume I shoot a 5-meg photo of a painting and adjust the paralax etc in photoshop so it is perfectly set inside the rectangular format. Now I want to import it into iphoto, knowing that I will use the crop tool to make two blow-up enlargements to go along with the original full-size image... total 3 related images of the artwork. As you mentioned these will eventually go into iweb as part of a series inside an album, with a hyperlink connecting the two enlargements to the full-size image... so viewers can tap to see finer detail and then go back to the overall composition. My question is, for the original photo image of the painting should I drag a 10-meg tiff into iphoto or a 2-meg jpg? (I do not know what importing the image into iphoto does to the original in terms of final image resolution and data size... Will a 2-meg jpg provide sufficient data to work with and enlarge via the cropping, or would the 10-meg tiff be better as a starting point? Or, if everything gets translated into a certain pre-ordained size anyway, would the tiff get watered down to, say, a 2-meg-size image anyway?) Thanks, - D

  • Limited Access to MiniDV camcorder, need to import best

    I've had many versions of Photoshop Elements, but have just purchased Adobe Premiere Elements 10 (have never used any Premiere/Premiere Elements software).
    I have about 50 miniDV tapes, but no camcorder.  Recently, I talked an acquaintance into lending me their camcorder which will play these tapes, which really took some talking, as I'm not a friend, just an acquaintance.
    I ordered and *just* received a Canopus/Grass Valley Bidirectional Analog/Digital Video Conversion ADVC110.  I intend to convert the tapes to DVD's, but with some editing, removing some scenes, etc.
    I had intended to read forum postings, view the tutorials, experiment, etc.  BUT, now the person want their camcorder back after this weekend, so I need to import all the tapes this weekend.  So, with no time to learn anything, what's the best way I can import the tapes into a format that I can play with/experiment with later?  Like I said, I had intended to import various ways, experimenting until I found what works best.  But now I need a "huge, uncompressed" file, or whatever, not even sure what I need.
    I wasn't expecting them to want their camcorder back anytime soon, so got kind of blindsided (long story).  When I was looking at ebay before, camcorders like this were either too expensive, or you were taking chances as to whether they worked or not.
    I know the usual answer would be to "read the manual", view tutorials, etc, but I'm going to be hard pressed just to import the tapes.  Even though it's a US Holiday weekend, I have to work it.
    Any tips would be appreciated, thanks.
    RedDog

    You don't say... IF you run Windows 7 you need to do this...
    Legacy Driver and Capture http://forums.adobe.com/thread/869277
    - And a Picture http://forums.adobe.com/thread/727755
    IF you have problems figuring out how to use PreEl there are a couple other programs that are single purpose, so MAYBE easier to use... I think they are free, but I don't use them so not 100% sure
    I have NOT used either, but many say to try these for SD capture http://windv.mourek.cz/ or http://www.exsate.com/products/dvcapture/
    Be aware that DV AVI is "about" 13 Gig per hour, so you need LOTS of space on your 2nd, dedicated data drive... if you don't have a dedicated data drive, you NEED one
    My 3 hard drives are configured as... (I built a computer for Win7... you need at least 2 hard drives... a BIG one for your data files)
    1 - 320Gig Boot for Win7 64bit Pro and ALL program installs (2)
    2 - 320Gig data for Win7 paging swap file and video project files
    3 - 1Terabyte data for all video files... input & output files (1)
    (1) for faster input/output with 4 drives
    - use drive 3 for all source files
    - use drive 4 for all output files
    (2) only 60Gig used, for Win7 & CS5 MC & MS Office & other smaller programs
    Search Microsoft to find out how to redirect your Windows paging swap file
    http://search.microsoft.com/search.aspx?mkt=en-US&setlang=en-US
    And... some other reading
    Steve's Basic Training Tutorials http://forums.adobe.com/thread/537685
    FAQ http://forums.adobe.com/community/premiere_elements/premiere_elements_faq
    TIPS http://forums.adobe.com/community/premiere_elements/premiere_elements_tips
    http://www.amazon.com/Muvipix-com-Guide-Premiere-Elements-Version/dp/1466286377/
    User Guide PDF http://help.adobe.com/en_US/premiereelements/using/index.html
    Right click the PDF link in the upper right corner and select to save to your hard drive

  • Newbie with Common Best Pratice questions

    I swear I have looked through the Oracle docs but have not found
    the answers. These are a few burning questions I have. I have
    a dev system: Redhat 7.2, Single 80GB HD, and 1GB of ram. I am
    using 9i for now but may move back to 8iR3. The application is
    a small database with usage similar to that of a data warehouse
    (three users, 80000 records per table, Record size 128, 2000
    Tables Not my scheme).
    1) Should the DB_DOMAIN contain the machine Name? IE (M1)
    M1.sales.US.ORacle.com. So full name would be
    ORCL01.M1.Sales.US.Oracle.com, and ORCL02.M1.Sales.US.Oracle.com?
    2) Whats up with JRE? My DB runs 200MB of JRE. If I run the
    Change Management Pack, or a OEM Repository this can go as high
    as 800+MB. Is this normal? Is this the same on Sun and NT
    (jrew) or just a limit of LINUX?
    3) Why do I need an OEM Repository to run the Export and Import
    tool?
    4)If I use an OEM Repository should it exist in the same
    instance as a DB already on the machine, on a new Instance, or
    on a separate machine? I know separate machines would be best
    but it would require another licensee that I can't afford.
    5) Why Can I not select the OLAP tools in the DBCA? I have
    enterprise but it will never install.
    6) If I did get OLAP running would it be better to have it in
    its own instance or in the same instance as my primary
    database? Does the OLAP rely on the jre like the other tools?
    I have been told OLAP in 9i is Views and other tricks not stored
    cubes and rollups can I create standard cubes and rollups on a
    scheduled basis (only asking this because I have been unable to
    get OLAP running).
    7) Is DBSNMP needed for the agent to work if I only use OEM and
    no snmp tools?
    8) Is it better to use LDAP (Oracle Internet Directory) on such
    a small system?
    9) If I had a RAID5 Array would this be enough to stop
    contention or should I put data on a RAID, index on IDE 1
    w/mirror, and systems on scsi drive.
    Thanks for any answers. I am sure they are in a FAQ some where
    but I have been unable to find it I someone could point me to
    good tech FAQ I would love it.
    Thanks Again
    A

    try posting them in the various newsgroups as well:
    comp.databases.oracle.server
    comp.databases.oracle.tools
    comp.databases.oracle.misc
    you may have better luck there.
    hth

Maybe you are looking for

  • Exchange 2010 Auto-Mapping not removing mailbox from Outlook

    I am having an issue that others seem to have as well.  When I give Full Access to another mailbox, auto-mapping brings it in to the users Outlook 2010 client.  However, when I remove that users access to the shared mailbox it doesn't remove the mail

  • My iMac freezes while working but more often when in stand by is.

    my iMac freezes while working but more often when in stand by is.

  • Tree region doesn't display correctly in IE7.

    I've created a tree region in a page, I put it in the sidebar region position. I'm using theme 13 and my page template is No Tabs with sidebar. In Google Chrome and Firefox it displays wonderful... But in IE 7.. the Form Region elements float over th

  • Same date travel not allowed

    Hi experts.. I want to ask, how to block or restrict employee from creating the same date for travel request for another request? so if the employee has already had trip for 01.04.2010 - 04.04.2010, he will not be able to create another travel reques

  • ITunes - iPhone 5 help

    Hi, for some reason one of my iTunes albums won't sync to my iPhone. I can't figure out why. Help?