Best practice: let asm name your datafiles?

Hello,
Wanted to get opinions on what dbas prefer: let asm handle naming or explicitly name your datafiles?
+DATA/orcl/datafile/products.133.84672930
or
+DATA/orcl/datafile/products01.dbf
Reasons? Ease of maintenance, backup/recover, etc?
Thanks

Hello,
Please refer this http://docs.oracle.com/cd/B13789_01/server.101/b10739/omf.htm
Your questions are answered in the above document. I have most of my databases on OMF and I leave it to Oracle on assigning the names.
ninteck      
Newbie
Handle:      ninteck
Status Level:      Newbie (5)
Registered:      Sep 12, 2005
Total Posts:      25
Total Questions:      5 (4 unresolved)
Name      ninteck If you feel that your questions have been answered, then please consider closing your questions by providing appropriate points and marking them as answered. Please keep the forum clean !

Similar Messages

  • Best Practice on Not Exposing your internal FQDN to the outside world

    Exchange server 2010, sits in DMZ, internet facing. The server is currently using the Default Receive Connector. This exposes the internal fqdn to the outside world (ehlo). Since you should not (can't) change the FQDN on your Default Receive connector, what
    is the best practice here?
    The only solution I can see is the following:
    1. Change the Network on the Default Receive Connector to only internal IP addresses.
    2. Create a new Internet Receive Connector port 25 for external IP addresses (not sure what to put in Network tab?) and use my external FQDN for ehlo responses (e.g. mail.domain.com)
    3. What do I pick for Auth and Permissions, TLS and Annoymous only?
    Michael Maxwell

    Yes, it fails PCI testing/compliance. I shouldn't be able to see my internal server and domain. I understand that is the recommendation, but my client doesn't want to host in the cloud or go with a Trend IHMS (trust me I like that better, but its
    not my choice). I have to work with the deck of cards dealt to me. Thanks, just want a solution with what I have now.
    Michael Maxwell
    Understand. I wont go into the value of those tests  :)
    If the customer is really concerned about exposing the internal name, then create a new receive connector with a different FQDN  ( and corresponding cert)  for anonymous connections as you mention above. Know that  it also means internal clients
    can connect to the server on port 25 as well if you dont have the ability to scope to set of ip addresses ( i.e. a SMTP gateway).
    The internal names of the servers will also be in the internet headers of messages sent out:
    http://exchangepedia.com/2008/05/removing-internal-host-names-and-ip-addresses-from-message-headers.html
    http://www.msexchange.org/kbase/ExchangeServerTips/ExchangeServer2007/SecurityMessageHygiene/HowtoremoveinternalservernamesandIPaddressesfromSMTPheaders.html
    Twitter!:
    Please Note: My Posts are provided “AS IS” without warranty of any kind, either expressed or implied.

  • The best practice when backing up your files

    Hi,
    I recently started using Carbon Copy Cloner after using only Time Machine as my back up solution.  I do understand the purpose of each application TM and a cloner utility such as Super Duper or CCC but I was wondering what the best process is when using these two methods to backup you files.
    For instance I use TM to back up my files as frequently as possible to keep all my recent changes updated, but I don’t see how I would keep my clone updated and make sure that when something happens I will have a workable boot disk and not something that contains corrupted files.  In other words I think these cloner utilities have some sort of feature to keep your clone drive updated every time something changes but this got me wondering what if you update your clone drive and for some reason one of the updated file was corrupted, without you knowing it if course, now you have a backup that contains bad files and may affect your system (corrupted fonts files etc.) and when you realized that something is not working right in your system you may want to recover from your clone but you will basically end up with the same problem because the bad files were also backed up.
    How do you ensure that your clone will always be ready and that it will not contain bad files?
    What is your backup process?  Be so kind and share your method.
    Again, I’m ok with TM I just need to know how you guys are managing your clone drives using ether Super Duper or Carbon Copy Cloner.
    Thanks a lot!

    I use CCC exclusively and update my clones every couple of days or after installing updates. I have no use for TM, since I've never had to go back and get something I deleted or changed. I do, however, boot into those clones on a routine basis to ensure that they look and act like the originals. This has worked for over seven years, but YMMV.

  • Best Practices To Size the Datafile on ASM in Oracle 11gR2

    Hi DBAs,
    I need to know that what should be the datafile next extent on ASM. I have created a database 11gR2 with ASM. Now some certified DBA told me to create the datafile as small as possible and make it auto extend with the next extent size 1M. The expected data growth every month is about 20-25 Gig.
    Please let me know what are the best practices while for sizing the datafiles and extents in 11g using ASM. Any good document/note refer would be great help.
    Thanks
    -Samar-

    Hi Samar,
    I need to know that what should be the datafile next extent on ASM. I have created a database 11gR2 with ASM. >Now some certified DBA told me to create the datafile as small as possible and make it auto extend with the next >extent size 1M. The expected data growth every month is about 20-25 Gig.
    Please let me know what are the best practices while for sizing the datafiles and extents in 11g using ASM. Any >good document/note refer would be great help.I don't think there is any kind of recomendations for that, at least, I've never seen. As You probably know, ASM already divides the ASM files (any sort of files written in a ASM diskgroup) in extents and those extents are composed of AU. So, doesn't matter the datafile size, ASM will balace the I/O on all ASM disks in a diskgroup.
    I guess this thread ( AU_SIZE and Variable Extent Size ), even not being the subject of Your question, Will help You.
    I do believe there are more important factors for ASM performance, like stripe size, LUN size, storage configuration, etc.
    Hope it helps,
    Cerreia

  • Best practice for searching on surname/lastname/name in Dutch

    I'm looking for a best practice to store names of persons, but also names of companies, in my database.
    I always store them as is (seems logical since you need to be able to display the original input-name) but I also want to store them transformed in some sort of way so I can easily search on them with LIKE! (Soundex, Metaphone, Q-Gram, ...)
    I know SOUNDEX and DIFFERENCE are included in SQLServer, but they don't do the trick.
    If somebody searches for the phrase "BAKKER", you should find names like "Backer", "Bakker", ... but also "De Backer", "Debecker", ... and this is where SOUNDEX fails ...
    Does someone know some websites to visit, or someone already wrote a good function to transform a string that I can use to store the names but also to transform my search data?
    (Example:  (Pseudo lang :-))
    function MakeSearchable (sString)
      sString = sString.Replace(" ", ""); //Remove spaces
      sString = sString.Replace("CK", "K");
      sString = sString.Replace("KK", "K");
      sString = sString.Replace("C", "S");
      sString = sString.Replace("SS", "S");
      return sString;
    Greetz,
    Tim

    Thanks for the response, but unfortunately the provided links are not much help:
    - The first link is about an article I don't have access to (i'm not a registered user)
    - The second link is about Integration Services. This is nice for Integration stuff, but I need to have a functionality within a frontend. 
    - The third link is for use in Excel.
    Maybe I'm looking for the wrong thing when wanting to create an extra column with "cleaned" up data. Maybe there's another solution from within my frontend or business layer, but I simply want a textbox on a form where users can type a search-value like
    "BAKKER". The result of the search should return names like "DEBACKER", "DE BEKKER", "BACKER", "BAKRE", ...
    I used to work in a hospital where they wrote their own SQL-function (on an Interbase database) to do this: They had a column with the original name, and a column with a converted name:
    => DEBACKER => Converted = DEBAKKER
    => DE BEKKER => Converted = DEBEKKER
    => BACKER => Converted = BAKKER
    => BAKRE => Converted = BAKKER
    When you searched for "BAKKER", you did a LIKE operation on the converted column ...
    What I am looking for is a good function to convert my data as above.
    Greetz,
    Tim

  • Datafile sizing best practice ??

    Hi,
    I have a Tablespace of size 82 GB with 3 datafiles. The datafile config is given below.
    'prod01.dbf' SIZE 33554112K AUTOEXTEND ON NEXT 8K MAXSIZE UNLIMITED'
    'prod02.dbf' SIZE 33554416K AUTOEXTEND ON NEXT 20M MAXSIZE UNLIMITED'
    'prod03.dbf' SIZE 18064M AUTOEXTEND ON NEXT 20M MAXSIZE UNLIMITED'
    My server config is given below.
    HP AMD 4xCPU 6 cores blade server
    Sun Solaris 10 - 64 bit
    10g R2 10.2.0.4 64 bit
    Native local HDD 2x300GB having RAID 1
    What is the best practice for sizing the above datafiles to attain maximum performance?
    Regards,
    Ashok Kumar.G

    I have a Tablespace of size 82 GB with 3 datafiles. The datafile config is given below.
    'prod01.dbf' SIZE 33554112K AUTOEXTEND ON NEXT 8K MAXSIZE UNLIMITED'
    'prod02.dbf' SIZE 33554416K AUTOEXTEND ON NEXT 20M MAXSIZE UNLIMITED'
    'prod03.dbf' SIZE 18064M AUTOEXTEND ON NEXT 20M MAXSIZE UNLIMITED'You can have different extents on each data file, But it depends on RAID level performance too.
    HP AMD 4xCPU 6 cores blade server
    Sun Solaris 10 - 64 bit
    10g R2 10.2.0.4 64 bit
    Native local HDD 2x300GB having RAID 1
    What is the best practice for sizing the above datafiles to attain maximum performance?check this note
    *I/O Tuning with Different RAID Configurations [ID 30286.1]*
    http://www.dba-oracle.com/t_datafile_management.htm

  • Best Practices - Telco - PM

    Dear All,
    I am in need of some u201Cbest practicesu201D in asset management in Telecommunications industry.
    On of my LE clients would like to implement asset management. The concentration will be PM - equipment tracing & tracking in the system. A u201Cbest-practiceu201D asset management system in their SAP ECC is the idea.
    An insight into best practices of Plant Maintenance and equipment trace & tracking in Telecommunications Industry needed.
    u2022     Asset coding/naming design in Telecommunications  industry (especially in u201Cnetwork assetsu201D u2013 if there is some kind of a best practice how to name the assets (hierarchies, naming conventions etc)).
    u2022     Insight into plant maintenanceu2019s core functionality in Telco
    u2022     Any tracing & tracking system proposal u2013 barcode and/or RFID technologies for telecommunication asset management u2013 any insight into partners working for this purpose.
    They are specifically interested in Deutsche Telekom
    Best regards
    Yavuz Durgut - SAP Turkey

    Hi,
    You have a good start.  What you need to do is 1.) Find out what the requirements are -- what does your user want.... if this is a fact finding mission (e.g., they want to see what's in the system) then your requirement becomes load the data in R/3, so figure out what they configured in PM and use those definitions as your requirements.   2.) Use those requirements to find data in the fields listed in the MultiProvider, InfoCubes, or DataStore Objects sections in your first link ... in other words, now that you know what data to look for, look for it in the Data Targets (MP's, Cubes and DSO's).  If you find some of the data you want, then trace back the infosources and determine what datasources in R/3 load the data you are looking for.  After all that, check those datasources for any additional fields you may need and add them in.
    So, if your company doesn't maintain equipment costs or maintenance costs for equipment, then you don't have to worry about 0PM_MP04.  Use this type of logic to whittle down to what your really need and want, then activate those object only.
    Good Luck,
    Brian

  • Best practices to reduce downtime for Database releases(rolling changes)

    Hi,
    What are best practices to reduce downtime for database releases on 10.2.0.3? What DB changes can be rolling and what can't?
    Thanks in advance.
    Regards,
    RJiv.

    I would be very dubious about any sort of universal "best practices" here. Realistically, your practices need to be tailored to the application and the environment.
    You can invest a lot of time, energy, and resources into minimizing downtime if that is the only goal. But you'll generally pay for that goal in terms of developer and admin time and effort, environmental complexity, etc. And you generally need to architect your application with rolling upgrades in mind, which necessitates potentially large amounts of redesign to existing applications. It may be perfectly acceptable to go full-bore into minimizing downtime if you are running Amazon.com and any downtime is unacceptable. Most organizations, however, need to balance downtime against other needs.
    For example, you could radically minimize downtime by having a second active database, configuring Streams to replicate changes between the two master databases, and configure the middle tier environment so that you can point different middle tier servers against one or the other database. When you want to upgrade, you point all the middle tier servers against database A other than 1 that lives on a special URL. You upgrade database B (making sure to deal with the Streams replication environment properly depending on requirements) and do the smoke test against the special URL. When you determine that everything works, you configure all the app servers to point at B and have Streams replication process configured to replicate changes from the old data model to the new data model), upgrade B, repeat the smoke test, and then return the middle tier environment to the normal state of balancing between databases.
    This lets you upgrade with 0 downtime. But you've got to license another primary database. And configure Streams. And write the replication code to propagate the changes on B during the time you're smoke testing A. And you need the middle tier infrastructure in place. And you're obviously going to be involving more admins than you would for a simpler deploy where you take things down, reboot, and bring things up. The test plan becomes more complicated as well since you need to practice this sort of thing in lower environments.
    Justin

  • Best practice for maintaining URLs between Dev, Test, Production servers

    We sometimes send order confirmations which include links to other services in requestcenter.
    For example, we might use the link <href="http://#Site.URL#/myservices/navigate.do?query=orderform&sid=54>Also see these services</a>
    However, the service ID (sid=54) changes between our dev, test, and production environments.  Thus we need to manually go through notifications when we deploy between servers.
    Any best practices out there?

    Your best practice in this instance depends a bit on how much work you want to put into it at the front end and how tied to the idea of a direct link to a service you are.
    If your team uses a decent build sheet and migration checklist then updating the various URL’s can just be part of the process. This is cumbersome but it’s the least “technical” solution if you want to continue using direct links.
    A more technical solution would be to replace your direct links with links to a “broker page”. It’s relatively simple to create an asp page that can accept the name of the service as a parameter and then execute an SQL query against the DB to return the ServiceID, construct the appropriate link and pass the user through.
    A less precise, but typically viable, option would be to use links that take advantage of the built in search query functionality. Your link might display more results than just one service but you can typically tailor your search query to narrow it down. For example:
    If you have a service called Order New Laptop or Desktop and you want to provide a link that will get the user to that service you could use: http://#Site.URL#/RequestCenter/myservices/navigate.do?query=searchresult&&searchPattern=Order%20New%20Desktop%20or%20Laptop
    The above would open the site and present the same results as if the user searched for “Order New Desktop or Laptop” manually. It’s not as exact as providing a direct link but it’s quick to implement, requires no special technical expertise and would be “environment agnostic”.

  • Best Practices for Professional video editing

    Hi
    I'd like to know your thoughts on what the most proffessional / effeciant method for editing are. At the moment, I archive all the footage from a DV tape through iMovie (I just find iMovie easier for doing this) save / archive all the imported segments of clips I need, name them, then import them into FCP
    When I finish an edit I export and uncompressed Quicktime movie, then back up the entire project on an external drive
    Is this good practise, Should I export the final edit to tape?
    I've just started out as a video-maker as a paid proffession and I'd like to know the most 'by the book' methods
    THanks
    G5 Dual   Mac OS X (10.4.8)  

    Sounds to me that you're doing a whole lot of extra steps using i-movie as your import. You're going to lose some of FCP best media features by not digitizing with FCP. Batch Capture in FCP isn't hard to learn.
    I wouldn't say there's any "rulebook" for professional editors. We all work a little differently but here are some of my "best practices"
    Always clearly name and label all of the tapes that you are using in a fashion that makes sense to you. When I cut a large project I may have multiple tapes. If I lose a piece of media accidentally, it's easier to go back and re-digitize if I have organized the project early in.
    Clearly label bins and use them wisely. For example, on a small project I might have a "video" bin, a "music" bin and a "graphics" bin. This saves searching through one large bin.
    On larger projects, I try to think ahead to how I will edit and make bins accordingly. For example I might have bins as follows, interviews, b-roll location a, b-roll location b and so on. Then I'll have music bins, animation bins and still graphic bins. I generally try to save all to one hard drive which saves me looking through three or four drives. This isn't always possible depending upon the size of the project.
    As for back-up. Lots of peope buy harddrives for each project and then store them until they need them next. Of course, keep all of your raw-footage and you can always re-digitize.
    When I'm done with a project I save the completed project to tape...this is for dubs and library. I save the FCP information on a DVD and I burn the media from the drive, because I can't afford multiple hard drives. I would rather re-digitize my raw if I need to re-do the project in the future.
    That's how I do it, but other editors have other methods. I would highly suggest digitizing in FCP and not i-movie, but that's entirely up to you. You're not doing anything "wrong."
    G4 Dual Processor   Mac OS X (10.4.1)  
    G4 Dual Processor   Mac OS X (10.4.1)  

  • Transport Best Practices - Cumulative Transports

    Hi All,
        I am looking for a some sort of authoritative guide from SAP on ECC Transport Best Practices, especially around merging/combining/accumulating multiple transports into fewer ones. We are a very large project, but we haven't figured out the CVSs like ChaRm so we still deal with individual transports.
    The reason I am asking this is that we some development leads on our project that insist that ALL transports that leave Development system must be imported into Production.  They claim this the SAP best practice.  An SAP consulting review of our system also left a vague note that "We are orphaning transports". This could mean that 1. either we are not importing all the stuff that leaves Dev system 2. or we are not keeping track of our code changes across all environments. Proponents of "All transports must get to PRD" are interpreting this as "1". 
    I have my team cumulate transport for subsequent changes into newer transport and only take the newest Transport to Production.  The continuous rolling of old Transport into new one using SE01 "Include Objects" options ensures that all changes part of current development are in a single TR. Fewer transports mean fewer housekeeping across all systems, and less chances of something going out of sequence or missed out. This is for Workbench Transports only. I understand Config transports could get a little tricky with rolling in.
    If you can't point me to a link, what is your take on "Send everything to Prod" vs. "Combine changes into fewer Transports"?  I can't think of any software packaging methodology that suggests putting everything, including some junk/crap, into production build. I have looked at SAP enhancement packs, SPS, and Notes. I haven't found any evidence of  SAP including older buggy code in what it releases to its customers.
    Thank you all!

    Jānis, Christian,
        I think we are all on the same page.  Let me clarify my specific scenario a little bit more.
    We are about 15 ABAP developers team, for production support. We don't do huge changes. Our code updates are mostly limited to a handful of objects (average 2 - 5).  We often have multiple iterations to same objects. For large scale development objects, this approach may not work very well. Those should really utilize SolMan or other CVS tools.
    Here is How I have my team putting together final transport.
    step 1. Change is done to object X. Transport #1 created and released after standard checks.
    step 2. More change is needed for object X.  Transport #2 Started.  Transport #1 is brought in using SE01 include objects at the main task. Changed objects are in lower tasks. This way, I can tell what was brought over, and what really changed this time.  Releases of the Transport #2 inspects all objects and insures all objects from #1 are included in #2, and that there are no other changes between #1 and #2 to any of the objects.  This is very easy check mostly from Version History.
    Step 3. More changes needed to object X and Y. Transport #3 started.  Transport #2 brought in at main task. Same check from Step 2 is done to ensure no other changes exist.
    step 4....6. Step 6 ended at Transport #6.  All changes verified in QA system.
    Only the Transport #6 needs to be sent to Production since previous Transports 1 to 5 were rolled into #6.
    Jānis, the deletions will be covered automatically just like standard SAP.  No special manual steps needed.
    Christian,
       The transport of copies works in similar way.  Only thing different is that the Main/cumulative transport is released at very last, possibly after QA tests have been confirmed.
    I had thought about doing Copies vs. cumulative.  I preferred to go with having Transport in QA already at the time of approval. I would have hard time explaining to our client why a Transport was released out of Dev After all tests were done and signed off.
    However, the Copies have advantage that intermediate versions or parallel changes of same objects are easily recognized upfront vs. us having to check each transport before release.
    Jānis,  the "copies of Transport" also creates extra versions in version history just like manually adding objects in a transport. 
    My quick analysis between copies and cumulative only came with one different that 'copies' have a different flag/attribute in Table E070.  I am sure the flag means something to SAP, but for versioning history, it makes no differences.
    Regardless of if I cumulate or not, based on your experiences, have you come across notion that everything that leaves Dev must get to PRD as SAP Best Practice? What would your reply  be if someone insists that is the only way to move code to production?

  • New Best Practice for Titles and Lower Thirds?

    Hi everyone,
    In the days of overscanned CRT television broadcasts, the classic Title Safe restrictions and the use of larger, thicker fonts made a lot of sense. These practices are described in numerous references and forum posts.
    Nowadays, much video content will never be broadcast, CRTs are disappearing, and it's easy to post HD video on places like YouTube and Vimeo. As a result, we often see lower thirds and other text really close to the edge of the frame, as well as widespread use of thin (not bold) fonts. Even major broadcast networks are going in this direction.
    So my question is, what are the new standards? How would you define contemporary best practice?
    Thanks for your thoughtful replies!
    Les

    stuckfootage wrote:
    I wish I had a basket of green stars...
    Quoted for stonedposting.
    Bzzzz, crackle..."Discovery One, what is that object?
    Bzz bzz."Not sure, Houston, it looks like a basket...." bzzz
    Crackle...."A bas...zzz.. ket??"
    Bzzz. "My God, It's full of stars!" bzz...crackle.
    Peeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee eeeeeeeeeep!

  • Best Practice for SUP and WSUS Installation on Same Server

    Hi Folks,
    I have a question, I am in process of deploying SCCM 2012 R2... I was in process of deploying Software Update Point on SCCM with one of the existing WSUS server installed on a separate server from SCCM.
    A debate has started with of the colleague who says that the using remote WSUS server is recommended by Microsoft because of the scalability security  that WSUS will be downloading the updates from Microsoft and SCCM should be working as downstream
    server to fetch updates from WSUS server.
    but according to my consideration it is recommended to install WSUS server on the same server where SCCM is installed... actually it is recommended to install WSUS on a site system and you can used the same SCCM server to deploy WSUS.
    please advice me the best practices for deploying SCCM and WSUS ... what Microsoft says about WSUS to be installed on same SCCM server OR WSUS should be on a separate server then the SCCM server ???
    awaiting your advices ASAP :)
    Regards, Owais

    Hi Don,
    thanks for the information, another quick one...
    the above mentioned configuration I did is correct in terms of planning and best practices?
    I agree with Jorgen, it's ok to have WSUS/SUP on the same server as your site server, or you can have WSUS/SUP on a dedicated server if you wish.
    The "best practice" is whatever suits your environment, and is a supported-by-MS way of doing it.
    One thing to note, is that if WSUS ever becomes "corrupt" it can be difficult to repair and sometimes it's simplest to rebuild the WSUS Windows OS. If this is on your site server, that's a big deal.
    Sometimes, WSUS goes wrong (not because of ConfigMgr)..
    Note that if you have a very large estate, or multiple primary site servers, you might have a CAS, and you would need a SUP on the CAS. (this is not a recommendation for a CAS, just to be aware)
    Don
    (Please take a moment to "Vote as Helpful" and/or "Mark as Answer", where applicable.
    This helps the community, keeps the forums tidy, and recognises useful contributions. Thanks!)

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

Maybe you are looking for

  • Moving print queues

    We are preparing to move from queue based printing to iPrint and are experiencing the following problem with Windows XP locked-down workstations. We created the queue directory on the iprint server, gave the ou rights to the queues directory, changed

  • Connecting to Xbox live

    How do I get Xbox live to work with my Apple computers and Base Station 72a539?

  • Movie Inventory Management

    Movie Inventory Management Despite having computed for decades, I am very new to iPod and downloading music and movies. I understand that one cannot burn a DVD of a downloaded movie. Is that correct? So, what is the point of having the movie art if o

  • Parallel loading

    Hello BW Experts, In the performance tuning document there is a reference to the 'Parallelization when loading data' document, but the link http://service.sap.com/~sapidb/011000358700012379892000E but this link does not seem to work. do you know what

  • Can't open .docx attachments

    I have looked thru list but can't find specifically what I should do about my daughter's computer. She is running 10.4.6 and cannot open attachments from emails with the suffix ".docx". What should we do? Thanks much. Ed1941