Migrating printers from XP to Win7. Best Practice?

I am currently using PRINTMIG to migrate mapped printers from XP to win7 on workstations. I am experiencing failures to map Dell and RICO printers. I just read "Printbrm" is supposed to replace PRINTMIG but does it also work with XP?
Im also curious what others are currently using to migrate and remap printers from XP to Win7?????
tconners

I don't know that I've ever found a really great way to do printer migrations from one OS to another. For one you often have issues going from 32-bit drivers on the XP side to 64-bit drivers on the Windows 7 and beyond side.  Plus, depending on
your current methodology of installing printers (i.e. printer server with global shares vs local port connection, or tcpip, etc.) different challenges can arise.  We've explored the idea of using a print server with heavy tie in to our AD environment. 
In this case, user's would be assigned a group, likely based on location or department, and this would map the given printers for them automatically.  Depending on the size and complexity of your environment, this could be a potential idea to consider.

Similar Messages

  • Problem with crystal reports get from All in One Best Practice package

    Hi,
    I tried to use Financial Statements crystal report download from All in One Best Practice package but encounter the below errors. I followed the guide in Manual Steps for Additional Datasource Creation to set up the additional data source. When i try to preview the report, I tried to set all the value to Null but the error message still appear. I believe this has something to do with the infoset but i not sure how to resolve this. Those report which connect to infoset doesnt seem to work for me.
    Errors:
    Failed to retrive data from database. (then i press OK)
    database connector error: 'no item structure data' (then i press OK)
    database connector error: 'RFC_Closed'
    Please advice
    Thank you

    Hi Afzal,
    I think you misunderstood. The crystal report i am talking about is the 23 crystal report template i get from All in One Best Practice package. All i need to do is follow the "Quick Guide to impletement SAP Best Practices for Business Intelligence V4.31" to make those template works. For those template that connect to Database type: SAP Table, Cluster or Function, i can make those work. The problem i facing now are those template (example: Financial Statements) that connect to Database type: SAP Info sets. The error i receive are stated in my first post.
    Please advice

  • Upgrade and migration from Netware to Linux best practice

    Hi folks,
    We've been running NSM 2.5 for a few years on Netware and have been very satisfied with the product and performance. We are planning to migrating to SLES11 with OES11 very soon to finally move away from Netware and I have some issues I can't seem to find answers to. Of course I want to use NSM to move my home directories from my Netware to my SLES so I want this working well when we migrate.
    I've searched but not found a good resource to show how to move the NSM engine from a Netware server to a Linux server with best practices. Anyone have experience with this and any gottchas?
    The other question is of course is to upgrade NSM to a new version before or after the move to Linux/OES11? We are at Ver 2.5.0.43 and want to move to the latest version. Upgrade procedure and best practices would be handy.
    The last of course is NSM compatible with SLES/OES11? I presume it is and certainly hope so because we want to move all of our users to SLES11 NSS.

    On 5/9/2012 11:06 AM, jlauzon wrote:
    >
    > Hi folks,
    > We've been running NSM 2.5 for a few years on Netware and have been
    > very satisfied with the product and performance. We are planning to
    > migrating to SLES11 with OES11 very soon to finally move away from
    > Netware and I have some issues I can't seem to find answers to. Of
    > course I want to use NSM to move my home directories from my Netware to
    > my SLES so I want this working well when we migrate.
    > I've searched but not found a good resource to show how to move the NSM
    > engine from a Netware server to a Linux server with best practices.
    > Anyone have experience with this and any gottchas?
    The NSM 3.0.x Engine setup process actually handles the migration from
    NSM 2.5 fairly easily. Our 3.0 Installation Guide (available at
    http://www.novell.com/documentation/storagemanager3 ) includes all the
    information you should need regarding migration, including the
    suggestions I'll list here.
    You'll want to leave NSM 2.5 running during the migration so that the
    NSM 3.0 setup wizard can connect to that Engine and import its policies
    and pending events. You'll also want to have as few pending events as
    possible in NSM 2.5 -- deferred deletes are fine, but all pending events
    will slow down the migration process, since they'll have to be
    transferred over.
    > The other question is of course is to upgrade NSM to a new version
    > before or after the move to Linux/OES11? We are at Ver 2.5.0.43 and want
    > to move to the latest version. Upgrade procedure and best practices
    > would be handy.
    Again, this information is available in the 3.0 Installation Guide. To
    migrate from 2.5 to 3.0.x, you'll have to have at least one OES11 server
    in your tree to install it on; but you'll also have to leave the NSM 2.5
    Engine running on its Netware host long enough to migrate from it.
    > The last of course is NSM compatible with SLES/OES11? I presume it is
    > and certainly hope so because we want to move all of our users to SLES11
    > NSS.
    We are about to release version 3.0.4 of NSM, which provides full
    support for OES11 on SLES11. The NFMS Support Team can also provide you
    with builds of NSM 3.0.3 which support OES11; if you need those for
    early testing, please send an email to storagemanager[at]novell[dot]com.
    Hope this helps!
    - NFMS Support Team

  • PrintBrm.exe - Migrating printers from Server 2008 R2 to 2012 R2

    Hi everyone,
    I'm trying to migrate our printers from a Windows Server 2008 R2 to a brand new Windows Server 2012 R2. While doing this, we also want to switch the drivers to a newer version. As I successfully tried this a year before with the PrintBrm.exe (but from Server
    2008 R2 to another Server 2008 R2), I choosed this tool again.
    Unfortunately it fails at some point this time. The printer export and import succeeds every time, but if I'm trying to print a testpage on the new server, it won't stop printing hyroglyphs... Only if I install a printer manually (same driver), it works
    fine.
    Do you have any solutions or ideas for that? For better troubleshooting, I attached an image of one failed testpage and the code I've written, too. Looking forward for some ideas.
    Best regards, Andreas
    Export: PrintBrm.exe -B -S \\server -F C:\Tools\filename.printerExport -nobin
    Import: PrintBrm.exe -R -F filename.printerExport -C BrmConfig.xml –O FORCE
    BrmConfig:
    <BrmConfig>
    <PLUGINS>
    </PLUGINS>
    <LanguageMonitors>
    </LanguageMonitors>
    <DriverMap>
    <DRV old="HP Universal Printing PCL 6" new="HP Universal Printing PCL 6 (v5.8.0)" />
    <DRV old="HP Universal Printing PCL 6 (v5.2)" new="HP Universal Printing PCL 6 (v5.8.0)" />
    <DRV old="HP Color LaserJet CP1510 series PCL6" new="HP Universal Printing PCL 6 (v5.8.0)" />
    <DRV old="HP LaserJet P3011/P3015 PCL6" new="HP Universal Printing PCL 6 (v5.8.0)" />
    </DriverMap>
    </BrmConfig>
    Image of the testpage:

    Hi Andreas Remdt,
    Based on your description, did you mean that complete the process of migration successfully? Would you please
    let me know which version of the Printbrm.exe command-line tool that you used to migrate?
    In addition, please open Event Viewer and check if you can find any related event (Source: PrintBRM). If any
    update, please feel free to let me know.
    Best regards,
    Justin Gu

  • Unicode Migration using National Characterset data types - Best Practice ?

    I know that Oracle discourages the use of the national characterset and national characterset data types(NCHAR, NVARCHAR) but that is the route my company has decide to take and I would like to know what is the best practice regarding this specifically in relation to stored procedures.
    The database schema is being converted by changing all CHAR, VARCHAR and CLOB data types to NCHAR, NVARCHAR and NCLOB data types respectively and I would appreciate any suggestions regarding the changes that need to be made to stored procedures and if there are any hard and fast rules that need to be followed.
    Specific questions that I have are :
    1. Do CHAR and VARCHAR parameters need to be changed to NCHAR and NVARCHAR types ?
    2. Do CHAR and VARCHAR variables need to be changed to NCHAR and NVARCHAR types ?
    3. Do string literals need to be prefixed with 'N' in all cases ? e.g.
    in variable assignments - v_module_name := N'ABCD'
    in variable comparisons - IF v_sp_access_mode = N'DL'
    in calls to other procedures passing string parameters - proc_xyz(v_module_name, N'String Parameter')
    in database column comparisons - WHERE COLUMN_XYZ = N'ABCD'
    If anybody has been through a similar exercise, please share your experience and point out any additional changes that may be required in other areas.
    Database details are as follows and the application is written in COBOL and this is also being changed to be Unicode compliant:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    NLS_CHARACTERSET = WE8MSWIN1252
    NLS_NCHAR_CHARACTERSET = AL16UTF16

    ##1. while doing a test convertion I discovered that VARCHAR paramaters need to be changed to NVARCHAR2 and not VARCHAR2, same for VARCHAR variables.
    VARCHAR columns/parameters/variables should not by used as Oracle reserves the right to change their semantics in the future. You should use VARCHAR2/NVARCHAR2.
    ##3. Not sure I understand, are you saying that unicode columns(NVARCHAR2, NCHAR) in the database will only be able to store character strings made up from WE8MSWIN1252 characters ?
    No, I meant literals. You cannot include non-WE8MSWIN1252 characters into a literal. Actually, you can include them under certain conditions but they will be transformed to an escaped form. See also the UNISTR function.
    ## Reason given for going down this route is that our application works with SQL Server and Oracle and this was the best option
    ## to keep the code/schemas consistent between the two databases
    First, you have to keep two sets of scripts anyway because syntax of DDL is different between SQL Server and Oracle. There is therefore little benefit of just keeping the data type names the same while so many things need to be different. If I designed your system, I would use a DB-agnostic object repository and a script generator to produce either SQL Server or Oracle scripts with the appropriate data types or at least I would use some placeholder syntax to replace placeholders with appropriate data types per target system in the application installer.
    ## I don't know if it is possible to create a database in SQL Server with a Unicode characterset/collation like you can in Oracle, that would have been the better option.
    I am not an SQL Server expert but I think VARCHAR data types are restricted to Windows ANSI code pages and those do not include Unicode.
    -- Sergiusz

  • Migrating iTunes from XP to Win7

    Greetings,
    I'm in the process of migrating from XP to Win7, and need to know how I can safely move my iTunes library, playlists, etc.. from one to the other.
    XP, Win7 and my iTunes music folder are all on separate drives. I can point iTunes at the folder the music is stored in via the preferences, but before I do anything, I need to know how to transfer the data, such as the playlists, ratings, etc.. into Win7.
    Will a simple export from iTunes on the XP drive do the trick, or is there another path I should take?
    Thanks!

    1 - Copy the entire \Music\iTunes\ folder from the C: drive to the D: drive.
    2 - Copy everything from C:\Documents & Settings\AllUsers\Shared Music\iTunes Music\ to the \iTunes\iTunes music\ you copied to the D: drive.
    3 - Move everything in your current iTunes Music Library folder on the D: drive into the \iTunes\iTunes music\ you copied to the D: drive.
    Not sure where the database is sitting
    It is iTunes library.itl is in \Music\iTunes\ on the C: drive.
    Copy the entire \iTunes\ folder (now with everything in it) fom the D: drive to the \Music\ folder in your user folder on the new computer.

  • Migrating from TFVC to Git best practices

    Hi Ive been tasked with migrating a large number of team projects from TFVC to GIT in TFS 2013. Can you please suggest any tools or documents to get me started? It seems the only suggestion so far is to create a new GIT project but you'd lose the history. 

    Hi Beaglehound24,
    You cannot change the version control system after a team project is created. As you mentioned, the way is create a new Git team project and then move the source code. This way will not keep your history.
    A very detailed instruction together with Powershell script which does migrate source code, work items and  test plans can be found here
    Migrating a TFS TFVC based team project to a Git team project - a practical example
    Best regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Migrate Firefox from WinXP to Win7 and keep settings

    I just replaced my ASUS laptop running WinXP Pro with a Samsung laptop running Win7 Home Basic. I would like to install Firefox 4 on the new Samsung and transfer my bookmarks & cookies from the old computer. Can anyone help? Win7 doesn't have Documents & Settings any more, so it's not totally straightforward. Thanks!

    For bookmarks alone, it is easy. <br />
    Source: <br />
    Bookmarks > Show All Bookmarks (or Organize Bookmarks) -> Import & Backup - Export HTML... - to a USB stick <br />
    then for the destination" <br />
    Bookmarks > Show All Bookmarks (or Organize Bookmarks) -> Import & Backup - Import HTML... - from file on that USB stick
    For the topic being discussed in this thread, the "automated" solution is MozBackup. <br />
    http://mozbackup.jasnapaka.com/

  • Change OBIEE Source from Temp to Permanent (Best Practices)

    All - I am struggling with the following:
    I would like to do a quick POC (proof-of-concept) for a client using OBIEE and temporary source. However, this POC will within 6 months need to start pointing to a more permanent source.
    The temp vs. permanent source will be slightly different. Perm may include more fact columns and more dimension tables.
    Questions for the OBIEE architects is:
    - What precautions can i take to make sure work done for POC will not be a throw away?
    Thanks,

    If you change to a source where you find all the existing objects then there is no problem in changing the source. You can create reports normally now. Care needs to be taken when the new source will have a few objects missing. In the presentation layer rename these objects whose source will change in the future and make it a common or generic name. When moving, For all those object in BMM whose sources are changed, try to replace the LTS source with the new ones. Try to make the one to one mapping between physical objects and BMM objects same before and after moving the source.
    Edited by: kart on Jul 26, 2010 12:17 PM

  • SAP ASE Best Practice latest update

    Hello experts,
    just wondering if somebody already reviewed thoroughly latest guide for best practices on SAP Sybase ASE?
    I am talking about the document from note 1680803 - SYB: Migration to SAP Adaptive Server Enterprise - Best Practice (former note 1722359 - SYB: Running SAP applications on SAP ASE - Best Practice).
    The guide for normal runtime operation was merged with the guide for migration, but there are some contradictory statements.
    Apart from that the study case is again designed for server with huge memory and lot of CPU cores (so not so real case normally, I wonder who setup so often such huge servers...), I have found some inconsistencies.
    E.g. in part "Reconfigure Engines and Parallel Processing", they talk about to limit ASE engines to 16, but the command configures 32.
    alter thread pool syb_default_pool with thread count = 32, idle timeout = 2000
    No change to the previous setup for migration. Is this just typo? I understand it should be 16, and then also number of network tasks for normal operation would be 4 (as mentioned in the beginning of guide that normally you set up 1 per 3-4 engnes). If this is not typo, then number of network tasks is wrong as it should be 8.
    Also they introduced idle timeout, but only talking about ERP and possible lower value for Solman - does this mean that for BW you keep default value (which if I am not mistaken is 100)? As per ADM540 you should even decrease this timeout when SAP system is sharing server with database - I know that document is old, but is again contradictory, not saying that it is wrong, but not well explained.
    If anybody checked new version of guide, please let me know, I think it is bit messed up and is bit difficult to distinguish what you should set up for migration case and what for normal operation case.
    Thanks!
    Regards,
    Matus

    Actually, quite a few customers run with that many engines/memory.   In fact, it is difficult these days to even buy a server with less than 128GB of memory and 16 cores/32 threads.    Pretty much the only time we see less is when the install is in a VM.   Interestingly, we had comments from the first version suggesting the numbers were not realistic given the typical size of systems being deployed were much larger....    In addition, in my experience with customers on SAP systems, they were not aware of how  much memory was necessary to really support medium to large systems based on the configurations they were attempting.
    I am sorry that you feel some of the examples are contradictory.  You are correct in pointing out that the text refers to 16 engines and the example configures 32....   So yes, for that specific example, it should have been 16. 
    Secondly, not having seen ADM540, but I think there is a bit of a problem if they suggest that.   I my opinion (and I have spent a lifetime tuning ASE), the idle timeout for ERP and BW should likely both be 1000+ and 2000 is not unreasonable.   The comment in ADM540 is likely due to if ASE and a NW CI are sharing the same cores - e.g. you have a 4 core box and ASE is running on 2 cores (we will ignore threads for this discussion) and you have 30 NW worker processes - which obviously will need to bump ASE off the cpu in order to run.   This may be fine in a test/dev or even a solution manager system, but bumping ASE off the core is NOT a good thing for a production system.  In fact, I would encourage using numactl or similar to fence off the the cores used for ASE from NW worker processes if at all possible.   We have seen cases of overloaded NW installations with multiple CI instances with hundreds of worker processes each starving cpu away from ASE......sooo....I would tend to actually be a bit more than firm on suggesting that 100 is a very bad starting point.   Given the number of client side joins that SAP uses to avoid [DBMS proprietary] temp tables, it is critical that ASE's (or any DBMS) response time be minimized as much as possible.....having ASE yield the core practically as soon as it gets done processing one task (and puts it to sleep pending an IO) just really causes things to run slow.   Think of a typical query that returns 10 rows - say wide enough that each row fills 1 packet.   If the packet transmit time (and client ACK) takes more than 100 microseconds on CPU (almost a given for network interactions...as clock ticks are in nanoseconds and networking is minimally milliseconds - 1000 microseconds), ASE would yield the CPU every time it sent a packet.    When the client wanted the next packet, the OS would have to wake up the ASE process (an interrupted sleep) which is a nasty heavy weight operation.   Hence it is best for ASE to hang out on the CPU until reasonably sure that nothing more is going to happen very soon....and on current cpus...and having it run for 1-2ms (1000-2000 microseconds) shouldn't be a hardship.     If you created a separate thread pool for batch worker processes, then I could see maybe using a lower idle timeout such as 200 or 250......100 is just plain too low in my mind...it is like saying ASE is expecting an odd query every few seconds vs. a steady workload.  Basically at that level, there had better be a task in the ASE job queue or one on the way on the network already, or that engine is going to sleep.
    While I state that with regards to ADM540 itself, I have not seen the class (perhaps)...one customer did show me the notebook of a class (ASE Sys Admin) they went to and it was really targeted at non-SAP installations more than SAP installations - from a reality/experience aspect.   Part of the issue with the class the customer showed me was it borrowed liberally from the old SY classes as a starting point, but at the point the class was developed there was not a lot of experience with running SAP installations on ASE to really point out the fine tweaking areas such as idle timeout.
    However, the document was really aimed primarily at Business Suite vs. BW systems or a Solution Manager install (which are much smaller) - there are a lot of other considerations for BW the guide doesn't get into - although some of the sizing is a better start than the defaults provided by SAPINST
    The former runtime guide essentially was just merged in to the Post-Migration Steps section.
    May do a quick refresh in the near-future (due to some recent experiences), so if you have other specific examples of the text and SQL not aligning - please let me know.

  • OS X Server 3.0 new setup -- best practices?

    Alright, here's what I'm after.
    I'm setting up a completely new OS X Server 3.0 environment.  It's on a fairly new (1.5 year old) Mac Mini, plenty of RAM and disk space, etc.  This server will ONLY be used interally.  It will have a private IP address such as 192.168.1.205 which will be outside of my DHCP server's range (192.168.1.10 to .199) to prevent any IP conflicts.
    I am using Apple's Thuderbolt-to-Ethernet dongle for the primary network connection.  The built-in NIC will be used strictly for a direct iSCSI connection to a brand new Drobo b800i storage device.
    This machine will provide the following services, rougly in order of importance:
    1.  A Time Machine backup server for about 50 Macs running Maverics.
    1a.  Those networked Macs will authenticate individually to this computer for the Time Machine service
    1b.  This Server will get it's directory information from my primary server via LDAP/Open Directory
    2.  Caching server for the same network of computers
    3.  Serve a NetInstall image which is used to set up new computers when a new employee arrives
    4.  Maybe calendaring and contacts service, still considering that as a possibility
    Can anyone tell me the recommended "best practices" for setting this up from scratch?  I've done it twice so far and have faced problems each time.  My most frequent problem, once it's set up and running, is with Time Machine Server.  With nearly 100 percent consistency, when I get Time Machine Server set up and running, I can't administer it.  After a few days, I'll try to look at it via the Server app.  About half the time, there'll be the expected green dot by "Time Machine" indicating it is running and other times it won't be there.  Regardless, when I click on Time Machine, I almost always get a blank screen simply saying "Loading."  On rare occasion I'll get this:
    Error Reading Settings
    Service functionality and administration may be affected.
    Click Continue to administer this service.
    Code: 0
    Either way, sometimes if I wait long enough, I'll be able to see the Time Machine server setup, but not every time.  When I am able to see it, I'll have usability for a few minutes and then it kicks back to "Loading."
    I do see this apparently relevant entry in the logs as seen by Console.app (happens every time I see the Loading screen):
    servermgrd:  [71811] error in getAndLockContext: flock(servermgr_timemachine) FATAL time out
    servermgrd:  [71811] process will force-quit to avoid deadlock
    com.apple.launchd: (com.apple.servermgrd[72081]) Exited with code: 1
    If I fire up Terminal and run "sudo serveradmin fullstatus timemachine" it'll take as long as a minute or more and finally come back with:
    timemachine:command = "getState"
    timemachine:state = "RUNNING"
    I've tried to do some digging on these issues and have been greeted with almost nothing to go on.  I've seen some rumblings about DNS settings, and here's what that looks like:
    sudo changeip -checkhostname
    Primary address = 192.168.1.205
    Current HostName = Time-Machine-Server.local
    The DNS hostname is not available, please repair DNS and re-run this tool.
    dirserv:success = "success"
    If DNS is a problem, I'm at a loss how to fix it.  I'm not going to have a hostname because this isn't on a public network.
    I have similar issues with Caching, NetInstall, etc.
    So clearly I'm doing something wrong.  I'm not upgrading, again, this is an entirely clean install.  I'm about ready to blow it away and start fresh again, but before I do, I'd greatly appreciate any insight from others on some "best practices" or an ordered list on the best way to get this thing up and running smoothy and reliably.

    Everything in OS X is dependant on proper DNS.  You probably should start there.  It is the first service you should be configuring and it is the most important to keep right.  Don't configure any services until you have DNS straight.  In OS X, DNS really stands for Do Not Skip.
    This may be your toughest decision.  Decide what name you want the machine to be.  You have two choices.
    1: Buy a valid domain name and use it on your LAN devices.  You may not have a need now for use externally, but in the future when you use VPN, Profile Manager, or Web Services, at least you are prepared.  This method is called split horizon DNS.  Example would be apple.com.  Internally you may name the server tm.apple.com.  Then you may alias to it vpn.apple.com.  Externally, users can access the service via vpn.apple.com but tm.apple.com remains a private address only.
    2: Create an invalid private domain name.  This will never route on the web so if you decide to host content for internal/external use, you may run into trouble, especially with services that require SSL certificates.  Examples might be ringsmuth.int or andy.priv.  These type of domains are non-routable and can result in issues of trust when communicating with other servers, but it is possible.
    Once you have the name sorted out, you need to configure DNS.  If you are on a network with other servers, just have the DNS admin create an A and PTR record for you.  If this is your only server, then you need to configure and start the DNS service on Mavericks.  The DNS service is the best Apple has ever created.  A ton of power in a compact tool.  For your needs, you likely need to just hit the + button and fill out the New Device record.  Use a fully qualified host name in the first field and the IP address of your server (LAN address).  You did use a fixed IP address and disabled the wireless card, right?
    Once you have DNS working, then you can start configuring your other services.  Time Machine should be pretty simple.  A share point will be created automatically for you.  But before you get here, I would encourage starting Open Directory.  Don't do that until DNS is right and you pass the sudo changeip -checkhostname test.
    R-
    Apple Consultants Network
    Apple Professional Services
    Author, "Mavericks Server – Foundation Services" :: Exclusively in the iBooks Store

  • Best practices for apps integration with third party systems ?

    Hi all
    I would like to know if there is any document from oracle or from your own regarding best practices for apps integration with third party systems.
    For example, in particular, let's say we need customization in a given module(ex:payables) need to provide data to a third party system, consider following:
    outbound interface:
    1)should third party system should be given with direct access to oracle database to access a particular payments data information table/view to look for data ?
    2) should oracle create a file to third party system, so that it can read and do what it need to do?
    inbound:
    1) should third party should directly login and insert data into tables which holds response data?
    2) again, should third party create file and oralce apps will pick up for further processing?
    again, there could be lot of company specific scenarios like it has to be real time or not... etc...
    How does companies make sure third party systems are not directly dipping into other systems (oracle apps/others), so that it will follow certain integration best practices.
    how does enterprise architectute will play a role in this? can we apply SOA standards? should use request/reply using Tibco etc?
    Many oracle apps implementations customizations are more or less directly interacting with third party systems by including code to login into respective third party systems and vice versa.
    Let me your know if you have done differently and that would help oracle apps community.
    thanks
    rrb.

    you want to send idoc to third party system (NONSAP).
    what kind of system is it? can it handle http requests
    or
    can it handle webservice?
    which version of R/3 you are using?
    what is the mechanism the receiving system has, to receive data?
    Regards
    Raja

  • Best practices in Queue table maintenance

    Hi Fellow AQ Users,
    I am looking to hear from the community about best practices in queue table maintenance.
    I have been mining through metalink about various Oracle recommendations and putting
    together a set of recommendations as a starting point for my DBAs.
    I am looking to answer questions like these --
    How often (in relation to messaging load) would you coalesce and rebuild the indexes?
    How often would you rebuild the table itself to get rid of the high water mark issues ?
    and what procedure would you use to do that?
    Would really love to learn from your experiences in this area. We are using 9.2.0.7
    64 bit DB and have plans to go to 10g over the next year. So, I am looking at 9i related
    stuff and then 10g.
    Thanks
    Vijay

    Hello,
    In general you coalesce once per day ideally during a quiet time to avoid ORA-54 errors as per <Note:271855.1>. Some customers do it more often than that but once per day is a good starting point.
    In terms of shrinking the queue tables you can use the procedure in <Note:304522.1> with a null 3rd parameter. This is an offline procedure so you could only run it during a maintenance window. In 10.2 onwards you can dynamically shrink the queue table and IOTS. Again it depends on exactly what you are doing with your queue tables how often you might need to do this.
    Thanks
    Peter

  • Best Practice for  Retail BP-ERP05/BP-INSTASS

    Hi SAP Guru's
    We have installed ECC 6.0 SR2 with the BP Baseline and Inst Asst
    BP-ERP05     600VD   0000
    BP-INSTASS   600V1   0000
    I have applied these add-ons by SAINT for Retail  and now what's my question
    1)There is need to upgrade any component in server which is related to these addons
    2)How to check Retail pre configuration
    I think after applying these component , retail pre configuration should come in the server but it is not coming
    so is there some settings/process for getting Retail pre configuration.

    HI,
    These two components are required to call the installation assistant from SAP to implement best practices blocks.
    Now you have to call the transaction /n/SMB/BBI and from there you need to call upon the blocks that are appropriate for Retail, you need to download the required best practice blocks and related document, since you have to follow the sequence to call the pre customized modules, suitable for you scenario from market place.
    Follow the document provided along with the software or download from market place.
    Regards....
    Raju.

  • BI Best Practice for Chemical Industry

    Hello,
    I would like to know if anyone is aware of SAP BI  Best Practice for Chemicals.And if so can anyone please post a link aswell.
    Thanks

    Hi Naser,
    Below information will helps you in detail explanation regarding Chemical industry....
    SAP Best Practices packages support best business practices that quickly turn your SAP ERP application into a valuable tool used by the entire business. You can evaluate and implement specific business processes quickly u2013 without extensive Customization of your SAP software. As a result, you realize the benefits with less Effort and at a lower cost than ever before. This helps you improve operational efficiency while providing the flexibility you need to be successful in highly demanding markets. SAP Best Practices packages can benefit companies of all sizes, including global enterprises creating a corporate template for their subsidiaries.
    Extending beyond the boundaries of conventional corporate divisions and functions, the SAP Best Practices for Chemicals package is based on SAP ERP; the SAP Environment, Health & Safety (SAP EH&S) application; and the SAP Recipe Management application. The business processes supported by SAP Best Practices for Chemicals encompass a wide range of activities typically found in a chemical industry
    Practice:
    u2022 Sales and marketing
    u2013 Sales order processing
    u2013 Presales and contracts
    u2013 Sales and distribution (including returns, returnables, and rebates, with quality management)
    u2013 Inter- and intracompany processes
    u2013 Cross-company sales
    u2013 Third-party processing
    u2013 Samples processing
    u2013 Foreign trade
    u2013 Active-ingredient processing
    u2013 Totes handling
    u2013 Tank-trailer processing
    u2013 Vendor-managed inventory
    u2013 Consignment processing
    u2013 Outbound logistics
    u2022 Supply chain planning and execution Supply and demand planning
    u2022 Manufacturing planning and execution
    u2013 Manufacturing execution (including quality management)
    u2013 Subcontracting
    u2013 Blending
    u2013 Repackaging
    u2013 Relabeling
    u2013 Samples processing
    u2022 Quality management and compliance
    u2013 EH&S dangerous goods management
    u2013 EH&S product safety
    u2013 EH&S business compliance services
    u2013 EH&S industrial hygiene and safety
    u2013 EH&S waste management
    u2022 Research and development Transformation of general recipes
    u2022 Supplier collaboration
    u2013 Procurement of materials and services (Including quality management)
    u2013 Storage tank management
    u2013 E-commerce (Chemical Industry Data Exchange)
    u2022 Enterprise management and support
    u2013 Plant maintenance
    u2013 Investment management
    u2013 Integration of the SAP NetWeaver Portal component
    u2022 Profitability analysis
    More Details
    This section details the most common business scenarios u2013 those that benefit most from the application of best practices.
    Sales and Marketing
    SAP Best Practices for Chemicals supports the following sales and marketingu2013related business processes:
    Sales order processing u2013 In this scenario, SAP Best Practices for Chemicals supports order entry, delivery, and billing. Chemical industry functions include the following:
    u2022 Triggering an available-to-promise (ATP) inventory check on bulk orders after sales order entry and automatically creating a filling order (Note: an ATP check is triggered for packaged material.)
    u2022 Selecting batches according to customer requirements:
    u2022 Processing internal sales activities that involve different organizational units
    Third-party and additional internal processing u2013 In this area, the SAP Best Practices for Chemicals package provides an additional batch production step that can be applied to products previously produced by either continuous or batch processing. The following example is based on further internal processing of plastic granules:
    u2022 Purchase order creation, staging, execution, and completion
    u2022 In-process and post process control
    u2022 Batch assignment from bulk to finished materials
    u2022 Repackaging of bulk material
    SAP Best Practices for Chemicals features several tools that help you take advantage of chemical industry best practices. For example, it provides a fully documented and reusable prototype that you can turn into a productive solution quickly. It also provides a variety of tools, descriptions of business scenarios, and proven configuration of SAP software based on more than 35 years of working with the
    Chemical industry.
    SAP Functions in Detail u2013 SAP Best Practices for Chemicals
    The package can also be used to support external toll processing such as that required for additional treatment or repackaging.
    Tank-trailer processing u2013 In this scenario, SAP Best Practices for Chemicals helps handle the selling of bulk material, liquid or granular. It covers the process that automatically adjusts the differences between the original order quantities and the actual quantities filled in the truck. To determine the quantity actually filled, the tank trailer is weighed before and after loading. The delta weight u2013 or quantity filled u2013 is transmitted to the SAP software via an order confirmation. When the delivery for the sales order is created, the software automatically adjusts the order quantity with the confirmed filling quantity.The customer is invoiced for the precise quantity filled and delivered.
    Supply Chain Planning and Execution
    SAP Best Practices for Chemicals supports supply chain planning as well as supply chain execution processes:
    Supply and demand planning u2013 Via the SAP Best Practices for Chemicals package, SAP enables complete support for commercial and supply-chain processes in the chemical industry, including support for integrated sales and operations planning, planning strategies for bulk material, and a variety of filling processes with corresponding packaging units. The package maps the entire supply chain u2013 from sales planning to material requirements planning to transportation procurement.
    Supplier Collaboration
    In the procurement arena, best practices are most important in the following
    Scenario:
    Procurement of materials and services:
    In this scenario, SAP Best Practices for Chemicals describes a range of purchasing processes, including the following:
    u2022 Selection of delivery schedules by vendor
    u2022 Interplant stock transfer orders
    u2022 Quality inspections for raw materials, including sampling requests triggered
    by goods receipt
    Manufacturing Scenarios
    SAP Best Practices for Chemicals supports the following sales and
    Manufacturingu2013related business processes:
    Continuous production u2013 In a continuous production scenario, SAP Best Practices for Chemicals typifies the practice used by basic or commodity chemical producers. For example, in the continuous production of plastic granules, production order processing is based on run-schedule headers. This best-practice package also describes batch and quality management in continuous production. Other processes it supports include handling of byproducts,co-products, and the blending process.
    Batch production u2013 For batch production,
    SAP Best Practices for Chemicals typifies the best practice used by specialty
    chemical producers. The following example demonstrates batch production
    of paint, which includes the following business processes:
    u2022 Process order creation, execution, and completion
    u2022 In-process and post process control
    u2022 Paperless manufacturing using XMLbased Process integration sheets
    u2022 Alerts and events
    u2022 Batch derivation from bulk to finished materials
    Enterprise Management and Support
    SAP Best Practices for Chemicals also supports a range of scenarios in this
    area:
    Plant maintenance u2013 SAP Best Practices for Chemicals allows for management
    of your technical systems. Once the assets are set up in the system, it focuses on preventive and emergency maintenance. Tools and information support the setup of a production plant with assets and buildings.Revenue and cost controlling u2013 The package supports the functions that help you meet product-costing requirements in the industry. It describes how cost centers can be defined, attached
    to activity types, and then linked to logistics. It also supports costing and settlement of production orders for batch and continuous production. And it includes information and tools that help you analyze sales and actual costs in a margin contribution report.
    The SAP Best Practices for Chemicals package supports numerous integrated
    business processes typical of the chemical industry, including the following:
    u2022 Quality management u2013 Supports integration of quality management concepts across the entire supplychain (procurement, production, and sales), including batch recall and complaint handling
    u2022 Batch management u2013 Helps generate batches based on deliveries from vendors or because of company production or filling, with information and tools for total management of batch production and associated processes including batch  derivation, batch information cockpit, and a batchwhere- used list
    u2022 Warehouse management u2013 Enables you to identify locations where materials
    or batch lots are stored, recording details such as bin location and other storage information on dangerous goods to help capture all information needed to show compliance with legal requirements
    Regards
    Sudheer

Maybe you are looking for