Import to new Server with Standard Database Failing at 5%

Since the new database edition (basic, standard, premium) preview has gone live I have been trying to import an automatically exported bacpac onto a new server and into a Standard S1 database using the portal UI. So far every attempt has failed
at roughly 5% which it never appears to move past. It is roughly 26GB, the bacpac was created with the automatic export on a business edition database and I have selected Standard, S1 and 30GB for the new database. I have tried 3 times now with
no success and have no idea what to try next.
Any ideas on how to get this database into the new edition?

Hello,
I had also the same problem with much smaller databases. The only solution for the moment is to use Premium (maybe Premium 2 or 3) to do the import the database.
The reason is that when I am trying to import the database, I see on the dashboard  monitor that I use 100% of available Write_log metric for the edition, and in some way the server is "slowing" my import requests to keep me in my edition limits. Of
course with this policy is almost impossible to import relatively large databases to these editions. I hope Microsoft engineers find a solution to this problem, or remove the limit for this metric.
Regards,
Dimitris

Similar Messages

  • SBS2008: Move email from Exchange 2007 to new server with Exchange 2013

    We have an old server (SBS2008) and plan to buy a new server with (Server 2012). I need to move all the exchange emails, contacts & calendars to the new server. We will no longer use the old server. 
    Is there a document or migration tool that will help me understand how to move this data form the old exchange server to the new one? 
    Old Server:
    SBS2008 running Exchange 2007
    New Server:
    Server 2012
    Exchange 2013
    Any help is appreciated!

    Hi Dave,
    It can be done, and as Larry suggested you will consider two Server 2012 installs in order to achieve an environment that looks like your current SBS roles; Exchange 2013 on an Active Directory controller isn't a good long-term solution (SBS did this for
    you in the past).
    For your size operation, a virtual server host, with a Windows Server 2012 license, and two virtual machines would probably be a suitable design model.  In this manner, you have Server 2012 license that permits 1 +2 licenses (one host for virtualization,
    up to 2 Virtual Machines on same host).
    There's no migration tool. That comes with experience and usually trial and error. You earn the skills in this migration path, and for the average SBS support person you should plan on spending 3x (or more) your efforts estimate in hours planning your migration. 
    You can find a recommended migration path at this link to give you an idea of the steps, but its not exactly point by point going to cover you off for an sbs2008 to server 2012 w/exchange 2013 migration.  But the high points are in here. If it looks
    like something you would be comfortable with then you should research more.
    http://blogs.technet.com/b/infratalks/archive/2012/09/07/transition-from-small-business-server-to-standard-windows-server.aspx
    Specific around integrating Exchange 2013 into an Exchange 2007 environment, guidance for that can be found here:
    http://technet.microsoft.com/en-us/library/jj898582(v=exchg.150).aspx
    If that looks like something beyond your comfort level, then you might consider building a new 2012 server with Exchange 2013 environment out as new, manually export your exchange 2007 mailbox contents (to PST) and then import them into the new mail server,
    and migrate your workstations out of old domain into new domain.  Whether this is more or less work at your workstation count is dependent upon a lot of variables.
    If you have more questions about the process, update the thread and we'll try to assist.
    Hopefully this info answered your original question.
    Cheers,
    -Jason
    Jason Miller B.Comm (Hons), MCSA, MCITP, Microsoft MVP

  • Migrating Non ASM, Non RMAN to New Server with ASM and RMAN - Possible?

    We currently have a database ( Oracle 10g R1 ) on a Sun Solaris server that is NOT using ASM or RMAN. The database is about 300GB. We are getting a new server and we want to install Oracle 10g R2 with ASM and RMAN and migrate the database.
    I have seen the documentation on migrating non ASM to an ASM server but the methods all use RMAN. Is it possible to migrate to an ASM database without using RMAN? Would datapump import/export work if I created a new database on the new server with all the same tablespaces? Or, do I have to bite the bullet, install RMAN on the old server and do the backup?
    Thanks.

    If you're not using RMAN that doesn't mean you can't use it to perform a single backup, rman is contained in every oracle RDBMS installation version 10G or higher.
    this is only a sample of how to do it
    RMAN> CONFIGURE CHANNEL DEVICE TYPE DISK FORMAT '<file_system_path>/%U.DBF';
    --first we allocate the channel default channel.
    RMAN>RUN
    ALLOCATE CHANNEL DEFAULTCHANNEL TYPE DISK;
    SHUTDOWN IMMEDIATE;
    STARTUP MOUNT;
    BACKUP DATABASE;
    SHUTDOWN
    }then once you have it, you can do what you want.
    It should also be possible to manually restore the database from the original datafiles but it's better to follow the solution involving RMAN.
    Bye Alessandro

  • Import on new server

    Hi All,
    I just want to know that,
    I have taken an export from one server.On this server all SQL and PL/SQL running very good.
    Now I'm trying to import that dumps on new server with same configuration as the previous one.
    But all SQL and PL/SQL are not running good on this server. It takes time to execute.
    What do I need to check.
    Is it a database statistics or something else need to be checked ???
    Thanks ,
    Raj

    Raj wrote:
    Hi All,
    I just want to know that,
    I have taken an export from one server.On this server all SQL and PL/SQL running very good.
    Now I'm trying to import that dumps on new server with same configuration as the previous one.
    But all SQL and PL/SQL are not running good on this server. It takes time to execute.
    What do I need to check.
    Is it a database statistics or something else need to be checked ???
    Thanks ,
    Rajsee the advice in Reports Take more longer time when changing the database version

  • Remote PowerShell Connection to Lync Server With Kerberos authentication Fails

    Hi everyone ,
    Remote PowerShell to Lync Server With Kerberos authentication Fails .. Is there any reason for not being able to connect when authentication specified as Kerberos . But exactly same code works when Authentication is specified as "Negotiate"
    E.g :
    Error -
    $session=New-PSSession -ConfigurationName Microsoft.Powershell -ConnectionUri https://serverName.lync.com/ocspowershell/ -Credential $cred -Authentication Kerberos
    [serverName.lync.com] Connecting to remote server failed with the following error message : The WinRM client cannot process the request. The authentication mechanism requested by the client is not supported by the server or unencrypted traffic is disabled in
    the service configuration. Verify the unencrypted traffic setting in the service configuration or specify one of the authentication mechanisms supported by the server.  To use Kerberos, specify the computer name as the remote destination. Also verify
    that the client computer and the destination computer are joined to a domain.To use Basic, specify the computer name as the remote destination, specify Basic authentication and provide user name and password. Possible authentication mechanisms reported by
    server:   Digest Negotiate For more information, see the about_Remote_Troubleshooting Help topic.
        + CategoryInfo          : OpenError: (System.Manageme....RemoteRunspace:RemoteRunspace) [], PSRemotingTransportExc
       eption
        + FullyQualifiedErrorId : PSSessionOpenFailed
    Works  -
    $session=New-PSSession -ConfigurationName Microsoft.Powershell -ConnectionUri https://serverName.lync.com/ocspowershell/ -Credential $cred -Authentication Negotiate

    Hi,
    Please double check if Windows Update is the latest version, if not, please update and then test again.
    Please also ensure that the workstation you are using has network access to the Certificate Authority that signed the certificate.
    Best Regards,
    Eason Huang
    Eason Huang
    TechNet Community Support

  • Does Transfer EBS to new server with New specifications (Ram,Pross ,..)

    Does Transfer EBS with same release to new server with New specifications (Ram,pressoser ,..) need new licences ?

    Micel 811 wrote:
    Does Transfer EBS with same release to new server with New specifications (Ram,pressoser ,..) need new licences ?\does your license is CPU based?
    if its user based then No need
    please contact oracle sales as they are the best to answer your question .
    http://www.oracle.com/us/corporate/pricing/applications-price-list-070574.pdf
    ;) AppsMasti ;)
    Sharing is Caring

  • Moving DP Content to a New Server With Same Name

    We are currently in the middle of a server OS refresh from 2k8 to 2k12 R2. Some of the servers are DPs so I need to move the content from one server to another. The big problem is that the new server must have the same name.
    I've read this thread http://social.technet.microsoft.com/Forums/en-US/1ffcaa47-9bf8-476d-965e-28350a6bef1b/move-content-from-old-dp-to-new-dp-at-same-remote-site?forum=configmanagerapps which kind of covers my issue but I presume this person was going to
    a server with a new name.
    Has anyone done this? Is it possible?
    Unless there is something clever the only way I see to do this is to uninstall the DP role from the current server, remove the current server from the domain, add the new server with the same name, re-install the DP role and then re-distribute all the content
    to the new server. I'd rather not do this as the DP is on a remote site so all the content will be going over the WAN.
    I am running SCCM 2012 SP1 CU3.
    Any ideas?

    Hi,
    >>So can I pre-stage from the old server to the new server or does the pre-staging have to be done from the site server?
    Yes, you could prestage content from the old server to the new server. Prestaged content files could be created from Configuration Manager Administrator Console.
    >>What is the order of the process?
    The following blog could help you to prestage content.
    http://blogs.technet.com/b/inside_osd/archive/2011/04/11/configuration-manger-2012-content-prestaging.aspx
    Best Regards,
    Joyce
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Moving oracle database to a new server with an upgraded oracle software

    Hi Guys,
    Here is the scenario, we have an existing Oracle 9.2 demo database on a Solaris 5.9 server. I was asked to transfer this to another server with same OS but with an Oracle 10g software installed. The original plan was to mount the exisiting Oracle home from the original server to the new server and from there copy all the database files and bring up the database.
    Here are the steps:
    - Issue an "alter database backup controlfile to trace", get the script from the trace file
    - Copy all database files to the new server
    - Edit parameter files
    - Do a startup mount recreate the control files, do a recover database, and startup the database on the new server.
    I suggested that we to a transport tablespace but due to limited resources and that the 9.2 database does not have this feature we can't perform this process. I also can't do a exp/imp of the database since we really have limited resources on a disk. I was wondering if the steps I enumerated above are correct? Or would result in an error?

    Hi,
    why you want to recreate the controlfile, does the mount point locations are changing from old server to new server?
    if you recreating controlfile new incarnation starts, keep in mind.
    you have the downtime to upgrade.. :)
    shutdown
    take cold backup and move to new server.
    create the directory structure as exist in PFILE including CRD files.
    install software
    startup upgrade
    upgrade database(need to add sysaux TBS)
    shut immediate
    startup
    *@?/rdbms/admin/utlrp.sql*
    change hostname in listner.ora/tnsnames.ora files.
    Thanks

  • Poor performance of new SQL Azure Standard database

    This is not new information. The revised SQL models (basic|standard|premium) perform poorly compared to the earlier web|business databases.  We are talking orders of magnitude - over 4 minutes to perform an update vs. 19 seconds. (UPDATE Invoice SET
    SalesOrderID = O.SalesOrderID FROM Invoice INNER JOIN SalesOrder AS O ON Invoice.InvoiceID = O.InvoiceID for 196043 rows.
    Microsoft is saying we can only use the web database until September, 2015.  Moving to new model (tried standard S2) will cause the project to fail.
    There are numerous posts on the Internet identifying this problem.  How do we get Microsoft's attention?  This is an Azure killer. Fortunately for us, there are a number of other hosting solutions
    available.
    If this problem is not resolved in the next few months, we will be forced to abandon Microsoft Azure!
    Jim Rand

    Our application is a desktop application that communicates to the web role using a single WCF call. In the server pipeline, a call is made to a method that looks like this, except all the error trapping is removed here for brevity:
    public static Response Process(Request request)
    DateTime startDate = DateTime.UtcNow;
    Agents.Agent agent = Agents.AgentFactory.GetAgent(request);
    Response response = agent.ProcessRequest();
    response.ServiceTime = DateTime.UtcNow - startDate;
    return response;
    While building this application over the last year, we did occasional performance testing with the Windows client reporting on logout to the server the mean service time for a complete session. Quite frankly, I was amazed at the performance.  While
    slightly slower than the development machine, the performance was acceptable from the user perspective over the Internet.
    Not so anymore. The mean service time on the Azure server has increased dramatically resulting in timeouts. 
    We will be sticking with the Web edition for one more month during development. At that time, we will switch to Premium(P1) for user acceptance testing.  It should be interesting see what the mean, median and standard deviation session server statistics
    are.
    The performance of SqlAzure web edition is no longer acceptable.  I sure hope Premium(P1) makes it.
    Jim Rand

  • Add New Server with Configtool Questions

    Hello All,
    We are running EP7 SP12 on W2K3 with 16GB RAM and 4 CPUs.
    As part of one the Go Live checks done to the system a while back, it was recommended that we increase the number of Server processes to the instance.
    So, I followed the information here (http://help.sap.com/saphelp_nw70/helpdata/en/68/dcde416fb3c417e10000000a155106/frameset.htm) to add 3 additional servers to the instance.  When I restarted the J2EE engine, it basically sat there in a 'Starting Apps' status.  I waited about 2 hours before I stopped the startup process.  I went back into Configtool and did a 'Remove Server' on the last, Server3, process and restarted J2EE.  This time, everything did start OK, but now I have a few questions I hope someone can answer.
    1.  Is there a limit to the number of Server processes that can exist in a single instance?  If not, what would cause the J2EE servers to essentially hang on startup although they didn't look hung, just extremely slow starting up.
    2.  I thought when you did the initial 'Add Server' it was supposed to create a duplicate Server process.  But, when I check the directory structures of the newly created Server processes, they aren't anywhere close to being the same as the original Server0 process.  For example, we have a custom redirect in place when people logon to the Portal, but this wasn't transferred/copied to the new Server1 or Server2.  The same was true for other customizations.  Also, when looking at the structure, Server0 has approx 80,000 files and 15,000 sub-folders.  When looking at the new Server1 & Server2, they are both different in size in both files and folders in respect to each other and compared to Server0.  Shouldn't they all be the same?
    3.  A follow up to question 2.  If they aren't the same and they are supposed to be the same, can we just copy the missing files & folders from Server0 to the new Servers?
    4.  Lastly, since I did a 'Remove Server' in Configtool for Server3, it does not appear in the MMC when the J2EE engine starts.  This I expect.  But, it did not remove the directory structure for Server3.  If I try to manually delete the Server3 directory, it simply says it is in use and won't let me delete the structure.  So, is it safe to delete this structure since it isn't being used anymore?  If so, I'll stop SAP and delete the structure offline.  Do I have to do any database cleanup once I do this?  If so, can someone point me to some documentation as to what needs to cleaned up in the DB and how?
    Thanks,
    Tom

    Thanks for the info.
    Interesting though.  I open a message with SAP and posed these same questions.  Their response was similar to yours, but it opened up a whole new set of questions.  What follows is the text of that message for others to benefit from (clipped for clarity):
    SAP's response to the original set of questions posted here:
    =====================
    ....1.Number of server nodes depends upon the CPU speed.J2EE Engine
    can support upto 21 server nodes in a intance.
    If number of server nodes is big,it will slow down the startup process.
    2.When you create the "New Server"its not the dublicate server.
    Its a new server.You can not copy mising files from one server to other
    J2EE engine syncronises the server nodes in the instance.
    3.Yes.You need to manually delete the file system.Configtool deletes
    the server node from the DB....
    =====================
    To which I replied:
    =====================
    .....2. I thought when you did the initial 'Add Server' it was supposed to
    create a duplicate Server process. But, when I check the directory
    structures of the newly created Server processes, they aren't anywhere
    close to being the same as the original Server0 process. For example,
    we have a custom redirect in place when people logon to the Portal, but
    this wasn't transferred/copied to the new Server1 or Server2. The same
    was true for other customizations. Also, when looking at the structure,
    Server0 has approx 80,000 files and 15,000 sub-folders. When looking at
    the new Server1 & Server2, they are both different in size in both
    files and folders in respect to each other and compared to Server0.
    Shouldn't they all be the same?"
    You replied:
    "2.When you create the "New Server"its not the dublicate server.
    Its a new server.You can not copy mising files from one server to other
    J2EE engine syncronises the server nodes in the instance."
    But, from actual experience, it does NOT synchronize the nodes, rather
    it is a partial synchronization.
    So, are the following staements true:
    -When a new server process is added, ONLY standard SAP delivered files
    & folders are synchronized. True or False?
    -No custom files/folders are synnchronized. True or False?
    -Custom values, whether part of standard SAP deliverables or custom
    deliverables, are NOT synchronized. True or False?
    -Each server node needs to be configured independently. True or False?
    The reason I'm asking such specific questions is because of what we are
    seeing. I'll use the same example from before. We modified the
    index.html file for the Portal in Server0. When we added the new
    server nodes, this index.html file was not synchronized. Instead, the
    default index.html file was created for the new server nodes. That is
    an example at the file level. Here is an example from a configuration
    perspective. In Server0, in the Visual Admin tool, we have a TREX
    server specified in TREX service properties. When the new server nodes
    were created, this setting was NOT transferred. Therefore, TREX didn't
    work until I specifically went back into VA and added the TREX settings
    to the new server nodes. This is a major problem and opens up three
    more questions:
    1.How do we know what settings were transferred vs. those that weren't
    transferred?
    2.Also, if you can't copy missing files/folders from one server to
    another, then how are we supposed to get those files/folders into the
    new server(s)?
    3. Is there a way to do a comparison between the server nodes to see
    what settings are missing?.....
    =====================
    SAP's reply:
    =====================
    .....-When a new server process is added, ONLY standard SAP delivered files
    & folders are synchronized. True or False? TRUE
    -No custom files/folders are synnchronized. True or False? TRUE
    -Custom values, whether part of standard SAP deliverables or custom
    deliverables, are NOT synchronized. True or False? TRUE
    -Each server node needs to be configured independently. True or False?
    YES.
    If any new server node is added,J2EE Engine syncronises the information
    at the time of next restart......
    ======================
    My Reply:
    ======================
    .....1.How do we know what settings were transferred vs. those that weren't
    transferred?
    2.Also, if you can't copy missing files/folders from one server to
    another, then how are we supposed to get those files/folders into the
    new server(s)?
    3. Is there a way to do a comparison between the server nodes to see
    what settings are missing?.....
    ======================
    SAP's Reply:
    ======================
    .....1.How do we know what settings were transferred vs. those that weren't
    transferred?
    NO there is no way.
    2.Also, if you can't copy missing files/folders from one server to
    another, then how are we supposed to get those files/folders into the
    new server(s)?
    You need to deploy the appliactions again on the new server.
    3. Is there a way to do a comparison between the server nodes to see
    what settings are missing?
    No there is no way......
    ======================
    So, although I really didn't have the answers I was looking for, I confirmed the message.  It seems ludicrous to me that there is no way of doing a comparison or manual synch between two server nodes.  Also, I can't believe you have to configure each node independently let alone when you do a deployment you now have to deploy to each server individually, of which I haven't found any docs explaining how to do that.  So if someone here has a suggestion I'd appreciate it.  Although at this point, given the fact the server nodes are so out of synch as to make them almost unusable, we might just delete them all and go back to one node, although I don't want to have to do that.
    Thanks,
    Tom

  • New MVC with Standard Tags and DB sample

    I have posted initial build of a new learning application on
              SourceForge "basicPortal" under CVS
              http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/basicportal/basicportal
              It is using Struts with Standard Tags and X: Transform from DB, Realm
              base security, CURD, etc. etc. I have big plans for this, to be 20% of
              code that gets used 80% of time and "faces" and expression language
              compliant.
              (Err.. this build is not for newbies, but I will update it .... a lot
              and often.)
              To keep up on this and related topics, there is a mail list
              http://www.netbean.net/mailman/listinfo/mvc-programmers
              that also gets you password to other baseBeans.com features.
              and a newsgroups readers news.baseBeans.com.
              Hope you like,
              Vic C.
              

    Looking at the XFILES.5.xml file and Install.vbs files of the XFILES installer, it appears that the XDBPM_SUPPORT and XFILES_DBA_TASKS actions will only be run if you enter a SYSDBA user in the "DBA User" field of the installer. However, for some reason, Oracle seems to have commented out the command that checks to ensure that the DBA user you entered is a SYSDBA user, and the "DBA User" field defaults to SYSTEM, which is not a SYSDBA user. Since I got things working I haven't tried rerunning the installer with a SYSDBA user, but maybe it will work better. The installation documentation and/or the installation scripts need to be updated to reflect the need for a SYSDBA user.
    Regarding the APEX version of XFILES, this has not installation instructions whatsoever, and there are prerequisite steps to get this to install correctly as well. I tried importing the f102.sql file into APEX and it failed on the prereq validation because an "XFILES/Apex" folder wasn't found in XML DB. I went looking for anything that creates this directory and I found that there is another XFILES_DBA_TASKS.sql script specifically for the APEX version. It is included in the f102.sql file. So I cut and paste the contents into SQL*Plus and ran it as SYS. I was then able to install the f102 application.

  • Restore on new server with new DB_NAME/SID?

    My goal is to restore database with the RMAN backup on a server that has no connectivity to the production database (I don’t want to use duplicate command).
    I restore a database from RMAN using a same DB_NAME/SID and everything was ok.
    -     restore spfile
    -     restore controlfile
    -     restore database
    -     recover database
    -     open with reset logs
    Now I want to restore database but with different DB_NAME/SID (My production database is PROD and I want to restore this PROD RMAN backup on new server and this new database I want to be called DEV).
    How to restore database with new DB_NAME/SID?
    Oracle 10g, Windows Server 2003
    Thanks in advance!

    user9106065 wrote:
    My goal is to restore database with the RMAN backup on a server that has no connectivity to the production database (I don’t want to use duplicate command).
    I restore a database from RMAN using a same DB_NAME/SID and everything was ok.
    -     restore spfile
    -     restore controlfile
    -     restore database
    -     recover database
    -     open with reset logs
    Now I want to restore database but with different DB_NAME/SID (My production database is PROD and I want to restore this PROD RMAN backup on new server and this new database I want to be called DEV).
    How to restore database with new DB_NAME/SID?
    Oracle 10g, Windows Server 2003
    Thanks in advance!You can use NID command
    Run NID and you can see the options to run
    or see How to Change the DBID, DBNAME Using NID Utility in version 10GgR2 onwards [ID 863800.1]
    Cheers

  • 10GR1 GridControl  with existing database fails

    Can anyone explain how the install with existing database works. I have tried to reason through it, but it always fails because it cannot find an oradata file in the directory where I am installing the software.... This makes no sense....
    thanks,

    You have to create the $ORACLE_HOME/oradata directory ahead of time. The installer
    warns you that the directory exists, but lets you continue and specify the locations where
    you would like to place the repository data files.
    For more information, check out this MetaLink forum message:
    http://metalink.oracle.com/metalink/plsql/for_main.expandThreads?p_thread_id=600102.992&p_forum_id=93&p_after_post=N&p_forum_scope=a&p_message_id=600102.992&p_forum_time=7&p_myThread=1
    or this other forum message:
    Re: 10g Grid Control on Solaris box

  • Building new server with old home folders

    I have a dying 10.5.5 server with OD that is in dire need of a rebuild. I am going to install 10.5.8 unlimited on a new machine and create all the same user accounts(names) in WGM, I then want to re-link all the old home folders to the new accounts.
    Is this as simple as naming the new accounts identically as the old ones and then making the home folder location the same as the old?
    Will I need to do anything regarding permissions on the old home folders etc??

    Hi,
    I have had to do the same thing many times what with server upgrades, crashes etc and I have found the most effective way is as follows:
    1. Create the OD on the new machine and create the accounts making sure that the shortnames are the same so that the home folder names match in the new location.
    2 as root user (type su - in terminal and put in the root password)use rsync in the terminal to copy the old folders to the new location eg:
    rsync -av --progress [email protected]:/Volumes/userdata/homefolders/ /Volumes/userdata/homefolders/
    That could take some time but it is better than using a gui as it can tend to corrupt a few things.
    NOW, you will find that there are permissions issues if the user ID's have changed accross servers so I always run a little script to correct this. I will explain how to do it in the terminal in case you may be baffled by the terminal, my apologies if you already know this but it may help someone else.
    open terminal and type: vi permissions.sh
    you will then be in a vi editing window. Press I (thats an Eye not an Ell)to start editing and type in the following (adjust for your own environment)
    for i in /Volumes/userdata/homefolders/*
    do
    u=`echo $i | cut -d/ -f5`
    chown -R $u /Volumes/userdata/homefolders/$u
    done
    Now press ESC to get out of editing mode and type : x (without the space in between, edited to remove smiley) and hit return to exit and save.
    back in the terminal window type chmod 777 permissions.sh (this makes the script executable)
    Now you should be ready to run the script which will effectively take the name of each folder and change the ownership of everything in said folder to the new user and rectify any permissions issues.
    type: ./permissions.sh
    You should be sorted now.
    Alternatively you can try Passenger. http://macinmind.com/?pid=2&progid=1&subpid=1 which can do all of the above but I find it quicker to do it manually.
    I do use passenger for bulk account creation though, admins best friend.
    Hope I never lost the plot there and that it helps someone on their way
    Message was edited by: PsyMan2009 to rectify smileys at vital parts LOL

  • Installation with Existing Database Fails

    We're installing SCVMM 2012 R2 and upgrading an existing 2012 R2 CU4 database. The installation fails after running for about 30 minutes with these lines in the SetupWizard.log
    03:53:43:Failed sql script: Threw Exception.Type: Microsoft.VirtualManager.DB.CarmineSqlException, Exception.Message: Unable to connect to the VMM database because of a general database failure.
    Ensure that the SQL Server is running and configured correctly, then try the operation again.
    03:53:43:StackTrace:   at Microsoft.VirtualManager.DB.SqlRetryCommand.ExecuteNonQuery()
       at Microsoft.VirtualManager.Setup.DBConfigurator.ExecuteScript(SqlContext ctx, String fileName)
    03:53:43:InnerException.Type: System.Data.SqlClient.SqlException, InnerException.Message: Incorrect syntax near 'MERGE'. You may need to set the compatibility level of the current database to a higher value to enable this feature. See help for the SET COMPATIBILITY_LEVEL
    option of ALTER DATABASE.
    Must declare the scalar variable "@error".
    Must declare the scalar variable "@error".
    03:53:43:InnerException.StackTrace:   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
       at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
       at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
       at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async, Int32 timeout, Boolean asyncWrite)
       at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean asyncWrite)
       at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
       at Microsoft.VirtualManager.DB.SqlRetryCommand.ExecuteNonQuery()
    03:53:43:VMMPostinstallProcessor threw an exception: Threw Exception.Type: Microsoft.VirtualManager.Setup.Exceptions.DatabaseConfigurationException, Exception.Message: The Virtual Machine Manager database can not be upgraded.
    The Virtual Machine Manager database SCVMM2008Prod was not upgraded. Try a new database and run Setup again.
    Has anyone seen this before? The SQL is remote but is reachable.
    Orange County District Attorney

    We have to dig deeper on this one. Can you check the SQL errorlogs for related information?
    You must verify that the databases is working, have enough space, is healthy etc.
    The logs should tell you if everything is allright.
    -kn
    Kristian (Virtualization and some coffee: http://kristiannese.blogspot.com )

Maybe you are looking for

  • Run a PL/SQL statement in an Oracle Alert action?

    Hello did anyone know is it possible to run a PL/SQL statement in an Oracle Alert action? I can run an SQL like "UPDATE Table SET alert_run = 'ok' where id = 10;" but if I run instead of this the following "Select XXKN_TEST.UPDATE_XXX_TEST('&USER_NAM

  • JNDI replication within a cluster

    Hi to all of you, we successfully enabled HTTP Session replication and tested the failover. We would also like to setup a JNDI replication, so that we can use it as a storage for some shared data -- as stated in http://download.oracle.com/docs/cd/B10

  • Photobook-jpg-email

    Made a photobook in iPhoto. Saved as pdf. Picked a few pages and saved as jpg images in a Word file. Want to send it as attachment with e-mail. But it's too heavy. I (try to) compress all images to 150 ppt. But nothing happens. It remains far too hea

  • How to automatically update Smart LINKED Object

    Hi. This is my first post here, I'm French, so excuse any English mistake i may do. I'm not an artist / designer / ..., still new about using image software editing like photoshop. So i downloaded Photoshop CC to try it because i wanted to do somethi

  • Custom Live Tiles for Third Party Apps

    Hello, how can I create custom live tiles for third party apps? There are a lot of applications which can pin live tile which starts another app - for example Facebook. How can I do that?