What would be best approach to migrate millions of records from on premise SQL server to Azure SQL DB?

Team,
In our project, we have a requirement of data migration. We have following scenario and I really appreciate any suggestion from you all on implementation part of it.
Scenario:
We have millions of records to be migrated to destination SQL database after some transformation.
The source SQL server is on premise in partners domain and destination server is in Azure.
Can you please suggest what would be best approach to do so.
thanks,
Bishnu
Bishnupriya Pradhan

You can use SSIS itself for this
Have a batch logic which will identify data batches within source and then include data flow tasks to do the data transfer to Azure. The batch size chosen should be as per buffer meory availability + parallel tasks executing etc.
You can use ODBC or ADO .NET connection to connect to azure.
http://visakhm.blogspot.in/2013/09/connecting-to-azure-instance-using-ssis.html
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page

Similar Messages

  • What is the best approach to insert millions of records?

    Hi,
    What is the best approach to insert millions of record in table.
    If error occurred while inserting the record then how to know which record has failed.
    Thanks & Regards,
    Sunita

    Hello 942793
    There isn't a best approach if you do not provide us the requirements and the environment...
    It depends on what for you is the best.
    Questions:
    1.) Can you disable the Constraints / unique Indexes on the table?
    2.) Is there a possibility to run parallel queries?
    3.) Do you need to know the rows which can not be inserted if the constraints are enabled? Or it is not necessary?
    4.) Do you need it fast or you have time to do it?
    What does "best approach" mean for you?
    Regards,
    David

  • What's the best approach to migrate to Snow Leopard using the time machine?

    I have a time capsule, and have noticed the restore from backup option once I had to replace the HD on my MacBook Pro.
    I am about to buy a new MacBook Pro which will probably ship with Snow Leopard, and am wandering, if when I setup the new OS I can restore from a backup of my old system?
    Thanks
    Miguel

    don't use the full system restore from backup option with a new computer. that can only be used on the same exact computer. also, that option erases the destination drive and replaces it with the copy of the backed up system. this is not what you want.
    when you first turn on the new computer you'll get a setup assistant which will give you an option to Migrate your user data and applications from a TM backup. you can also do it later using Migration Assistant located in Applications/Utilities.

  • What is the best approach to migrate SharePoint farm from one data center to other datacenter

    We have two web front server and one application server and two instances for database server and we have to migrate complete farm from one data center to other data center with minimal downtime and end user impact.
    Please provide your best input on this.
    Thanks in advance.

    Create a new farm in the secondary Data Center at the same patch level with the desired configuration. Replicate the databases using the method of choice (Mirroring, AlwaysOn, etc.). Create a downtime window where you can then attach the databases to the
    new farm's Web Application(s)/Service Application(s).
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • What is the best way to migrate settings/apps/files from late 2008 MBP to new 2012 MBA?

    I have the 2008 MBP backed up via TM to an external HD that is FW800, but is USB 2.0 capable.  I was hoping the TB/FW adapter would come out from Apple soon, and maybe it will.  Bottom line, my new MBA arrives next week, and I'm not sure when the TB/FW adapter will hit the stores.  Can I use USB 2.0 to link the HD to my new MBA or will it be too slow to conduct the migration?  I have about 110GB on the MBP HD and the new MBA is the i7/8G/256GB model. so no worries there.

    The firewire adapter is due for release in "July"  according to the specs.  You can make due with your TM backup on USB 2.0.   It's slower than firewire but should complete within 2 hours.

  • What is the best approach to invoke secured REST Services from SOA

    Hi there,
    I've a REST service which expects username and password to access it.
    To Invoke it I'm passing the credentials as properties in composite.xml under <reference> section as shown below.
    <property name="oracle.webservices.auth.username" type="xs:string"
                    many="false" override="may">USERNAME</property>
    <property name="oracle.webservices.auth.password" type="xs:string"
                    many="false" override="may">PASSWORD</property>
    Is there any way to use OWSM to achieve the same?
    Please suggest if there is better approach. I'm Using SOA Suite PS6
    Thanks
    JGun

    For external entities, the best practice is to access the rest service in OSB (as Business Service) and if needed, virtualize the access via an Proxy Service (can be used a Web Service transaction) and than consume the OSB in SOA Suite.

  • Best approach to migrate DB2 UDB V7 for z/OS

    It seems that the latest migration workbench still doesn't work for DB2 UDB V7 on z/OS. What is the best approach to migrate DB2 UDB V7 for z/OS (about 10 GB) to Oracle 10g? Any suggestions/advices are greatly appreciated. Thanks.
    George

    SQL Loader/External tables always works assuming you can get text files out of your old DB.
    Gints Plivna
    http://www.gplivna.eu

  • What is the best approach to trying to find high freq hits in a file?

    Lets say I have a text document that has millions of rows of information like "Name, address, last time checked in:"
    What is the best approach if I were to look for the top 5 people who appears the most on this huge list?
    Thanks!

    If it is not in a database and it's just one fileYou can still put it into a DB.
    with all those data, what approach would be good in
    the realm of Java? I thought I already said that.
    Would Map still be the best
    choice?Simplest? Probably. Best? Only you can determine that.
    Would the complexity be n^2 since you would
    need to put everything in, then compare all the
    sizes?No, it should be O(2N) (which is really O(N)). Inseting into the map is O(N), and then iterating once over the entries and adjusting your running top 5 is O(N).

  • What is the best approach to converting LV7.1 tags to LV2012 shared variables in multiple VIs?

    What is the best approach to upgrading from LV7.1/DSC tags to LV2012/DSC shared variables, in multiple VIs running on multiple platforms? Our system is composed of  about 5 PCs running Windows 2000/LV7.1 Runtime, plus a PLC, and a main controller running XP/SP3/LV2012. About 3 of the PCs publish sensor information via tags across the LAN to the main controller. Only the main controller is currently being upgraded. Rudimentary questions:
    1. Will the other PCs running the 7.1 RTE (with tags) be able to communicate with the main controller running 2012 (shared variables)?
    2. Is it necessary to convert from tags to shared variables, or will the deprecated legacy tag VIs from LV7.1 work in LV2012?
    3. Will all the main controller VIs need to be incorporated into a project in order to use shared variables?
    4. Is the only way to do this is to find all tag items and replace them with shared variable items?
    Thanks in advance with any information and advice!
    lb
    Solved!
    Go to Solution.

    Hi lb,
    We're glad to hear you're upgrading, but because there was a fundamental change in architecture since version 7.1, there will likely be some portions that require a rewrite. 
    The RTE needs to match the version of DSC your using.  Also, the tag architecture used in 7.1 is not compatible with the shared variable approach used in 2012.  Please see the KnowledgeBase article Do I Need to Upgrade My DSC Runtime Version After Upgrading the LabVIEW DSC Module?
    You will also need to convert from tags to shared variables.  The change from tags to shared variables took place in the transition to LabVIEW 8.  The KnowledgeBase Migrating from LabVIEW DSC 7.1 to 8.0 gives the process for changing from tags to shared variables. 
    Hope this gets you headed in the right direction.  Let us know if you have more questions.
    Thanks,
    Dave C.
    Applications Engineer
    National Instruments

  • What´s the best approach to work with Excel, csv files

    Hi gurus. I got a question for you. According to your experience what's the best approach to work with Excel or csv files that have to be uploaded through DataServices to you datawarehouse.
    Let's say your end-user, who is not a programmer, creates a group of 4 excel files with different calculations in a monthly basis, so they can generate a set of reports from their datawarehouse once the files have been uploaded to tables in your DWH. The calculations vary from month to month. The user doesn't have a front-end to upload the excel files directly to Data Services. The end user needs to keep a track of which person uploaded the files for a determined month.
    1. The end user should place their 4 excel files in a shared directory that will be seen by DataServices.
    2. DataServices will execute certain scheduled job that will read the four files and upload them to the Datawarehouse at a determined time, lets say at 9:00pm.
    It makes me wonder... what happens if the user needs to present their reports immediately so they can´t wait until 9:00pm.  Is it possible for the end user to execute some kind of action (out of the DataServices Environment) so DataServices "could know" that it has to process those files right now, instead of waiting for the night schedule?
    Is there a way that DS will track who was the person who uploaded those files?
    Would it be better to build a front-end for the end user so they can upload their four files directlyto the datawarehouse?
    Waiting for your comments to resolve this dilemma
    Best Regards
    Erika

    Hi,
    There are functions in DS that captures the input files automatically. You could use file_exists() or wait_for_file() option to do that. Schedule the job to run every certain minute and if the file exists then run. This could be done by using a certain file name with date and timestamp etc or after running move the old files to archive and DS wait for new files to show up.
    Check this - Selective Reading and Postprocessing - Enterprise Information Management - SCN Wiki
    Hope this helps.
    Arun

  • Best approach to migrate from 9.2.0.1 to 9.2.0.6

    Hi friends.
    Actually I have a production database running on Oracle 9.2.0.1 in Suse Linux. I bought a new server and I have to migrate my database from the old one. In this new server I´ll have Oracle 9.2.0.6 and RedHat. What is the best approach to make the migration?
    - By export/import? Will I have problems wiht users/grants/synonyms, etc, since I wont be able to use full import cause the system users?
    - Update de 9.2.0.1 to 9.2.0.6 and to copy the database to new server?
    - Copy the database to new server and update only the database to 9.2.0.6 by scripts?
    - Anything else??? :)
    Thx in advance.

    I'm not sure what you mean by " I wont be able to use full import cause the system users". I can't seem to understand what you're trying to say here.
    If you have a relatively small database, exporting and importing is probably the simplest option. You can also restore a backup from the old machine to the new machine and upgrade the new machine in place, use transportable tablespaces, patch the old database and set up the new system as a standby database, etc. There are plenty of options, we'd need more information about how much data we're talking about, what sort of down time window you have, whether you're concerned about speed, administrative complexity, etc.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • What is the best way to migrate zoning from an MDS9216i to a Nexus 5596?

    I am migrating from a MDS9216i to a Nexus 5596.  We have dual paths, so I can migrate one leg at a time and the full function for these switches is fiber channel attached storage.
    What is the best way to migrate the zoning information? I have been told I can ISL the new and old switch together and then push the zoning to the new switch.  Is that better than just cutting and pasting the zoning information to the new switch?
    Also, I have to move four Brocade M4424 switches to the new switches - they are currently attached via interop mode 1, but will now be attached using NPIV.  Has anyone done this before, and did you have any issues?
    Any help or advice would be appreciated. Thanks!

    Use an ethernet cable to connect the two computers, and set up file sharing. After that copying files from one computer to the other is exactly like copying from one hard drive to another:
    http://docs.info.apple.com/article.html?artnum=106658

  • What is the best approach to capture TBOM's for a SAP SRM system/functionality?

    Hello SCN Community,
    It would be much appreciated if somebody could share some information about the following....
    What is the best approach to create TBOM's for a SAP SRM system? The SRM functionality is basically consisting out of multiple ABAP Web Dynpro's that are connected as a process via a SAP Portal (as is understand it). The entrypint to the SRM functionality is via the SAP Portal.
    Do I first have to create a link to the Portal via an SAP Web Application link in SOLAR01 and then start recording? Will it record only the portal objects or also the ABAP Web Dynpro objects?
    Do I have to list all the separate ABAP Web Dynpro's in SOLAR01 and use those as a starting point?
    I am myself more familair with more classical SAP ABAP ECC systems and transactions.  I could hardly find any information on the use of BPCA and the required TBOM's in the area of SRM.... Any help would be much appreciated!
    Kind Regards,
    Guido Jacobs

    Hi Guido,
    today was a new blog released, maybe this helps:
    BPCA - Powerful Risk Eliminator
    Best Regards,
    Christoph

  • What is the best approach to setup intranet and internet sites in SharePoint 2013?

    I am planning to setup a internet and intranet website for one of our client.  What is the best approach to setup this kind of environment?
    Some of the users (registered users) from the internet should be able to access information in the intranet site.  I have created two web applications for intranet and internet.  Is it the right way to go forward?
    Thanks in advance! :)
    LM

    Hi Laemon,
    Creating two separate web applications, one for Internet site and the other for Intranet is the right thing you have done.
    1. To properly plan creation of your web application, site collection and website is of utmost important to ensure you build your site in a professional and most recommended way. Go through this article from Technet that would help you plan your site in
    SharePoint 2013.
    https://technet.microsoft.com/en-us/library/cc263267.aspx
    2. Planning and choosing the right authentication type is also a very important decision. I recommend you to go through the below article if you have not already gone through.
    Plan for user authentication methods in SharePoint 2013
    3. Plan for licensing for your SharePoint 2013 Internet Facing Website.
    Licensing Internet Sites Built on SharePoint 2013
    SharePoint 2013 licensing for Internet facing sites
    4. To grant access to registered users to Intranet site (as you mentioned in question), if you created both web applications in same farm (same domain) then that would be easy to grant access using Site Permission with Windows Authentication enabled for
    both web application. If both web applications are created on different domains then If there is a two-way trust in place, and the SharePoint servers have the necessary port access to the remote domain's Domain Controller, then it is automatic. If it is a
    one-way trust, then you need to follow these directions:
    http://technet.microsoft.com/en-us/library/cc263460(v=office.12).aspx
    If there is no domain trust in place, then you either need to create one, or look at alternative technologies,
    such as ADFS.
    Please remember to upvote if it helps you or
    click 'Mark as Answer' if the reply answers your query.

  • What is the best approach to portalize a struts application to oracle porta

    Hey guys,
    what is the best approach to portalize a struts application to oracle portal... would it by created PDK Portlets or JSR 168 portlets... and is there any direction to do so... can some one please him this girl in need...

    HELLO!!! WELCOME BACK!! I THINK YOU SHOULD USE
    deploy....Hey Thanks.
    is there any network congestion OR any other problem that i can anticipate before i use "deploy" utility. I have heared of some problems(i couldn't remember them now ....because honestly, i couldn't understand them atall when a BEA consultant told me those...).
    so, any problem that may arise ....that i need to think about before deploying ~10 applications to like ~70-80 clusters ....all at a time.
    thanks again for your advise. iam learning to see the big picture of application deployment.
    -sangita

Maybe you are looking for