Data centre connectivity options

Hello
I am currently investigating a dual data centre design running
in active/active mode. The data centres will each have connectivity to
our WAN (MPLS) and to the Internet. They will have also have dedicated
links to each other for site replication etc.
Having read a few of the Cisco SRND's what i am still a little unclear
about is whether it is better to connect the two data centres over the
dedicated link using layer 2 or layer 3 and what the pros and cons are of
each. I would appreciate any experiences (good and bad) that people have had
in this area.
My instinct is to go layer 3 eliminating a potential spanning tree issue
that could affect both data centres but i am sure there are more issues
than this to take into account.
Many thanks

i have redundant data centers and they have been setup as follows for specific reasons:
(these data centers are not separated by a WAN, if they were, a T3 or better would be required in my case but i'd opt for a metro fiber type of solution to provide GB+)
using the 3 hierarchial network design: core, distribution, access
1) the CORE is L3/routed; we do not want a L2/switched core for a few reasons. one is to allieviate STP and its inherent problems.
(the core should be moving packets as fast and predictable as possible; stp can interrupt this and cause complete packet forwarding delay or worse; with todays routers, they can route packets just as fast as switching them, or faster in some cases)
2) the distribution layer is switched with fully meshed GB or greater trunks to both the cores. also provides redundant intra VLAN routing for all the VLANs controlled in their specific 'distribution blocks'; i have 5 fully redundant distribution blocks with VLAN routing and VLAN load balancing via HSRP.
(i channel upto 6 GB trunks in a given link)
3) the access layer is switched with fully meshed GB or greater trunks to at least two distribution switches per access switch; one trunk to each core, at least.
(there is no routing performed at the access layer)
other reasons such as the routing operation, location and number of distribution switches, administration and speed affect the design.

Similar Messages

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • What connection options do I have in transfer of data from my macbook pro 13" to my iMac

    What connection options do I have in transfer of data from my MacBook Pro 13" to my iMac?

    If available, use a Firewire cable and put the origin in Target Disk Mode.  That's how I rescued all my stuff, all 150GB of it, from my now dead Early '08 MBP to my current one.
    With the new crop, I believe you can use the Thunderbolt ports as well.

  • Why do I have so many connection to Apple data centres?

    My home network was a little slow tonight, so checked the traffic flow on the router.  I found hundreds of secure connection to a range of addresses, after checking, I have noted most the remote addresses are Apple data centres and all the internal addresses are from IOS devices. I have around 6 devices at home, all IOS8.
    These conection are not sending large amounts of data and not slowing things too much, however I am keen to know what these connections are for.

    What services do you have enabled in your iGadget?
    Location? Find My iPad? Background App Refresh? Handoff? The iCloud services? To mention a few. And that's just the OS.

  • SQL 2012 AlwaysOn Dual Data Centre (an instance in each data centre with a secondary in each other respectively)

    Hi, hopefully someone will be able to find my scenario interesting enough to comment on!
    We have two instances of SQL, for this example I will call them 'L' and 'J'. We also have two data-centres, for this example I will call them 'D1' and 'D2'. We are attempting to create a new solution and our hardware budget is rather large. The directive
    from the company is that they want to be able to run either instance from either data centre. Preferably the primary for each will be seperated, so for example:
    Instance 'L' will sit in data centre 'D1' with the ability to move to 'D2', and...
    Instance 'J' will sit in data centre 'D2' with the ability to move to 'D1' on request.
    My initial idea was to create a 6-node cluster - 3-nodes in each data centre. Let's name these D1-1, D1-2, D1-3 and D2-1, D2-2, D2-3 to signify which data centre they sit in.
    'L' could then sit on (for example) D1-1, with the option to move to D1-2 (synchronously), D2-1,D2-2 (a-synchronously)
    'J' could sit on D2-3, with D2-2 as a synchronous secondary and D1-3,D1-2 as the asynchronous secondaries.
    Our asynchronous secondaries in this solution are our full DR options, our synchronous secondaries are our DR option without moving to another data centre site. The synchronous secondaries will be set up as automatic fail-over partners.
    In theory, that may seen like a good approach. But when I took it to the proof of concept stage, we had issues with quorum...
    Because there are three nodes at each side of the fence (3 in each data centre), then neither side has the 'majority' (the number of votes required to take control of the cluster). To get around this, we used WSFC with Node and File Share majority - with
    the file share sitting in the D1 data centre. Now the D1 data centre has 4 votes in total, and D2 only has 3.
    This is a great setup if one of our data centres was defined as the 'primary', but the business requirement is to have two primary data centres, with the ability to fail over to one another.
    In the proof of concept, i tested the theory by building the example solution and dropping the connection which divides the two data centres. It caused the data centre with the file share to stay online (as it had the majority), but the other data centre
    lost it's availability group listeners. SQL Server stayed online, just not via the AG listener's name - i.e. we could connect to them via their hostnames, rather than the shared 'virtual' name.
    So I guess really I'm wondering, did anyone else have any experience of this type of setup? or any adjustments that can be made to the example solution, or the quorum settings in order to provide a nice outcome?

    So if all nodes lost connectivity to the fileshare it means that there are a total number of 6 votes visible to each node now. Think of people holding up their hands and each one can see the hand. If the second link between the two sites went down then each
    node on each side would only see 3 hands being held up. Since Quorum maximum votes =7, the majority needed to be seen by a node would be 4. So in that scenario, every node would realize it had lost majority and would offline from the cluster.
    Remember that quorum maximum (and therefore majority), never changes *unless* YOU change node weight. Failures just mean then is one less vote that can be cast, but the required majority remains the same.
    Thanks for the complement btw -very kind! I am presuming by your tag that you might be based in the UK. If so and you are ever nearby, make sure you drop by and say hello! I'll be talking at the
    London SQL UG two weeks from today if you are around.
    Regards,
    Mark Broadbent.
    Contact me through (twitter|blog|SQLCloud)
    Please click "Propose As Answer" if a post solves your problem
    or "Vote As Helpful" if a post has been useful to you
    Come and see me at the
    PASS Summit 2012

  • SAP Data Centres: Data At Rest Encrypted?

    http://help.sap.com/saphelp_byd1502/en/ktp/Products/96e54f70e7b2450895d71966b531edf7/Publishing/Guides/SecurityGuide.pdf
    Hello everyone,
    A quick one: I was having a look through the PDF at the above URL ('SAP Business ByDesign, SAP Cloud For Customer, and SAP Cloud For Travel Expense: Security Guide), and it struck me that I couldn't find a single reference to data at rest in their data centres being encrypted. There's lots of talk about transport security and IDSs, physical security etc. --- but I didn't spot anything about actual encryption of data at rest. Is anyone knowledgeable on this subject able to enlighten me? It seems like a no-brainer that they would do so, of course, but I don't like to assume anything and would like to know what their confidentiality, integrity, etc. measures are for this. Thanks!
    Lewis

    Hi,
    My scenario is similar.
    Let us assume User A who is not supposed to see particular BEXquery Connection in BI Launch Pad in SAP BO BI 4.0
    This can be accomplished at 
    1) BI end by creating analysis Authorization for a BI/BEX query unless a SAP Single Sign on enabled.
    Question A) Can't this be done without SAP Single Sign on enabled? if so what are the other options? from SAP BI end.
    Question B) Can you please advice steps In CMC how this can be done?
    Is it possible that user in BI Launch Pad not able to have option "Select a Bex Query as datasource" all together ?
    if so,
    Question C) Can you please advice steps In CMC how this can be done?
    Thank you,
    Venu

  • How to access a Network Share between two servers in same data centre

    I have two dedicated servers (both Windows 2012 Server) hosted in a data centre somewhere.   I want to share a folder on one server with the other server, but it's obviously not as straight forward as one might think.  My servers are called "Maximus"
    and "Apprentice".
    On Maximus I shared a folder by right clicking on it and choosing "Share with... / Specific People", I then typed in the name of a local user account which also exists on Apprentice with the same name and password.  (so each server has a local
    user account with the same name and password).
    So then on Apprentice, I was hoping I could access the share (while being logged in as this user with whom the folder was shared) by simply typing  "\\ipaddress\sharename" into the address bar in file explorer.  Unfortunately it comes
    back with "Windows can not access [ip address]".
    Now, I do have a website setup on the IP address for Maximus.  This is actually the reason I want to create this share.  I need the second server for load balancing and need to share IIS config as well as the website itself.  (So I will need
    two shares eventually, but for now I'm just trying to get one to work).  I don't know if the fact that the ip address is pointing to the website is causing me problems here or if it's something else.
    Are there any network gurus out there who can tell me what the issue is and how to resolve it?

    I can ping both servers in either direction, but I believe I may have found the problem.  Apparently my host is blocking port 445 which Windows wants to use to connect to the share and they will not unblock it.
    Is there a way to connect to the share through a different port?  
    To my knowledge, you cannot change the port. However, you can try disabling your security software for testing. If this fixes the problem then you need to adjust your security software configuration so that traffic on this port is not blocked or filtered.
    This posting is provided AS IS with no warranties or guarantees , and confers no rights.
    Ahmed MALEK
    My Website Link
    My Linkedin Profile
    My MVP Profile

  • Error while importing the tables from MySQL using the data source connection

    Hi,
    I am trying to import tables from MySQL into Powerpivot using the data source connection, if use the import using the Query option its working fine but not with the select list of table option.
    when i click on the select list of tables option, i get the below error after selecting all the tables to be imported:
    OLE DB or ODBC error.
    An error occurred while processing table 'XXXXXXXXXX'.
    The current operation was cancelled because another operation in the transaction failed.

    Hi Bharat17an,
    Please provide the detail information when create the MySQL connection in your PowerPivot model. Here is a good article regarding "how to Use MySQL and Microsoft PowerPivot Together" for your reference, please see:
    http://www.datamensional.com/2011/09/how-to-use-mysql-and-microsoft-powerpivot-together-2/
    If this issue still persists, please help to collection windows event log information. It maybe helpful for us to troubleshoot this issue.
    Regards,
    Elvis Long
    TechNet Community Support

  • WAS 6.1 JNDI data source connection CRXI R2 Developer

    Hi all,
    I've been tasked to get a JNDI connection from a WAS 6.1 server to a Oracle DB to use in developing a report.
    The CR developer I'm using is CR Developer;Product Type: Full Product Version: 11.0.0.1282
    I have created a simple Java 'main' class to test the JNDI connection using the following jars:
    com.ibm.ws.admin.client_6.1.0.jar
    com.ibm.ws.runtime_6.1.0.jar
    com.ibm.ws.webservices.thinclient_6.1.0.jar
    ibmorb.jar
    ojdbc-14.jar
    rsadbutils.jar
    rsahelpers.jar
    I have been able to retrieve data from the Oracle DB through the 'main' class and display it on the console.
    However, after I set the CRConfig.xml file's classpath to use these jars, along with a jar I created that contains a jndi.properties file that contains the org.omg.CORBA.ORBClass=com.ibm.CORBA.iiop.ORB property because the JVM that the CR developer is pointing at is a Sun 1.6 HotSpot JVM.
    When I try getting the connection, Database -> Database Expert -> Create New Connection -> JDBC (JNDI), select the JNDI Connection option and enter the relevant information, the developer finds the JNDI names that I'm looking for. Then I 'Click' the Finish button and receive the following error pop-up:
    Logon Failed.
    Details: SQL Exception: SQL State: null ERROR Message: Invalid argument(s) in callDSRA0010E: SQL State= null, Error Code = 17,068 Database Vendor Code: 17068
    I have noticed that I can replicate this error message, through the 'main' class I used to test the connection, if I remove the username/password arguments from the method I use to get the DB connection.
    Is there anything that I'm missing to successfully use this WAS data source?
    I have tested a WebLogic 10.3 connection and the CR Developer had no problem finding/using the connection?
    Any help that can be provided would be greatly appreciated.
    Below is the JDBC tag from the CRConfig.xml file that I'm using:
    <JDBC>
         <CacheRowSetSize>100</CacheRowSetSize>
         <JDBCURL></JDBCURL>
         <JDBCClassName></JDBCClassName>
         <JDBCUserName></JDBCUserName>
         <JNDIURL>iiop://server:bootstrap_address-port</JNDIURL>
         <JNDIConnectionFactory>com.ibm.websphere.naming.WsnInitialContextFactory</JNDIConnectionFactory>
         <JNDIInitContext>cell/clusters/clusterName/jdbc</JNDIInitContext>
         <JNDIUserName>schema-username</JNDIUserName>
         <GenericJDBCDriver>
              <Option>No</Option>
              <DatabaseStructure>catalogs,tables</DatabaseStructure>
              <StoredProcType>Standard</StoredProcType>
              <LogonStyle>Oracle</LogonStyle>
         </GenericJDBCDriver>
    </JDBC>
    Thanks in Advance
    Edited by: java_geek on Jul 28, 2010 3:31 AM

    Hi all,
    I've been tasked to get a JNDI connection from a WAS 6.1 server to a Oracle DB to use in developing a report.
    The CR developer I'm using is CR Developer;Product Type: Full Product Version: 11.0.0.1282
    I have created a simple Java 'main' class to test the JNDI connection using the following jars:
    com.ibm.ws.admin.client_6.1.0.jar
    com.ibm.ws.runtime_6.1.0.jar
    com.ibm.ws.webservices.thinclient_6.1.0.jar
    ibmorb.jar
    ojdbc-14.jar
    rsadbutils.jar
    rsahelpers.jar
    I have been able to retrieve data from the Oracle DB through the 'main' class and display it on the console.
    However, after I set the CRConfig.xml file's classpath to use these jars, along with a jar I created that contains a jndi.properties file that contains the org.omg.CORBA.ORBClass=com.ibm.CORBA.iiop.ORB property because the JVM that the CR developer is pointing at is a Sun 1.6 HotSpot JVM.
    When I try getting the connection, Database -> Database Expert -> Create New Connection -> JDBC (JNDI), select the JNDI Connection option and enter the relevant information, the developer finds the JNDI names that I'm looking for. Then I 'Click' the Finish button and receive the following error pop-up:
    Logon Failed.
    Details: SQL Exception: SQL State: null ERROR Message: Invalid argument(s) in callDSRA0010E: SQL State= null, Error Code = 17,068 Database Vendor Code: 17068
    I have noticed that I can replicate this error message, through the 'main' class I used to test the connection, if I remove the username/password arguments from the method I use to get the DB connection.
    Is there anything that I'm missing to successfully use this WAS data source?
    I have tested a WebLogic 10.3 connection and the CR Developer had no problem finding/using the connection?
    Any help that can be provided would be greatly appreciated.
    Below is the JDBC tag from the CRConfig.xml file that I'm using:
    <JDBC>
         <CacheRowSetSize>100</CacheRowSetSize>
         <JDBCURL></JDBCURL>
         <JDBCClassName></JDBCClassName>
         <JDBCUserName></JDBCUserName>
         <JNDIURL>iiop://server:bootstrap_address-port</JNDIURL>
         <JNDIConnectionFactory>com.ibm.websphere.naming.WsnInitialContextFactory</JNDIConnectionFactory>
         <JNDIInitContext>cell/clusters/clusterName/jdbc</JNDIInitContext>
         <JNDIUserName>schema-username</JNDIUserName>
         <GenericJDBCDriver>
              <Option>No</Option>
              <DatabaseStructure>catalogs,tables</DatabaseStructure>
              <StoredProcType>Standard</StoredProcType>
              <LogonStyle>Oracle</LogonStyle>
         </GenericJDBCDriver>
    </JDBC>
    Thanks in Advance
    Edited by: java_geek on Jul 28, 2010 3:31 AM

  • How to change the Data base connection Dynamically

    Hi
    Hi create a crystal reports using crystal report 12.0 in this i use the standard wizard and local data base connection and create a dotnet programme for this .Now i want to  change the database because because i want to instal it in client they use a different connection so i need to change the connection .How is it possible . PLzzzzz tell me it is urgen t
    Addvance thnks

    Hi
    To change the database you can use DataSource  Option that is present in the Menu Bar.
    Go to Database
    ->Set Datasource Location
    ->Replace the current Data Source with the new Datasource.
    Hope this helps
    Shraddha

  • "Delete data and restore" option in Nokia N8

    I want to restore my N8 to fresh state as I have installed and uninstalled too many apps and they all have created and left a lot of files on C drive even after uninstalling them. They have ocupied some space on C drive.
    I am going for "Delete data and restore" option in phone management. I will back up all data with Nokia Suite before the procedure. I just want to know what all apps which are installed by default will be lost. Please enumerate every app or thing that will be lost.
    I know that default wallpapers, songs will be lost. I'm interested in default apps only. Likes dictionary, Zip, Communicator, Video editor, Photo editor,  My Nokia, Recorder, Shazam, Msg. reader, Adobe reader, Nat Geo, Movie Teasers, YouTube, E! etc. I have mentioned most here, may have missed some. Please enlist in detail.
    Thanks a lot.
    One more thing, this are should be improved by Nokia-uninstalling any app should remove it completely without any footprints.

    If you go to this page (and are logged in to the store) you will find all apps your previously downloaded and these are the apps that will be removed when you do a full reset. You can re-install them again afterwards provided you have not done so 10 times before.
    Please follow the steps below if you decide to completely reset your N8 and start with a fresh device:
    If you have data on the N8-00 you wish to keep create a backup using Nokia Suite but when asked deselect the option to include settings in the backup. You can start a backup through Tools>Backup in the Nokia Suite menu.
    Now disconnect your N8-00 from the PC and select the offline profile, press the on/off switch and from the menu which appears select and activate the profile offline.
    Enter the code *#7370# followed by the security code 12345. Please be aware this reset may take a while. Your N8-00 will restart when finished.
    After the N8-00 restarts go through the Device First Use sequence to setup the N8-00 correctly.
    Now connect the N8-00 to your PC again and let Nokia Suite install the requires files on N8-00. After this (re)install the latest system update. When you go to the update screen in Nokia Suite your will either be presented by a system update or you can choose to re-install the current (up to date) version again.
    After the update verify if there are any additional updates and install these as well.
    After this you can restore the backup if you made one.
    Kosh
    Press the 'Accept As Solution' icon if I have solved your problem, click on the Star Icon below if my advice has helped you!

  • Self Service Restore - Data Centre Failure

    I'm evaluating the new 'Standard' tier of SQL database and self service restore feature. My question is about where the backups are stored for self service restore and whether they help in a full data centre failure. e.g. If my Standard SQL db is in Europe
    West and the only backup I am using is the in built self service restore. Do I have any protection if Europe West has a major DR event? Or do I still need to export the databases to blob storage to handle this kind of event? (Whereby I would need to restore
    to Europe North manually)

    As indicated in the "Service Tiers Details" table in this
    blog posting , we have a couple of new features which are not yet enabled for the new tiers.
    One feature will be the ability to restore a database to an alternate datacenter in case the primary data center you are using is not available OR for any other reason you may want to restore a database in an alternate location.  This recovery option
    uses geo-replicated backups.  For this feature we only geo-replicate the full and daily backups but not the log backups so point-in-time restore is not possible but you will have an RPO of less than 24-hours.  Think of this as always having access
    to daily off-site backups from any location.  This feature will be available for Basic, Standard and Premium. 
    The second feature which we will enable later is the geo-replication for Standard.  This feature will allow you to opt-in for geo-replicating your Standard database to a passive secondary in an alternate location.  We will have more details about
    this feature in a few weeks.
    Until you have access to one of the above two features you will have to use the export-to-blob for the datacenter failure case.  We are working hard to deliver the above two features as soon as we can.
    I hope this helps.
    Tonyp

  • DMZ VLANs in the Data Centre - Physical or Logical Seperation

    I am building a new DMZ in my Data Centre and I'm looking at the merits of Logical Seperation rather than Physical Seperation.
    Instead of putting in some new DMZ Switches and then physically cabling all the DMZ devices and Servers to these switches so that these are physically seperate from the rest of the DC, I'm thinking of connecting them up to the existing DC Switches and just use a different set of VLANs with the routed interface for these on Physical Firewalls.
    Can people please appraise me of the concerns or issues with this? Are there any articles or design papers on this?
    Thanks

    Come on guys I expected someone to at least make some form of comment!
    It looks like either the community doesn't know or doesn't care!

  • The Data Model connection could not be created

    Hello, 
    Using powerquery, I am trying to access daily logs conveniently, the are over 24m records in a month. 
    I have selected the columns i need, and checked the box ' load data to Model', Once the query has ran it returns the error, "The Data Model connection could not be created".
    Thank you for any help on this :)

    This is a tough one because as an add-in, we can't get much information from the API for failures like this. Our general recommendation for people working with big data sets is to use 64-bit Excel. I realize that's a big hammer, but it might be your only
    option if you need to work with a 24 million row data set.
    Another option (which you may have already done) is to filter and aggregate that dataset as much as possible before adding it back into Excel. The analysis done inside the Power Query editor is less likely to negatively affect the working set on your machine.
    (It does depend how the query is built and where the data is come from. If we have to pull all the data into memory to do some of the operations then it won't help at all.)

  • Data packet connection received

    On a fairly regular basis, once an hour approx, my N86 beeps and a message appears on the screen stating that a " data packet connection received". This also happens when I'm on a call and beeps in my ear.
    This is only a minor annoyance but an annoyance nevertheless. Any idea what this is and what I can do about it?
    Solved!
    Go to Solution.

     also have this problem with my N5800. I have noticed on my phone bill that I have connected to the internet every 30 min.
    When I hold the menu button to see what apps are active I only ever see the "home" and "menu" logo. In my Admin settings>Packet data>Packet data connection the only options are "When needed or when available"
    How is the best way to find out what app is connecting to the internet and sky rocketing my bills!!
    Nokia 5800 XpressMusic / FW 50.0.005

Maybe you are looking for