Internal and External application tier File move

Hi all,
I am new oracle apps R12 with 3 tier instance.
We are created the oracle apps R12 instance with 3 tier. Two tier is Application tier and one tier is Database tier.
The data base tier is common for both application tier. The issue is, we are moved the all customized Development works into only one application tier (i.e. Internal tier). So we are not moved the all customized Development work into second application tier (i.e. External tier).
My question is, we are not moving all customized work into both application tier means , is there any issue will occur ?
How can we handle my customized works both application tiers( i.e. Internal and external tiers) ?
Please help,
Thanks,
Prab.

Hi;
They are questioning me why we have to do that.Answer is easy, You can mention for ebs stability all nodes should be on same level :) If they push you more please show metalink notes which is already sharing by Hussein Sawwan. If they still persist say them you will go wiht SR and confirm wiht support ;)
PS:Please dont forget to change thread status to answered if it possible when u belive your thread has been answered, it pretend to lose time of other forums user while they are searching open question which is not answered,thanks for understanding
Regard
Helios

Similar Messages

  • Can I split my itunes music library between and internal and external hard drive?

    Is it possible to have music on the two different drives and still be able to make playlists from both drives to put on my ipod?  Here's my situation:  I'm using a Dell 32 bit computer running Vista Home Premium SP2 which I share with other users.  I have a large itunes library (40g+ of music only) that's starting to hog the hard disk space from other users. If possible, I would like to split off part of my itunes library to an external hard drive while still having access to the entire library for my ipod. Ideally I would like to have all my downloaded songs on both the internal and external drives so I'll have backups in case a drive fails, while putting all the music I have hard copies of solely on the external drive. Any help on how this can be done would be appreciated.

    If you manually manage your itues library, you can store parts of it on an external drive, or on an internal drive.  Just keep in mind that in iTunes that you must set a default location for iTunes to manage it's files, and that will be the default location for iTunes to store anything it downloads.  You would need to manuay manage moving things from there if tht is not where you want them stored.
    Perhaps a "better" solution would be for you to invest a little mone into 2 external drives.  One to hold your entire iTunes library, and the second to be your off-line backup of the first.  This is the method I have gone with and I like how it works for me.  My entire iTunes library is on an external (portable) 320GB drive.  Then, when iTunes is not running, I will clone my external drive onto a second drive I happen to have, this way if I have any failure of my primary drive, I lose at most about a weeks worth of content.  For me, I purchase little content during a week, but I do move a fairly large volume of podcasts.  With the entire library being on the external drive, if I ever have to swap to using my backup drive iTunes wil just think that it hasn't run and downloaded anything for a week, and all the "lost" content will be re-downloaded automatically.  This won't work for apps or purchased content, but for subscribed podcasts, it is great.

  • Internal and external DVD burner ?

    Hi all, have a general question, currently have a HP ProBook 6455b, obviously has an internal DVD burner.. I am looking to add an external DVD burner...is this possible to run an internal and external at the same time? would I need additional burning software? Burning is quite slow takes over an hr to burn one DVD, currently using Windows DVD burner because it is simple and effective.. or should I just use another program that is quicker? any ideas or suggestions on fast burning programs? thanks you guys are the best!

    You're very welcome.
    I have never burned a movie with mine but I have burned large 3+ GB ISO images using CD Burner XP, and it takes just a few minutes.
    Now, if you mean can you insert a disk in one drive, and copy that disk via the other drive using the same program set to copy or grab disk, that should be possible.
    It should show one drive as the one to copy from and one to copy to, just like if you had two DVD drives in a desktop PC.
    If you are asking can you simultaneously burn two different files on two different burners using the same program at the same time, that I have no clue. I guess you would have to try it and find out.  You would open the program and burn one file to a disk and open the program again and burn another file to the other DVD burner. If the program won't let you open two instances of it at the same time, then you probably could open a different DVD burning program such as the one I posted above and use that to burn the other files with.
     Paul

  • Internal and external facing applicaitons on same infrastructure

    I'm looking for suggestions on the best way to architect an apex production environment where you may have two or three apps open to the public and 10 or more for internal access only. All of the apps (regardless of public or private) are running on the same APEX instance, DB, app tier and web tier.
    We are using the APEX Listener on Weblogic for the app tier with an OHS webserver and Load Balancer in front of everything.
    The Load Balancer houses all of our certificates and has the ability to perform iRules to make more friendly urls.
    Our approach is to assign each app (ie https://someurl.com/apex/f?p=APPID) a static IP from the load balancer and then firewall public/private based on APPID to prevent internal only apps from being reached outside the network.
    Unfortunately the iRule friendly url rewrite isn't able to mask the APPID from the URL (https://someurl.com/apex/f?p=200) which currently allows anyone the ability to change the APPID parameter of the URL and cycle through all the apps regardless of the firewall rule in place to prevent it from being publicly accessible.
    For example, if we have the following apps deployed and the only one which is allowed open to the internet is app 100, the url rewrite isn't able to mask APPID of 100 (or the APP Alias if used).
    Publicly accessible:
    https://someurl.com/apex/f?p=100 (192.168.25.100)
    Internal only access:
    https://somedifferenturl.com/apex/f?p=200 (192.168.25.200)
    https://anotherurl.com/apex/f?p=250 (192.168.25.250)
    https://subdomain.someurl.com/apex/f?p=300 (192.168.25.300)
    I could navigate to the publicly accessible url https://someurl.com/apex/f?p=100 and change the APPID for one of (200,250,300) and still access those apps which should not be open to the internet.
    from the internet browsing directly to https://somedifferenturl.com/apex/f?p=200 or https://anotherurl.com/apex/f?p=250 or https://subdomain.someurl.com/apex/f?p=300 would all result in a page not found error since their ip's are not accessible directly from the internet.
    What is the best practice to overcome the above scenario and utilize shared infrastructure for internal and external facing applications? Is mod_rewrite my only other option to accomplish this setup and bypass the load balancer?

    Hi Jeff,
    I'm not sure if this is the ideal recommendation, but I know of a way you could block the "internal-only" applications from being accessed externally.
    1) Create a function which inspects the CGI environment variables, e.g., HTTP_HOST, HTTP_PORT, etc. Using this information, you determine if the request is emanating from an internal server name or an external server name.
    2) Create an authorization scheme which returns FALSE if the host/port/other CGI isn't what you expect.
    3) Apply this authorization scheme to every application you wish to keep from an external site.
    I know this isn't ideal, as you have to add this to every "internal-only" application. And if you forget an application, then this application suddenly becomes available on the Internet. But it's one way. If all of the applications are in the same workspace, you could define this authorization scheme in one application and subscribe to it from the other applications.
    Joel
    P.S. From SQL Commands, you can see all of the CGI environment variables at your disposal using:
    begin
    owa_util.print_cgi_env;
    end;

  • SharePoint 2013 - Office Web Apps - Internal and External Use

    I have successfully installed SharePoint 2013 and Office Web Apps on Azure VMs inside an Azure Virtual Network (IaaS model). Everyting is working well. However, my testing has shown that external users and internal users can't use Office Web Apps at the
    same time.
    Office Web Apps, installed on its own vm, accomodates an external and internal URL quite well. However, SharePoint 2013 appears to only allow one setting for WOPI Zone, either internal or external but not both. I've set the WOPI zone to Internal-HTTPS (Set-SPWOPIZone
    –Zone “internal-https”). OWA works just fine if accessed from inside the Azure Virtual Network. However, if I try to access from outside the Virtual Network, from the Internet, Office Web Apps fails. The exact oppisite is also true. I can set WOPI Zone to
    External-HTTPS and accessing from the Internet works fine, but accessing inside the Virtual Network fails.
    Am I missing something? I, obviously, want Office Webs Apps to function properly for both internal and external users simultaneously.
    I appreciate any help anyone can provide here.
    Glenn

    Hi Glenn,
    To have both the use of Internet and Internal available to your end-users, you first need to configure AAM setting. Open Central Administration > Application Management > Configure alternate access mappings. Let's say there is an existing web application
    named http://sharepoint and my end-users from local network are able to access it using the URL http://sharepoint (root site collection). Here you need to add the Internet URL by select the web application and click Edit Public URLs. Add the Internet domain
    to the web application, e.g http://sharepoint.abc.com. You don't necessarily have to edit binding setting in IIS. Before continuing next steps, make sure you are able to access http://sharepoint.abc.com from the Internet while being able to access http://sharepoint
    from local network (aka Internal).
    On the machine where Office Web App (OWA) Server 2013 is installed, open PowerShell to add OWA module and use the following command to re-create a new OWA server farm if you've completed configuring it previously.
    New-OfficeWebAppsFarm -InternalUrl "http://owa" -ExternalUrl "http://owa.abc.com" -EditingEnabled.
    In this case, I'm not using SSL certificate to encrypt data over the Internet. You can use Internet-public IP of the OWA server like -ExternalUrl "http://198.xxx.xxx.xx". Add CertifcateName parameter if you want to use whether CA-issued certificate
    or self-signed certificate.
    On your SharePoint machine, you need to re-bind all WFE machines to WAC farm using the cmdlet New-SPWOPIBinding. Next, you need to set the WOPI zone for both internal and external.
    Set-SPWOPIZone -zone "external-http"
    Note: I'm not all using certificate in my guidance. But the steps to have it configured is just to add more parameter. 
    I've recently successfully deployed OWA multi-server farm for both internal and internet uses for two big clients. In real-world scenario, ideally OWA should be published through firewall (Forefront UAG, TMG, F5...etc). Please let me know if you still have
    issues after following my steps. My email: [email protected]
    Regards,
    -T.s
    Thuan Soldier
    A 23-year-old man loving Microsoft technologies and making crazy ideas on business journey.
    SharePoint Vietnam |
    Blog | Twitter

  • Unable to activate internal and external urls at the same time

    Hi,
    We have Configured EBS R12 in DMZ setup as described in Figure F-9 of metalink note 380490.1 ,Option 2.4: Using Reverse Proxy with no External Web Tier.
    refering to 726953.1 Case History: Implementing a Reverse Proxy Alone in the DMZ Configuration - R12.
    but Not able to activate internal and external urls at the same time in this configuration. Only the node where last autoconfig was run getting activated as web node.
    When trying to accees the url of the other node it gets redirected to the url (where autoconfig is last run).and for this error observed is Error Code:502 Proxy Error.The specified Secure Sockets Layer (SSL) port is not allowed.(12204).
    For both external and internal services are UP.opmn status is live no error.
    Using Apache as reverse proxy.
    EXTERNAL Reverse proxy settings:
    s_login_page http://LONWEB01.process.com:81/OA_HTML/AppsLogin
    <TIER_DB oa_var="s_isDB">NO</TIER_DB>
    <TIER_ADMIN oa_var="s_isAdmin">NO</TIER_ADMIN>
    <TIER_WEB oa_var="s_isWeb">YES</TIER_WEB>
    <TIER_FORMS oa_var="s_isForms">YES</TIER_FORMS>
    <TIER_NODE oa_var="s_isConc">NO</TIER_NODE>
    <TIER_FORMSDEV oa_var="s_isFormsDev">YES</TIER_FORMSDEV>
    <TIER_NODEDEV oa_var="s_isConcDev">NO</TIER_NODEDEV>
    <TIER_WEBDEV oa_var="s_isWebDev">YES</TIER_WEBDEV>
    INTERNAL Middle Tier settings:
    s_login_page http://stprojapp01.test.com:8005/OA_HTML/AppsLogin
    <TIER_DB oa_var="s_isDB">NO</TIER_DB>
    <TIER_ADMIN oa_var="s_isAdmin">YES</TIER_ADMIN>
    <TIER_WEB oa_var="s_isWeb">YES</TIER_WEB>
    <TIER_FORMS oa_var="s_isForms">YES</TIER_FORMS>
    <TIER_NODE oa_var="s_isConc">YES</TIER_NODE>
    <TIER_FORMSDEV oa_var="s_isFormsDev">YES</TIER_FORMSDEV>
    <TIER_NODEDEV oa_var="s_isConcDev">YES</TIER_NODEDEV>
    <TIER_WEBDEV oa_var="s_isWebDev">YES</TIER_WEBDEV>
    Are we missing anything....
    Thanks & Regards

    Hi,
    Finally it's resolved...Following is the solution thought to share in the forum:
    The configuration of the E-Business Suite environment for DMZ requires profile options hierarchy type to be set
    to SERVRESP.
    To change the profile options hierarchy type values to SERVRESP, execute the following SQL script as
    shown below:
    sqlplus / @/patch/115/sql/txkChangeProfH.sql SERVRESP
    After successfully completing the above sql script, run Autoconfig in all nodes to complete the profile options configuration.
    It's resolved after doing this..

  • Cannot install Yosemite as it keeps telling me both my internal and external drives are time machine backups

    Cannot install Yosemite. Keeps telling me that both my internal and external
    drives are backing up with TIme Machine. Any suggestions?

    Is it actively backing up, or are you encountering a dialog stating that Yosemite cannot be installed because the disk is being used for Time Machine backups?
    Read OS X: Cannot install on a volume used by Time Machine for backups
    If for some reason you have a folder called Backups.backupdb at the root level of the hard disk on which you want to install Yosemite, move it to the Trash.

  • ILife with both internal and external hard drives?

    I've been considering switching from a homebrew, multi-boot desktop to a MacBook for my primary computer, in part so I can hang out with my family in the living room rather than be exiled to the home office when I want to compute.
    But here's my concern: I have media. We have about 50 GB of iTunes; maybe 30 GB of iPhoto; and tons and tons of digital video that would be stored in iMovie. Obviously the libraries are all interlinked. And it's all growing. I also like to rip DVDs and re-encode them for my iPod and AppleTV. Right now, my desktop has 480 GB of internal storage and that's just about enough.
    I have discovered that the MacBook only comes with an option up to 250 GB. I absolutely need AppleCare, so I can't get an aftermarket hard drive. (All my Macs break - this one from the office that I'm on right now has a bum DVD drive, and my wife's has needed both fan and logic board replacements.)
    While I'm aware of the existence of external hard drives, I'm concerned about Apple's non-external-hard-drive-friendly way of storing iLife data. If I wanted to keep more recent or useful music and photos on the internal drive but older stuff on an external, and still be able to use iLife seamlessly, would that be possible? (I see myself editing recent video in the living room, but then hooking back into the external HD in the office if I need older stuff.)
    What solutions are out there for integrating data stores on both internal and external hard drives into an iLifestyle?
    Thanks!

    Sascha Segan1 wrote:
    .. What solutions are out there for integrating data stores on both internal and external hard drives into an iLifestyle?
    all iApps (iPhoto, iTunes, iM08) support usage of external drives as 'mass storage' devices.. you can tell all apps which drive to use for the Libraries.. there some tools out there, which even allow the usage of 2/many different Libraries in iTunes/iPhoto..
    for iM in detail: the Projects are small files, and should stay internal (allthough I'm discribing a 'hack' on my site: http://karsten.schluter.googlepages.com/im08tricks Project Library (and Events) on External Harddrive); the Events (=GBs) could be located on as much ext. HDDs as you want..
    but ...
    all iApps are single-user .. you can NOT 'share' Libraries to 2/many different users; the idea of a 'media server' which hosts/shares all kind of data to all kind of users is not 'on concept' of iLife ..

  • Best practises regarding Internal and External access to SIM

    Currently we have two separate Active Directories one internal and one in the DMZ and plan to have one SIM on an segmented network allowing access for our internal users directly to SIM UI and external users thru portlets that talks to SIM.
    The external AD hosts some internal users that also needs access to the DMZ applications so we can save efforts in managing to separate SIM environments in development, tests, upgrades, unique UID etc...
    What are the best practices on the market is this a preferred choice with only one SIM or with one SIM internally and one SIM in DMZ hosting suppliers, customers etc?
    With a single SIM environment are you allowing internal users accessing SIM from Internet to change internal AD password or have you restricted the functionality in some way for internal users accessing SIM from internet?
    How about challenge response questions are you allowing users to have the same both internally and externally or setup different for different user interfaces?
    Anyone willing to share how your environment is setup for internal and external access?

    Yes for handling the access to the SIM we probably need to look into some kind of access management solution to get it to work in a secure way.
    The question is a bit complex with many different factors controlling the outcome of the SIM implementation, but I hope to get some idées with this thread of how we can solve it.
    The question still remains if its common to have one or to SIM's and what internal users is allowed to do in SIM from Internet.
    Ex are internal users allowed to change their password in internal Active Directory thru SIM from Internet or what have others done to limit the functionality?

  • Backup internal and external hard drives-TC and offsite

    I now have Mavericks 10.9.3 on my iMac with a 2TB (1.25 TB used) internal hard drive.
    I also have some external drives attached to my iMac with older iPhoto libraries and other files (total file sizes 940 GB and 505 GB).  I primarily use Aperture now on my internal hard drive, but still access those iPhoto libraries on my external drives on occasion.  I have Time Machine backup my iMac internal hard drive on my 2TB Time Capsule regularly.
    My primary question is in regards to getting another copy onto a larger external drive that covers my internal and external drives so I can have a backup off-site.  Online backup services seem to always exclude external drives.  So a physical drive I can have offsite seems to be the best option.
    A year ago I changed the destination of my Time Machine backup to be on a 3TB external drive (backed up iMac internal hard drive and the two externals).  However, when I changed the destination of the backup back to the Time Capsule, the TM initiated a brand new backup (it did not recall that I had backed up prior to that on my Time Capsule).
    I want to backup monthly, if not quarterly for my off-site storage.  But, if every time I change the destination drive for TM, a new backup profile is created, it will overwork my drives unnecessarily.
    Is there a backup program or a process on "disk utility" I could run parallel to TM that I just use quarterly capturing only the changes/additions in those few months for both the internal and external hard drives?  Also, is there a way to add an external drive to my Time Capsule that is solely used to wirelessly backup the two externals on a regular basis (i.e. keep the internal 2TB drive backing up to the Time Capsule; and the external hard drive attached to Time Capsule via USB used as the backup drive for the external hard drives)?
    Summary:  I need to backup regularly to the local Time Capsule/additional external hard drive.  The data will come from my internal hard drive and my two external hard drives.  I also want to do quarterly backups of the additions/changes to all three drives to have on an offsite external drive that I manually backup to quarterly.  Any help is greatly appreciated.

    Carbon Copy Cloner is not on the App store.
    Correct.. it is not approved because Apple do not like the fact that CCC (and most likely superduper) which are the most popular backup software for Mac because it makes a bootable clone. Apple will never approve of that. But let me assure you that is the genius of it. If the internal disk fails, you simply boot from the external. It is $40 but you can use on all the computers in your home.
    CCC is a clone.. ie when it does the backup, any changes on the drive are changed on the clone. It does not work like Time Machine which simply piles up incrementals until the drive fills up. The idea of CCC is a backup of the drive as it exists at any point in time. TM btw is also not a reliable archive, ie it thins backups constantly.. so you should never rely on it to archive old versions.. but in the middle of a project it does a good job to keep various versions of your files. That is why I specifically said in my last post do not stop using it.
    and you can keep using the TC just for the internal drive.
    Keep TM running to the TC.. that will then keep a current hourly incremental of your drive. You can set CCC in a way which is a lot more flexible. ie backup just at the end of the day. There is no need for constant hourly backups. So to answer the second question.. you are still using your TC and TM.. but I suggest you only backup the internal drive.
    Please read a bit from forum expert Pondini on the value of clones and TM.
    http://pondini.org/TM/Clones.html
    That's why many folks use both Time Machine and a bootable clone, to have two separate, independent backups, with the advantages of both.  If one fails, the other remains.
    Now the ports issue.
    You can of course continue to use USB2. Just that moving large volumes around will be slow.. as doubtless you already know.
    On your particular Mac since you missed out on USB3 which is a pain.. you can buy a Thunderbolt to USB3 adapter like the belkin.
    http://www.belkin.com/au/p/P-F4U055/
    I suggested the Thunderbolt to Esata (it is an older interface and the adapter is rather cheaper but I hear more reliable.. check reviews for both).
    http://store.apple.com/au/product/H8875ZM/A/lacie-esata-hub-thunderbolt-series
    Sata is the interface of the hard disk.. Esata just means external sata.. so it is native and without conversion.
    I was merely suggesting ways to speed things up. But as long as you don't run CCC on more than daily basis then I think you will be fine just with USB 2. It will take a while on the day you do the swap over for the archive volume, but if you turn off the power saving in the Mac and leave it run overnight it should be able to do most of it.. you need to realise it will have to deep scan both the disks.. to compare files.. but CCC is based on Rsync and it is extremely fast and efficient. I am just not sure I know how long it will take to do. Anyway.. there is a plan.. tweak and adapt as you see fits your needs.

  • Hierarchies,dependent and internal and external hierarchies .

    Hi Gurus,
    Pls let me know about time dependent and internal and external hierarchies .
    Regards
    Karan

    Ok to clarify
    Coflaher,
    My end goal is to use Jabber for windows as an internal IM client which currently works fine as "saMAccountName"@domain.loc. I would also like to be able to federate to services listed under the Interdomain Federation section of the document you refered to above.
    How should my IM address field read? Since you say it needs to be resolvable from external sources. This leads me to believe my IM field should read "firstname.lastname"@domain.ca instead of "saMAccountName"@domain.loc
    Yet when add the following to my jabber.config.xml file it seems to loose it's mapping to all the other fields in AD.
    This part of the documentation doesn't seem entirely clear. Your assistance is much appreciated.
       true

  • Internal and External sources for OBIEE

    Hi,
    When we say OBIEE can integrates data feeds from internal and external sources..what exactly does this mean. Can OBIEE even do that? Thanks

    Hi,
    We have a requirement where the data sources are from internal and external sources. I was not sure of it too and so raised the question on forums. I am assuming internal sources would be Oracle etc within the system and external would be out of the application. Not sure exactly what is meant by it.

  • Shared Application Tier File System in Oracle Applications R12

    Hi
    Recently I came to an environment, here we are having 3 nodes with Shared Application Tier File System - One is Admin & DB and the other two are Applications Nodes(Apache and Forms) with load balancing. Its a Shared Application Tier File system so APPS_TOP is NFS mounted on two Applications Nodes. But i see ADMIN Node INST_TOP is also mounted on these two Applications Nodes...Could you please let me know what for this is????? or for what reasons we can keep like this?????

    my question is Admin node and two Application Nodes are having INST_TOPS locally on those nodes it self,the best practice is to have $INST_TOP directory located on a each server local mount directory.
    Re: Shared Application File System on Linux
    other than that Why ADMIN Node INST_TOP is mounted on these two Application Nodes.you have to check with your admins who implemented initially.but best practice is to have $INST_TOP directory located on a server local mount point.

  • DNS records to be created for Lync deployment (Internal and External)

    Hi There,
    If I want the Lync server environment to work Internal as well from External in all the aspects. (auto-discover, meetings, AV conferencing,web conferencing, voice integration, mobility etc), please answer to the below questions and also their purpose please.
    I'm not sure whether the answer varies for 2010 and 2013 version.
    1. What are the Internal and External(public) DNS records to be created for the reverse proxy(assume i'm using TMG servers), and their purpose?
    2. What are the Internal and External(public) DNS records to be created for Lync Edge server, and their purpose?

    I'll try to answer as well.
    1) For the reverse proxy, you'll need to publish the following:
    External:
    lyncdiscover.sipdomain.com (You'll need this record for every sip domain you have).  This is for client autodiscover.
    external web services FQDN (You'll need one of these per pool, you get to choose the name).  This is for address book downloads, web conferencing, etc.
    Meet.sipdomain.com (You can choose the name here, and have one per sip domain or one for the whole org).  This is for web conferencing.
    Dialin.sipdomain.com (You'll just need one here, it doesn't have to be dialin).  This is for changing your conferencing/phone pin, resetting conference info, and general conferencing info.
    For Lync 2013 only, you may want the Office Web Application server pool name as well for PowerPoint sharing.  Lync 2010 doesn't use this.  
    Internal:
    The external web services FQDN.  You'll need this available internally through the reverse proxy so you can redirect requests on port 443 to port 4443.  This will be used for mobile devices on WiFi.
    2) For the Edge server:
    Externally:
    sip.sipdomain.com (you'll need one per sip domain) this is an autodiscover/multi use FQDN and should point to your access edge IP.
    webedge.sipdomain.com (edge web conferencing, you can pick any name you like).
    avedge.sipdomain.com (av edge, you can pick any name you like).
    accessedge.sipdomain.com (you'll need a name for the access edge role, however you can just use sip.sipdomain.com and save a name in your certificate request).
    Internally:
    edgepool.sipdomain.com (you can pick any name you want, it's just the name assigned to the internal edge interface.
    If you choose to have a single ip for the external edge, you can get away with just an access edge name and/or sip.sipdomain.com
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question please click "Mark As Answer".
    SWC Unified Communications
    This forum post is based upon my personal experience and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • AutoDiscover working Internally and Externally?

    What is AutoDiscover and how it works in Exchange 2007 & 2010. Internally and Externally?
    Aditya Mediratta

    Hi,
    Based on my knowledge,in Exchange 2007 and Exchange 2010, Autodiscover is located in CAS server and helps Outlook client automatically find multiple settings including EWS URL and so on.
    And there are four connectivity methods for Autodiscover connectivity: SCP,DNS, local XML file and SRV records. Internal Outlook clients will try  the four methods while external clients will try the latter three methods.
    Except with the above articles, here are more references you can refer to:
    http://blogs.technet.com/b/exchdxb/archive/2012/05/10/troublshooting-autodiscover-exchange-2007-2010.aspx
    http://support.microsoft.com/kb/2644437
    http://support.microsoft.com/kb/2212902
    http://support.microsoft.com/kb/2404385
    Thanks,
    Angela Shi
    TechNet Community Support

Maybe you are looking for