Best practice for exposing internal data to external world?

Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
connects to the database server.
Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
server which then talks to the database.
I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
Thanks.

IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
http://pauliom.wordpress.com

Similar Messages

  • Best practice for integrating oracle atg with external web service

    Hi All
    What is the best practice for integrating oracle atg with external web service? Is it using integration repository or calling the web service directly from the java class using a WS client?
    With Thanks & Regards
    Abhishek

    Using Integration Repository might cause performance overhead based on the operation you are doing, I have never used Integration Repository for 3rd Party integration therefore I am not able to make any comment on this.
    Calling directly as a Java Client is an easy approach and you can use ATG component framework to support that by making the endpoint, security credentials etc as configurable properties.
    Cheers
    R
    Edited by: Rajeev_R on Apr 29, 2013 3:49 AM

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practice for retraction of data from BPC 10.0 to General Ledger

    Hi All,
    I have the requirement to retract data from BPC 10.0 to General Ledger.
    What is the best practice for doing this?
    I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
    What is your opinion on this?
    Best regards,
    JA

    Hi Rich,
    In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
    For this retraction we are following below link.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
    In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
    as it is Z class, i am unable to create in my system.
    could you please help me.

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Best practices for exposing BPEL as web service on the Internet

    Hello can anyone tell me if there are any best practices to exposing a BPEL process to the Internet using SOA Suite 11g. We want to create a separate domain that will contain the few processes that will be need to be accessed by external services. We obviously have to do this in a secure manner, so any help in this area on how to do so would be greatly appreciated.
    Thanks in advance.
    Jim

    Hello can anyone tell me if there are any best practices to exposing a BPEL process to the Internet using SOA Suite 11g. We want to create a separate domain that will contain the few processes that will be need to be accessed by external services. We obviously have to do this in a secure manner, so any help in this area on how to do so would be greatly appreciated.
    Thanks in advance.
    Jim

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best Practice for attaching Meta Data, Organizing & Archiving Photos

    I am about to start the daunting task of organizing and archiving ALL of my photos so that I can access them from an external hard drive (or two) when I need them. I want to know/understand the best way to do this before I start so that thousands of hours later I don't realize that I have not considered something critical. I have already spent hours trying to find this information online to no satisfactory avail. I have both iPhoto ('09 v8.1.2) and Aperture 3, which I am just learning to use. I am wanting to back up to my WD My Book. I have already transferred some pictures and realized that, though I had them nicely organized in Albums in iPhoto, that information doesn't transfer over to the My Book so the file names are unrecognizable as any type of descriptor. I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    Here are all of the things I want/need to know:
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    2. What is the best way to attach MetaData to pictures so that the information is permanent? Tagging (allowing multiple search options)? Batch Name Change (allowing limited search capabilities)? Other?
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD? Are there multiple copies of the same photo (original, edited) and if so, are both transferred? Are they both deleted when I move the pictures to the trash?
    4. Is there anything else that I need to know??
    Your expertise would be greatly appreciated.
    Thank you.

    You are trying to defeat the photo management of iPhoto - My suggestion is to learn iPhoto and use it of choose another program - iPhoto does not work well unless it manages the entire library and you access it using the tools build into the OS - click here for a discussion on the many correct and safe ways to access your iPhoto library photos.
    Aperture may or may not be a better choice - try the Aperture forum for detailed info
    Backing up iPhoto is trivial - simply drag the iPhoto library intact as a single entity to another hard drive - or use an incremental backup program to do it as changes are made - I use Time Machine to one backup drive and SuperDuper to another
    I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    This is an integral part of iPhoto and one of its strengths
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    iPhoto does a great job of this - file names are only adjusted when you export - and there is no need for them to be adjusted within the iPhoto library since you never should be going into it directly and never need to see the files directly
    2. What is the best way to attach MetaData to pictures so that the information is permanent?
    Exporting from iPhoto
    Tagging (allowing multiple search options)?
    Using iPhoto and the OS tools
    Batch Name Change (allowing limited search capabilities)?
    Done during export if desired
    Other?
    Depends
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    In iPhoto yes so long as you are not going directly into the iPhoto library but are using the provided tools
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD?
    Simply drag the iPhoto library intact as a single entity to the EHD - the database is a single entity and it will be the same there (so long as the EHD is formatted Mac OS extended (journaled)
    Are there multiple copies of the same photo (original, edited) and if so, are both transferred?
    Yes - Yes
    Are they both deleted when I move the pictures to the trash?
    You do not move photos to the trash - you select them in iPhoto and press the delete key then empty the iPhoto trash (iPhoto menu ==> empty iPhoto trash)
    4. Is there anything else that I need to know??
    Lots - if you are going to use iPhoto stop thinking of managing files and start thinking of using a SQL database which is what iPhoto is -- otherwise you need to use a different program
    LN

  • Best Practice for Launching Internal Training?

    Hello.  I have a series of 12-20 internal trainings that employees are required to take when they start with our company, and some of the trainings are required every year for different groups around the company.  So, I have these 20 pieces of content built, and currently had just one Training set-up for each, and would launch that same link to folks everytime I needed to.  More and more start/stop times on these are all over the place.  I may have Sally and Tom starting the trainings this week and they have 3 weeks to complete, but in a week - I could need Harry to take only 2 of the trainings, and he will have a 1 week window.  Using just that one training link negates my ability to be able to set open/close dates - set reminders within the system...
    So, my question is for those of you that run this type of program - do you set-up a new training each time you launch a piece of content out to folks?  I feel like this could get messy quickly, and cause the need for me to run a ton of reports.  Am I missing an easier way to do this?
    Thank you.

    Hello!
    Apologizes for the late reply on this; have you considered setting up new training courses for each piece of content? This to me would seem to be much easier to manage. You could then setup new versions of the course for each year (month, etc..) they are required. A little bit of extra work at the start but easier in the long run. Keeping the same course for multiple pieces of content is a tricky practice and is generally not recommended. For example, what if Sally needed to take your Course with Content A at the start of the year, then Content B at the later half, using the same course? You would have to reset Sally's training transcript in order for her to retake the course, and she would lose that transcript data.
    If you want a single point of reference for all your training courses, you could consider using the Connect Training Catalog, or developing your own using the Connect APIs.
    Hope this helps!
    Lauren

  • Best Practice for Storing Spatial Data in Multiple Projections?

    From what I've been able to determine MS SQL can't do reprojections is that correct? Assuming it is, the state of Washington requires that our GIS data be managed in Washington Stat Plane South
    but for our web mapping sites need the data in Web Mercator.  Is it OK then to have two geometry columns in a spatial enabled table, say a county table, one in WSPS and the other in WM? 
    I’m coming at this from a 30 year background in GIS using ESRI software. 
    Usually we would store the shape / geometry in one projection and project it to other on the fly when needed. 
    That way there is only one definitive source.  I don’t see a way to do this in MS SQL.

    Hi Scott.
    Storing two columns of spatial data is fine in SQL Server. And you are correct that there is no built-in reprojection in SQL Server, most folks do that as part of data loading. There was talk of porting parts of ogr2ogr to SQLCLR, but I don't think anyone
    did that.
    Cheers, Bob

Maybe you are looking for

  • Validation for duplicate AP invoice

    We would like to prevent duplicate AP invoices from being posted through FB60.   We created an FI validation in OB28 in which the prerequisite = Tcode FB60, and the check is a user exit to check the values in vendor / company code / reference fields

  • Parallax scrolling site incorperating a blog.

    Anyone seen a good parallax scrolling site incorporating a blog. IE does the blog go to a new page wordpress site or what?

  • Remove malware from OS 10.8.5

    I am in Asia and was using my laptop when apparently i downloaded malware that has overtaken the browsers (chrome, safari) with pop up ads. I think sites like interyield.jmp9.com and www.mrlmedia.net. The symptom is a ad will appear on the edge of th

  • Extra spacing - can't figure out where it is coming from.

    Hello, I am trying to close the gap between the sentence "We can cover your overdrafts in two different ways" and the actual two points.  I can not figure out where all that space is coming from: http://www.tellgcu.com/melanie.shefchik Thanks, Melani

  • Smooth s:Scroller Component in Flex4

    Hi at all, can i create a smooth <s:Scroller> component in Flex4? I would like moving the thumb movement was smooth. <s:Scroller width="80%" height="80%"                 horizontalCenter="0" verticalCenter="0">                <local:Dummy id="dummy"