Source and Destination Port

Hi, Please clarify my queries on the port assignment in WAN Connection
I am having a host A and Host B and trying to access Host C on same Port(Let us say port 20)...
Will This Happen
Please clarify my query

Hello Rajendran.
Happy new year,
Yes, you can access different host on the same port, but you will nedd to mapped to a different port on the wan IP address.
Lets say you have 2 web servers that will receive traffic on port 80, and you only have one available IP(1.1.1.1)
How would you do it?
With port forwarding
static (inside,outside) tcp 1.1.1.1 80 192.168.11.2 80
static (inside,outside) tcp 1.1.1.1 81 192.168.11.3 80
So all traffic to port 80 and 81 on the wan ip address will be redirected to the web servers on the right port
Do rate helpful posts
Julio

Similar Messages

  • How to keep sort order in Shuttle item's source and destination boxes?

    I have defined a shuttle item on a page and the list of values in the source box is populated by a SQL with sort order specified. In the Settings of the shuttle item, Show Controls value is set to Moving Only instead of All. When a user moves selected values from the source box to the destination box or vice versa, the list of values does not maintain the original sort order. For example, if I have 26 values A through Z, a user can put value Z before A in the destination box. And when a user move A from the destination box to the source box, A is the last item in the source box. The reset button clears everything in the destination box and resets the source box to the original sorted list. This is not the solution that I am looking for. I would like to see the items in the source and destination boxes to maintain their original sort order all the time.
    Is it possible?
    APEX v4.0.1.00.03
    Oracle 11g (v11.2.0.1.0)

    Hi,
    See if this post help
    Re: Sort Shuttle Right
    Regards,
    Jari

  • Dbms_lob.getlength() returns different source and destination lengths

    I am fairly new to PL/SQL so maybe this is an obvious problem but here goes. I am updating a clob field with a text file ~5KB in size. The field updates fine (as far as I can tell). Before I update the field, I open the source file as a bfile and then inquire the length using dbms_lob.getlength(). I then update the clob field using dbms_lob.loadclobfromfile(). This seems to work fine. However, when I use dbms_lob.getlength() on the destination object returned by dbms_lob.loadclobfromfile(), I get a length 3 characters less than then the source object (5072 vs 5075). Both the source and destination offsets are set to 1.
    Probing on what documentation I could find, I found this at http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_lob.htm#i998484:
    "The length returned for a BFILE includes the EOF, if it exists. Any 0-byte or space filler in the LOB caused by previous ERASE or WRITE operations is also included in the length count. The length of an empty internal LOB is 0."
    I did not create the source file and I believe it is a Unix type file (because of the lack of CRs) and I am running on Windows 7. I am also using 11g Express. Could the use of a Unix-type file for a CLOB on a Windows system be causing this character count difference?
    Once I found this issue I can work around it. I just want to understand what is going on.
    Thanks to all who look at this.

    The EOF and the LF versus CR/LF could influence the count difference, yes.
    Another explain could possibly be character set conversions. The BFILE I believe counts bytes, a CLOB would count "characters" - so if the source happens to contain a few multibyte characters (UTF), then the byte count would be larger than the character count.
    To help you find the cause for your exact file, then I can suggest a couple of things you might do to explore the issue:
    <li>Load the file into a BLOB instead of a CLOB and see what getlength() returns for the BLOB. BLOBs would also do byte counts and not try to treat the source as text.
    <li>Save the CLOB back into the filesystem and compare the original file with the exported CLOB and check the differences with some filecompare tool.

  • HPCM custom driver use source and destination

    Hi All,
    I'm using HPCM 12.1.2, I need your help to create one custom driver that uses information from source and destination. I need to use 1 dimension of source and 2 of destination.
    I have the situation bellow:
    STAGE1 (SOURCE)
    Dim1 Machine
    Dim2 Account
    STAGE2 (DESTINATION)
    Dim1 Process
    Dim2 Operation
    Dim 3 Account
    Measure: Time used in the Machine
    The infromation is extracted:
    Machine1 - Process1 - Operation1 = 10 Hours
    Machine1 - Process1 - Operation2 = 9 Hours
    Machine2 - Process1 - Operation1 = 13 Hours
    How can I create one Driver that I could use these informations? It is possible?
    Thanks in Advance
    Diogo
    Edited by: user10432898 on 15/05/2012 14:21

    Hi Alex,
    You have to increase the Max Row size limit in Xcelsius.
    Goto >> File menu >> Preferences >> Increase the "Max Row Sixe"  by default it will be 500 you can increase it as per the requirement.
    Regards,
    AnjaniKumar C.A.

  • Could nt complete your request because source and destination files are the same

    Hi, thank you for reading.
    I'm having this problem and it's driving me nuts.
    I'm actually following a tutorial that you can check out here: http://nightshifted.tumblr.com/post/2559360661/tutorial-paused-animations
    basically I'm trying to do a animated gif with canvas (I'm sorry if my english is not so great). when I try to drag the layers into the canvas (step 2 of the tutorial), I get the error: "could not complete your request because source and destination are the same".
    can anybody help me? I have both CS3 and CS5 and they the error appears in both.
    thank you in advanced

    I think they mean select the layers and frames and using the move tool, drag
    inside the document (click inside the document window and drag) to move the
    selected layers to the top half  (transparent area), not to drag the layers from
    the layers palette into the document, which would give that error.
    MTSTUNER

  • Dynamic source and destination tables

    Hi all
    I've got to import 142 tables from csv into SQL 2008 on a regular basis.
    I was looking at building a 142-part SSIS package to do this, then thought there must be a dynamic way of doing it.
    Is there any way of dynamically changing the source and and destination tables?
    The csv filenames will remain identical, the SQL tables will be the same names but with "_Staging" at the end of them (e.g. SRSection.csv will always go into SRSection_Staging).
    I can then write MERGE statements to update the main tables from the staging data.
    Any help on this would be greatly appreciated.
    I get the the feeling I would need a FOREACH LOOP container but I'd really aprreciate a step-by-step guide if you can.

    Please check this :- http://sql-bi-dev.blogspot.com/2010/07/dynamic-database-connection-using-ssis.html  
    STEP1:
    To begin, Create two tables as shown below in on of the environment:
    -- Table to store list of Sources
    CREATE TABLE SourceList (
       ID [smallint],
       ServerName [varchar](128),
       DatabaseName [varchar](128),
       TableName [varchar](128),
       ConnString [nvarchar](255)
    GO
    -- Local Table to store Results
    CREATE TABLE Results(
       TableName  [varchar](128),
       ConnString [nvarchar](255),
       RecordCount[int],
       ActionTime [datetime]
    GO
    STEP 2:
    Insert all connection strings in SourceList table using below script:
    INSERT INTO SourceList
    SELECT 1 ID,
    '(local)' ServerName,
    --Define required Server
    'TestHN' DatabaseName,--Define DB Name
    'TestTable' TableName,
    'Data Source=(local);Initial Catalog=TestHN;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;' ConnString
    Insert as many connections as you want.
    STEP 3:
    Add new package in your project and rename it with ForEachLoopMultipleServers.dtsx. Add following variable:
    Variable
    Type
    Value
    Purpose
    ConnString
    String
    Data Source=(local);
    Initial Catalog=TestHN;
    Provider=SQLNCLI10.1;
    Integrated Security=SSPI;
    Auto Translate=False;
    To store default connection string
    Query
    String
    SELECT '' TableName,
    N'' ConnString,
    0 RecordCount,
    GETDATE() ActionTime
    Default SQL Query string.
    This can be modified at runtime based on other variables
    SourceList
    Object
    System.Object
    To store the list of connection strings
    SourceTable
    String
    Any Table Name.
    It can be blank.
    To store the table name of current connection string.
    This table will be queried at run time
    STEP 4:
    Create two connection managers as shown below:
    Local.TestHN: For local database which has table SourceList. Also this will be used to store the result in Results table.
    DynamicConnection: This connection will be used for setting up dynamic connection with multiple servers.
    Now click on DynamicConnection in connection manager and click on ellipse to set up dynamic connection string. Map connection String with variable
    User::ConnString.
    STEP 5:
    Drag and drop Execute SQL Task and rename with "Execute SQL Task - Get List of Connection Strings". Now click on properties and set following values as shown in snapshot:
    Result Set: Full Result Set
    Connection: Local.TestHN
    ConnectionType: Direct Input
    SQL Statement: SELECT ConnString,TableName FROM SourceList
    Now click on Result Set to store the result of SQL Task in variable User::SourceList.
    STEP 6:
    Drag and drop ForEach Loop container from toolbox and rename with "Foreach Loop Container - DB Tables". Double click on ForEach Loop container to open Foreach Loop Editor. Click on Collection  and select
    Foreach ADO Enumerator as Enumerator. In Enumerator configuration, select User::SourceList as ADO object source variable as shown below:
    STEP 7: Drag and drop Script Task inside ForEach Loop container and double click on it to open Script Task Editor. Select
    User::ConnString,User::SourceTable as
    ReadOnlyVariables and User::Query as
    ReadWriteVariables. Now click on Edit Script button and write following code in Main function:
    public void Main()
    try
    String Table = Dts.Variables["User::SourceTable"].Value.ToString();
    String ConnString = Dts.Variables["User::ConnString"].Value.ToString();
    MessageBox.Show("SourceTable = " + Table +
    "\nCurrentConnString = " + ConnString);
    //SELECT '' TableName,N'' ConnString,0 RecordCount,GETDATE() ActionTime
    string SQL = "SELECT '" + Table +
    "' AS TableName, N'" + ConnString +
    "' AS ConnString, COUNT (*) AS RecordCount, GETDATE() AS ActionTime FROM " + Dts.Variables["User::SourceTable"].Value.ToString() +
    " (NOLOCK)";
          Dts.Variables["User::Query"].Value = SQL;
          Dts.TaskResult = (int)ScriptResults.Success;
    catch (Exception e)
          Dts.Log(e.Message, 0,
    null);
    STEP 8:
    Drag and drop Data Flow Task and double click on it to open Data Flow tab. Add OLE DB Source and Destination. Double click on OLE DB Source to configure the properties. Select
    DynamicConnection as OLE DB connection manager and
    SQL command from variable as Data access mode. Select variable name as User::Query. Now click on
    columns to genertae meta data.
    Double click on OLE DB Destination to configure the properties. Select Local.TestHN as
    OLE DB connection manager and Table or view - fast load as
    Data access mode. Select [dbo].[Results] as Name of the table or the view. now click on
    Mappings to map the columns from source. Click OK and save changes.
    Finally DFT will look like below snapshot:
    STEP 9: We are done with package development and its time to test the package.
    Right click on the package in Solution Explorer and select execute. The message box will display you the current connection string.
     Once you click OK, it will execute Data Flow Task and load the record count in Results table. This will be iterative process untill all the connection are done. Finally package will execute successfully.
    You can check the data in results table:
    Here is the result:
    SELECT *
    FROM SourceList
    SELECT *
    FROM Results
    Regards, Pradyothana DP. Please Mark This As Answer if it solved your issue. Please Mark This As Helpful if it helps to solve your issue. ========================================================== http://www.dbainhouse.blogspot.in/

  • Source and destination clarification needed

    Hi Guys,
    We are using SRM as an add-on component to ECC 6.0. We are on SRM Server 5.5. In the defination of backend system SAP asked us to create 4 entries.
    1. ONECLNTERP : RFC tick is on
    2. ONECLNTEBP:
    3. ONECLNTSUS:
    4. XXXCLNTYYY: Local tick is on.
    So this means 'ONECLNTERP' is my backend system i.e. ECC 6.0 and 'XXXCLNTYYY' is my local system.
    Now while defining backend system for Product category there is one 'Source System and there is 'Target system'. Source System: System from which master data is replicated. So this should be my SRM system (as Product category is a part of SRM) or this should be ECC (as My product categories are copy of material group from ECC).
    Target system is the system into which follow-on document of SC is transfered i.e. Backend ECC system.
    Am I correct to say the above. I'm confused about which system is source and which system is destination.
    While defining the Account Assignement category and defining the PR document type I'm facing the same problem as I've to define the source system.
    Can anybody clarify me??
    Thanks
    Debashish

    Hello Deb,
    I have not worked on ECC 6.0.
    But logically speaking the
    source system - has to be yr ECC as SRM would be taking the ref of matl masters from there only
    Target system - would be yr same ECC again if you are using only one backend (and I think you are)
    (if you are using multiple backend and you want followon document in diff backend then here the SID of that backend would come)
    BR
    Dinesh

  • Association must have both source and destination ends.

    I am getting this error when trying to developer Oracle Applications framework page using ICX and FND information. I am trying to find out if there is something missing. I don't see an assocation that is missing a source or destination.

    {forum:id=210}

  • Setting up a project where source and destination are different formats

    Howdy...
    I'm a recent Vegas user who is just switching over to FCP6.
    My question is this: When your source footage and destination format are two unrelated formats (for instance, HDV for source footage, Apple-TV H264 for destination), is it better to set up the project for your source footage format, or for your destination format?
    In Vegas it didn't really matter, but it was good to use the destination format because you could preview what it'd look like at the size and frame-rate that you were going to end up in.
    In FCP6, however, it seems that importing HDV into a H264 project may not work as well, becuase it wants to render the footage into the correct format before you can preview it.
    FYI: The final desitnation is my gadget podcast, http://www.neo-fight.tv, which I have shot on HDV and edited on Vegas for the past year. FCP is very new to me, but I'm enjoying learning something new.
    Let me know your thoughts...
    Best,
    Benjamin
    http://www.neo-fight.tv [The TV Show for The 'Not-So-Geeky']
    MacBook, MacPro   Mac OS X (10.4.9)  

    Hi Benjamin - Prior to fcs 2 I worked in whatever my capture format was and will still continue to do that. I plan to use ProRes now - but if I was coming from an hdv source I'd prolly opt to capture using DVCPro HD and working in that.
    "In FCP6, however, it seems that importing HDV into a H264 project may not work as well, becuase it wants to render the footage into the correct format before you can preview it."
    As far as working in .h264 is concerned and combining other formats into that I really don't know.
    h264 for me is a delivery format - so I'd opt to still work in native capture format and then transcode/convert at the end. More options ....

  • SQL Server replication and size differences of source and destination databases

    I set up snapshot replication for a DB between two SQL instances.  On the source instance, the DB shows as 106612.56MB with 34663.75MB as available free space.  I expected that the replica would then end up being 71948.81MB (106612.56 - 34663.75
    because it wouldn't replicate the white space).  The resultant replica database is showing as 35522.94MB.  The required data appears to be present in the replicated DB as the SSRS reports that use it are able to find the data they look for.  But
    why the large discrepancy in size between the source and replicated DB?  The replicated DB is less than 1/2 the size of the source DB.  I've searched around and can't seem to find any explanation.  I realize this isn't mirroring so the DBs will
    not be identical in size but I did not expect to see such a large difference between the two.  I am replicating all almost all articles (tables, stored procs, etc.) with the exception of a handful of stored procedures and user-defined functions that either
    reference invalid column names in a table (vendor bug) or reference another DB that is not present on the replica's instance.  I would expect these 4-5 articles can not account for a 37000 MB size difference between the two DBs.
    Please note that this has nothing to do with transaction log size.  I am specifically talking about the database size and am not looking at the size that combines both DB and TxLog size.
    Any insight?

    Another factor could be that on the publisher the data is distributed through pages, paragraphs and extents. Depending on your fill factor and the amount of deletes and your datatype, there could be space in the pages, paragraphs and extents which have not
    been reclaimed.
    During the bcp process which is part of the snapshot application process on the subscriber all the data will be in the tables in a contiguous fashion. I would suspect this would be why you have the difference in space usage.
    looking for a book on SQL Server 2008 Administration?
    http://www.amazon.com/Microsoft-Server-2008-Management-Administration/dp/067233044X looking for a book on SQL Server 2008 Full-Text Search?
    http://www.amazon.com/Pro-Full-Text-Search-Server-2008/dp/1430215941

  • Source and destination are deep structures.. can't seem to map the detail table

    So, both my source data structure and my destination structure are very similarly built:
    They both have a HEADER and BODY component. The HEADER has the standard AIF fields (MSGGUID, NS, IFNAME, IFVER).
    I have defined the BODY as a deep structure composed of a Header Structure + a Detail Table. Each row in the BODY has a structure with a few header fields, then a table of detail data attached to it.
    In my Structure Mappings, I've created 2 rows:
    The first to map BODY-HEADER to my header structure.
    The second to map BODY-DETAILS to my details table. Here, I've chosen "Indirect Mapping".
    Selecting that row and navigating to "Define Field Mappings", I choose my destination table (DTL) and enter the Sub-table from my source structure.
    At this point, I only have 1 action - to save the data in Ztables. I'm seeing that the header data is passed in fine. But there aren't any detail lines coming through.
    Any ideas?
    thanks alot.
    RP.

    Christoph,
    In your reply, I only have the ability to perform the First Structure Mapping (please see the below screenshots):
    Source Structure; BODY
    Destination Structure: BODY
    Field Mapping:
    1. Field in Dest.Struc=HDR, Sub-Table = SCH_HEADER
    2. Field in Dest.Struc=DTL,  Sub Table = SCH_DETAILS
    I presume I'm following your suggestion, but when I try to create the other two structure mappings, I only have HEADER or BODY as available options to choose from: I've attached screenshots of the F4 dropdown of the Source Structure and also the Destination Structure fields.
    I believe this is because my Interface Definition is defined as a HEADER and BODY:
    Source Data Structure
    Destination Data Structure

  • Drag & Relate Relation Editor ins't accessiable for source and destination

    Dear all,
    we've install a EP 7 SPS 15 and want to use the relationship editor, after we've import the business objects from a systeme where we have already a connection with ep6 the objects are available, but when we want to add them as source / destination to the relationshipeditor, there comes up an Website error :
    mtxSelect error no object
    we try this with IE 7 and 6 and Firefox, but for fierfox there wasn't any message.
    So we need some ideas why or what can we do to fix / find the reason for this error.
    Best regards
    Thorsten Stracke

    I am getting the same error. If anyone know what might be causing this, please help.
    Thanks,
    Alex

  • Migrating schemas - Different tablespace names on source and destination

    Hi,
    I am migrating database schemas with exp/imp from 8i on Solaris to 9i on Linux and also from one 9i on Linux to another 9i on another Linux machine.
    The tablespaces and schemas (empty) already exist in the destination, so the schemas now need to be filled using imp. However, the default tablespace of the schemas have a different name in the source than in the destination. Is this going to cause errors during the import?
    Thanks.

    Not at all, i think. Just verify that the source tablespace is not present in the destination database. If you have a tablespace with the same name like the source, put the user quota to zero, so imp will switch to the default tablespace.
    Dba Z.

  • Strcpy problem overlapping source and destination in 64 bit.

    A simple program like the following when compiled for 64 bit, it gives wrong result in some of the 64bit m/c.
    /* a.c */
    #include <stdio.h>
    #include <string.h>
    int main()
    char st[50];
    char *p = st + 2;
    strcpy(st,"XY1234");
    strcpy(st,p);
    printf("\n%s\n",st);
    return 0;
    cc -o check_copy -m64 a.c
    On running check_copy, the expected answer is "1234", but in some m/c this gives "1434" .
    The libc.so.1 in both the m/cs are different.
    The result is correct in older SunOS.
    uname -a in two m/cs ( old_mc and new_mc )
    SunOS old_mc 5.10 Generic_127112-05 i86pc i386 i86pc [ Gives correct result ]
    SunOS new_mc 5.10 Generic_147441-01 i86pc i386 i86pc [ Gives wrong result ]
    Both the m/c s have 118855-36 patch, which I believe is for libc.so.1.
    As our s/w is large one, it will be difficult to have a code fix, as there will be many instances of the above scenario and which triggered in very rare usecases.
    Any help, regarding the patch details which can fix this issue..

    Hi,
    Based on the error message, we should confirm whether Oracle client tool has been installed on the Report Server at the beginning. If the Report Server installed is 64-bit, we should also install a 64-bit Oracle client tool on the Report Server. Besides, we
    can also try to bypass tnsnames.ora file when connecting to Oracle by specifying all the information in the data source instead of using the TNS alias to connect
    to the Oracle database. Please see the sample below:
    Data Source=(DESCRIPTION=(CID=GTU_APP)(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST= server01.mydomain.com)(PORT=1521)))(CONNECT_DATA=(SID=OracleDB)(SERVER=DEDICATED)));
    Reference:http://blogs.msdn.com/b/dataaccesstechnologies/archive/2010/06/30/ora-12154-tns-could-not-resolve-the-connect-identifier-specified-error-while-creating-a-linked-server-to-oracle.aspx
    Hope this helps.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • L2L VPN with source and destination NAT

    Hello,
    i am new with the ASA 8.4 and was wondering how to tackle the following scenario.
    The diagram is
    Customer ---->>> Firewall --->> L2L VPN --->> Me --->> MPLS ---> Server
    The server is accessible by other tunnels in place but there is no NAT needed. For the tunnel we are talking about it is
    The Customer connects the following way
    Source: 198.1.1.1
    Destination: 192.168.1.1
    It gets to the outside ASA interface which should translate the packets to:
    Source: 10.110.110.1
    Destination: 10.120.110.1
    On the way back, 10.120.110.1 should be translated to 192.168.1.1 only when going to 198.1.1.1
    I did the following configuration which I am not able to test but tomorrow during the migration
    object network obj-198.1.1.1
    host 198.1.1.1
    object network obj-198.1.1.1
    nat (outside,inside) dynamic 10.110.110.1
    For the inside to outside NAT depending on the destination:
    object network Real-IP
      host 10.120.110.1
    object-group network PE-VPN-src
    network-object host 198.1.1.1
    object network Destination-NAT
    host 192.168.1.1
    nat (inside,outside) source static Real-IP Destination-NAT destination static PE-VPN-src PE-VPN-src
    Question is if I should create also the following or not for the outside to inside flow NAT? Or the NAT is done from the inside to outside estatement even if the traffic is always initiated from outside interface?
    object network obj-192.168.1.1
    host 192.168.1.1
    object network obj-192.168.1.1
    nat (outside,inside) dynamic 10.120.110.1

    Let's use a spare ip address in the same subnet as the ASA inside interface for the NAT (assuming that 10.10.10.251 is free (pls kindly double check and use a free IP Address accordingly):
    object network obj-10.10.10.243
      host 10.10.10.243
    object network obj-77.x.x.24
      host 77.x.x.24
    object network obj-10.10.10.251
      host 10.10.10.251
    object network obj-pcA
      host 86.x.x.253
    nat (inside,outside) source static obj-10.10.10.243 obj-77.x.x.24 destination static obj-10.10.10.251 obj-86.x.x.253
    Hope that helps.

Maybe you are looking for

  • Citrix receiver does not install on mountain lion

    Hello everybody. I can't get citrix receiver (the actual version, don't know how to check for its version) installed on my Mountain Lion 10.8.2 It always gets until the end of the install process and tells me that "something went wrong", no clear ind

  • Convert HH:mm:ss String format.

    Hi, I have a formula in webI to display Milliseconds to HH:mm:ss and is in String format. Is there a way to convert to date or number format?. Also, how to calculate average of Max and Min time?. Thanks, Jothi

  • Insert currency symbol($) in Bex report

    Hello gurus,    I developed one report in BW which display result as decimal no. but i want to display it as currency, like $ as prefix for every value. Please help me in this regarding    Thanks in Advance Mahesh

  • Storage for web apps/cloud services in multiple datacenters

    Hi all; We're setting up our system so we have stage & production in a U.S. datacenter and production is EU & S/Pore datacenters. In this case do we have to have a distinctly named storage account for each of these locations? And if we do have a dist

  • I deleted my photos! D:

    I created a new album and moved some photos to that new album(lets call  the new one #2, and the original album #1). After I moved some photos to #2, I went back to #1 and deleted a bunch of phoots and clicked "delete". But... I didn't see that it sa