4503/4506 - data handling and data flow

I have been tasked to find a document that details how, from start to finish, a packet goes through a 4503/4506 unit. I'm talking a level of detail that includes what portion of RAM the packet goes into once it hits the inbound interface, what parts of the switch handle the analysis (ACLs, et all), and so on, right until the packet is either dropped or forwarded to the outgoing int. As detailed a description as possible, and if a non-model-specific equivalent is available and applicable to this unit, that works as well.
I have been looking through the TechDocs and the like, as well as several attempts at Google (which is well-nigh useless), and no luck thus far.
Thanks in advance for any information provided.

I am not aware of any CCO documents explaining path of a packet/CAT4500 architecture. However, there was a presentation on this at Networkers 2005. If you attended it, you can check it out at
http://www.cisco.com/networkers/nw05/nwol.html
Here is the session information for RST-4500.
Session Title: Cisco Catalyst 4500 Switch Architecture
Length: 2 Hours
Level: Intermediate
Related Sessions:
RST-3031 Troubleshooting LAN Protocols
RST-3042 Troubleshooting Cisco Catalyst 4000 and 4500 Series Switches
Abstract: This session presents an in-depth study of the architecture of the Cisco Catalyst 4500 Switches and how the various components work together. The focus for this session is present information to help the audience understand the architecture to be able to design and implement Catalyst 4500 in their network and troubleshoot better.
Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
This session is designed for network designers and senior nework operation engineers considering deploying or have Cisco Catalyst 4500 series of switches in enterprise and service provider networks.
* Prerequisites:
1. Knowledge of LAN protocols is required
2. Basic understanding of Cisco Catalyst switches is required.
Topics include a discussion of the architecture, information about the latest Cisco Catalyst 4500 supervisors and switching modules such as SupevisorV-10GE, Netflow Feature card (NFL), Catalyst 4948, Supervisor II+TS amd PoE linecard, as well as the key features such as CEF/Multicast Forwarding, DHCP Snooping, IP Source Guard, Dynamic ARP inspection, 802.1X, Redundancy (SSO), Netflow, ACL/TCAM, QoS (Per-port/Per-VLAN and UBRL),
Speakers: Balaji Sivasubramanian
Escalation Eng
Cisco Systems
Balaji Sivasubramanian is an escalation engineer in Cisco's Gigabit Switching Business Unit. Balaji, who is a CCNP, is also co-author of "CCNP Self-Study: Building Cisco Multilayered Switched Network - 2nd Edition" (ISBN -1587051508). Balaji is an expert in Catalyst 4500 switches architecture, and in troubleshoooting LAN protocols and Catalyst switches including the Catalyst 4500, Catalyst 6500 and Catalyst 3500. In his 5+ years with Cisco, Balaji has also held positions of TAC Technical Leader/Expert in LAN/Campus switching, Worldwide Subject Matter Expert in LAN technologies for the Cisco TAC and a TAC support engineer in LAN/Campus switching.

Similar Messages

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • Data Models and Data Flow diagrams.

    Hi  Gurus,
        Can anybody brief me the concept of Data Models and Data Flow Diagrams and their development with illustrations. And is it a responsibility of a Technical or a Functional consultant..i.e to translate Business requirements and functional specifications into technical specifications, data flow diagrams and data models.
    Your valuable answers will be rewarded.
    Thanks in advance.

    Hi,
    Concept of Data Models
    Data model or Data modelling is basically how you define or design your BW Architecture based on Business requirements. It deals with designing and creating a effcient BW architecture sticking to standard practices.
    Multi-Dimensional Modeling with SAP NetWeaver BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    /people/githen.ronney3/blog/2008/02/13/modeling-strategies
    Modeling the Data Warehouse Layer with BI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3668618d-0c01-0010-1ab5-aa75c3a4dfc2
    /people/gilad.weinbach2/blog/2007/02/23/a-beginners-guide-to-your-first-bi-model-in-nw2004s
    Data Flow Diagrams
    This show the path of data flow for each individual object in BW. How data dets loaded into that object and how it is going out of the object etc.
    Right click on the data target > show data flow .
    It shows all the intermdeiate layer through which data comes into that particular object.
    Responsibility of a Technical or a Functional consultant
    This is done genrally in the designing phase itself by a Senior Technical Consultant with the help of a Functional consultant or a Techno=Functional consultant interacting with Business.
    Hope this helps.
    Thanks,
    JituK

  • SQL DM - Data Transformation and Data Movement option ?

    SQL DM - Data Transformation and Data Movement option ?
    I am using SQL DM 3.0.0.665. I need your thoughts on following.
    We find that Erwin introduced Data Transformation and Data Movement functionality to support ETL need. We were able to generate ETL spec using this feature.
    Does SQL DM have any plan to introduce such features?
    How do we use the current SQL DM to build ETL spec ?
    Thanks in helping us out.

    Hello,
    I am currently experimenting with SQL Data Modeler to produce high level solution designs and ETL specifications.
    Have not completed what I am doing but so far have come up with the following:
    Current assumption I am working on:
    All objects specified within the SQL Data Modeler will export to the Reporting Schema tables set up in an Oracle database. Once the data is within these tables then it will be a simple task to develop a SQL report to extract the data in any format required.
    1) There is nothing in the physical (Relational) Model section that supports this
    - though I have yet to fully use the Dimensional Modelling functionality which may have the mapping functionality required to specify an ETL
    2) We need diagrams of the processes as well as the ETL mapping
    - Process modelling is available in the Logical
    - Reverse Engineer all Physical objects to become Logical object i.e. one Table to one Entity
    - For each Entity set up an Information Structure
    (Currently this can only be done in a convoluted method via creating a diagram, creating a Flow and editing the Flow then drilling down)
    MESSAGE to SQL Data Modeler Support: Can things be set up so that Information Structures can be set up directly from the Browser, current method is a bit nonsensical
    - You are now set up to use the Logical Process Modeling functionality to capture the ETL requirements
    - Advise that you reference the training to understand what primitive, composite and transformation processes objects are
    - Also, take the time to understand what an external agent object is
    - Will assume you know what a Data Store is
    Here is the standard I am heading towards that seems feasible, will need to run a proof of concept within the larger team to ensure it works though:
    - A Logical is kept that is a one for one with the Physical
    (The only reason for this is that there is no process modeling functionality for the Physical objects
    MESSAGE to SQL Data Modeler Support: Can you duplicate the Process Modeling for the Logical to be available for the Physical objects too, would be a great help to specify ETL jobs.
    - An External Agent is used to represent an external source e.g. Billing application
    - A primitive process is used to represent the high Level design
    - A composite process is used to specify processes which can be further broken down to ETL jobs
    - A transformation process is used to represent an ETL job
    Within a Transformation process you can specify the mapping from multiple sources to a target table
    There are some negatives to this approach:
    - You lose the physical schemas the tables are part of, though a naming convention will get round this
    - You need to maintain a logical that is one for one with the physical, this is not a big overhead
    However, as I have stated in my message to the SQL Data Modeler support team, would all be resolved if the Process Modeling functionality were also made available within the Physical objects environment.
    Please note that we have not as yet adopted the above approach and are still assessing is SQL Data Modeler will meet this requirement to our satisfaction. The critical bit will be if the data exports to the Reporting Schema, if it does then we have plenty of SQL resource that can produce the reports required procided the data can be captured.
    Hope that all helps.
    Also, hope I have not missed the point of your email.
    Kind regards,
    Yusef

  • Difference b/w DATA TYPE and DATA OBJECT & differences b/w TYPE and LIKE

    hai
    can any one say the differences between Data type and Data Object.
    And also differences between TYPE and LIKE
    thanks
    Gani

    hi,
    _Data Types and Data Objects_
          Programs work with local program data – that is, with byte sequences in the working memory. Byte sequences that belong together are called fields and are characterized by a length, an identity (name), and – as a further attribute – by a data type. All programming languages have a concept that describes how the contents of a field are interpreted according to the data type.
          In the ABAP type concept, fields are called data objects. Each data object is thus an instance of an abstract data type. There are separate name spaces for data objects and data types. This means that a name can be the name of a data object as well as the name of a data type simultaneously.
    Data Types
       As well as occurring as attributes of a data object, data types can also be defined independently. You can then use them later on in conjunction with a data object. The definition of a user-defined data type is based on a set of predefined elementary data types. You can define data types either locally in the declaration part of a program using the TYPESstatement) or globally in the ABAP Dictionary. You can use your own data types to declare data objects or to check the types of parameters in generic operations.
         All programming languages distinguish between various types of data with various uses, such as ….. type data for storing or displaying values and numerical data for calculations. The attributes in question are described using data types. You can define, for example, how data is stored in the repository, and how the ABAP statements work with the data.
    Data types can be divided into elementary, reference, and complex types.
    a. Elementary Types
    These are data types of fixed or variable length that are not made up of other types.
    The difference between variable length data types and fixed length data types is that the length and the memory space required by data objects of variable length data types can change dynamically during runtime, and that these data types cannot be defined irreversibly while the data object is being declared.
    Predefined and User-Defined Elementary Data Types
    You can also define your own elementary data types in ABAP using the TYPES statement. You base these on the predefined data types. This determines all of the technical attributes of the new data type. For example, you could define a data type P_2 with two decimal places, based on the predefined data type P. You could then use this new type in your data declarations.
    b.  Reference Types
    Reference types are deep data types that describe reference variables, that is, data objects that contain references. A reference variable can be defined as a component of a complex data object such as a structure or internal table as well as a single field.
    c. Complex Data Types
    Complex data types are made up of other data types. A distinction is made here between structured types and table types.
    Data Objects
          Data objects are the physical units with which ABAP statements work at runtime. The contents of a data object occupy memory space in the program. ABAP statements access these contents by addressing the name of the data object and interpret them according to the data type.. For example, statements can write the contents of data objects in lists or in the database, they can pass them to and receive them from routines, they can change them by assigning new values, and they can compare them in logical expressions.
           Each ABAP data object has a set of technical attributes, which are fully defined at all times when an ABAP program is running (field length, number of decimal places, and data type). You declare data objects either statically in the declaration part of an ABAP program (the most important statement for this is DATA), or dynamically at runtime (for example, when you call procedures). As well as fields in the memory area of the program, the program also treats literals like data objects.
            A data object is a part of the repository whose content can be addressed and interpreted by the program. All data objects must be declared in the ABAP program and are not persistent, meaning that they only exist while the program is being executed. Before you can process persistent data (such as data from a database table or from a sequential file), you must read it into data objects first. Conversely, if you want to retain the contents of a data object beyond the end of the program, you must save it in a persistent form.
    Declaring Data Objects
          Apart from the interface parameters of procedures, you declare all of the data objects in an ABAP program or procedure in its declaration part. These declarative statements establish the data type of the object, along with any missing technical attributes. This takes place before the program is actually executed. The technical attributes can then be queried while the program is running.
         The interface parameters of procedures are generated as local data objects, but only when the procedure is actually called. You can define the technical attributes of the interface parameters in the procedure itself. If you do not, they adopt the attributes of the parameters from which they receive their values.
    ABAP contains the following kinds of data objects:
    a.  Literals
    Literals are not created by declarative statements. Instead, they exist in the program source code. Like all data objects, they have fixed technical attributes (field length, number of decimal places, data type), but no name. They are therefore referred to as unnamed data objects.
    b.  Named Data Objects
    Data objects that have a name that you can use to address the ABAP program are known as named objects. These can be objects of various types, including text symbols, variables and constants.
    Text symbols are pointers to texts in the text pool of the ABAP program. When the program starts, the corresponding data objects are generated from the texts stored in the text pool. They can be addressed using the name of the text symbol.
    Variables are data objects whose contents can be changed using ABAP statements. You declare variables using the DATA, CLASS-DATA, STATICS, PARAMETERS, SELECT-OPTIONS, and RANGESstatements.
    Constants are data objects whose contents cannot be changed. You declare constants using the CONSTANTSstatement.
    c.  Anonymous Data  Objects
    Data objects that cannot be addressed using a name are known as anonymous data objects. They are created using the CREATE DATAstatement and can be addressed using reference variables.
    d.  System-Defined Data Objects
    System-defined data objects do not have to be declared explicitly - they are always available at runtime.
    e.  Interface Work Areas
    Interface work areas are special variables that serve as interfaces between programs, screens, and logical databases. You declare interface work areas using the TABLES and NODESstatements.
    What is the difference between Type and Like?
    Answer1:
    TYPE, you assign datatype directly to the data object while declaring.
    LIKE,you assign the datatype of another object to the declaring data object. The datatype is referenced indirectly.
    Answer2:
    Type is a keyword used to refer to a data type whereas Like is a keyword used to copy the existing properties of already existing data object.
    Answer3:
    type refers the existing data type
    like refers the existing data object
    reward if useful
    thanks and regards
    suma sailaja pvn

  • Data Integrator and Data Quality

    Hi,
    I worked on BI 7.0 and now exploring Business objects. I wanted to know how Data Integrator and Data Quality products relate to SAP BI 7. What is the purpose of using them. Is this integration and quality features not embedded in SAP BI 7.

    BI and Data Services share many integration features, but Data Service provides better connectivity to third party data sources.  The data quality features of Data Services are not available in BI.

  • Data integrator for HP-UX missing data quality and data profiling

    Hi All,
    I have installed ODI 10.1.3.40.0 from odi_unix_generic_10.1.3.4.0.zip in HP unix 11.23.
    Data quality and data profiling is missing in that zip ? Could anyone please help me how to get installer of data quality and data profiling for Oracle data integrator in HP UX 11.23.
    Any response will be highly appreciated.
    Thanks in advance.
    regards
    Umapada
    Edited by: user10612738 on Nov 28, 2008 1:54 AM

    The integrated packaged( ODi, data quality and data profiling) for HP-UX wount available till 10.1.3.5.0 which will be released by this year end.

  • ODI Data Profiling and Data Quality

    Hi experts,
    Searching about ODI features for data profiling and data quality I found (I think) many ... extensions? for the product, that confuse me. Cause I think there are at least three diferents ways to do data profiling and data quality:
    In first place, I found that ODI has out of the box features for data profiling and data quality, but, acording to the paper, this features are too limited.
    The second way I found was the product Oracle Data Profiling and Oracle Data Quality for Oracle Data Integrator 11gR1 (11.1.1.3.0) that is in the download page of ODI. Acording to the page, this product extends the existing inline Data Quality and Data profiling features of ODI.
    Finally, the third way is Oracle Enterprise Data Quality that is another product that can be integrated to ODI.
    I dont know if I understood good my alternatives. But, in fact, I need a general explanation of what ODI offer to do Data Quality and Data profiling. Can you help me to understand this?
    Very thanks in advance.

    Hi after 11.1.1.3 version of ODI release, oracle no longer supports ODP/ODQ which is a trillium software product and this not owned by oracle. Oracle is recommending the usage OEDQ for quality purposes. It's better if you could spend time on OEDQ rather than trying to learn and implement ODP/ODQ in ODI

  • Oracle Data Profiling and Data Quality

    Hi,
    How to create metabase for Oracle Data Profiling and Data Quality.Is metabase and repository are same.

    Hi,
    You can create a metabase in the Metabase Manager:
    - Expand Control Admin
    - Click on Metabases
    - in the Metabases window, right-click on the white area and select Add...
    - go through the wizard to create your metabase
    This is documented in the ODQ/ODP tutorial (http://www.oracle.com/technology/products/oracle-data-quality/pdf/oracledq_tutorial.pdf) and in the Documentation (in Metabase Manager or Oracle Data Quality go to Help and then Manuals).
    Thanks,
    Julien

  • Is there a way to manually custom-edit the "Date created" and "Date modified" attributes of files in Mac OS 10.6?

    In "List View" in a Finder window, among the many column options are "Date created" and "Date modified." In "View Options" (command-J) for any folder, one can add these columns, along with the standard ones "Size," "Kind," "Label," etc.
    A user can alter a file's name, and a file's "label" (i.e. its color). On the other hand, a user can NOT arbitrarily edit/alter a file's official "size" -- other than by physically altering the contents of the file itself, obviously.
    But what about a file's "Date created" and "Date modified"? Can either of those be manually edited/changed, just as a file's name can be changed -- or is a file's creation-date an immutable attribute beyond the editorial reach of the user, just as a file's "size" is?
    And yes, a person can "alter" a file's "Date modified" by simply modifying the file, which would change its "Date modified" to be the moment it was last altered (i.e. right now). But (and here's the key question) can a user somehow get inside a file's defining attributes and arbitrarily change a file's modification date to be at any desired point in the past that's AFTER its creation date and BEFORE the present moment? Or will so doing cause the operating system to blow a gasket?
    If it is possible to arbitrarily manually alter a file's creation date or modification date, how would it be done (in 10.6)? And if it is NOT possible, then why not?

    sanjempet --
    Whew, that's a relief!
    But as for your workaround solution: All it will achieve is altering the created and modified dates to RIGHT NOW. What I'm looking to do is to alter the modification/creation dates to some point in the past.
    I'm not doing this for any nefarious reason. I just like to organize my work files chronologically according to when each project was initiated, but sometimes I forget to gather the disparate documents all into one folder right at the beginning, and as a result, sometimes after I finish a job, I will create a new folder to permanently house all the files in an old project, and when that folder is places in a bigger "completed projects" folder and then is organized by "Date created" or "Date modified" in list view, it is out-of-order chronologically because the creation and modification dates of that particular project folder reflect when the folder was created (i.e. today), and not when the files inside the folder were created (i.e. weeks or months ago).
    The simplest solution would simply to be able to back-date the folder's creation or modification date to match the date that the project actually started!

  • I have had my Iphone 5s for 2 months and all the fotos and data (fotos and data were not backed up) that I had were erased after I restored it with my old Iphone 4s backup, can I get my unsaved fotos and data back? and if I can, how?

    I have had my Iphone 5s for 2 months and all the fotos and data (fotos and data were not backed up) that I had were erased after I restored it with my old Iphone 4s backup, can I get my unsaved fotos and data back? and if I can, how?

    If you never backed up your phone to iTunes and never backed it up to iCloud all of your content is gone forever.

  • Data Carriers and Data carrier type

    Hi Sap Gurus,
    Pls let me know is there any necesssity of defining any new Data Carrier types other than PC,
    if so at what circumstances and requirements,
    also pls throw some light about data Carriers, when is the new data carriers required to be created
    at what circumstances and requirements.
    is that to access a document from different server or Different system, if it is so why is this required as the documents are always stored in a central content server and Once checked in, any one can access the document from Server,
    2) can we make use of Application softwares Installed in one System and make it run in another system, example,
    if there are say 4 system totally, among this S1 system1 has Autocad Installed in it ,
    rest of the 3 system have not installed Autocad, Can the system 2,3 and 4 edit the auto cad drawings in there system, does at this situations where Datacarries and Data carries types come in to picture,
    Please throw some light on this
    Thanks and regards
    Sathish

    Dear Christoph Hopf,
    Thanks a lot for your Quick Response
    as you have Explained,i assume it is mainly used to store a local copy or extra copy of the Original application files in the system which we use to View them, or if we feel to store the copy in any other system also.
    Example Say there are 3 systems S1, S2 and S3
    if we are viewing an original file in System S1 and want the same Copy to be stored in System S2  or S3.....
    then this data carrier configuration is required if iam right?
    But Sir apart from this funtionality can i also know the SAP definition purpose, as it defines  data carrier type as below
    The system requires the data carrier type in order to exchange data with other data carriers.
    This is the case, for example, if the user:
    wants to process original application files stored on another computer
    wants to use an application installed on another computer
    if i can interpret this in right way
    it tells us that, by use of data carrier and Data carrier types
    we can access a  original file stored in Different systems,
    if it is so why is this required as the documents are always stored in a central content server and Once checked in, any one can access the document from Server,
    2) can we make use of Application softwares Installed in one System and make it run in another system, example,
    if there are say 3 system totally, among this S1 system1 has Autocad Installed in it ,
    rest of the 2 system have not installed Autocad, Can the system 2, and 3 edit the auto cad drawings in there system, does at this situations where Datacarries and Data carries types come in to picture,
    if so how is this made possible?
    pls explain with example or Screen shots in Wiki if possible
    can u tell me what is the difference between Datacarrier and Datacarrier types with any examples as well
    Thanks and regards
    Sathish
    Edited by: sathish kumar on May 22, 2008 2:17 AM

  • Document on XML date rules and date management

    Hi,
    Could you Please send a document on XML date rules and date management. I really in need of it. You may send it to [email protected]

    Hello
    Could you please email this to me:  [email protected]
    Thanks in advance
    Kevin

  • Calender using Date From and Date To

    Hi,
    I'm trying to build a calender based on a table that has a user_name, date_from, date_to and comments.
    Basically I want the calender to show all the dates betwwen the date from and date to. Is there any way of getting the calender to use those dates and all the days in between or will I have a to build a view that returns one row for each day between the start and end
    Whats the best way of doing this and does any one have an example
    Thanks in advance

    Andy,
    See my blog post here:
    http://deneskubicek.blogspot.com/2007/05/create-pseudo-tables.html
    and the corresponding example:
    http://htmldb.oracle.com/pls/otn/f?p=31517:83
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://htmldb.oracle.com/pls/otn/f?p=31517:1
    -------------------------------------------------------------------

  • How to view "date from" and "date to"

    Hi,
    I have loaded EMP data. In the mapping "date from" and "date to" are visible. But these fields are not seen when I display the EMP data.Can someone please tell me how to view these fields?
    Thanks,
    Williams

    I got your question . Please find my understandings .
    In the transformation Please check weather routine or formula is written which is stopping the dats
    to be updated to the target .
    I mean data is there in the source but if its not updating to target it seems some rules have been set up in the transformation . Please check .
    Assign marks is the best way to appreciate help

Maybe you are looking for